Research Paves Way for Development of Cyborg Moth ‘Biobots’

Matt Shipman | News Services | 919.515.6386

Dr. Alper Bozkurt | 919.515.7349

Release Date: 08.20.14

North Carolina State University researchers have developed methods for electronically manipulating the flight muscles of moths and for monitoring the electrical signals moths use to control those muscles. The work opens the door to the development of remotely-controlled moths, or “biobots,” for use in emergency response.

“In the big picture, we want to know whether we can control the movement of moths for use in applications such as search and rescue operations,” says Dr. Alper Bozkurt, an assistant professor of electrical and computer engineering at NC State and co-author of a paper on the work. “The idea would be to attach sensors to moths in order to create a flexible, aerial sensor network that can identify survivors or public health hazards in the wake of a disaster.”

Continue reading “Research Paves Way for Development of Cyborg Moth ‘Biobots’”

A dog could run China’s banking system, says former statistics bureau spokesman

Yao Jingyuan predicts economic growth in 2014, but has harsh words for China’s banks

PUBLISHED : Tuesday, 24 December, 2013, 7:10pm

Jeremy Blum jeremy.blum@scmp.com

yaojingyuan.jpg

Yao Jingyuan. Photo: Xinhua

The former chief economist and spokesman of China’s National Bureau of Statistics estimated that the mainland’s economy grew 7.7 per cent in 2013, while also making a scathing criticism of China’s banking industry, likening it to an automated system that even a dog could successfully run.

“Banking in China has become like a highway toll system,” Yao Jingyuan said at a Saturday summit on China’s economy held at Nanjing University. “Banks charge every time money goes through them. Continue reading “A dog could run China’s banking system, says former statistics bureau spokesman”

Half of psychiatrists reject private and federal insurance, preferring cash

Contact: Jen Gundersen jeg2034@med.cornell.edu 646-317-7402 Weill Cornell Medical College

Researchers warn that just when the need for mental health services is recognized on a national level, access to help is declining at an alarming rate Continue reading “Half of psychiatrists reject private and federal insurance, preferring cash”

Spicy food on the menu 6000 to 23,000 years ago

 

Even in prehistoric Denmark, some liked it hot. Residues scraped from the inside of 6000-year-old pots found in the Baltic show they were used to cook meat and fish that was seasoned with a peppery, mustard-like spice.

Exactly when humans began to season their food is something of a mystery, says Oliver Craig at the University of York, UK. “Spices grow in the wild as part of the background flora,” he says. “So if you find the botanical remains of spices at a site you don’t know whether they were actually used in food or whether they just came from plants growing nearby.”

So although coriander seeds have been found at a 23,000-year-old site in Israel, we cannot be sure that they were used to flavour food.

Craig and Hayley Saul, also at York, have now found clear evidence that spices were intentionally added to food used in northern Europe by around 6100 years ago – the earliest known evidence of spiced food in Europe, and perhaps anywhere in the world.

Mustard me up

Their team analysed deposits left inside 74 cooking pots from prehistoric sites in Denmark and Germany. They contained chemical signatures consistent with the presence of meat or fish, and phytoliths – mineral traces of food – similar to those associated with seeds of garlic mustard (Alliaria petiolata), a local plant with a strong peppery flavour but little nutritional value.

There were significantly more phytoliths in the pot residues than in the sediment at the site. This suggests the garlic mustard had been brought in from outside the site and deliberately added to the pots.

“In Europe we see spices coming in as imports a few thousand years ago, but what we’ve found is that Europeans were putting spice in food long before that,” says Craig.

In fact, the pots predate the arrival of agriculture in the region, says Craig: the chefs at work in Germany and Denmark were hunter-gatherers. “Quite often we associate the arrival of farming with the first use of new plants and spices,” he says. “But people were putting spice in foods before then. It’s probably always been part of our cuisine.”

Hot and healthy

That raises an obvious question: why are humans so keen on spicy food? There are a few possible explanations. We know, for instance, that even the Neanderthals exploited a variety of plants in their environments for their medicinal properties. So it might be significant that garlic mustard was once used medicinally as a disinfectant.

Another idea, first suggested by Paul Sherman at Cornell University in the 1990s, is that people began seasoning their food because some spices are antimicrobial and guard against food spoilage. In other words, humans may have learned to love spicy food for evolutionary reasons – because it was safer to eat.

“One of the things Hayley found is that there seems to have been a preference to put the garlic mustard in pots that contained fish,” says Craig. “That might be associated with covering the smell, or even have had a role in preserving the fish.”

Still, he prefers a simpler explanation. “I think there may not have necessarily been a functional role here,” he says. “It might simply be down to the aesthetic of taste. We just like these spices.”

Journal reference: PLoS One, DOI: 10.1371/journal.pone.0070583

Several studies support the role of choline in fetal development and throughout the lifespan – Only 10% of Population meet requirements

2010 study posted for filing

Contact: Egg Nutrition News Bureau info@incredible-egg.org 312-233-1211 Egg Nutrition News Bureau

Essential nutrient in eggs may reduce risk of infant heart defects

A study published in the American Journal of Clinical Nutrition found that a choline-deficient diet is associated with increased risk for heart defects during prenatal development.1 Choline is an essential nutrient required for normal cell activity, healthy brain and nerve function, liver metabolism and transportation of nutrients throughout the body. Research shows that only 10 percent or less of older children, men, women and pregnant women in America are meeting the Adequate Intake (AI) levels for choline; despite a growing body of science which supports the importance of choline especially in healthy fetal development.2

Vital Role of Choline During Pregnancy

A growing body of science, conducted in both animals and humans, supports the need for more dietary choline. Researchers from McGill University and Cornell University examined the offspring of mice that consumed a choline-deficient diet during pregnancy compared to the offspring of mice that consumed a diet containing the recommended amount of choline. The researchers observed that heart defects were more prevalent among the offspring of mice consuming a choline-deficient diet. The study also found that low choline intake was associated with increased levels of homocysteine, an amino acid in the blood that, when elevated, is associated with an increased risk of cardiovascular disease and declined cognitive function.

“Choline is a complex nutrient that is intricately involved in fetal development, and this research reveals another piece of the puzzle,”according to Cornell University Associate Professor, Marie Caudill, Ph.D., R.D.  “Women with diets low in choline have two times greater risk of having babies with neural tube defects so it’s essential that nutrition education during pregnancy and breastfeeding highlight the importance of dietary sources of choline.”

Another study, published in the June issue of Behavioral Neuroscience, reported that choline intake during pregnancy and lactation is associated with improved attention function.3 The researchers observed that offspring of female mice consuming a diet supplemented with choline during pregnancy and lactation performed significantly better on attention tasks compared to offspring from mothers consuming a diet not supplemented with choline.

The Importance of Choline Throughout the Lifespan

Another study published in the American Journal of Clinical Nutrition examined adult dietary intake of choline and betaine (a nutrient related to choline) and found that higher intakes of choline and betaine were associated with lower blood homocysteine concentrations, especially in subjects with low blood levels of folate and vitamin B12.4 Choline, like folate, is involved in breaking down homocysteine in the blood. Elevated homocysteine concentrations have been associated with increased risk of stroke, coronary heart disease and cognitive decline.

In May, a study published online in the Journal of Nutrition reported on the role of choline in the complex system that regulates DNA production and stability. Researchers studied the impact of choline intake on DNA damage in 60 Mexican-American men. They found that individuals with greater intakes of choline, even exceeding current dietary recommendations, exhibited the least amount of DNA damage.5

Focusing on a Choline-Rich Diet

“Choline is important for people of all ages, particularly moms and moms-to-be,” says Neva Cochran, M.S., R.D., nutrition communications consultant and nutrition writer and researcher for Woman’s World magazine. “It is easy to meet the recommended choline intake with delicious foods like an egg, which is an excellent source of choline and provides roughly one-quarter of a pregnant or breastfeeding woman’s choline needs.”

Cochran recommends the following choline-rich meal ideas as part of a balanced diet:

  • Basic Hard-Cooked Eggs – Prepare a batch of hard-cooked eggs on Sunday to have, high-quality protein meals and snacks on hand throughout the week which is especially important for moms-to-be.
  • Cereal Bowl Egg & Cheese Breakfast Burrito – Try this microwavable burrito bowl topped with cheese and salsa – a quick, easy breakfast that can be enjoyed in seconds.
  • Basic Frittata – Make fillings from your favorite foods or from leftovers. Use a combination of meat, seafood or poultry, cheese, vegetables and cooked pasta or grains.

 

###

Choline Resources

  • To learn more about choline and to download free educational materials, visit www.cholineinfo.org.
  • To learn more about prenatal nutrition and download a free copy of the Pregnancy Food Guide, visit http://www.pregnancyfoodguide.org/.
  • For more information on the nutritional benefits of eggs, visit the Egg Nutrition Center at www.enc-online.org.
  • For additional choline-rich egg recipes and preparation tips, visit the American Egg Board at www.incredibleegg.org.

 

About the American Egg Board (AEB)

AEB is the U.S. egg producer’s link to the consumer in communicating the value of The incredible edible egg™ and is funded from a national legislative checkoff on all egg production from companies with greater than 75,000 layers, in the continental United States. The board consists of 18 members and 18 alternates from all regions of the country who are appointed by the Secretary of Agriculture. The AEB staff carries out the programs under the board direction. AEB is located in Park Ridge, Ill. Visit www.IncredibleEgg.org for more information.

About the Egg Nutrition Center (ENC)

The Egg Nutrition Center (ENC) is the health education and research center of the American Egg Board. Established in 1979, ENC provides science-based information to health promotion agencies, physicians, dietitians, nutritional scientists, media and consumers on issues related to egg nutrition and the role of eggs in the American diet. ENC is located in Park Ridge, IL. Visit www.enc-online.org for more information

1. Chan J, Deng L, Mikael LG, Yan J, Pickell L, Wu Q, Caudill M, Rozen R. Low dietary choline and low dietary riboflavin during pregnancy influence reproductive outcomes and heart development in mice. Am J Clin Nutr 2010; 91:1035-43.

2. Jensen HH, Batres-Marquez P, Carriquiry A, Schalinske KL. Choline in the diets of the US population: NHANES, 2003-2004. The FASEB Journal 2007;21:lb219.

3. Moon J, Chen M, Gandhy SU, Strawderman M, Levitsky DA, Maclean KN, Strupp BJ. Perinatal choline supplementation improves cognitive functioning and emotion regulation in the Ts65Dn mouse model of down syndrome. Behavioral Neuroscience 2010;124:346-361.

4. Lee JE, Jacques PF, Dougherty L, Selhub J, Giovannucci E, Zeisel SH, Cho E. Are dietary choline and betaine intakes determinants of total homocysteine concentration? Am J Clin Nutr 2010;91:1303-10.

5. Shin W, Yan J, Abratte CM, Vermeylen F, Caudill M. Choline intake exceeding current dietary recommendations preserves markers of cellular methylation in a genetic subgroup of folate-compromised men. J Nutr 2010;140:975-980.

Key nutrient in maternal diet promises ‘dramatic’ improvements for people with Down syndrome ( Choline )

2010 study posted for filing

Contact: John Carberry jjc338@cornell.edu 607-255-5353 Cornell University

ITHACA, N.Y. – A nutrient found in egg yolks, liver and cauliflower taken by mothers during pregnancy and nursing may offer lifelong “dramatic” health benefits to people with Down syndrome .

A new study done at Cornell University and published June 2 in the peer-reviewed journal Behavioral Neuroscience found that more choline during pregnancy and nursing could provide lasting cognitive and emotional benefits to people with Down syndrome. The work indicated greater maternal levels of the essential nutrient also could protect against neurodegenerative conditions such as Alzheimer’s disease.

“We found that supplementing the maternal diet with additional choline resulted in dramatic improvements in attention and some normalization of emotion regulation in a mouse model of Down syndrome,” said lead author Barbara Strupp, professor of nutritional sciences and of psychology.

In addition to mental retardation, Down syndrome individuals often experience dementia in middle age as a result of brain neuron atrophy similar to that suffered by people with Alzheimer’s disease. Strupp said the improved mental abilities found in the Down syndrome mice following maternal choline supplements could indicate protection from such neurodegeneration “in the population at large.”

Strupp and her co-authors tested Down syndrome-model mice born from mothers that were fed a normal diet versus those given choline supplements during their three-week pregnancy and three-week lactation period. They also examined normal mice born from mothers with and without additional choline. The choline-supplemented mothers received about 4.5 times more choline (roughly comparable to levels at the higher range of human intake) than unsupplemented mothers.

Beginning at 6 months of age, the mice performed a series of behavioral tasks over a period of about six months to assess their impulsivity, attention span, emotional control and other mental abilities. The researchers found the unsupplemented Down syndrome-model mice became more agitated after a mistake than normal mice, jumping repeatedly and taking longer to initiate the next trial. The choline-supplemented Down syndrome-model mice showed partial improvement in these areas.

“I’m impressed by the magnitude of the cognitive benefits seen in the Down syndrome-model mice,” Strupp said. “Moreover, these are clearly lasting cognitive improvements, seen many months after the period of choline supplementation.”

Strupp said the results are consistent with studies by other researchers that found increased maternal choline intake improves offspring cognitive abilities in rats. However, this is the first study to evaluate the effects of maternal choline supplementation in a rodent model of Down syndrome.

Previous studies of humans and laboratory animals have shown that supplementing the diets of adults with choline has proven to be largely ineffective in improving cognition.

“Although the precise mechanism is unknown, these lasting beneficial effects of choline observed in the present study are likely to be limited to increased intake during very early development,” Strupp said.

###

The study, funded in part by the National Institutes of Health, was part of the dissertation of Cornell doctoral candidate Jisook Moon. Other Cornell collaborators included Myla Strawderman, research associate in nutritional sciences, and David Levitsky, professor of nutrition and psychology. Strupp and collaborators have received additional NIH funding to study the neural mechanisms underlying the results observed in this study.

Long-term use of vitamin E may decrease COPD risk

2010 study posted for filing

Contact: Keely Savoie
ksavoie@thoracic.org
212-315-8620
American Thoracic Society

ATS 2010, NEW ORLEANS— Long-term, regular use of vitamin E in women 45 years of age and older may help decrease the risk of chronic obstructive pulmonary disease (COPD) by about 10 percent in both smokers and non-smokers, according to a study conducted by researchers at Cornell University and Brigham and Women’s Hospital.

“As lung disease develops, damage occurs to sensitive tissues through several proposed processes, including inflammation and damage from free radicals,” said Anne Hermetet Agler, doctoral candidate with Cornell University’s Division of Nutritional Sciences. “Vitamin E may protect the lung against such damage.”

The results of the study will be presented at the ATS 2010 International Conference in New Orleans.

“The findings from our study suggest that increasing vitamin E prevents COPD,” said Ms. Agler. “Previous research found that higher intake of vitamin E was associated with a lower risk of COPD, but the studies were not designed to answer the question of whether increasing vitamin E intake would prevent COPD. Using a large, randomized controlled trial to answer this question provided stronger evidence than previous studies.”

Ms. Agler and colleagues reviewed data compiled by the Women’s Health Study, a multi-year, long-term effort ending in 2004 that focused on the effects of aspirin and vitamin E in the prevention of cardiovascular disease and cancer in nearly 40,000 women aged 45 years and older. Study participants were randomized to receive either 600 mg of vitamin E or a placebo every other day during the course of the research.

Although fewer women taking vitamin E developed COPD, Ms. Agler noted the supplements appeared to have no effect on asthma, and women taking vitamin E supplements were diagnosed with asthma at about the same rate as women taking placebo pills. Importantly, Ms. Agler noted the decreased risk of COPD in women who were given vitamin E was the same for smokers as for non-smokers.

Ms. Agler said further research will explore the way vitamin E affects the lung tissue and function, and will assess the effects of vitamin E supplements on lung diseases in men.

“If results of this study are borne out by further research, clinicians may recommend that women take vitamin E supplements to prevent COPD,” Ms. Agler noted. “Remember that vitamin E supplements are known to have detrimental effects in some people; for example vitamin E supplementation increased risk of congestive heart failure in cardiovascular disease patients. Broader recommendations would need to balance both benefits and risks. ”

 

###

 

“Randomized Vitamin E Supplementation and Risk of Chronic Lung Disease (CLD) in the Women’s Health Study” (Session C103, Tuesday, May 18, 1:30- 4:00 p.m., CC-Room 353-355 (Third Level), Morial Convention Center; Abstract 3727)

82nd Health Research Report 31 MAY 2010 – Reconstruction

Health Research Report

82nd Issue 31 May 2010

Compiled By Ralph Turchiano

www.healthresearchreport.me www.vit.bz

www.youtube.com/vhfilm www.facebook.com/engineeringevil

www.engineeringevil.com

In this Issue:

1. Long-term use of vitamin E may decrease COPD risk

2. Eating processed meats, but not unprocessed red meats, may raise risk of heart disease and diabetes

3. Most patients survive common thyroid cancer regardless of treatment

4. ‘Fountain of youth’ steroids could protect against heart disease. Such as Pregnenolone and DHEA

5. New evidence caffeine may slow Alzheimer’s disease and other dementias, restore cognitive function

6. High-fat ketogenic diet effectively treats persistent childhood seizures

7. Study: Yogurt-like drink DanActive reduced rate of common infections in daycare children

8. Estrogen-lowering drugs minimize surgery in breast cancer patients

9. Prenatal exposure to endocrine-disrupting chemicals linked to breast cancer

10. Anti-aging supplements may be best taken not too late in life

11. Folate prevents alcohol-induced congenital heart defects in mice

12. LSUHSC researcher finds surprising link between sugar in drinks and blood pressure

13. Dangerous lung worms found in people who eat raw crayfish

14. Some bisphosphonates users unfamiliar with drug’s possible side effects on oral health

15. You have no natural right to food

 

Public release date: 16-May-2010

Long-term use of vitamin E may decrease COPD risk

ATS 2010, NEW ORLEANS— Long-term, regular use of vitamin E in women 45 years of age and older may help decrease the risk of chronic obstructive pulmonary disease (COPD) by about 10 percent in both smokers and non-smokers, according to a study conducted by researchers at Cornell University and Brigham and Women’s Hospital.

“As lung disease develops, damage occurs to sensitive tissues through several proposed processes, including inflammation and damage from free radicals,” said Anne Hermetet Agler, doctoral candidate with Cornell University’s Division of Nutritional Sciences. “Vitamin E may protect the lung against such damage.”

The results of the study will be presented at the ATS 2010 International Conference in New Orleans.

“The findings from our study suggest that increasing vitamin E prevents COPD,” said Ms. Agler. “Previous research found that higher intake of vitamin E was associated with a lower risk of COPD, but the studies were not designed to answer the question of whether increasing vitamin E intake would prevent COPD. Using a large, randomized controlled trial to answer this question provided stronger evidence than previous studies.”

Ms. Agler and colleagues reviewed data compiled by the Women’s Health Study, a multi-year, long-term effort ending in 2004 that focused on the effects of aspirin and vitamin E in the prevention of cardiovascular disease and cancer in nearly 40,000 women aged 45 years and older. Study participants were randomized to receive either 600 mg of vitamin E or a placebo every other day during the course of the research.

Although fewer women taking vitamin E developed COPD, Ms. Agler noted the supplements appeared to have no effect on asthma, and women taking vitamin E supplements were diagnosed with asthma at about the same rate as women taking placebo pills. Importantly, Ms. Agler noted the decreased risk of COPD in women who were given vitamin E was the same for smokers as for non-smokers.

Ms. Agler said further research will explore the way vitamin E affects the lung tissue and function, and will assess the effects of vitamin E supplements on lung diseases in men.

“If results of this study are borne out by further research, clinicians may recommend that women take vitamin E supplements to prevent COPD,” Ms. Agler noted. “Remember that vitamin E supplements are known to have detrimental effects in some people; for example vitamin E supplementation increased risk of congestive heart failure in cardiovascular disease patients. Broader recommendations would need to balance both benefits and risks. ”

Public release date: 17-May-2010

Eating processed meats, but not unprocessed red meats, may raise risk of heart disease and diabetes

Boston, MA – In a new study, researchers from the Harvard School of Public Health (HSPH) have found that eating processed meat, such as bacon, sausage or processed deli meats, was associated with a 42% higher risk of heart disease and a 19% higher risk of type 2 diabetes. In contrast, the researchers did not find any higher risk of heart disease or diabetes among individuals eating unprocessed red meat, such as from beef, pork, or lamb. This work is the first systematic review and meta-analysis of the worldwide evidence for how eating unprocessed red meat and processed meat relates to risk of cardiovascular diseases and diabetes.

“Although most dietary guidelines recommend reducing meat consumption, prior individual studies have shown mixed results for relationships between meat consumption and cardiovascular diseases and diabetes,” said Renata Micha, a research fellow in the department of epidemiology at HSPH and lead author of the study. “Most prior studies also did not separately consider the health effects of eating unprocessed red versus processed meats.”

The study appears online May 17, 2010, on the website of the journal Circulation.

The researchers, led by Renata Micha, a research fellow in the department of epidemiology, and HSPH colleagues Dariush Mozaffarian, assistant professor in the department of epidemiology and Sarah Wallace, junior research fellow in the department of epidemiology, systematically reviewed nearly 1,600 studies. Twenty relevant studies were identified, which included a total of 1,218,380 individuals from 10 countries on four continents (United States, Europe, Australia, and Asia).

The researchers defined unprocessed red meat as any unprocessed meat from beef, lamb or pork, excluding poultry. Processed meat was defined as any meat preserved by smoking, curing or salting, or with the addition of chemical preservatives; examples include bacon, salami, sausages, hot dogs or processed deli or luncheon meats. Vegetable or seafood protein sources were not evaluated in these studies.

The results showed that, on average, each 50 gram (1.8 oz) daily serving of processed meat (about 1-2 slices of deli meats or 1 hot dog) was associated with a 42% higher risk of developing heart disease and a 19% higher risk of developing diabetes. In contrast, eating unprocessed red meat was not associated with risk of developing heart disease or diabetes. Too few studies evaluated the relationship between eating meat and risk of stroke to enable the researchers to draw any conclusions.

“Although cause-and-effect cannot be proven by these types of long-term observational studies, all of these studies adjusted for other risk factors, which may have been different between people who were eating more versus less meats,” said Mozaffarian. “Also, the lifestyle factors associated with eating unprocessed red meats and processed meats were similar, but only processed meats were linked to higher risk.”

“When we looked at average nutrients in unprocessed red and processed meats eaten in the United States, we found that they contained similar average amounts of saturated fat and cholesterol. In contrast, processed meats contained, on average, 4 times more sodium and 50% more nitrate preservatives,” said Micha. “This suggests that differences in salt and preservatives, rather than fats, might explain the higher risk of heart disease and diabetes seen with processed meats, but not with unprocessed red meats.”

Dietary sodium (salt) is known to increase blood pressure, a strong risk factor for heart disease. In animal experiments, nitrate preservatives can promote atherosclerosis and reduce glucose tolerance, effects which could increase risk of heart disease and diabetes.

Given the differences in health risks seen with eating processed meats versus unprocessed red meats, these findings suggest that these types of meats should be studied separately in future research for health effects, including cancer, the authors said. For example, higher intake of total meat and processed meat has been associated with higher risk of colorectal cancer, but unprocessed red meat has not been separately evaluated. They also suggest that more research is needed into which factors (especially salt and other preservatives) in meats are most important for health effects.

Current efforts to update the United States government’s Dietary Guidelines for Americans, which are often a reference for other countries around the world, make these findings particularly timely, the researchers say. They recommend that dietary and policy efforts should especially focus on reducing intake of processed meat.

“To lower risk of heart attacks and diabetes, people should consider which types of meats they are eating. Processed meats such as bacon, salami, sausages, hot dogs and processed deli meats may be the most important to avoid,” said Micha. “Based on our findings, eating one serving per week or less would be associated with relatively small risk.”

Public release date: 17-May-2010

Most patients survive common thyroid cancer regardless of treatment

Individuals with papillary thyroid cancer that has not spread beyond the thyroid gland appear to have favorable outcomes regardless of whether they receive treatment within the first year after diagnosis, according to a report in the May issue of Archives of Otolaryngology–Head & Neck Surgery, one of the JAMA/Archives journals.

Papillary thyroid cancer is commonly found on autopsy among individuals who died of other causes, according to background information in the article. “Studies published as early as 1947 demonstrated it, and more recently, a report has shown that nearly every thyroid gland might be found to have a cancer if examined closely enough,” the authors write. “The advent of ultrasonography and fine-needle aspiration biopsy has allowed many previously undetected cancers to be identified, changing the epidemiology of the disease. Over the past 30 years, the detected incidence of thyroid cancer has increased three-fold, the entire increase attributable to papillary thyroid cancer and 87% of the increase attributable to tumors measuring less than 2 centimeters.”

Louise Davies, M.D., M.S., of Dartmouth Medical School, Hanover, N.H. and Gilbert Welch, M.D., M.P.H., both also of Department of Veterans Affairs Medical Center, White River Junction, Vt., and The Dartmouth Institute for Health Policy and Clinical Practice, Hanover, studied cancer cases and individual treatment data from National Cancer Institute registries. They then tracked cause of death through the National Vital Statistics System.

The researchers identified 35,663 patients with papillary thyroid cancer that had not spread to the lymph nodes or other areas at diagnosis. Of these, 440 (1.2 percent) did not undergo immediate, definitive treatment. Over an average of six years of follow-up, six of these patients died of their cancer. This was not significantly different from the rate of cancer death among the 35,223 individuals who did undergo treatment (161 over an average of 7.6 years of follow-up).

The 20-year survival rate from cancer was estimated to be 97 percent for those who did not receive treatment and 99 percent for those who did. “These data help put management decisions about localized papillary thyroid cancer in perspective: papillary thyroid cancers of any size that are confined to the thyroid gland, have no lymph node metastases at presentation and do not show extraglandular extension [reach beyond the thyroid gland] are unlikely to result in death due to the cancer,” the authors write.

“Thus, clinicians and patients should feel comfortable considering the option to observe for a year or longer cancers that fall into this category,” they conclude. “When treatment is elected, the cancers in this category can be managed with either hemithyroidectomy [removal of part of the thyroid] or total thyroidectomy [removal of the complete gland], and the prognosis will be the same.”

Public release date: 17-May-2010

‘Fountain of youth’ steroids could protect against heart disease

Such as Pregnenolone and DHEA

A natural defence mechanism against heart disease could be switched on by steroids sold as health supplements, according to researchers at the University of Leeds.

The University of Leeds biologists have identified a previously-unknown ion channel in human blood vessels that can limit the production of inflammatory cytokines – proteins that drive the early stages of heart disease.

They found that this protective effect can be triggered by pregnenolone sulphate – a molecule that is part of a family of ‘fountain-of-youth’ steroids. These steroids are so-called because of their apparent ability to improve energy, vision and memory.

Importantly, collaborative studies with surgeons at Leeds General infirmary have shown that this defence mechanism can be switched on in diseased blood vessels as well as in healthy vessels.

So-called ‘fountain of youth’ steroids are made naturally in the body, but levels decline rapidly with age. This has led to a market in synthetically made steroids that are promoted for their health benefits, such as pregnenolone and DHEA. Pregnenolone sulphate is in the same family of steroids but it is not sold as a health supplement.

“The effect that we have seen is really quite exciting and also unexpected,” said Professor David Beech, who led the study. “However, we are absolutely not endorsing any claims made by manufacturers of any health supplements. Evidence from human trials is needed first.”

A chemical profiling study indicated that the protective effect was not as strong when cholesterol was present too. This suggests that the expected benefits of ‘fountain of youth’ steroids will be much greater if they are used in combination with cholesterol-lowering drugs and/or other healthy lifestyle strategies such as diet and exercise.

“These ‘fountain of youth’ steroids are relatively cheap to make and some of them are already available as commercial products. So if we can show that this effect works in people as well as in lab-based studies, then it could be a cost-effective approach to addressing cardiovascular health problems that are becoming epidemic in our society and world-wide,” Professor Beech added.

The paper is published in Circulation Research.

Public release date: 17-May-2010

New evidence caffeine may slow Alzheimer’s disease and other dementias, restore cognitive function

Researchers explore potential benefits of caffeine in special supplement to the Journal of Alzheimer’s Disease

Amsterdam, The Netherlands, May 17, 2010 – Although caffeine is the most widely consumed psychoactive drug worldwide, its potential beneficial effect for maintenance of proper brain functioning has only recently begun to be adequately appreciated. Substantial evidence from epidemiological studies and fundamental research in animal models suggests that caffeine may be protective against the cognitive decline seen in dementia and Alzheimer’s disease (AD). A special supplement to the Journal of Alzheimer’s Disease, “Therapeutic Opportunities for Caffeine in Alzheimer’s Disease and Other Neurodegenerative Diseases,” sheds new light on this topic and presents key findings.

Guest editors Alexandre de Mendonça, Institute of Molecular Medicine and Faculty of Medicine, University of Lisbon, Portugal, and Rodrigo A. Cunha, Center for Neuroscience and Cell Biology of Coimbra and Faculty of Medicine, University of Coimbra, Portugal, have assembled a group of international experts to explore the effects of caffeine on the brain. The resulting collection of original studies conveys multiple perspectives on topics ranging from molecular targets of caffeine, neurophysiological modifications and adaptations, to the potential mechanisms underlying the behavioral and neuroprotective actions of caffeine in distinct brain pathologies.

“Epidemiological studies first revealed an inverse association between the chronic consumption of caffeine and the incidence of Parkinson’s disease,” according to Mendonça and Cunha. “This was paralleled by animal studies of Parkinson’s disease showing that caffeine prevented motor deficits as well as neurodegeneration “Later a few epidemiological studies showed that the consumption of moderate amounts of caffeine was inversely associated with the cognitive decline associated with aging as well as the incidence of Alzheimer’s disease. Again, this was paralleled by animal studies showing that chronic caffeine administration prevented memory deterioration and neurodegeneration in animal models of aging and of Alzheimer’s disease.”

Key findings presented in “Therapeutic Opportunities for Caffeine in Alzheimer’s Disease and Other Neurodegenerative Diseases”:

•Multiple beneficial effects of caffeine to normalize brain function and prevent its degeneration

•Caffeine’s neuroprotective profile and its ability to reduce amyloid-beta production

•Caffeine as a candidate disease-modifying agent for Alzheimer’s disease

•Positive impact of caffeine on cognition and memory performance

•Identification of adenosine A2A receptors as the main target for neuroprotection afforded by caffeine consumption

•Confirmation of data through valuable meta-analyses presented

•Epidemiological studies corroborated by meta-analysis suggesting that caffeine may be protective against Parkinson’s disease

•Several methodological issues must be solved before advancing to decisive clinical trials

Mendonça and Cunha also observe that “the daily follow-up of patients with AD has taught us that improvement of daily living may be a more significant indicator of amelioration than slight improvements in objective measures of memory performance. One of the most prevalent complications of AD is depression of mood, and the recent observations that caffeine might be a mood normalizer are of particular interest.”

Public release date: 17-May-2010

 

High-fat ketogenic diet effectively treats persistent childhood seizures

The high-fat ketogenic diet can dramatically reduce or completely eliminate debilitating seizures in most children with infantile spasms, whose seizures persist despite medication, according to a Johns Hopkins Children’s Center study published online April 30 in the journal Epilepsia.

Infantile spasms, also called West syndrome, is a stubborn form of epilepsy that often does not get better with antiseizure drugs. Because poorly controlled infantile spasms may cause brain damage, the Hopkins team’s findings suggest the diet should be started at the earliest sign that medications aren’t working.

“Stopping or reducing the number of seizures can go a long way toward preserving neurological function, and the ketogenic diet should be our immediate next line of defense in children with persistent infantile spasms who don’t improve with medication,” says senior investigator Eric Kossoff, M.D., a pediatric neurologist and director of the ketogenic diet program at Hopkins Children’s.

The ketogenic diet, made up of high-fat foods and few carbohydrates, works by triggering biochemical changes that eliminate seizure-causing short circuits in the brain’s signaling system. It has been used successfully in several forms of epilepsy.

A small 2002 study by the same Hopkins team showed the diet worked well in a handful of children with infantile spasms. The new study is the largest analysis thus far showing just how effective the diet can be in children with this condition.

Of the 104 children treated by the Hopkins team, nearly 40 percent, or 38 children, became seizure-free for at least six months after being on the diet for anywhere from just a few days to 20 months. Of the 38, 30 have remained so without a relapse for at least two years.

After three months on the diet, one-third of the children had 90 percent fewer seizures, and after nine months on the diet, nearly half of the children in the study had 90 percent fewer seizures. Nearly two-thirds had half as many seizures after six months on the diet.

Nearly two-thirds of the children experienced improvement in their neurological and cognitive development, and nearly 30 percent were weaned off antiseizure medications after starting the diet.

Most of the children continued taking their medication even after starting the diet, the researchers say, because the two are not mutually exclusive and can often work in synergy.

Researchers also used the diet as first-line therapy in18 newly diagnosed infants never treated with drugs, 10 of whom became seizure free within two weeks of starting the diet. The finding suggests that, at least in some children, the diet may work well as first-line therapy, but the researchers say they need further and larger studies to help them identify patients for whom the diet is best used before medications. Hopkins Children’s neurologists are actively using the ketogenic diet as first-line treatment in children with infantile spasms with promising results.

Side effects, including constipation, heartburn, diarrhea and temporary spikes in cholesterol levels, occurred in one-third of the children, with six percent of them experiencing diminished growth.

Despite these side effects, a recent study by Kossoff and his team showed that the ketogenic diet is safe long term.

Conflict of interest disclosure: Dr. Kossoff has received grant support from Nutricia Inc., for unrelated research. The terms of these arrangements are being managed by the Johns Hopkins University in accordance with its conflict-of-interest policies.

Public release date: 19-May-2010

Study: Yogurt-like drink DanActive reduced rate of common infections in daycare children

Washington, DC – The probiotic yogurt-like drink DanActive reduced the rate of common sicknesses such as ear infections, sinusitis, the flu and diarrhea in daycare children, say researchers who studied the drink in the largest known probiotic clinical trial to be conducted in the United States. An additional finding, however, showed no reduction in the number school days missed. The study led by Daniel Merenstein of Georgetown University School of Medicine (GUSOM), was funded by The Dannon Company, Inc., and published today online in the European Journal of Clinical Nutrition.

Probiotic foods are continuing to increase in popularity and some are marketed for the potential benefits of probiotics such as Lactobacillus casei (L. casei) DN-114 001, the probiotic in DanActive. Studies in other countries have found that probiotics, which are live micro-organisms, produce positive health benefits in children, including the reduction of school days missed due to infections. However, most of the research was conducted outside the United States in structured conditions not comparable to normal everyday living.

“We were interested in a study that resembled how children in the U.S. consume drinks that are stored in home refrigerators and consumed without study personnel observation,” says the study’s lead author Daniel Merenstein, MD, director of research in the Department of Family Medicine at GUSOM.

“…To our knowledge this is the largest probiotic clinical trial conducted in the U.S. and provides much needed data,” say the authors of the study. “We studied a functional food, not a medicinal product; parents will thus feed their children without any physician input and we felt it was best to assess [the drink] under similar conditions.”

The study, titled DRINK (Decreasing the Rates of Illness in Kids), was a randomized, double-blind, placebo-controlled study – the gold standard in clinical research design. It included 638 healthy children, aged three to six, who attended school five days a week. Parents were asked to give their child a daily strawberry yogurt-like drink. Some of the drinks were supplemented with the probiotic strain L. casei DN-114 001 (DanActive), while others had no probiotics (placebo). Neither the study coordinators, the children, nor the parents knew which drink was given to which participant until the study ended. In addition to phone interviews with researchers, parents kept daily diaries of their child’s health and the number of drinks consumed.

Researchers found a 19 percent decrease of common infections among the children who drank the yogurt-like drink with L. casei DN-114 001 compared to those whose drink did not have the probiotic. More specifically, those who drank DanActive had 24 percent fewer gastrointestinal infections (such as diarrhea, nausea, and vomiting), and 18 percent fewer upper respiratory tract infections (such as ear infections, sinusitis and strep). However, the reduction in infections did not result in fewer missed school days or activities – also a primary outcome of the study.

“Our study had mixed results,” says Merenstein. “Children in school or daycare are especially susceptible to these illnesses. We did find some differences in infection rates but this did not translate to fewer missed school days or change in daily activity. It is my hope that safe and tolerable ways to reduce illnesses could eventually result in fewer missed school days which means fewer work days missed by parents.”

“It is important that more of these products are put under the microscope by independent academic researchers,” he concludes.

Public release date: 20-May-2010

Estrogen-lowering drugs minimize surgery in breast cancer patients

A nationwide study has confirmed the benefit of giving estrogen-lowering drugs before surgery to breast cancer patients. The treatment increased the likelihood that women could undergo breast-conservation surgery, also called lumpectomy, instead of mastectomy.

The study’s chair, Matthew J. Ellis, MD, PhD, the Anheuser-Busch Endowed Chair in Medical Oncology and a breast cancer specialist with the Alvin J. Siteman Cancer Center at Barnes-Jewish Hospital and Washington University School of Medicine in St. Louis, will present the findings June 7 at the annual meeting of the American Society of Clinical Oncology.

Sponsored by the American College of Surgeons Oncology Group, the study took place at 118 hospitals across the country and involved 352 postmenopausal women with estrogen-receptor positive (ER+) breast tumors. The participants received aromatase inhibitors for 16 weeks before surgery for breast cancer, and the extent of their tumors was monitored before and after the drug treatment.

The lead investigator at the Washington University site was Julie A. Margenthaler, MD, assistant professor of surgery and a breast surgeon at the Siteman Cancer Center.

Aromatase inhibitors are also referred to as estrogen-lowering agents because they interfere with the body’s production of estrogen, a hormone that stimulates the growth of ER+ breast tumors. ER+ is the most common breast cancer, accounting for three-quarters of cases.

All women in the study had stage II or III breast cancer, in which tumors are about an inch or larger in size and may have spread to the lymph nodes in the underarm area. Participants were placed in one of three groups at the study’s start:

•marginal, meaning breast-conservation surgery was possible but likely to be disfiguring or to require several surgical procedures;

•mastectomy-only, meaning breast-conservation surgery was not possible; and

•inoperable, meaning mastectomy would not completely remove the cancer.

After the 16-week aromatase inhibitor therapy, the women were reevaluated to see which surgical option was appropriate for them. The results showed that 82 percent of women in the marginal group, 51 percent in the mastectomy-only group and 75 percent in the inoperable group had successful breast-conservation surgery instead of mastectomy.

“Aromatase inhibitor therapy shrank the tumors in many of these women and improved surgical outcomes,” Ellis says. “These results will encourage a change in practice across the country so that more women can benefit from the currently underutilized approach of administering estrogen-lowering agents before surgery.”

The study participants were randomly assigned to receive one of three estrogen-lowering agents: exemestane (25 mg daily), letrozole (2.5 mg daily) or anastrozole (1 mg daily). No statistically significant difference in effectiveness was found among the three drugs.

Ellis explains that there are other benefits to using estrogen-lowering agents before surgery.

“ER+ breast cancer can be thought of as a chronic disease because patients generally take estrogen-lowering agents for many years after surgery to repress recurrence,” Ellis says. “In other chronic diseases, such as hypertension or diabetes, a patient’s response to treatment is continually monitored. But we’ve never done that with breast cancer. By treating breast cancer patients with estrogen-lowering drugs for three or four months before surgery, we can monitor treatment response and then specifically tailor surgical and post-surgical treatment based on this response.”

 

Public release date: 21-May-2010

Prenatal exposure to endocrine-disrupting chemicals linked to breast cancer

A study in mice reveals that prenatal exposure to endocrine-disrupting chemicals, like bisphenol-A (BPA) and diethylstilbestrol (DES), may program a fetus for life. Therefore, adult women who were exposed prenatally to BPA or DES could be at increased risk of breast cancer, according to a new study accepted for publication in Hormones & Cancer, a journal of The Endocrine Society.

Endocrine-disrupting chemicals are substances in the environment that interfere with hormone biosynthesis, metabolism or action resulting in adverse developmental, reproductive, neurological and immune effects in both humans and wildlife. These chemicals are designed, produced and marketed largely for specific industrial purposes.

“BPA is a weak estrogen and DES is a strong estrogen, yet our study shows both have a profound effect on gene expression in the mammary gland (breast) throughout life,” said Hugh Taylor, MD, of the Yale University School of Medicine in New Haven, Conn. and lead author of the study. “All estrogens, even ‘weak’ ones can alter the development of the breast and ultimately place adult women who were exposed to them prenatally at risk of breast cancer.”

In this study, researchers treated pregnant mice with BPA or DES and then looked at the offspring as adults. When the offspring reached adulthood, their mammary glands still produced higher levels of EZH2, a protein that plays a role in the regulation of all genes. Higher EZH2 levels are associated with an increased risk of breast cancer in humans.

“We have demonstrated a novel mechanism by which endocrine-disrupting chemicals regulate developmental programming in the breast,” said Taylor. “This study generates important safety concerns about exposures to environmental endocrine disruptors such as BPA and suggests a potential need to monitor women exposed to these chemicals for the development of breast lesions as adults.”

Ralph’s Note – How many more warnings do we need ? If this stuff is not banned now. We may be responsible for the deaths of many generations to come.

 

Public release date: 24-May-2010

Anti-aging supplements may be best taken not too late in life

Anti-aging supplements made up of mixtures might be better than single compounds at preventing decline in physical function, according to researchers at the University of Florida’s Institute on Aging. In addition, it appears that such so-called neutraceuticals should be taken before very old age for benefits such as improvement in physical function.

The findings from rat studies, published last week in the journal PLoS One, have implications for how dietary supplementation can be used effectively in humans.

“I think it is important for people to focus on good nutrition, but for those of advanced age who are running out of energy and not moving much, we’re trying to find a supplement mixture that can help improve their quality of life,” said Christiaan Leeuwenburgh, Ph.D., senior author of the paper and chief of the biology of aging division in the UF College of Medicine.

Scientists do not fully understand all the processes that lead to loss of function as people age. But more and more research points to the mitochondrial free radical theory of aging, that as people age, oxidative damage piles up in individual cells such that the energy-generation system inside some cells stops working properly.

To address that problem, many anti-aging studies and supplements are geared toward reducing the effects of free radicals.

The UF researchers investigated the potential anti-aging benefits of a commercially available mixture marketed for relieving chronic fatigue and protecting against muscle aging. The supplement contains the antioxidant coenzyme Q10, creatine — a compound that aids muscle performance — and ginseng, which also has been shown to have antioxidant properties.

The study gauged the effects of the mixture on physical performance as well as on two mechanisms that underlie the aging process and many age-related disorders: dysfunction of the cells’ energy producing powerhouses, known as mitochondria, and oxidative stress.

The researchers fed the supplement to middle-aged 21-month-old and late-middle-aged 29-month-old rats — corresponding to 50- to 65-year-old and 65- to 80-year-old humans, respectively — for six weeks, and measured how strongly their paws could grip. Grip strength in rats is analogous to physical performance in humans, and deterioration in grip strength can provide useful information about muscle weakness or loss seen in older adults.

Grip strength improved 12 percent in the middle-aged rats compared with controls, but no improvement was found in the older group.

Measurements of the function of mitochondria corresponded with the grip strength findings. Stress tests showed that mitochondrial function improved 66 percent compared with controls in middle-aged rats but not in the older ones. That suggests that supplementation might be of greater effect before major age-related functional and other declines have set in, the researchers said.

“It is possible that there is a window during which these compounds will work, and if the intervention is given after that time it won’t work,” said Jinze Xu, Ph.D., first author of the paper and a postdoctoral researcher at UF.

The researchers are working to identify the optimal age at which various interventions can enhance behavioral or physical performance. Very few studies have been done to show the effect of interventions on the very old.

Interestingly, although the older rats had no improvement in physical performance or mitochondrial function, they had lowered levels of oxidative damage.

That shows that reduction of oxidative stress damage is not always matched by functional changes such as improvement in muscle strength.

As a result, research must focus on compounds that promote proper functioning of the mitochondria, since mitochondrial health is essential in older animals for reducing oxidative stress, the researchers said. And clinical trials need to be performed to test the effectiveness of the supplements in humans.

“It’s going to be very important to focus less on oxidative stress and biomarkers, and focus on having sufficient energy,” Leeuwenburgh said. “If energy declines, then you have an increased chance for oxidative stress or failure of repair mechanisms that recognize oxidative damage — we’re seeing that the health of mitochondria is central to aging.”

It is possible that although the supplement could help reduce the oxidative stress damage, because damage in much older animals was too great, energy could not be restored.

The different compounds in the mixture acted to produce effects that single compounds did not, because each component affected a different biochemical pathway in the body, addressing both oxidative stress and mitochondrial function, researchers said.

“People are catching on that using a single compound is not a good strategy — you have to use multiple compounds and target one or multiple pathways,” Leeuwenburgh said.

Public release date: 24-May-2010

Folate prevents alcohol-induced congenital heart defects in mice

University of South Florida study suggests high dose needed very early in pregnancy to protect developing heart

Tampa, FL (May 24, 2010) — A new animal study has found that high levels of the B-vitamin folate (folic acid) prevented heart birth defects induced by alcohol exposure in early pregnancy, a condition known as fetal alcohol syndrome.

Researchers at the University of South Florida College of Medicine and All Children’s Hospital report that the protection was afforded only when folate was administered very early in pregnancy and before the alcohol exposure. The dose that best protected against heart defects in mice was considerably higher than the current dietary recommendation of 400 micrograms (0.4 milligrams) daily for women of child-bearing age.

The findings were published online earlier this month in the American Journal of Obstetrics and Gynecology.

While more research is needed, the study has implications for re-evaluating folate supplementation levels during early pregnancy, said principal investigator Kersti Linask, PhD, the Mason Professor of Cardiovascular Development at USF and Children’s Research Institute/All Children’s Hospital.

“Congenital heart defects can occur in the developing embryo at a time when women typically do not even know they are pregnant – 16 to 18 days following conception. They may have been drinking alcohol or using prescription drugs without realizing this could be affecting embryonic development,” Dr. Linask said.

“We found that we could prevent alcohol-associated defects from arising in the mice — provided folate was given in relatively high concentrations very early in pregnancy around conception.”

In the USF study, two randomly assigned groups of pregnant mice were fed diets supplemented by folate in adjusted doses known from epidemiological studies to rescue human embryos from craniofacial birth defects. From the day after conception, one group received a high dose of folate supplementation (10.5 milligrams/kilogram) and the second received a moderate dose (6.2 mg/kg). A third control group ate a normal folate-supplemented diet (3.3 mg/kg) determined to maintain the general health of the pregnant mice, but not to rescue embryos from birth defects.

During the first week of pregnancy, the mice in all three groups were then administered injections of alcohol simulating a single binge drinking event in humans.

Following this alcohol exposure, Doppler ultrasound confirmed that 87 percent of the embryos of pregnant mice in the third group – those not receiving folate supplementation beyond what was present in their normal diets – had developed heart valve defects. The affected embryos were also smaller in size and their heart muscle walls appeared thinner.

Between days 15 and 16 of pregnancy in the mice – equal to 56 days of gestation in humans — ultrasound also showed that the high-folate diet protected heart valve development against lasting defects and restored heart function and embryonic size to near-normal levels. The moderate-folate diet provided only partial protection; in this group 58 percent of the mouse embryos developed heart valves that functioned abnormally, with a back flow of blood.

The researchers suggest that folate fortification may be most effective at preventing heart birth defects when administered at significantly higher levels than the doses currently recommended to prevent pregnancy complications — both in normal women (0.4 milligrams recommended daily) and even in women who have delivered an infant with a spinal birth defect (4 milligrams daily). Although higher folate levels did not cause adverse side effects in the pregnant mice, Dr. Linask notes, the safety and effectiveness of higher doses must be proven with human trials.

The heart is the first organ to form and function during embryonic development of vertebrates. The USF researchers suggest that folate supplementation thwarts alcohol’s damaging effect on an important early signaling pathway that plays a vital role in early heart development and subsequently in valve formation.

 

Public release date: 24-May-2010

LSUHSC researcher finds surprising link between sugar in drinks and blood pressure

New Orleans, LA – Research led by Liwei Chen, MD, PhD, Assistant Professor of Public Health at LSU Health Sciences Center New Orleans, has found that there is an association between sugary drinks and blood pressure and that by cutting daily consumption of sugary drinks by just one serving a day, people can lower their blood pressure. The research is published online in Circulation: Journal of the American Heart Association.

“We found no association for diet beverage consumption or caffeine intake and blood pressure,” notes Dr. Chen, “suggesting that sugar may actually be the nutrient that is associated with blood pressure and not caffeine which many people would suspect.”

The research, which was supported by a grant from the National Heart, Lung, and Blood Institute of the National Institutes of Health, analyzed dietary intake and blood pressure of 810 adults measured at baseline, 6 and 18 months. After known risk factors of high blood pressure were controlled for, a reduction in sugar-sweetened beverage consumption of one serving per day was associated with a drop of 1.8 mm Hg in systolic pressure and 1.1 mm Hg in diastolic blood pressure over 18 months.

After additional adjustment for weight change over the same period, a reduction in the consumption of sugar-sweetened beverages was still significantly associated with blood pressure reduction.

“By reducing the amount of sugar in your diet, you are also reducing the number of calories you consume and may lose weight,” adds Dr. Chen. “But even among those whose weight was stable, we still found that people who drank fewer sugary sodas lowered their blood pressure.”

Elevated blood pressure continues to be one of the most common and important problems in the United States. According to the American Heart Association, about 74.5 million people in the United States, or one in three people, age 20 and older have high blood pressure. It is estimated that high blood pressure killed 56,561Americans in 2006. From 1996 to 2006, the death rate from high blood pressure increased 19.5 percent, and the actual number of deaths rose 48.1 percent.

Normal blood pressure, measured in millimeters of mercury, is defined as systolic (top number) less than 120 and diastolic (bottom number) less than 80. High blood pressure (hypertension) is a systolic pressure of 140 or higher and a diastolic pressure of 90 or higher. Pressures falling in the range between are considered to be prehypertension.

High blood pressure, which usually has few symptoms, if any, is an established risk factor for stroke, cardiovascular disease, kidney failure, and shortened life expectancy.

“More research is needed to establish the causal relationship, but in the meantime, people can benefit right now by reducing their intake of sugary drinks by at least one serving per day,” concludes Dr. Chen.

Public release date: 25-May-2010

Dangerous lung worms found in people who eat raw crayfish

If you’re headed to a freshwater stream this summer and a friend dares you to eat a raw crayfish – don’t do it. You could end up in the hospital with a severe parasitic infection.

Physicians at Washington University School of Medicine in St. Louis have diagnosed a rare parasitic infection in six people who had consumed raw crayfish from streams and rivers in Missouri. The cases occurred over the past three years, but three have been diagnosed since last September; the latest in April. Before these six, only seven such cases had ever been reported in North America, where the parasite, Paragonimus kellicotti, is common in crayfish.

“The infection, called paragonimiasis, is very rare, so it’s extremely unusual to see this many cases in one medical center in a relatively short period of time,” says Washington University infectious diseases specialist Gary Weil, MD, professor of medicine and of molecular microbiology, who treated some of the patients. “We are almost certain there are other people out there with the infection who haven’t been diagnosed. That’s why we want to get the word out.”

Paragonimiasis causes fever, cough, chest pain, shortness of breath and extreme fatigue. The infection is generally not fatal, and it is easily treated if properly diagnosed. But the illness is so unusual that most doctors are not aware of it. Most of the patients had received multiple treatments for pneumonia and undergone invasive procedures before they were referred to Barnes-Jewish Hospital or St. Louis Children’s Hospital at Washington University Medical Center.

The half-inch, oval-shaped parasitic worms at the root of the infection primarily travel from the intestine to the lungs. They also can migrate to the brain, causing severe headaches or vision problems, or under the skin, appearing as small, moving nodules.

Some of the patients had been in and out of the hospital for months as physicians tried to diagnose their mysterious illness and treat their symptoms, which also included a buildup of fluid around the lungs and around the heart. One patient even had his gallbladder removed, to no avail.

“Some of these invasive procedures could have been avoided if the patients had received a prompt diagnosis,” says Michael Lane, MD, an infectious diseases fellow at the School of Medicine who treated some of the patients. “We hope more doctors will now have this infection on their radar screens for patients with an unexplained lingering fever, cough and fatigue.”

Once the diagnosis is made, paragonimiasis is easily treated with an oral drug, praziquantel, taken three times a day for only two days. Symptoms begin to improve within a few days and are typically gone within seven to 10 days. All the patients have completely recovered, even one patient who temporarily lost his vision when parasites invaded the brain.

The recent infections, which occurred in patients ages 10-32, have prompted the Missouri Department of Health & Senior Services to issue a health advisory alerting doctors across the state. The department also printed posters warning people not to eat raw crayfish and placed them in campgrounds and canoe rental businesses near popular Missouri streams. Thoroughly cooking crayfish kills the parasite and does not pose a health risk.

Paragonimiasis is far more common in East Asia, where many thousands of cases are diagnosed annually in people who consume raw or undercooked crab that contain Paragonimus westermani, a cousin to the parasite in North American crayfish.

While the U.S. Centers for Disease Control and Prevention has an antibody test to identify Paragonimus westermani infection, the test is not sensitive for patients with P. kellicotti parasite, and this makes diagnosis a real challenge. Diagnostic clues include elevated levels of white blood cells called eosinophils. These cells typically are elevated in patients with worm parasites, but they can also occur in more common illnesses, including cancer, autoimmune disease and allergy. X-rays also show excess fluid around the lungs and sometimes the heart.

“You have to be a bit of a detective and be open to all the clues,” says Washington University infectious diseases specialist Thomas Bailey, MD, professor of medicine, who diagnosed and treated the first case at the School of Medicine.

As a case in point, the first patient who sought treatment at Washington University had had a fever and cough for several weeks. His chest X-ray showed fluid around the lungs, and blood tests showed elevated levels of eosinophils.

The “aha moment” for Bailey occurred when the patient’s wife mentioned that his symptoms developed about a week after he ate raw crayfish from a Missouri river, and Bailey recalled that in Asia eating raw or undercooked crabs can lead to a paragonimus infection. With a quick search of the medical literature, Bailey learned that rare cases of North American paragonimiasis had been described in patients eating raw crayfish. The scenario fit perfectly with his patient.

“That’s the interesting thing about being an infectious diseases doctor,” Bailey says. “Every time you see a new patient you have to be open to the possibility that the diagnosis could be something highly unusual.”

Crayfish are common throughout North America, where hundreds of species live in rivers, streams, lakes and ponds. The parasite P. kellicotti has a complex life cycle. It lives in snails and crayfish but only causes a dangerous infection if it ingested by mammals, including dogs, cats and humans, who eat it raw.

No one knows why more cases of paragonimiasis are being diagnosed now, but doctors and researchers at Washington University are studying the parasite and hope to develop a better diagnostic test for the infection. For now, the message for physicians is to consider paragonimiasis in patients with cough, fever and eosinophilia. The simple message for the public is: “Do not eat raw crayfish,” Weil says.

 

Public release date: 26-May-2010

Some bisphosphonates users unfamiliar with drug’s possible side effects on oral health

CHICAGO, May 26, 2010 – People undergoing bisphosphonate therapy to prevent or treat osteoporosis (a thinning of the bones) may be unfamiliar with the drug and possible adverse side effects on oral health, according to a study in the May issue of the Journal of the American Dental Association (JADA).

Use of bisphosphonates has been associated with a small risk of developing bisphosphonate-associated osteonecrosis of the jaw (BON) that occurs spontaneously or after the patient has undergone dental surgery. BON is a rare but serious condition that can cause severe damage to the jaw bone. The prevalence of BON is between three and 12 percent for patients who receive bisphosphonates intravenously for cancer therapy and less than one percent for patients who receive bisphosphonates orally for osteoporosis or osteopenia.

In the study, the authors sought to determine whether patients taking bisphosphonates had knowledge about the medical indication for the therapy and how long the treatment would last. They also wanted to know whether participants’ physicians told them about possible adverse reactions.

The researchers interviewed 73 participants (71 women, two men) seeking routine care in a dental clinic. These participants, with an average age of 66 years that ranged from 44 to 88 years, also were undergoing bisphosphonate treatment. Eighty-four percent of the participants stated they knew why they were receiving bisphosphonate therapy. However, 80 percent said they were unsure about the duration of the therapy and 82 percent could not recall receiving information about the risk of experiencing adverse reactions, including oral osteonecrosis, by their physicians.

“The results of our small study show that patients who take bisphosphonates may not be aware that BON can develop after they undergo invasive dental care,” the authors wrote. “We believe that a more effective communication process between prescribing physicians, dentists and patients using bisphosphonates is needed.”

The American Dental Association Advisory Committee on Medication-induced Osteonecrosis of the Jaw recommends that dental patients on bisphosphonate therapy advise their dentist. The Committee believes that it is always appropriate for physicians to encourage patients to visit the dentist regularly for professional cleanings and oral exams, as recommended by their dentist. This is especially important for patients whose oral health is put at risk from medications or medical problems.

Public release date: 26-May-2010

 

You have no natural right to food

The Farm-to-Consumer Legal Defense Fund (FTCLDF), an organization whose mission includes “defending the rights and broadening the freedoms of family farms and protecting consumer access to raw milk and nutrient dense foods”, recently filed a lawsuit against the FDA for its ban on interstate sales of raw milk. The suit alleges that such a restriction is a direct violation of the United States Constitution. Nevertheless, the suit led to a surprisingly cold response from the FDA about its views on food freedom (and freedoms in general).

In a dismissal notice issued to the Iowa District Court where the suit was filed, the FDA officially made public its views on health and food freedom. These views will shock you, but they reveal the true evil intent of the FDA and why it is truly a rogue federal agency.

The FDA essentially believes that nobody has the right to choose what to eat or drink. You are only “allowed” to eat or drink what the FDA gives you permission to. There is no inherent right or God-given right to consume any foods from nature without the FDA’s consent.

This is no exaggeration. It’s exactly what the FDA said in its own words.

You have no natural right to food

The FTCLDF highlighted a few of the key phrases from the FDA’s response document in a recent email to its supporters. They include the following two statements from the FDA:

“There is no ‘deeply rooted’ historical tradition of unfettered access to foods of all kinds.” [p. 26]

 

“Plaintiffs’ assertion of a ‘fundamental right to their own bodily and physical health, which includes what foods they do and do not choose to consume for themselves and their families’ is similarly unavailing because plaintiffs do not have a fundamental right to obtain any food they wish.” [p.26]

There’s a lot more in the document, which primarily addresses the raw milk issue, but these statements alone clearly reveal how the FDA views the concept of health freedom. Essentially, the FDA does not believe in health freedom at all. It believes that it is the only entity granted the authority to decide for you what you are able to eat and drink.

The State, in other words, may override your food decisions and deny you free access to the foods and beverages you wish to consume. And the State may do this for completely unscientific reasons — even just political reasons — all at their whim.

 

Ralph’s note – Who would of ever guessed that we would lose our freedom to eat? It’s not lost yet, but obviously some would let to see that happen.

________________________________

 

These reports are done with the appreciation of all the Doctors, Scientist, and other

Medical Researchers who sacrificed their time and effort. In order to give people the

ability to empower themselves. Without the base aspirations for fame, or fortune.

Just honorable people, doing honorable things.

U of I study: Lack of omega-6 fatty acid linked to severe dermatitis

2010 study posted for filing

Contact: Phyllis Picklesimer p-pickle@illinois.edu 217-244-2827 University of Illinois at Urbana-Champaign

URBANA –University of Illinois scientists have learned that a specific omega-6 fatty acid may be critical to maintaining skin health.

“In experiments with mice, we knocked out a gene responsible for an enzyme that helps the body to make arachidonic acid. Without arachidonic acid, the mice developed severe ulcerative dermatitis. The animals were very itchy, they scratched themselves continuously, and they developed a lot of bleeding sores,” said Manabu Nakamura, a U of I associate professor of food science and human nutrition.

When arachidonic acid was added to the animals’ diet, the itching went away, he said.

Nakamura’s team has been focusing on understanding the function of omega-3 and -6 fatty acids, and doctoral student Chad Stroud developed a mouse model to help them understand the physiological roles of these fats. By knocking out genes, they can create deficiencies of certain fats and learn about their functions.

“Knocking out a gene that enables the body to make the delta-6-desaturase enzyme has led to some surprising discoveries. In this instance, we learned that arachidonic acid is essential for healthy skin function. This new understanding may have implications for treating the flaky, itchy skin that sometimes develops without an attributable cause in infants,” he said.

Nakamura explained that our bodies make arachidonic acid from linoleic acid, an essential fatty acid that we must obtain through our diets. It is found mainly in vegetable oils.

Scientists have long attributed healthy skin function to linoleic acid, which is important because it provides the lipids that coat the outer layer of the skin, keeping the body from losing water and energy, which would retard growth, the scientist said.

But skin function seems to be more complicated than that. These itchy mice had plenty of linoleic acid. They just couldn’t convert it to arachidonic acid because the gene to make the necessary enzyme had been knocked out, he noted.

Arachidonic acid is also essential to the production of prostaglandins, compounds that can lead to inflammatory reactions and are important to immune function. Common painkillers like aspirin and ibuprofen work by inhibiting the conversion of arachidonic acid to prostaglandins.

“We usually think of inflammation as a bad thing, but in this case, prostaglandins prevented dermatitis, which is an inflammatory reaction. We measured prostaglandin levels in the animals’ skin, and when we fed arachidonic acid to the knockout mice, they resumed making these important chemical compounds,” he said.

Nakamura cautioned that there are still things they don’t understand about the function of this omega-6 fatty acid. “This new knowledge is a starting point in understanding the mechanisms that are involved, and we need to do more research at the cellular level.”

###

 

The study was published in a recent issue of the Journal of Lipid Research. Co-authors are Chad K. Stroud, Takayuki Y. Nara, Manuel Roqueta-Rivera, Emily C. Radlowski, Byung H. Cho, Mariangela Segre, Rex A. Hess, and Wanda M. Haschek, all of the U of I, and Peter Lawrence, Ying Zhang, and J. Thomas Brenna of Cornell University. Funding was provided in part by a USDA National Needs Fellowship Award and a grant from the National Institutes of Health.

Researchers explore new ways to prevent spinal cord damage using a vitamin B3 precursor: nicotinamide adenine dinucleotide ( NAD+ )

2009 study posted for filing

Contact: Andrew Klein
ank2017@med.cornell.edu
212-821-0560
New York- Presbyterian Hospital/Weill Cornell Medical Center/Weill Cornell Medical College

Weill Cornell Medical College team receives $2.5 million New York State research grant to undertake laboratory study

NEW YORK (November 5, 2009) — Substances naturally produced by the human body may one day help prevent paralysis following a spinal cord injury, according to researchers at Weill Cornell Medical College. A recent $2.5 million grant from the New York State Spinal Cord Injury Research Board will fund their research investigating this possibility.

The Weill Cornell team believes that permanent nerve damage may be avoided by raising levels of a compound that converts to nicotinamide adenine dinucleotide (NAD+) — the active form of vitamin B3. The compound would potentially be administered immediately following spinal cord injury.

“Boosting NAD+ after injury may prevent permanent nerve death,” explains Dr. Samie Jaffrey, associate professor of pharmacology at Weill Cornell Medical College. “Our study is aimed at synthesizing a molecule that, when given soon after injury, may augment the body’s production of NAD+ and rescue these cells before they are stressed beyond recovery.”

The compound, called nicotinamide riboside (NR) — a natural NAD+ precursor found in foods like milk — as well as other NR derivatives have already been proven to protect against cell death and axonal degeneration in cultured cells and in models of spinal cord injury. In 2007, the authors reported results of laboratory experiments finding that NR can increase NAD+ concentrations as high as 270 percent when compared with untreated control cells. No other known agent has been shown to achieve these types of increases in cells.

NAD+ is known to play a key role in human cells by activating proteins called sirtuins that help the cells survive under stress. Sirtuins, which can be activated by compounds like resveratrol (found in large concentrations in the skin of grapes used to make red wine) have been shown to possess anti-aging and healing properties. The researchers believe that quickly increasing the NAD+ levels may help to activate the sirtuin levels in the cells and prevent cell death. This is especially important because when cells and tissues experience extreme trauma, NAD+ levels drop quickly.

In the newly funded research, the Weill Cornell team will conduct a lab study to see how NR compounds can raise NAD+ levels in cells that are stressed to the point that they will die within three to four hours, and instead survive as a consequence of treatment. In a separate study, Dr. Brett Langley from the Burke Rehabilitation Center in Westchester, N.Y. — a hospital affiliated with Weill Cornell Medical College — will test the compounds in mice with spinal injuries, with the hope of observing physical recovery and improvement in behavioral testing.

“We hope to show that a natural compound that can be produced cheaply and efficiently could be the key to preventing permanent injury,” explains Dr. Anthony Sauve, associate professor of pharmacology at Weill Cornell Medical College. “We also believe that the compound would be perfectly safe to use in humans, since it is a vitamin that has not been shown to have negative effects on the body when artificially elevated.”

Dr. Sauve has patented and pioneered a way to produce compounds that regulate NAD+ and specializes in making an array of NAD derivatives to determine which one best augments NAD+ levels in neurons.

“If this study is successful in animal testing, we hope to study the compound clinically,” says Dr. Jaffrey.

 

###

 

New York State Spinal Cord Injury Research Board

The New York State Spinal Cord Injury Research Board distributes research grants to find a cure for spinal cord injuries. More than 16,000 New Yorkers suffer with spinal cord injuries. In July 1998, landmark legislation was enacted to create the New York State Spinal Cord Injury Research Board and allocate funding to the Spinal Cord Injury Research Trust Fund. The purpose of the fund is to assist leading researchers with ongoing and new efforts to find a cure for spinal cord injuries. Since its inception, the Board has recommended more than $54 million in research awards to some of New York State’s finest research teams.

Weill Cornell Medical College

Weill Cornell Medical College, Cornell University’s medical school located in New York City, is committed to excellence in research, teaching, patient care and the advancement of the art and science of medicine, locally, nationally and globally. Weill Cornell, which is a principal academic affiliate of NewYork-Presbyterian Hospital, offers an innovative curriculum that integrates the teaching of basic and clinical sciences, problem-based learning, office-based preceptorships, and primary care and doctoring courses. Physicians and scientists of Weill Cornell Medical College are engaged in cutting-edge research in areas such as stem cells, genetics and gene therapy, geriatrics, neuroscience, structural biology, cardiovascular medicine, transplantation medicine, infectious disease, obesity, cancer, psychiatry and public health — and continue to delve ever deeper into the molecular basis of disease and social determinants of health in an effort to unlock the mysteries of the human body in health and sickness. In its commitment to global health and education, the Medical College has a strong presence in places such as Qatar, Tanzania, Haiti, Brazil, Austria and Turkey. Through the historic Weill Cornell Medical College in Qatar, the Medical College is the first in the U.S. to offer its M.D. degree overseas. Weill Cornell is the birthplace of many medical advances — including the development of the Pap test for cervical cancer, the synthesis of penicillin, the first successful embryo-biopsy pregnancy and birth in the U.S., the first clinical trial of gene therapy for Parkinson’s disease, the first indication of bone marrow’s critical role in tumor growth, and most recently, the world’s first successful use of deep brain stimulation to treat a minimally conscious brain-injured patient. For more information, visit www.med.cornell.edu.

Why We Need Insects–Even “Pesky” Ones

Photo of yellow flowers of evening primrose in Ithaca, New York.

A large natural population of evening primrose (yellow flowers) in Ithaca, New York.

Credit and Larger Version

October 4, 2012

View a video interview with Anurag Agrawal of Cornell University.

Image of Cornell University professor Anurag Agrawal.
View Video
Hard evidence of evolution.
Credit and Larger Version

At first blush, many people would probably love to get rid of insects, such as pesky mosquitoes, ants and roaches. But a new study indicates that getting rid of insects could trigger some unwelcome ecological consequences, such as the rapid loss of desired traits in plants, including their good taste and high yields.

Specifically, the study–described in the Oct. 5, 2012 issue of Science and funded by the National Science Foundation showed that evening primroses grown in insecticide-treated plots quickly lost, through evolution, defensive traits that helped protect them from plant-eating moths. The protective traits lost included the production of insect-deterring chemicals and later blooms that gave evening primroses temporal distance from plant-eating larvae that peak early in the growing season.

These results indicate that once the plants no longer needed their anti-insect defenses, they lost those defenses. What’s more, they did so quickly–in only three or four generations.

Anurag Agrawal, the leader of the study and a professor of ecology and evolutionary biology at Cornell University, explains, “We demonstrated that when you take moths out of the environment, certain varieties of evening primrose were particularly successful. These successful varieties have genes that produce less defenses against moths.”

In the absence of insects, the evening primroses apparently stopped investing energy in their anti-insect defenses, and so these defenses disappeared through natural selection. Agrawal says that he was “very surprised” by how quickly this process occurred, and that such surprises, “tell us something about the potential speed and complexities of evolution. In addition, experiments like ours that follow evolutionary change in real-time provide definitive evidence of evolution.”

Agrawal believes that his team’s study results are applicable to many other insect-plant interactions beyond evening primroses and moths. Here’s why: The ubiquitous consumption of plants by insects represents one of the dominant species interactions on Earth. With insect-plant relationships so important, it is widely believed that many plant traits originally evolved solely as defenses against insects. Some of these anti-insect plant defenses, such as the bitter taste of some fruits, are desirable.

“This experimental demonstration of how rapid evolution can shape ecological interactions supports the idea that we need to understand feedbacks between evolutionary and ecological processes in order to be able to predict how communities and ecosystems will respond to change,” said Alan Tessier, a program director in NSF’s Directorate for Biological Sciences.

“One of the things farmers are trying to do is breed agricultural crops to be more resistant to pests,” said Agrawal. “Our study indicates that various genetic tradeoffs may make it difficult or impossible to maintain certain desired traits in plants that are bred for pest resistance.”

In addition, oils produced by evening primroses have been used medicinally for hundreds of years and are beginning to be used as herbal remedies. Agrawal’s insights about pests that attack these plants and about chemical compounds produced by these plants may ultimately be useful to the herbal and pharmaceutical industries.

Agrawal says that most previous real-time experiments on evolution have been conducted with bacteria in test tubes in laboratories. “One of things we were excited about is that we were able to repeat that kind of experiment in nature. You can expect to see a lot more of this kind of thing in future. We will keep our experiment running as a long-term living laboratory. ”

More information about this study is available from a Cornell University press release.

-NSF-

Cover of the October 5, 2012, issue of the journal Science.
The research results are described in the Oct. 5, 2012, issue of the journal Science.
Credit and Larger Version

The upside to allergies: cancer prevention

Contact: Kevin Stacey
kstacey@uchicago.edu
773-834-0386
University of Chicago Press Journals

The upside to allergies: cancer prevention

A new article in the December issue of The Quarterly Review of Biology provides strong evidence that allergies are much more than just an annoying immune malfunction. They may protect against certain types of cancer.

The article, by researchers Paul Sherman, Erica Holland and Janet Shellman Sherman from Cornell University, suggests that allergy symptoms may protect against cancer by expelling foreign particles, some of which may be carcinogenic or carry absorbed carcinogens, from the organs most likely to come in with contact them. In addition, allergies may serve as early warning devices that let people know when there are substances in the air that should be avoided.

Medical researchers have long suspected an association between allergies and cancer, but extensive study on the subject has yielded mixed, and often contradictory, results. Many studies have found inverse associations between the two, meaning cancer patients tended to have fewer allergies in their medical history. Other studies have found positive associations, and still others found no association at all.

In an attempt to explain these contradictions, the Cornell team reexamined nearly 650 previous studies from the past five decades. They found that inverse allergy-cancer associations are far more common with cancers of organ systems that come in direct contact with matter from the external environment—the mouth and throat, colon and rectum, skin, cervix, pancreas and glial brain cells. Likewise, only allergies associated with tissues that are directly exposed to environmental assaults—eczema, hives, hay fever and animal and food allergies—had inverse relationships to cancers.

Such inverse associations were found to be far less likely for cancers of more isolated tissues like the breast, meningeal brain cells and prostate, as well as for myeloma, non-Hodgkins lymphoma and myelocytic leukemia.

The relationship between asthma and lung cancer, however, is a special case. A majority of the studies that the Cornell team examined found that asthma correlates to higher rates of lung cancer. “Essentially, asthma obstructs clearance of pulmonary mucous, blocking any potentially prophylactic benefit of allergic expulsion,” they explain. By contrast, allergies that affect the lungs other than asthma seem to retain the protective effect.

So if allergies are part of the body’s defense against foreign particle invaders, is it wise to turn them off with antihistamines and other suppressants? The Cornell team says that studies specifically designed to answer this question are needed.

“We hope that our analyses and arguments will encourage such cost/benefit analyses,” they write. “More importantly, we hope that our work will stimulate reconsideration…of the current prevailing view … that allergies are merely disorders of the immune system which, therefore, can be suppressed with impunity.”

 

###

 

Sherman, Paul W., Erica Holland, Janet Shellman Sherman, “Allergies: Their Role In Cancer Prevention,” The Quarterly Review of Biology December 2008

Since 1926, The Quarterly Review of Biology has been dedicated to providing insightful historical, philosophical, and technical treatments of important biological topics

Wild parrots name their babies | video | : Rival Human Language

Wild green-rumped parrotlet parents give their babies their own individual names

Wild pair of green-rumped parrotlets, Forpus passerinus, photographed in Venezuela. Male (left) and female (right). Image: screengrab.
.
People who live with parrots know that they can mimic their human care-givers as well as many of the common sounds in their environment. Although such mimicry is delightful, it does raise the question of what purpose does vocal mimicry serve for wild parrots?One proposed hypothesis for parrots’ remarkable ability to mimic sounds in their environment is to develop and maintain social cohesion. For example, several species of wild parrots studied to date demonstrate the ability to readily imitate their flock mates’ calls. This ability is important for psittacines: when an individual parrot moves from one locale to another, it learns the calls of the local parrot flock as part of forming a social bond with those birds.But research in spectacled parrotlets, Forpus conspicillatus, went further: this research showed that each parrot has its own signature call — a unique sound that is used only for recognising that particular individual (doi:10.1007/s002650050481). Basically, each parrot has its own name. Interestingly, similar to human culture, members of each parrot family have names that sound more like each other than like those for other parrot families. But how do young parrots acquire their special signature calls (their names)? Do they learn their names from their parents, or are they born knowingtheir names?
.

To answer to this question, Karl Berg, a graduate student in Neurobiology and Behavior at Cornell University in Ithaca, NY, assembled a team of researchers and studied a wild population of green-rumped parrotlets, Forpus passerinus, in Guarico, Venezuela. Because this particular population has been carefully documented for decades, it provided an excellent opportunity to study the social dynamics of wild parrots.

.

Slightly smaller than a domestic canary, green-rumped parrotlets are the smallest parrot species in the Americas. They are resident in open forest and scrubland throughout much of tropical South America, and they have small ranges. These tiny, mostly green, parrots are slightly sexually dimorphic cavity nesters, laying between five and seven eggs in a termite nest, in a tree cavity — or in a hollow pipe.

.

To distinguish between the two hypotheses (social name learning versus biological name inheritance), Mr Berg and his team of researchers set up inconspicuous video cameras and audio recorders inside and outside 17 nest cavities in PVC pipes in 2007 and 2008. When the resident female parrotlet had completed her clutch, nine of those nests were swapped between unrelated birds that lived far enough apart that they did not come into any auditory contact with each other. (The other eight nests were controls that remained with their biological parents.)

.

The researchers then recorded and analyzed the sounds in each nest cavity (figure 1 a & b):

Figure 1. Least-squares regression of contact call similarities within pairs and within sibling groups of green-rumped parrotlet nests. (a, b). Dotted lines indicate confidence intervals. (a) r2 = 0.64, p < 0.02; (b) r2 = 0.62, p < 0.01.
.
The team found that each adult had its own unique contact call and that contact call was more similar to each bird’s mate’s call than to calls produced by adults at other nests.They also recorded and analyzed the nestlings’ contact calls (figure 1 c & d):
Figure 1 c & d. Least-squares regression of contact call similarities within pairs and within sibling groups of green-rumped parrotlet nests. (c, d) Mean canonical scores of nestlings as a function of canonical scores of siblings within nests (c) in 2007 and (d) in 2008. Dotted lines indicate confidence intervals. (c) r2 = 0.71, p < 0.02; (d) r2 = 0.37, p < 0.15.
.
.
As expected, the parrot nestlings’ calls were more variable than those of the adults, but sibling parrots tended to show strong similarities in their contact call structure. Like their parents, nestling parrotlet calls were more similar to their siblings than to nestling calls at other nests (however, this finding was significant only in 2007).But were the foster parents learning their adopted nestlings innate contact calls or were the nestlings learning calls that their parents assigned to them? The researchers had anticipated this question by recording the foster parents’ calls prior to them hearing their adopted nestlings’ calls. Spectrographic analysis showed that it was the parents who assigned signature calls — names — to the young parrots instead of the other way around. Further, all parrot nestlings adopted contact calls that were notably similar to those that their parents — whether biological or foster — vocalized to them in the first weeks of their lives. Taken together, these data indicate that nestling parrots learn their names from their parents and parrot names are the result of social learning rather than biological inheritance.It’s likely that parrots evolved the ability to mimic sounds for social reasons, although those precise reasons are still unknown. But since this ability allows families to recognise each other by voice, it is likely that such vocal recognition is important for restricting parental care to one’s own fledglings after parrot families begin moving to communal foraging and roosting sites.
.

These findings have a number of interesting implications as well. For example, can parrots recall and distinguish particular individuals and identify family members, even after being separated for years? This also raises the possibility that parrots may have a concept of individuality and even of self awareness.

.

This video is a slide presentation of these findings:

[video link]

Sources:

Karl S. Berg, Soraya Delgado, Kathryn A. Cortopassi, Steven R. Beissinger, & Jack W. Bradbury (2012). Vertical transmission of learned signatures in a wild parrot. Proceedings of the Royal Society B: Biological Sciences, 279 (1728): 585-591. doi:10.1098/rspb.2011.0932

Ralf Wanker, Jasmin Apcin, Bert Jennerjahn, & Birte Waibel (1998). Discrimination of different social companions in spectacled parrotlets (Forpus conspicillatus): evidence for individual vocal recognition. Behavioral Ecology and Sociobiology, 43 (3), 197-202. doi:10.1007/s002650050481

.. .. .. .. .. .. .. .. .. .. ..

GrrlScientist maintains her presence on a number of social media sites, including facebook and twitter: @GrrlScientist email: grrlscientist@gmail.com

http://www.guardian.co.uk/science/grrlscientist/2012/sep/22/1

 

Nutrient in Eggs and Meat May Influence Gene Expression from Infancy to Adulthood: Choline

 

 

Implications for Wide Range of Disorders – Hypertension to Mental Health Problems

 

September 20, 2012

 

Just as women are advised to get plenty of folic acid around the time of conception and throughout early pregnancy, new research suggests another very similar nutrient may one day deserve a spot on the obstetrician’s list of recommendations.

 

Consuming greater amounts of choline – a nutrient found in eggs and meat – during pregnancy may lower an infant’s vulnerability to stress-related illnesses, such as mental health disturbances, and chronic conditions, like hypertension, later in life.

 

In an early study in The FASEB Journal, nutrition scientists and obstetricians at Cornell University and the University of Rochester Medical Center found that higher-than-normal amounts of choline in the diet during pregnancy changed epigenetic markers – modifications on our DNA that tell our genes to switch on or off, to go gangbusters or keep a low profile – in the fetus. While epigenetic markers don’t change our genes, they make a permanent imprint by dictating their fate: If a gene is not expressed – turned on – it’s as if it didn’t exist.

 

The finding became particularly exciting when researchers discovered that the affected markers were those that regulated the hypothalamic-pituitary-adrenal or HPA axis, which controls virtually all hormone activity in the body, including the production of the hormone cortisol that reflects our response to stress and regulates our metabolism, among other things.

 

More choline in the mother’s diet led to a more stable HPA axis and consequently less cortisol in the fetus. As with many aspects of our health, stability is a very good thing: Past research has shown that early exposure to high levels of cortisol, often a result of a mother’s anxiety or depression, can increase a baby’s lifelong risk of stress-related and metabolic disorders.

 

“The study is important because it shows that a relatively simple nutrient can have significant effects in prenatal life, and that these effects likely continue to have a long-lasting influence on adult life,” said Eva K. Pressman, M.D., study author and director of the high-risk pregnancy program at the University of Rochester Medical Center. “While our results won’t change practice at this point, the idea that maternal choline intake could essentially change fetal genetic expression into adulthood is quite novel.”

 

Pressman, who advises pregnant women every day, says choline isn’t something people think a lot about because it is already present in many things we eat and there is usually no concern of choline deficiency. Though much more research has focused on folate – functionally very similar to choline and used to decrease the risk of neural tube defects like spina bifida – a few very compelling studies sparked her interest, including animal studies on the role of choline in mitigating fetal alcohol syndrome and changing outcomes in Down syndrome.

 

A long-time collaborator with researchers at Cornell, Pressman joined a team led by Marie Caudill, Ph.D., R.D., professor in the Division of Nutritional Sciences at Cornell, in studying 26 pregnant women in their third trimester who were assigned to take 480 mg of choline per day, an amount slightly above the standard recommendation of 450 mg per day, or about double that amount, 930 mg per day. The choline was derived from the diet and from supplements and was consumed up until delivery.

 

The team found that higher maternal choline intake led to a greater amount of DNA methylation, a process in which methyl groups – one carbon atom linked to three hydrogen atoms – are added to our DNA. Choline is one of a handful of nutrients that provides methyl groups for this process. The addition of a single methyl group is all it takes to change an individual’s epigenome.

 

Measurements of cord blood and samples from the placenta showed that increased choline, via the addition of methyl groups, altered epigenetic markers that govern cortisol-regulating genes. Higher choline lessened the expression of these genes, leading to 33 percent lower cortisol in the blood of babies whose mom’s consumed 930 mg per day.

 

Study authors say the findings raise the exciting possibility that choline may be used therapeutically in cases where excess maternal stress from anxiety, depression or other prenatal conditions might make the fetal HPA axis more reactive and more likely to release greater-than-expected amounts of cortisol.

 

While more research is needed, Caudill says that her message to pregnant women would be to consume a diet that includes choline rich foods such as eggs, lean meat, beans and cruciferous vegetables like broccoli. For women who limit their consumption of animal products, which are richer sources of choline than plant foods, she adds that supplemental choline may be warranted as choline is generally absent in prenatal vitamin supplements.

 

“One day we might prescribe choline in the same way we prescribe folate to all pregnant women,” notes Pressman, the James R. Woods Professor in the Department of Obstetrics and Gynecology. “It is cheap and has virtually no side effects at the doses provided in this study. In the future, we could use choline to do even more good than we are doing right now.”

 

In addition to Pressman and Caudill, several scientists and clinicians from the Division of Nutritional Science and the Statistical Consulting Unit at Cornell and the Cayuga Medical Center in Ithaca, N. Y., participated in the research. The study was funded by the Egg Nutrition Center, the National Cattlemen’s Beef Association, the Nebraska Beef Council, the U.S. Department of Agriculture and the President’s Council of Cornell Women. The funding sources had no role in the study design, interpretation of the data, or publication of the results.

 

 

 

For Media Inquiries:

Emily Boynton

(585) 273-1757

Email Emily Boynton

 

People Aren’t Smart Enough for Democracy to Flourish, Scientists Say

By: Natalie Wolchover, Life’s Little Mysteries Staff Writer
Date: 28 February 2012 Time: 12:35 PM ET

The democratic process relies on the assumption that citizens (the majority of them, at least) can recognize the best political candidate, or best policy idea, when they see it. But a growing body of research has revealed an unfortunate aspect of the human psyche that would seem to disprove this notion, and imply instead that democratic elections produce mediocre leadership and policies.

The research, led by David Dunning, a psychologist at Cornell University, shows that incompetent people are inherently unable to judge the competence of other people, or the quality of those people’s ideas. For example, if people lack expertise on tax reform, it is very difficult for them to identify the candidates who are actual experts. They simply lack the mental tools needed to make meaningful judgments.

As a result, no amount of information or facts about political candidates can override the inherent inability of many voters to accurately evaluate them. On top of that, “very smart ideas are going to be hard for people to adopt, because most people don’t have the sophistication to recognize how good an idea is,” Dunning told Life’s Little Mysteries.

He and colleague Justin Kruger, formerly of Cornell and now of New York University, have demonstrated again and again that people are self-delusional when it comes to their own intellectual skills. Whether the researchers are testing people’s ability to rate the funniness of jokes, the correctness of grammar, or even their own performance in a game of chess, the duo has found  that people always assess their own performance as “above average” — even people who, when tested, actually perform at the very bottom of the pile. [Incompetent People Too Ignorant to Know It]

We’re just as undiscerning about the skills of others as about ourselves. “To the extent that you are incompetent, you are a worse judge of incompetence in other people,” Dunning said. In one study, the researchers asked students to grade quizzes that tested for grammar skill. “We found that students who had done worse on the test itself gave more inaccurate grades to other students.” Essentially, they didn’t recognize the correct answer even when they saw it.

The reason for this disconnect is simple: “If you have gaps in your knowledge in a given area, then you’re not in a position to assess your own gaps or the gaps of others,” Dunning said. Strangely though, in these experiments, people tend to readily and accurately agree on who the worst performers are, while failing to recognize the best performers.

The most incompetent among us serve as canaries in the coal mine signifying a larger quandary in the concept of democracy; truly ignorant people may be the worst judges of candidates and ideas, Dunning said, but we all suffer from a degree of blindness stemming from our own personal lack of expertise.

Mato Nagel, a sociologist in Germany, recently implemented Dunning and Kruger’s theories by computer-simulating a democratic election. In his mathematical model of the election, he assumed that voters’ own leadership skills were distributed on a bell curve — some were really good leaders, some, really bad, but most were mediocre — and that each voter was incapable of recognizing the leadership skills of a political candidate as being better than his or her own. When such an election was simulated, candidates whose leadership skills were only slightly better than average always won.

Nagel concluded that democracies rarely or never elect the best leaders. Their advantage over dictatorships or other forms of government is merely that they “effectively prevent lower-than-average candidates from becoming leaders.”