Vital Signs - Vox Humana English School&Shanti



Hotter is better for removing allergens in laundry

SAN FRANCISCO -- A new study finds that the heat setting you choose when doing laundry makes all the difference when it comes to killing dust mites. The researchers found that washing laundry in hot water--140 degrees Fahrenheit (60 ºC) or higher--kills all house dust mites, compared with just 6.5% of dust mites in laundry washed at 104 degrees Fahrenheit (40 ºC), or warm water. The study is being presented at the American Thoracic Society 2007 International Conference, on Sunday, May 20.

Hotter water temperatures are also more effective in removing dog dander and pollen, says lead researcher Jung-Won Park, M.D., Ph.D., of Yonsei University in Seoul, Korea.

There is an alternative to washing in hot water that’s also effective, Dr. Park found: washing at a lower temperature (between 86-104° F, or 30-40 ºC), then rinsing the laundry twice with cold water for three minutes each.

In the study, researchers compared allergen levels on cotton sheets after they were washed in various temperature settings. They found that since more pollen was left on the sheets when they were washed in cooler temperatures (86° F, or 30ºC), rinsing the sheets was especially important when using this temperature setting.

Green tea may protect the bladder from becoming inflamed

ANAHEIM, Calif – Herbal agents could be used to treat inflammatory bladder diseases, according to a preliminary study that looked at the ability of green tea to protect bladder cells from inflammation. The University of Pittsburgh School of Medicine study, being presented at the annual meeting of the American Urological Association (AUA) in Anaheim, Calif., found that components of green tea protected bladder cells from damage in culture. The study is Abstract 299 in the AUA proceedings.

Green tea, reported to have many health benefits, is rich in powerful antioxidants that make it a possible remedy for many medical conditions. It is comprised of catechins – plant metabolites that provide it with many anti-oxidative properties.

"We discovered that catechins found in green tea protected both normal and cancerous bladder cells from inflammation when we exposed the cells to hydrogen peroxide," said Michael B. Chancellor, M.D., professor of urology and gynecology at the University of Pittsburgh School of Medicine. "Although further studies are needed, these results indicate herbal supplements from green tea could be a treatment option for various bladder conditions that are caused by injury or inflammation."

In the study, normal and cancerous bladder cells were exposed to two major catechin components of green tea, epigallocatechin gallate (EGCG) and epicatechin gallate (ECG), for 23 hours. Both significantly protected cell lines from exposure to hydrogen peroxide, which damages or kills cells. The concentrations of EGCG and ECG used in the study were at levels that may be achieved through dietary intake.

Repetitive exposure to an opinion can influence as much as exposure to opinions from several people

Study examines people's tendency to conclude that a familiar opinion is the same as a popular opinion

WASHINGTON -- Whether people are making financial decisions in the stock market or worrying about terrorism, they are likely to be influenced by what others think. And, according to a new study in this month’s Journal of Personality and Social Psychology, published by the American Psychological Association (APA), repeated exposure to one person’s viewpoint can have almost as much influence as exposure to shared opinions from multiple people. This finding shows that hearing an opinion multiple times increases the recipient’s sense of familiarity and in some cases gives a listener a false sense that an opinion is more widespread then it actually is.

In a series of six experiments that included 1044 students, from the University of Michigan, Princeton University, Rutgers University, University of Michigan – Dearborn, University of Toledo and Harvard University, researchers sought to understand individuals’ accuracy in identifying group norms and opinions. The experiments included dividing students into three groups, (three person control group, single opinion group and repeated opinions group).

Participants in the three person control group read three opinion statements each made by a different group member. The participants in the repeated opinion group read the same three statements but they were all attributed to one group member. Those in the single opinion control group read one opinion statement from one group member.

The studies found that an opinion is more likely to be assumed to be the majority opinion when multiple group members express their opinion. However, the study also showed that hearing one person express the same opinion multiple times had nearly the same effect on listener’s perception of the opinion being popular as hearing multiple people state his/her opinion.

Researchers examined the underlying processes that take place when individuals estimate the shared attitude of a group of people and how that estimation of collective opinion can be influenced by repetition from a single source. Since gauging public opinion is such an essential component in guiding our social interactions, this research has implications in almost every facet of modern day life.

"This study conveys an important message about how people construct estimates of group opinion based on subjective experiences of familiarity," states lead author Kimberlee Weaver, (Ph.D), of Virginia Polytechnic Institute and State University. "The repetition effect observed in this research can help us to understand how our own impressions are influenced by what we perceive to be the reality of others. For example, a congressman may get multiple phone calls from a small number of constituents requesting a certain policy be implemented or changed, and from those requests must decide how voters in their state feel about the issue. This study sheds light on the cognitive processes that take place that may influence such a decision."

UCLA imaging study reveals how pure oxygen harms the brain

Adding a little carbon dioxide could prevent lasting damage

It's a scenario straight out of Gray's Anatomy – a paramedic or doctor plops a mask over the face of a person struggling to breathe and begins dispensing pure oxygen.

Yet growing research suggests that inhaling straight oxygen can actually harm the brain. For the first time, a new UCLA brain-imaging study reveals why. Published in the May 22 edition of Public Library of Science (PLoS) Medicine, the findings fly in the face of national guidelines for medical practice and recommend a new approach adding carbon dioxide to the gas mix to preserve brain function in patients.

"For decades, the medical community has championed 100 percent oxygen as the gold standard for resuscitation. But no one has reported what happens inside our brains when we inhale pure oxygen," explained Ronald Harper, distinguished professor of neurobiology at the David Geffen School of Medicine at UCLA. "What we discovered adds to a compelling body of evidence for modifying a widely practiced standard of care in the United States."

Harper's team used functional magnetic resonance imaging (fMRI) to capture detailed pictures of what occurs inside the human brain during two different breathing scenarios. The technique detects subtle increases in blood flow triggered by the activation of different parts of the brain, causing these regions to glow or "light up" on the color scan.

The researchers scanned the brains of 14 healthy children, ages 8 – 15, as they inhaled 100 percent oxygen through a mouthpiece, and monitored their breathing and heart rates. After waiting eight minutes for the youngsters' breathing to return to normal, the team added 5 percent carbon dioxide to the gas mixture and repeated the scan.

A comparison of the two scans revealed dramatic differences.

"When the children inhaled pure oxygen, their breathing quickened, resulting in the rapid exhalation of carbon dioxide from their bodies," said coauthor Paul Macey, associate researcher in neurobiology. "The drop in carbon dioxide narrowed their blood vessels, preventing oxygen from reaching tissue in the brain and heart."

That's when something surprising happened on the MRI scan.

Three brain structures suddenly lit up: the hippocampus, which helps control blood pressure; the cingulate cortex, which regulates pain perception and blood pressure; and the insula, which monitors physical and emotional stress.

All this activity awakened the hypothalamus, which regulates heart rate and hormonal outflow. Activation of the hypothalamus triggered a cascade of harmful reactions and released chemicals that can injure the brain and heart.

"Several brain areas responded to 100 percent oxygen by kicking the hypothalamus into overdrive," explained Harper. "The hypothalamus overreacted by dumping a massive flood of hormones and neurotransmitters into the bloodstream. These chemicals interfere with the heart's ability to pump blood and deliver oxygen – the opposite effect you want when you're trying to resuscitate someone."

When the children inhaled the carbon dioxide-oxygen mix, the hypothalamus' hyperactivity vanished from the MRI scan.

"Adding carbon dioxide to the oxygen relaxed the blood vessels, allowed oxygen to reach the heart and brain, calmed the hypothalamus and slowed the release of dangerous chemicals," said Macey.

"Pure oxygen kindles the match that fuels a forest fire of harm to the body," said Harper. "But a little whiff of carbon dioxide makes it all go away."

Based on their findings, the researchers strongly encourage healthcare providers to add carbon dioxide to oxygen dispensation, especially when resuscitating infants or administering oxygen for more than a few minutes. The new direction could hold particular implications for patients of stroke, heart attack, carbon monoxide poisoning and any long-term oxygen therapy.

"When in doubt about a case, the current medical approach is to increase oxygen levels and wait to see if the patient improves," explained Harper. "But no one has ever scanned patients' brains to examine how they respond to oxygen therapy."

Earlier data on high oxygen's harmful effects have already resulted in policy changes overseas. Instead of using straight oxygen, many European hospitals now resuscitate patients with room air, which contains a mixture of nitrogen, oxygen and carbon dioxide; or with a blend of oxygen and carbon dioxide.

Climate change threatens wild relatives of key crops

At risk are vital genetic resources for resisting drought, pests

ROME, ITALY -- Wild relatives of plants such as the potato and the peanut are at risk of extinction, threatening a valuable source of genes that are necessary to boost the ability of cultivated crops to resist pests and tolerate drought, according to a new study released today by scientists of the Consultative Group on International Agricultural Research (CGIAR). The culprit is climate change, the researchers said.

According to the study, in the next 50 years as many as 61 percent of the 51 wild peanut species analyzed and 12 percent of the 108 wild potato species analyzed could become extinct as the result of climate change. Most of those that remained would be confined to much smaller areas, further eroding their capacity to survive. The study also examined wild relatives of cowpea, a nutritious legume farmed widely in Africa. It found that only two of 48 species might disappear. However, the authors predict that most wild cowpeas will decline in numbers because climatic changes will push them out of many areas they currently inhabit.

"Our results would indicate that the survival of many species of crop wild relatives, not just wild potato, peanuts and cowpea, are likely to be seriously threatened even with the most conservative estimates regarding the magnitude of climate change," said the study’s lead author, Andy Jarvis, who is an agricultural geographer working at two CGIAR-supported centers – the Colombia-based International Center for Tropical Agriculture and Bioversity International, with headquarters in Rome. "There is an urgent need to collect and store the seeds of wild relatives in crop diversity collections before they disappear. At the moment, existing collections are conserving only a fraction of the diversity of wild species that are out there."

Extinction of crop wild relatives threatens food production because they contain genes for traits such as pest resistance and drought tolerance, which plant breeders use to improve the performance of cultivated varieties. The reliance on wild relatives to improve their cultivated cousins on the farm is expected to intensify as climate change makes it too hot, too cold, too wet or too dry for many existing crop varieties to continue producing at their current levels.

The results of the study were announced on International Biodiversity Day, organized by the Convention on Biological Diversity (CBD).

Jarvis and his colleagues looked specifically at the effects of climate change on the three crops in Africa and South America. The scientists focused on the two continents because this allowed them to consider how known populations of wild plants would fare in a wide variety of growing conditions. They found the impact of climate change is likely to be more pronounced in some species than in others but that, in general, all three groups of species would suffer.

Though not apparent to the average consumer, the wild relatives of crops play an important role in food production. All food crops originated from wild plants. But when they were domesticated, their genetic variation was narrowed significantly as farmers carefully selected plants with traits such as those related to taste and appearance as well as to yield. When trouble arises on the farm—attacks by pests or disease or, more recently, stressful growing conditions caused by climate change—breeders tend to dip back into the gene pool of the robust wild relatives in search of traits that will allow the domesticated variety to overcome the threat.

In recent years, genes available in wild relatives have helped breeders develop new types of domesticated potatoes that can fight devastating potato blight and new types of wheat more likely to survive drought conditions. Wild relatives of the peanut have helped breeders provide farmers with varieties that can survive a plant pest known as the root knot nematode, and resist a disease called early leaf spot. In fact, according to the report, more than half of new domesticated peanut varieties developed in the last five years have incorporated traits from wild relatives. Cowpea wild relatives are known to be a reservoir of genes that could confer resistance to major insect pests. In the US alone, the value of the improved yield and quality derived from wild species is estimated to be in the hundreds of millions of dollars a year.

Jarvis said the vulnerability of a wild plant to climate change can depend on its ability to adapt by, for example, extending its range as warming in its native regions becomes too hot to handle. One reason wild peanut plants appear to be so vulnerable to climate change is they are largely found in flat lands and would have to migrate a long way to reach cooler climates, a predicament exacerbated by the fact that peanuts bury their seeds underground, a meter or less from the parent plant. That limits the speed at which seeds can move into more favorable climates. By contrast, plants in mountainous locations could theoretically survive by extending their range slightly up a slope, even by only a few meters, to find cooler weather. What scientists must do, Jarvis said, is identify which wild relatives are most likely to suffer from climate change and give them priority for conservation.

"The irony here is that plant breeders will be relying on wild relatives more than ever as they work to develop domesticated crops that can adapt to changing climate conditions," said Annie Lane, the coordinator of a global project on crop wild relatives led by Bioversity International. "Yet because of climate change, we could end up losing a significant amount of these critical genetic resources at precisely the time they are most needed to maintain agricultural production.

Research that identifies crop wild relatives threatened by climate change is part of a broader CGIAR effort to anticipate and blunt the effects of global warming on agriculture. In the local, national, and international policy arenas, CGIAR researchers are generating innovative options to foster adaptation to climate change. In addition, new research at CGIAR-supported centers focuses on understanding the impacts of shifting climate patterns on natural resources, such as water, fisheries, and forests, and on planning for improved management of these resources to meet the needs of growing populations as the climate changes.

Alarming acceleration in CO2 emissions worldwide

Stanford, CA -- Between 2000 and 2004, worldwide CO2 emissions increased at a rate that is over three times the rate during the 1990s—the rate increased from 1.1 % per year during the 1990s to 3.1% per year in the early 2000s. The research, published in the early on-line edition of the Proceedings of the National Academy of Sciences* May 21-25, also found that the accelerating growth rate is largely due to the increasing energy intensity of economic activity (the energy required to produce a unit of gross domestic product) and the carbon intensity of the energy system (the amount of carbon per unit of energy), coupled with increases in population and in per-capita gross domestic product. “No region is decarbonising its energy supply,” states the study.

The research showed that the increases in energy and carbon intensity constitute a reversal of a long-term trend toward greater energy efficiency and reduced carbon intensities. “Despite the scientific consensus that carbon emissions are affecting the world’s climate, we are not seeing evidence of progress in managing those emissions in either the developed or developing countries. In many parts of the world, we are going backwards,” remarked co-author of the study Chris Field, director of the Carnegie Institution’s Department of Global Ecology.

The research also shows that the actual global emissions since 2000 grew faster than in the highest of the scenarios developed by the Intergovernmental Panel on Climate Change (IPCC). “The trends relating energy to economic growth are definitely headed in the wrong direction,” Field commented.

The acceleration of carbon emissions is greatest in the exploding economies of developing regions, particularly China, where the increases mainly reflect increasing per capita gross domestic product. The study** divided the world into the USA, the European Union, Japan, the nations of the former Soviet Union, China, India, and three regions covering the rest of the world.

Between 2000 and 2004 the developing countries accounted for a large majority of the growth in emissions, even though they contribute only about 40% of total emissions. In 2004, 73% of the growth in global emissions came from the developing and least developed economies, comprising 80% of the world’s population. That same year the developed areas (including the Former Soviet Union), contributed about 60% to the total emissions. These countries account for 77% of the cumulative emissions since the start of the industrial revolution.

Between 1980 and 2004, total emissions in the developed areas (USA, Europe, Japan, and other smaller economies) increased as a result of fast growth in per-capita gross domestic product, coupled with relatively slight increases in population. This growth was partially offset by decreases in the amount of energy needed to make each unit of product.

The study emphasizes that the growth in emissions can be caused by a variety of factors and that managing emissions in a growing economy requires progress in both the energy intensity of the economic system and the carbon intensity of the energy system. According to Field, “solving the first part of the puzzle requires shifting more of the economy toward activities like service industries and information technology, where emissions can be lower, and emphasizing energy efficiency. Solving the second requires deploying new sources of non-emitting energy like wind, solar, and nuclear power.”

Carnegie president Richard A. Meserve notes that “the impacts of carbon dioxide in our atmosphere are the result of cumulative emissions. This study is a signal that global action is urgently needed to reverse the adverse trends or the challenge of responding to climate change will be more difficult.”

A drug's brand name skews patient treatment choices

The brand name of a drug can strongly influence treatment decisions by patients, according to a randomized trial of decision aids by researchers from McMaster University.

The unexpected finding regarding the strong influence that drug names have on treatment choices emerged from a study undertaken by Dr. Anne Holbrook and McMaster colleagues who wanted to see if certain features of a decision aid – i.e., format and graphic presentation of data on benefits and harms of treatment options – make a difference. They designed a decision aid for anticoagulant drug therapy for atrial fibrillation in 3 formats (decision board, decision booklet with audiotape, or interactive computer program) with 2 types of data presentation (pie graph or pictogram). The treatment options were identified initially as "treatment A" (warfarin), "treatment B" (acetylsalicylic acid) and "treatment C" (no treatment). The authors found that the participants’ comprehension of the condition and treatment options improved significantly with the decision aid, regardless of the format or graphic presentation of data. Virtually all (96%) of the participants felt that the decision aid helped them make their treatment choice.

But unexpectedly, they also discovered that after participants were shown the true treatment names, 36% changed their initial choice (including 46% of those who initially chose warfarin and 78% of those who initially chose no treatment), even though the risks and benefits of each treatment were clearly laid out in the decision aid.

In a related commentary, Dr. Annette O'Connor notes that many common medical decisions exist in a "grey zone," where the best choice differs depending on how patients weigh the benefits and risks of each treatment option. She discusses how patient decision aids differ from educational aids and how they help patients and express their personal values.

----------------

p. 1583 Influence of decision aids on patient preferences for anticoagulant therapy: a randomized trial

— A. Holbrook et al

p. 1597 Using decision aids to help patients navigate the "grey zone" of medical decision-making

— A.M. O'Connor

'Star Trek'-type scanning may reveal genetic activity of tumors, Stanford study shows

STANFORD, Calif. -- Peering into the body and visualizing its molecular secrets, once the stuff of science fiction, is one step closer to reality with a study from researchers at the Stanford University School of Medicine and the University of California, San Diego School of Medicine.

The research team is reporting that by looking at images from radiology scans - such as the CT scans a cancer patient routinely gets - radiologists can discern most of the genetic activity of a tumor. Such information could lead to diagnosing and treating patients individually, based on the unique characteristics of their disease. The study will be published May 21 in the advance online edition of Nature Biotechnology.

"Potentially in the future one can use imaging to directly reveal multiple features of diseases that will make it much easier to carry out personalized medicine, where you are making diagnoses and treatment decisions based exactly on what is happening in a person," said co-senior author Howard Chang, MD, PhD, assistant professor of dermatology at Stanford, who led the genomics arm of the study.

The study's other senior author is Michael Kuo, MD, assistant professor of interventional radiology at UCSD, who said their work will help doctors obtain the molecular details of a specific tumor or disease without having to remove body tissue for a biopsy. "Ideally, we would have personalized medicine achieved in a noninvasive manner," said Kuo, who spearheaded the project in 2001 while he was a radiology resident at Stanford.

In some ways, the work brings to mind a device that science fiction fans may recall from the TV series, "Star Trek." "In almost every episode of 'Star Trek,' there is a device called a tricorder, which they used noninvasively to scan living or nonliving matter to determine its molecular makeup," said Chang. "Something like that would be very, very useful."

In real life, this approach would avoid the pain and risk of infection and bleeding from a biopsy and would not destroy tissue, so the same site could be tested again and again.

At the time the project started at Stanford in 2001, the medical school was ground-zero for studies of DNA microarrays - the lab tools that can screen thousands of genes at a time, developed by biochemistry professor Patrick Brown, MD, PhD. Microarrays have proven to be extremely useful for identifying groups of genes that are more active or less active in a disease such as cancer, compared with normal tissue.

"Radiology - while making great technological advances towards capturing more and more information - seemed to be largely oblivious to a fundamental shift in medicine towards genomic, personalized medicine that was beginning to take place," said Kuo, who is also the director of the Center for Translational Medical Systems at UCSD. "Being there at Stanford, I was aware of that shift and I was trying to think of what are the ways that we as radiologists could merge and integrate that data so we could take advantage of it."

A problem with using biopsied material for microarrays is that the tissue is destroyed in the process. Thus, there is no opportunity to re-test the same tissue after, say, a course of chemotherapy. Imaging through MRI or CT, however, leaves all organs intact and functioning.

To increase the research team's expertise in the areas of genomics and computational biology, Kuo brought in Chang and the paper's lead author, Eran Segal, PhD, in 2004. Chang had been using the gene activity patterns of microarrays to predict cancer outcome. Segal developed algorithms during his doctoral studies at Stanford that played a critical role in the analysis of the massive amounts of data encompassed in the study.

"When we look at noninvasive images, there are lots and lots of different patterns that had no known meaning," said Chang. "We thought that maybe we could come up with a way to systematically connect the gene activity seen with microarrays to imaging patterns, to translate meaning into three different types of languages, from genes to images and then to outcome of the disease process."

Their method was similar to what archeologists might do to recover a lost language. Chang compared their process-translating genetic activity patterns into medical imaging terminology - to the breakthrough that occurred when archeologists uncovered the Rosetta Stone in 1799. On the stone was the same text written in three versions: hieroglyphics, Egyptian and Greek languages. Every time certain letters showed up in Greek, a certain set of symbols would show up in hieroglyphics. That correspondence allowed previously undecipherable hieroglyphic writing to be understood.

The first step for the researchers was the equivalent of finding words for the hieroglyphics: to define the language of radiology. Kuo and his radiology colleagues initially defined mutually agreeable terminology for more than 100 features that appeared on scans. As their work progressed, they found they only needed 28 of them to capture maximal information.

They then matched those imaging features with a vast stockpile of microarray data generated from human liver cancer samples. They also could compare their data with how the cancer patient fared.

What they found is that two very different aspects of cancer - how it looks by imaging and how it behaves on a molecular level - have a strong connection. Out of the 5,000 or more genes that have different activity in cancerous tissue, the researchers could reconstruct 80 percent of gene expression based on looking at standard CT scans the patients had undergone.

"Clearly, we are very far from clinical applications of these tools that we developed," said Segal, who is now a computational biologist at the Weizmann Institute of Science in Rehovot, Israel. "But the fact that we saw strong connections between the imaging features and the molecular gene activity data suggests that this could be a promising and fruitful research direction."

Much like being able to identify the aromas from wine once the lexicon of wine-tasting is realized, radiologists - already experts in recognizing the visual differences between normal and pathological tissues - simply need to know what to look for and what it means.

"They already have the skills, so it's not a quantum leap by any stretch - if this were to be validated ultimately on a large-scale-for this to be implemented," said Kuo.

Topical retinol helps reduce wrinkles associated with natural skin aging

Applying vitamin A to the skin appears to improve the wrinkles associated with natural aging and may help to promote the production of skin-building compounds, according to a report in the May issue of Archives of Dermatology, one of the JAMA/Archives journals.

The wrinkles and brown spots associated with aging appear first and most prominently on skin exposed to the sun, according to background information in the article. "Human skin not exposed to the sun also ages but less dramatically," the authors write. "In intrinsic, natural or chronological aging, skin loses its youthful appearance by becoming thinner, laxer and more finely wrinkled. These changes are readily appreciated by inspecting the upper inner arm." Thinner skin results from a reduced production of the protein collagen and may slow wound healing, presenting a public health issue. "Safe and effective therapies to reverse the atrophy of natural skin aging do not exist currently," the authors note.

Reza Kafi, M.D., then of the University of Michigan Medical School, Ann Arbor, and now of Stanford Medical School, Palo Alto, Calif., and colleagues assessed the effectiveness of vitamin A (retinol) lotion in 36 elderly individuals (average age 87 years). Researchers applied a lotion containing 0.4 percent retinol to participants’ right or left upper inner arms, and lotion with no retinol to the other arm, up to three times a week for 24 weeks. Wrinkles, roughness and overall severity of aging were each graded on a scale from zero (none) to nine (severe) before treatment and two, four, eight, 16 and 24 weeks after beginning treatment. In addition, 4-millimeter biopsy specimens of skin were taken from both arms at the beginning and end of the 24-week treatment period.

A total of 23 individuals completed the full study and 13 withdrew from the study prior to completion. When the researchers included the individuals who had dropped out of the study by assuming their skin did not change after their last measurement, wrinkles, roughness and overall aging severity were all significantly reduced in the retinol-treated arm compared with the control arm. The skin biopsies revealed that the retinol increased the production of glycosaminoglycan and procollagen, structural components of the skin.

"Topical retinol improves fine wrinkles associated with natural aging," the authors conclude. "Significant induction of glycosaminoglycan, which is known to retain substantial water, and increased collagen production are most likely responsible for wrinkle effacement [reduction]. With greater skin matrix synthesis [production of compounds that form new skin], retinol-treated aged skin is more likely to withstand skin injury and ulcer formation along with improved appearance."

Cure for hepatitis C announced by VCU researcher

Disease is leading cause of cirrhosis, liver cancer and the need for transplants

RICHMOND, Va. – The use of peginterferon alone, or in combination with ribavirin, points to a cure for hepatitis C, the leading cause of cirrhosis, liver cancer and the need for liver transplant, a Virginia Commonwealth University researcher said today.

Mitchell Shiffman, M.D., professor in the VCU School of Medicine, and chief of hepatology and medical director of the Liver Transplant Program at the Virginia Commonwealth University Medical Center, is one of the lead investigators in the study, which was presented at the 38th annual Digestive Disease Week conference in Washington, D.C. VCU was among about 40 sites worldwide studying pegylated interferon alfa-2a, manufactured by Roche Inc.

Nearly all -- 99 percent – of patients with hepatitis C who were treated successfully with peginterferon alone, or in combination with ribavirin, had no detectable virus up to seven years later. Researchers say this data validates the use of the word "cure" when describing hepatitis C treatment as successful treatment is defined as having undetectable hepatitis C virus in the blood six months following treatment.

"We at VCU are encouraged by this data because it is rare in the treatment of life-threatening viral diseases that we can tell patients they may be cured," Shiffman said. "In hepatitis C today, we are able to help some patients achieve an outcome that effectively enables them to put their disease behind them."

The results are based on a long-term follow-up study designed to determine if the virus re-emerges in patients who have achieved treatment success. The study reviewed 997 patients, either mono-infected with chronic HCV or co-infected HCV and HIV, who achieved a sustained viral response (SVR) following treatment with either Pegasys (peginterferon alfa-2a) monotherapy or combination therapy with Pegasys and ribavirin.

After successful treatment, researchers monitored serum levels of HCV once a year for an average of 4.1 years (range 0.4 to 7 years). Of the 997 patients, 989 maintained undetectable levels of HCV. The remaining eight patients tested positive for HCV at an average of two years following treatment completion. The study found that these eight patients exhibited no consistency in age, gender or HCV genotype, and it has not yet been determined if these patients experienced a relapse or if they were re-infected with HCV.

Hepatitis C is a blood-borne infectious disease of the liver and a leading cause of cirrhosis, liver cancer and the need for liver transplants. According to the Centers for Disease Control and Prevention, an estimated 4.1 million Americans have been infected with hepatitis C, and 3.2 million are chronically infected. The number of new infections per year declined from an average of 240,000 in the 1980s to about 26,000 in 2004, the latest year for which statistics are available. The CDC estimates the number of hepatitis C-related deaths could increase to 38,000 annually by the year 2010, surpassing annual HIV/AIDS deaths.

UF researchers awaken vision cells in blind mice

GAINESVILLE, Fla. -- University of Florida researchers used gene therapy to restore sight in mice with a form of hereditary blindness, a finding that has bearing on many of the most common blinding diseases.

Writing online in today’s (May 21) edition of Nature Medicine, scientists describe how they used a harmless virus to deliver corrective genes to mice with a genetic impairment that robs them of vision.

The discovery shows that it is possible to target and rescue cone cells — the most important cells for visual sharpness and color vision in people.

"Cone vision defines whether someone is blind or not," said William W. Hauswirth, Ph.D., the Rybaczki-Bullard professor of ophthalmic molecular genetics in the College of Medicine and a member of the UF Genetics Institute. "If you can usefully deliver a gene specifically to cone cells, there are implications for all blinding diseases, not just inherited ones. Even in two very common types of blindness, age-related macular degeneration and diabetic retinopathy, if you can target cones you might be able to rescue that vision."

Scientists experimented with mice with a form of hereditary blindness called achromatopsia, which affects about 1 in 30,000 Americans by disabling cone photoreceptors in the retina. The disease results in nearly complete color blindness and extremely poor central vision.

Within two months of the gene therapy injection into the subretinal space of the mouse eyes, scientists measured the electrical activity in the retinas, finding that 19 of the 21 treated eyes positively responded to therapy, and 17 of those 19 had electrical readings from their retinas on par with those taken in normal mice.

When the mice were between 6 and 7 months old, tests showed 18 of the 21 treated eyes continued to respond normally.

In addition, a separate, smaller group of treated mice were evaluated using an exam akin to an eye test at the doctor’s office.

In experiments overseen by Robert B. Barlow, Ph.D., a professor of ophthalmology at State University of New York Upstate Medical University, the mice were surrounded by four computer monitors that simulated the appearance of being inside a moving drum that had vertical stripes on the walls.

Scientists knew the mice could see the stripes because sighted animals naturally move their heads in the same direction as the moving stripes. By making the stripes ever-narrower — similar to how the letters get smaller toward the bottom of an eye chart — researchers could assess the mice’s visual abilities.

As a group, all of the mice displayed normal visual acuity in their treated eyes.

"People can talk and tell us what they see," said lead researcher John J. Alexander, Ph.D., a postdoctoral fellow in the department of ophthalmology at UF. "Animals are much more difficult. What makes this test so fantastic is that it involves an animal’s natural response, and the results tell us that the animals’ brains are involved in the process, that they are actually seeing something."

In addition to cones, which number about 6 million in the retina, the eye’s rod cells are important for low-light and peripheral vision and exist in much greater amounts, with populations of more 100 million.

But treating cones could play a role in diseases that begin with the destruction of rods, such as retinitis pigmentosa, which affects about 1 in 3,000 Americans.

"This is the first to my knowledge of a cone-targeted gene therapy that restores function in an animal model where cones are the primary defect," said Richard Weleber, Ph.D., a professor of molecular and medical genetics at Oregon Health & Science University who was not involved in the research. "This validates the concept that it is possible to deliver a gene therapy targeting the cone system, and that is incredibly important for a number of degenerative diseases."

HIV in breast milk killed by flash-heating, new study finds

Berkeley -- A simple method of flash-heating breast milk infected with HIV successfully inactivated the free-floating virus, according to a new study led by researchers at the Berkeley and Davis campuses of the University of California.

Notably, the technique - heating a glass jar of expressed breast milk in a pan of water over a flame or single burner - can be easily applied in the homes of mothers in resource-poor communities.

The findings, to appear in the July 1 print issue of the Journal of Acquired Immune Deficiency Syndromes, but now available online, provide hope that mothers with HIV in developing nations will soon be able to more safely feed their babies.

"We conducted this research to help HIV-positive mothers and their infants who do not have safe alternatives to breastfeeding," said Kiersten Israel-Ballard, a doctoral candidate at UC Berkeley's School of Public Health and lead author of the study. "HIV can be transmitted to the baby via breastfeeding. But for infants in developing countries where infant mortality is already so high from diarrhea and other illnesses, they can't afford to lose the antibodies, other anti-infective agents and the optimal nutrition found in breast milk. This study shows that an easy-to-implement heating method can kill the HIV in breast milk."

This line of research began when HIV-positive women in Zimbabwe asked how they could make their milk safe for their babies. Israel-Ballard conducted a study there that indicated that HIV-positive women wanted to attempt the flash-heating method. The World Health Organization (WHO) recommends heat treating HIV-infected breast milk, but there has been little research into a simple method that a mom in a developing country could use.

Studies by this research team have shown that flash-heating breast milk can kill bacteria while retaining most of the milk's nutritional and antimicrobial properties, as well as a majority of its important antibodies.

"Many people in this field were skeptical that this would work," said Barbara Abrams, UC Berkeley professor of epidemiology and maternal and child health, and senior author on the study. "We wanted to be sure that there was scientific evidence that flash-heated milk was truly free of HIV, nutritious and immunologically beneficial. This study was done in response to the concerns of the mothers in Zimbabwe, and in addition provides evidence that field studies are warranted."

Banks that collect, store and dispense human milk already pasteurize milk, but the method they commonly use requires thermometers and timers that may be hard to obtain in resource-poor communities.

Flash-heating is a type of pasteurization that brings the milk to a higher temperature for a shorter period of time, a method known to better protect the anti-infective and nutritional properties of breast milk than the one typically used in human milk banks. Moreover, the low-tech materials used for this study are readily available in local communities in the developing world, and the heating method can be easily incorporated into a mother's normal daily routine.

Of the 700,000 children who become infected with HIV each year, an estimated 40 percent contract the virus from prolonged breastfeeding. WHO recommends that HIV-positive mothers avoid breastfeeding when safe feeding alternatives are available.

But in regions of the world where mothers cannot afford the cost of infant formula, water is contaminated, or other socio-cultural conditions make replacement feeding difficult, WHO recommends exclusively breastfeeding for up to six months.

"The risks and benefits of heating HIV-contaminated breast milk are different for women in developing countries than for women in the United States," said Dr. Caroline Chantry, a pediatrician and infant nutrition researcher with UC Davis Children's Hospital, and co-author of the paper. "Here we have access to safe water and formula, so it makes less sense for HIV-positive mothers in developed countries to take the risks associated with feeding babies their breast milk."

Studies indicate that when babies are breastfed exclusively, there is a 3 to 4 percent risk of HIV transmission. However, when babies are given formula or other foods in addition to breast milk, there is a significant three- to four-fold increase in the risk of HIV transmission, possibly because allergens and contaminants in solid foods and formula can compromise the epithelial lining of a baby's digestive tract, making it easier for viruses to pass through.

For this reason, WHO guidelines have recommended that after six months of exclusively breastfeeding, HIV-positive mothers wean their babies as soon as other foods are available. Even then, while weaning may decrease the risk of HIV transmission, studies have shown that it increases the risk of malnutrition, diarrhea and other diseases that can lead to infant mortality.

"Early cessation of breastfeeding has been tried in several recent studies, and the results suggest that stopping breastfeeding early increased the risk of infant illness, growth failure and death, and actually outweighed the risk of transmitting HIV through breast milk," said Abrams. "This has been a desperate dilemma for mothers in developing countries. Our method of flash-heating breast milk could be particularly important at the time the mother stops nursing. Roughly 300,000 infants contract HIV from breastfeeding each year. Even if only a small proportion of HIV-positive mothers in resource-poor countries can successfully express and flash-treat their milk, this simple, inexpensive and potentially sustainable method could still save thousands of babies from HIV infection while providing most of the health benefits of human milk."

This study reflects results from the first stage of research, headed by Abrams, into the effects of flash-heating breast milk. Chantry will head the next stage of field trials, which involve moving this technique out of the lab and into the homes of women in Africa. The researchers are seeking funding to assess the flash-heating method's feasibility for babies in local communities in developing countries.

"Clinical trials are urgently needed to substantiate that mothers can express, flash-heat and store their milk safely, and to test the impact of this method on actual HIV transmission," said Chantry. "What is important about this study is that women have the right to an informed choice. It's amazing to me that in our paternalistic society, people so often readily dismiss the possibility that women would be willing to express and heat their milk to prevent their babies from getting infected with HIV."

Of the 98 samples of breast milk collected from 84 HIV-positive women in Durban, South Africa, only 30 had detectable levels of HIV before heating. Not all breast milk from HIV-positive mothers contains HIV naturally. Milk had been hand expressed into clean, locally purchased glass food jars provided by the researchers.

For each sample of HIV-infected milk, researchers set aside 50 milliliters in the original collection jars and used the remainder as unheated controls. The uncovered jars were placed in a 1-quart pan filled with 450 milliliters of water. The water and milk were heated together over a single-burner butane stove. Once the water reached a rolling boil, the breast milk was immediately removed and allowed to cool.

The researchers checked the temperature of the milk at 15-second intervals and determined that the flash-heated milk reached a peak temperature of 163 degrees Fahrenheit (72.9 degrees Celsius), and typically stayed hotter than 132 degrees Fahrenheit (56 degrees Celsius) for more than six minutes.

Viral analysis of the flash-heated and unheated breast milk found that cell-free HIV had been inactivated in all of the heated samples.

The researchers note that they used a reverse transcriptase (RT) assay to test for an enzyme produced by viable HIV since traditional tests for HIV do not distinguish between dead and live viruses. The RT test, however, cannot detect HIV within cells, but preliminary data suggest that flash-heat inactivates cell-associated HIV as well.

"We hope this technique will not only provide HIV-free breast milk that is safe to consume, but that the milk also retains the antibodies and nutrition that will help keep their infants healthy," said Israel-Ballard. "Mothers in Africa have told us they will do anything to keep their babies alive, and this work is ultimately about providing them with viable options to do just that."

Vital Signs

Outcomes: Alcohol Is Tied to Lower Risk of One Type of Kidney Cancer

By NICHOLAS BAKALAR

Scientists have discovered yet another reason that alcohol might be good for you. Using pooled data from 12 studies and more than 750,000 subjects, researchers found that moderate alcohol consumption ― about a drink a day ― is associated with a decreased risk of renal cell carcinoma, one type of kidney cancer.

The paper, which appears in the May 16 issue of The Journal of the National Cancer Institute, covered only prospective studies that involved at least 25 cases of renal cell cancer, that assessed long-term intake of a variety of foods and beverages, and that included information on nondietary factors. During the 7- to 20-year follow-up, researchers found 1,430 cases of renal cell cancer.

The researchers found that people who drank two-tenths of an ounce to one-half an ounce of alcohol a day ― beer, wine or liquor ― reduced their risk of renal cell cancer by 18 percent, and those who drank a half-ounce or more reduced their risk by 28 percent. There is about a half-ounce of alcohol in 1 1/2 ounces of hard liquor, 12 ounces of beer or a 5-ounce glass of wine.

The study had limitations in that it lacked a measure of alcohol use over time, depended on self-reports and had no information on family history of renal cell cancer.

Jung Eun Lee, the lead author and a fellow at Brigham and Women’s Hospital in Boston, would not encourage anyone to start drinking. Rather, she said, maintaining a healthy weight and avoiding smoking are essential as the “principal means to reduce renal cell cancer.”

Mysteries to Behold in the Dark Down Deep: Seadevils and Species Unknown

By WILLIAM J. BROAD

When, more than 70 years ago, William Beebe became the first scientist to descend into the abyss, he described a world of twinkling lights, silvery eels, throbbing jellyfish, living strings as “lovely as the finest lace” and lanky monsters with needlelike teeth.

“It was stranger than any imagination could have conceived,” he wrote in “Half Mile Down” (Harcourt Brace, 1934). “I would focus on some one creature and just as its outlines began to be distinct on my retina, some brilliant, animated comet or constellation would rush across the small arc of my submarine heaven and every sense would be distracted, and my eyes would involuntarily shift to this new wonder.”

From left; Caulophryne jordani Fanfin seadevil, Marrus Orthocanna and Grimpoteuthis, or "Dumbo octopus."

Beebe sketched some of the creatures, because no camera of the day was able to withstand the rigors of the deep and record the nuances of this cornucopia of astonishments.

Colleagues reacted coolly. Some accused Beebe of exaggeration. One reviewer suggested that his heavy breathing had fogged the window of the submarine vessel, distorting the undersea views.

Today, the revolution in lights, cameras, electronics and digital photography is revealing a world that is even stranger than the one that Beebe struggled to describe.

The images arrayed here come from “The Deep: The Extraordinary Creatures of the Abyss” (University of Chicago Press, 2007), by Claire Nouvian, a French journalist and film director. In its preface, Ms. Nouvian writes of an epiphany that began her undersea journey.

“It was as though a veil had been lifted,” she says, “revealing unexpected points of view, vaster and more promising.”

The photographs she has selected celebrate that sense of the unexpected. Bizarre species from as far down as four and half miles are shown in remarkable detail, their tentacles lashing, eyes bulging, lights flashing. The eerie translucence of many of the gelatinous creatures seems to defy common sense. They seem to be living water.

On page after page, it is as if aliens had descended from another world to amaze and delight. A small octopus looks like a child’s squeeze toy. A seadevil looks like something out of a bad dream. A Ping-Pong tree sponge rivals artwork that might be seen in an upscale gallery.

Interspersed among 220 color photographs are essays by some of the world’s top experts on deep-sea life that reflect on what lies beneath. For example, Laurence Madin of Woods Hole Oceanographic Institution notes the violence that air and gravity do to creatures without internal or external skeletons when they are pulled up to the deck of a ship, obliterating their varieties of form and function.

“This unattractive jello-like mass,” he writes, “is the unfair land version of amazing and delicate creatures that can display their true beauty only in their natural watery environment.” The photographs in the book right that wrong, and not just for jellyfish.

One shows a dense colony of brittle stars, their arms intertwined and overlapping, their masses in the distance merging with the blackness of the seabed, alive, inhabiting a place once thought to be a lifeless desert.

Craig M. Young of the Oregon Institute of Marine Biology writes in the book that the diversity of life in the abyss “may exceed that of the Amazon Rain Forest and the Great Barrier Reef combined.”

Beebe, who ran the tropical research department at the New York Zoological Society, surely had intimations of what lay beyond the oceanic door he had opened. “The Deep” brings much of that dark landscape to light, even while noting that a vast majority of the planet’s largest habitat remains unexamined, awaiting a new generation of explorers.

Q & A

Taking a Dim View

By C. CLAIBORNE RAY

Q. Does wearing someone else’s prescription glasses for a short time do any damage to a child’s eyes?

A. The short answer is no, said Dr. Pamela F. Gallin, director of pediatric ophthalmology at the Morgan Stanley Children’s Hospital of New York-Presbyterian. It would be uncomfortable but not harmful to wear the wrong prescription for, say, five minutes, she said.

But there are potential risks for young children who wear the wrong prescription for an extended period, Dr. Gallin said.

“When we put on someone else’s glasses, which presumably are the incorrect prescription, then an out-of-focus image is presented to our retina,” she explained, much like a distorted image in an amusement park mirror.

In 2 percent to 5 percent of adults, a condition called amblyopia, also known as lazy eye, is present, she said, almost invariably because such an out-of-focus image has been presented to one eye for a long time.

“The risk arises in children because our vision develops in the back of the brain until the ages of 7 to 9,” she said, so for harm to be done by wearing the incorrect prescription, it would have to be in a child 9 or younger.

This risk varies by the time of use and the child’s age. The visual part of the brain becomes less plastic as we mature, Dr. Gallin said, and it takes more time to change the brain at 9 years than at 1 year. Doctors estimate a week per year of life to make a change of any amount.

Einstein researchers' discover 'radiation-eating' fungi

Finding could trigger recalculation of Earth's energy balance and help feed astronauts

Scientists have long assumed that fungi exist mainly to decompose matter into chemicals that other organisms can then use. But researchers at the Albert Einstein College of Medicine of Yeshiva University have found evidence that fungi possess a previously undiscovered talent with profound implications: the ability to use radioactivity as an energy source for making food and spurring their growth.

"The fungal kingdom comprises more species than any other plant or animal kingdom, so finding that they're making food in addition to breaking it down means that Earth's energetics—in particular, the amount of radiation energy being converted to biological energy—may need to be recalculated," says Dr. Arturo Casadevall, chair of microbiology & immunology at Einstein and senior author of the study, published May 23 in PLoS ONE.

The ability of fungi to live off radiation could also prove useful to people: "Since ionizing radiation is prevalent in outer space, astronauts might be able to rely on fungi as an inexhaustible food source on long missions or for colonizing other planets," says Dr. Ekaterina Dadachova, associate professor of nuclear medicine and microbiology & immunology at Einstein and lead author of the study.

Those fungi able to "eat" radiation must possess melanin, the pigment found in many if not most fungal species. But up until now, melanin's biological role in fungi—if any--has been a mystery.

"Just as the pigment chlorophyll converts sunlight into chemical energy that allows green plants to live and grow, our research suggests that melanin can use a different portion of the electromagnetic spectrum—ionizing radiation—to benefit the fungi containing it," says Dr. Dadachova.

The research began five years ago when Dr. Casadevall read on the Web that a robot sent into the still-highly-radioactive damaged reactor at Chernobyl had returned with samples of black, melanin-rich fungi that were growing on the reactor's walls. "I found that very interesting and began discussing with colleagues whether these fungi might be using the radiation emissions as an energy source," says Dr. Casadevall.

To test this idea, the Einstein researchers performed a variety of in vivo tests using three genetically diverse fungi and four measures of cell growth. The studies consistently showed that ionizing radiation significantly enhances the growth of fungi that contain melanin.

For example, two types of fungi--one that was induced to make melanin (Crytococcus neoformans) and another that naturally contains it (Wangiella dermatitidis)—were exposed to levels of ionizing radiation approximately 500 times higher than background levels. Both species grew significantly faster (as measured by the number of colony forming units and dry weight) than when exposed to standard background radiation.

The researchers also carried out physico-chemical studies into melanin's ability to capture radiation. By measuring the electron spin resonance signal after melanin was exposed to ionizing radiation, they showed that radiation interacts with melanin to alter its electron structure. This is an essential step for capturing radiation and converting it into a different form of energy to make food.

Dr. Casadevall notes that the melanin in fungi is no different chemically from the melanin in our skin. "It's pure speculation but not outside the realm of possibility that melanin could be providing energy to skin cells," he says. "While it wouldn't be enough energy to fuel a run on the beach, maybe it could help you to open an eyelid."

Salt increases ulcer-bug virulence

Scientists have identified yet another risk from a high-salt diet. High concentrations of salt in the stomach appear to induce gene activity in the ulcer-causing bacterium Helicobacter pylori, making it more virulent and increasing the likelihood of an infected person developing a severe gastric disease.

"Apparently the stomach pathogen H. pylori closely monitors the diets of those people whom it infects. Epidemiological evidence has long implied that there is a connection between H. pylori and the composition of the human diet. This is especially true for diets rich in salt," says Hanan Gancz, of the Uniformed Services University of the Health Sciences in Bethesda, Maryland, who presents the research May 22, 2007 at the 107th General Meeting of the American Society for Microbiology in Toronto.

H. pylori is a spiral-shaped bacterium that can live in the acidic environment of the stomach and duodenum which is the section of intestine below the stomach. It is the most common cause of ulcers of the stomach and duodenum, accounting for up to 90% of duodenal ulcers and up to 80% of gastric ulcers. Infection with H. pylori also causes gastritis, and infected persons also have a 2- to 6-fold increased risk of developing mucosa-associated lymphoid tissue (MALT) lymphoma, and gastric cancer compared with uninfected counterparts.

H. pylori infection is common in the United States and is most often found in persons from lower income groups and older adults. About 20% of persons less than 40 years of age and about 50% of persons over 60 years of age are infected. Most infected people do not have symptoms and only a small percentage go on to develop disease.

Previous research has focused on the affects diet has on the stomach environment where H. pylori resides, but until now scientists have overlooked the response of the microorganism specifically to these dietary queues. Working from the epidemiological evidence that H. pylori infection combined with a high-salt diet results in an increased incidence of severe gastric maladies, Gancz and colleagues decided to look at the direct effect a high concentration of salt had on both the growth and gene expression of the bacterium.

"We noted that H. pylori growth rate shows a sharp decline at high salt concentrations. Moreover, bacterial cells exposed to increased salt exhibited striking morphological changes: cells became elongated and formed long chains," says Gancz. "We conclude that H. pylori exposed to high levels of salt in vitro exhibit a defect in cell division."

They also discovered transciption of two genes responsible for the virulence of the bacterium was increased during high-salt conditions.

"The altered expression patterns of some virulence genes may partially explain the increased disease risk that is associated with a high salt diet in H. pylori infected individuals," says Gancz.

Scientists find war vets' hand dexterity determines susceptibility to PTSD

Geisinger researchers investigate link between PTSD and levels of dual dexterity

DANVILLE, PA. – A recent study conducted by investigators with the Geisinger Center for Health Research shows a clear link between combat veterans' use of both hands for common tasks and the likelihood that they will experience post-traumatic stress disorder (PTSD).

Combat veterans with an extreme level of mixed handedness are nearly twice as likely to develop (PTSD) after combat compared to veterans who use both hands less often, according to the study, which is being published in the May issue of Psychosomatic Medicine.

The study also found that veterans with extreme mixed handedness and high combat exposure were nearly five times more likely to have PTSD than those with lower degrees of mixed handedness.

Joseph Boscarino, PhD, MPH and Stuart Hoffman, DO of the Geisinger Center for Health Research measured PTSD and handedness among a national sample of 2,490 Vietnam veterans exposed to combat.

"These findings suggest the possibility of a pre-existing biological vulnerability for PTSD," said Boscarino, the study's principal investigator. "We know generally what type of soldier is likely to suffer from PTSD, before they go into combat."

While other studies on handedness and PTSD have yielded similar results, those prior studies were too small to draw significant conclusions. Boscarino's groundbreaking study examined a much larger group of patients, and therefore the results are more applicable to a large group of veterans.

"Given the research, it might be beneficial to screen people entering high-risk occupations such as the military for handedness," Boscarino said. "If pre-screening doesn't occur, the healthcare community should at least make sure that these people receive adequate post stress exposure help."

In today's context, even brief psycho-social interventions for military personnel returning from Iraq and Afghanistan could significantly reduce the risk of PTSD, said Boscarino, a Vietnam combat veteran himself.

Although therapy doesn't necessarily have to be extensive, it should occur shortly after a person has experienced a traumatic event such as combat or a natural disaster. Treatment may be critical to avoiding depression, PTSD and substance abuse related problems following such exposures, Boscarino said.

It has been theorized that people with a lesser de¬gree of cerebral lateralization, as measured by mixed handedness, would have a greater likelihood of developing PTSD. This is because the right brain hemisphere is believed to be significant in threat identification and in the regulation of emotion responses.

People with reduced cerebral lateralization for language, as indexed by increased mixed-handedness, were thought to be more sensitive to perceived threat and prone to experience emotions more intensity. This was because their cerebral organization was thought to give primacy to right hemisphere contri¬butions in cognitive processes.

"What we've found is a near conclusive link between handedness and a person's predisposition toward PTSD," Boscarino said. "These findings may be useful in mitigating some of the adverse outcomes associated with traumatic stressor exposures."

Bubonic Plague Kills a Monkey at the Denver Zoo

By MINDY SINK

DENVER — The death of a monkey at the Denver Zoo from bubonic plague has prompted officials to change the habitats of some zoo animals and renew efforts to keep visitors from feeding the urban wildlife here.

The animal, an 8-year-old female hooded capuchin monkey named Spanky, was the first zoo animal to be infected with the plague since an outbreak was detected last month in squirrels and a rabbit in City Park, just outside the zoo.

Bubonic plague, which came to be called the Black Death as it killed millions of people throughout Europe in the 14th century, is carried by fleas that infect rodents. Today, it is found mainly in rural areas of the West. While it can be deadly in humans and some animals, bubonic plague is treatable.

The zoo’s senior veterinarian, Dr. Dave Kenny, said that the capuchin monkeys were recently moved to their summer habitat, an island with tall trees that squirrels can climb. Dr. Kenny said he thought that Spanky “found a nest of ground squirrels” that carried the plague.

“Because it was pretty acute,” he said, “it makes the most sense that she ingested an infected squirrel.”

Spanky appeared lethargic last Tuesday and was found dead last Wednesday morning. The cause was confirmed Friday.

While none of the 17 other capuchin monkeys at the zoo have shown signs of illness, all are being treated with antibiotics and have been moved back to cages where visitors can still see them swinging around.

“It’s not particularly shocking,” said Steve Feldman, a spokesman for the Association of Zoos and Aquariums, a nonprofit zoo accreditation group in Silver Spring, Md. “And it shouldn’t be alarming to the public, either. Animals in zoos are kept appropriately separated from the visiting public and receive the highest level of veterinary care.”

There are 4,000 animals at the Denver Zoo, and Dr. Kenny said it was not known how susceptible many exotic animals were to the plague. To prevent any possible exposure, zoo officials decided Tuesday not to open a summer exhibit where visitors have been previously allowed to pet Nubian and pygmy goats. In addition, some animals will be fed indoors, rather than outside where leftover grains could attract squirrels.

After the plague was discovered, zoo officials put up large signboards with pictures of squirrels that read, “I know I am cute, but please don’t feed or touch me!” Although officials say that the chance of contamination from hungry squirrels to humans is “slim to none,” they want to discourage any interactions. Staff members will also patrol the zoo to “share information” about the plague and monitor squirrel activity.

“We see plague every year in rural and semirural areas,” said Dr. John Pape, an epidemiologist with the Colorado Department of Public Health and Environment. “It is unusual to see it in the center of an urban area, but it is not unprecedented.”

Dr. Pape said the infection did not come as a complete surprise because squirrels had “24/7 access” to the zoo and could get into the wild-animal enclosures.

On Tuesday, neither rain nor plague kept visitors from the zoo, and employees and volunteers said visitors seemed unconcerned about the health risk.

“I’ve been kind of expecting questions, but I didn’t get any yet,” said a zoo docent, Melissa DeCost, who was showing schoolchildren an anteater’s skull. “I would tell them not to feed the squirrels though.”

Glenda Reynolds, who was serving as a chaperone to a group of schoolchildren from Cheyenne, Wyo., said there was little concern about the plague. “We already tell our kids not to touch or feed the animals here,” Ms. Reynolds said. “No kids asked about it.”

But at the caged capuchin monkey exhibit, Ryan Picket, 9 was pointing and shouting, “These are the monkeys that died.” Ryan said he learned of the plague from watching the news on television. “I thought it was kind of sad, and I was kind of nervous to come to the zoo today.”

Ryan’s father, Matt Picket, was less dramatic. “I don’t think it’s really that big of a deal,” Mr. Picket said. “There are so few squirrels around, and we are telling the kids not to touch or feed them.”

The current spread of plague here could be wiped out with the dry heat of summer. “If it gets hot and dry,” Dr. Pape said, “the fleas won’t survive.”

Captive shark had 'virgin birth'

Female hammerhead sharks can reproduce without having sex, scientists confirm.

The evidence comes from a shark at Henry Doorly Zoo in Nebraska which gave birth to a pup in 2001 despite having had no contact with a male.

Genetic tests by a team from Belfast, Nebraska and Florida prove conclusively the young animal possessed no paternal DNA, Biology Letters journal reports.

The bonnethead is a species in the hammerhead group

The type of reproduction exhibited had been seen before in bony fish but never in cartilaginous fish such as sharks.

Parthenogenesis, as this type of reproduction is known, occurs when an egg cell is triggered to develop as an embryo without the addition of any genetic material from a male sperm cell.

Population concern

The puzzle over the hammerhead birth was reported widely in 2001, but it is only with the emergence of new DNA profiling techniques that scientists have now been able to show irrefutably what happened.

The investigation of the birth was conducted by the research team from Queen's University Belfast, the Southeastern University in Florida, and Henry Doorly Zoo itself.

The scientists say the discovery raises important issues about shark conservation.

In the wild, these animals have come under extreme pressure through overfishing and many species have experienced sharp declines.

If dwindling shark groups resort to parthenogenesis to reproduce because females have difficulty finding mates, this is likely to weaken populations still further, the researchers warn.

The reason is that asexual reproduction reduces genetic diversity and this makes it harder for organisms to adapt - to changed environmental conditions or the emergence of a new disease, for example.

With normal sex, the mixing of maternal and paternal DNA introduces genetic novelty which can give animals new traits that might be advantageous in their new circumstances.

Sex marks

Dr Paulo Prodohl, a co-author on the Biology Letters paper from Queen's School of Biological Sciences, said: "Vertebrates in general have evolved away from parthenogenesis to boost genetic diversity and enhance evolutionary potential.

"The concern for sharks is that not only could we be reducing their numbers but we could be making them less fit as well." "Our findings will now have to be taken into consideration for any conservation management strategy, especially for overexploited species."

The birth of the hammerhead (of the bonnethead species, Sphyrna tiburo ) at Henry Doorly was as tragic as it was puzzling. The new pup was soon killed by a stingray before keepers could remove it from its tank.

At the time, some theorised that a male tiger shark kept at the zoo could have been the father - but the institution's three bonnethead females had none of the bite marks that are usually inflicted on their gender during shark sex. Some even suggested that one of the females could have had sex in the wild and stored the sperm in her body - but the three-year period in captivity made this explanation highly unlikely.

The new tests on the dead pup's tissues now show the newborn's DNA only matched up with one of the females - and there was none of any male origin.

Although extremely rare in vertebrates, parthenogenesis (out of the Greek for "virgin birth") occurs in a number of lower animals. Insects such as bees and ants use it to produce their drones, for example.

Plant extract may block cannabis addiction

* 22:00 22 May 2007

* news service

* Roxanne Khamsi

A drug which reduces the desire for marijuana and blocks its effect on the brain has been successfully tested in rats. Scientists say the findings may translate into better therapies for cannabis addiction in humans.

Rodents given a compound derived from a plant in the buttercup family lose their hankering for a synthetic version of tetrahydrocannabinol (THC) - the active compound in marijuana. The treatment also blocked a reward response in the animals' brains when they did receive synthetic THC.

In the first part of the experiment, Steven Goldberg at the National Institute on Drug Abuse in Maryland, US, and his colleagues placed rats in a cage with a lever the animals could push. Each time the rats leaned on the lever, they received a dose of the synthetic THC through a small tube running into their body.

Over a period of three weeks the rats learned to enjoy the effects of synthetic THC and frequently self-administered the drug. By comparison, rats that received saline solution did not press the lever often.

Goldberg's team then injected the rats with a compound derived from the seeds of the Delphinium brownii plant, which is in the buttercup family. The compound, known as methyllycaconitine (MLA), had a dramatic effect on the animals' behaviour.

Blocking dopamine

On the day that they received MLA they pushed the lever for synthetic THC 70% less than before. The drug did not seem to otherwise change the rats' movement and coordination, and had no other apparent side effects.

The scientists also took a close look at the effects of MLA on the rats' brains. They used a technique called microdialysis to take tiny fluid samples from a reward-signalling area of the brain known as the nucleus accumbens, which sits near the base of the head. When rats receive synthetic THC, levels of the reward chemical dopamine normally shoot up in the nucleus accumbens - but MLA blocked the release of dopamine in this brain region.

"The increases in dopamine are virtually non-existent because of MLA," says Goldberg. He adds that MLA did not lower dopamine levels below normal amounts. This is important, says Goldberg, because it suggests that a similar therapy for humans would not interfere with normal reward signalling in the brain.

He notes that the drug Rimonobant, which makes monkeys less likely to self-administer THC, has been linked to depression in humans.

The exact mechanism by which MLA works remains a mystery. Scientists know that MLA binds to specific cell receptors in the brain called alpha-7 nicotinic receptors. They speculate that cannabis indirectly triggers these receptors, but cannot do so when the receptors are blocked by MLA.

Human potential

There is a genuine need for medications to help cannabis addicts overcome their drug problem, according to Goldberg: "About 10% of the people who experiment with it go on to heavy use and have trouble voluntarily giving it up. I think there is a proportion of the population who need ways to make them stop."

Drug-makers have recently made medications such as Chantix available to help people quit tobacco smoking. But researchers say that these drugs affect different nicotinic receptors than those triggered by THC.

And while some people have pushed Rimonobant as a possible remedy for addiction, Goldberg says that more options - such as one based on MLA - must be explored: "Each patient is different and what works in one might not work in another." Journal reference: Journal of Neuroscience (DOI: 10.1523/JNEUROSCI.0027-07.2007)

The 300th Birthday of the Man Who Organized All of Nature

By JAMES BARRON

As a warm-up to a birthday celebration, it was sedate, and also spartan.

There was no cake. There were no candles. There was just a bunch of middle-aged men standing around in the back room of the library at the New York Botanical Garden yesterday, talking about the birthday boy and sex.

They were talking about the genius of genuses — or genera, to cite the preferred plural. Either way, Carl Linnaeus, the Swedish naturalist, was born 300 years ago today and is remembered as the man who gave the world modern taxonomy, the science of classifying organisms.

Among the things said about Linnaeus at the Botanical Garden in the Bronx was this: “The terminology he used, you could construe as botanical pornography.”

That was as racy as it got.

The man who mentioned “botanical pornography” was Robbin C. Moran, a Linnaeus expert and the garden’s curator of ferns. He described Linnaeus as an egotist who once declared, “God creates, Linnaeus arranges.” Dr. Moran said Linnaeus’s contemporary, Albrecht von Haller, topped that, calling Linnaeus “the second Adam” because Adam was the one who named everything the first time around, in the Garden of Eden.

Linnaeus is known for some firsts of his own, besides introducing his system of nomenclature for living things. He was the first to use a centigrade thermometer the way it is used today. (Anders Celsius was the first to divide the range between the freezing point and boiling point of water into 100 units, but made zero the boiling point and 100 the freezing point.)

Linnaeus was also the first person who figured out how to grow bananas in Europe. (Imitating the monsoon climates of Asia, he let the soil dry out, then bombarded it with water.)

Dr. Moran said Linnaeus presented his banana crop to the king and queen of Sweden. And yes, they had no bananas in Sweden until after World War I — at least not commercially. “The banana plant was a novelty,” Dr. Moran said. “Even to see the plant was a marvel.”

So what about the name of the man who named things? His last name was derived, appropriately enough, from the name of a tree.

“A European basswood,” Dr. Moran said.

Linnaeus’s father was born Nils Ingemarsson. Dr. Moran said Swedes were switching to Latinized surnames around the time Nils went to theology school. He decided to name himself after the linden tree that grew on the family farm. That gave him the name Lin, which was Latinized to Linnaeus. He became Nils Linnaeus; his son was Carl Linnaeus but became Carl von Linné after Sweden made him a noble in 1761.

As if to make things more confusing, a paper by Dr. Moran points out that Linneaus later classified the linden tree from the family farm Tilia cordata.

Gordon L. McDaniel, the library’s head of cataloging services, pulled out a microfilmed copy of Linnaeus’s baptismal record. Then he and Dr. Moran looked at half a dozen rare books by and about Linnaeus from the library’s collection.

In one, prepared for a wealthy Dutchman who had hired Linnaeus to catalog his garden, Linnaeus’s likeness appears twice in the same engraving in the front of the book. The engraver used Linnaeus’s face to represent the god Apollo.

“I don’t know of any other scientist who depicted himself as a Greek god,” Dr. Moran said. (In his writing, he has taken his claim beyond science, saying he knows of “no other Enlightenment or Renaissance scholar” who had let himself be shown that way.)

Linnaeus also appears as himself, putting a garland on the Dutchman.

Dr. Moran and Dr. McDaniel moved on to another book. “Now, the sexual system,” Dr. Moran said. “This is where it gets good.”

Linnaeus, Dr. Moran said, decided that plants enjoy sex. He came to that conclusion as a 22-year-old university student and handed in a paper about it on New Year’s Day, when, by custom, Swedish university students usually made up a flattering verse about a professor.

“This was a bombshell,” Dr. Moran said. “Science had been dominated by Aristotle, who said that plants don’t move, so plants don’t have sex.”

Going public with statements like “every animal feels the sexual urge” was, Dr. Moran said, “gutsy” of Linnaeus, who also declared in the New Year’s paper, “Yes, love comes even to the plants.” He also wrote about plants as being brides and bridegrooms, the latter embracing the former.

“The biggest objection to Linnaeus’s sexual system of plant classification was that it was immodest,” Dr. Moran said. “You couldn’t teach it to women and young people.”

Linnaeus also found a way to have the last word. “Linnaeus got even as only a taxonomist can,” Dr. Moran said. “He named smelly, ugly plants after his critics.”

So he named a weed Siegesbeckia, after Johann Siegesbeck, a German who called Linnaeus’s work “loathsome harlotry” and also said, “Who would have thought that bluebells, lilies and onions could be up to such immorality?”

Adult brain cells rediscover their inner child

Hopkins study shows adult-born nerves experience brief period of child-like learning

You may not be able to relive your youth, but part of your brain can. Johns Hopkins researchers have found that newly made nerves in an adult brain's learning center experience a one-month period when they are just as active as the nerves in a developing child. The study, appearing this week in Neuron, suggests that new adult nerves have a deeper role than simply replacing dead ones.

Song and his colleagues tracked the chemical signals received by Adult brain cells rediscover their inner child

Hopkins study shows adult-born nerves experience brief period of child-like learning

You may not be able to relive your youth, but part of your brain can. Johns Hopkins researchers have found that newly made nerves in an adult brain's learning center experience a one-month period when they are just as active as the nerves in a developing child. The study, appearing this week in Neuron, suggests that new adult nerves have a deeper role than simply replacing dead ones.

Song and his colleagues tracked the chemical signals received by newly made nerve cells in the adult mouse hippocampus, a brain structure dedicated to learning and memory, by injecting virus particles to light up nerve progenitor cells. Any freshly made nerves glowed green and become permanently marked for later identification.

"In essence, we stamped a birth date on new adult nerve cells," says Hongjun Song, Ph.D., assistant professor of neurology at Johns Hopkins' Institute for Cell Engineering. "The brief heightened activity we saw may help explain how adults continue to adapt to new experiences even though adult brains are more hardwired than children's brains," he adds. The slow and gradual addition of new nerve cells may be like a fine-tuning system, allowing adults to incorporate fresh information without altering our basic brain circuitry.

When they looked at brains from these mice, the researchers noticed that hippocampal nerves that were between 1 and 2 months old could dramatically increase or decrease the amount of signaling chemicals they receive from neighboring nerves. This ability of nerves to modulate their chemical inputs, known as synaptic plasticity, is especially high in developing brains but tends to become less intense in adults.

While the exact contribution adult-born neurons make to overall learning and memory remains mysterious, Song notes that these results are promising for any future nerve stem cell therapy. "If we can implant or stimulate these adult stem cells in damaged areas, it's possible we can do more than fill in lost nerve connections," he says. "We might be able to rejuvenate an aging brain."newly made nerve cells in the adult mouse hippocampus, a brain structure dedicated to learning and memory, by injecting virus particles to light up nerve progenitor cells. Any freshly made nerves glowed green and become permanently marked for later identification.

"In essence, we stamped a birth date on new adult nerve cells," says Hongjun Song, Ph.D., assistant professor of neurology at Johns Hopkins' Institute for Cell Engineering. "The brief heightened activity we saw may help explain how adults continue to adapt to new experiences even though adult brains are more hardwired than children's brains," he adds. The slow and gradual addition of new nerve cells may be like a fine-tuning system, allowing adults to incorporate fresh information without altering our basic brain circuitry.

When they looked at brains from these mice, the researchers noticed that hippocampal nerves that were between 1 and 2 months old could dramatically increase or decrease the amount of signaling chemicals they receive from neighboring nerves. This ability of nerves to modulate their chemical inputs, known as synaptic plasticity, is especially high in developing brains but tends to become less intense in adults.

While the exact contribution adult-born neurons make to overall learning and memory remains mysterious, Song notes that these results are promising for any future nerve stem cell therapy. "If we can implant or stimulate these adult stem cells in damaged areas, it's possible we can do more than fill in lost nerve connections," he says. "We might be able to rejuvenate an aging brain."

Moderate drinking lowers women's risk of heart attack

BUFFALO, N.Y. -- Women who regularly enjoy an alcoholic drink or two have a significantly lower risk of having a non-fatal heart attack than women who are life-time abstainers, epidemiologists at the University at Buffalo have shown.

Moderation is the key, however. Women in the study who reported being intoxicated at least once a month were nearly three times more likely to suffer a heart attack than abstainers, results showed.

One difference in the protective pattern among drinkers involved those who drank primarily liquor. Women who preferred liquor to wine experienced a borderline increase in risk of heart attack, results showed.

The study is published in the May 2007 issue of the journal Addiction.

"These findings have important implications, because heart disease is the leading cause of death for women," said Joan M. Dorn, Ph.D., associate professor of social and preventive medicine in the UB School of Public Health and Health Professions and first author on the study.

Women seem to have a quicker reaction to a smaller amount of alcohol, she noted: "Overdoing it is harmful, and what is too much depends on each individual. In some women, one drink can cause intoxication."

Moderate alcohol consumption has been shown to lower the risk of heart attack, but most studies have been done with men. The current study compared alcohol drinking volume and drinking patterns of women who had been hospitalized due to a heart attack, with age-matched controls without heart problems.

Women who had a prior heart attack, coronary bypass surgery, angioplasty, angina or a previous diagnosis of cardiovascular disease were excluded from the study.

Participants -- 320 heart attack patients and 1,565 controls -- were enrolled between 1996 and 2001. Extensive information was collected on the type of beverage consumed, serving size for each beverage and number of drinks consumed during the two years prior to the heart attack, or for controls, two years prior to the interview.

The researchers computed several variables. Drinking status was categorized as lifetime abstainers (women who reported never having 12 or more drinks in their lifetime or in any 1-year period); non-current drinkers (those who didn't consume at least one drink per month during the reference period), and current drinkers.

Additional variables calculated were: total ounces of alcohol consumed; drinks per drinking day; drinking frequency; drinking primarily with food; beverage preference -- wine, beer, liquor, or some of each; and frequency of intoxication -- current drinkers who stated they drank enough to get drunk or very high, once or more a month, and less than once a month.

Results showed that in this population-based study, women who drank moderately had a significantly lower risk of heart attack than abstainers, and the benefits were greatest in women who had a drink daily. A lower risk for drinkers than abstainers also was evident in women who drank with food, as well as without, and in those who primarily drank wine or a variety of alcoholic beverages.

Similar, but weaker, associations were found when patterns and volume were analyzed among drinkers only. Among these women, drinking alcohol in moderation in general was more important than the actual amount consumed. However, getting drunk at least once a month puts women at a significantly increased risk of heart attack, negating any of alcohol's potential protective effect.

Dorn emphasized that no one should interpret these finding as a reason to begin consuming alcohol, because alcohol brings with it risks for other conditions, such as breast cancer.

"I certainly wouldn't recommend that women start drinking, but among those who do, if they are concerning about heart health, the message is that a small amount is OK."

Plants that produce more vitamin C may result from UCLA-Dartmouth discovery

UCLA and Dartmouth scientists have identified a crucial enzyme in plant vitamin C synthesis, which could lead to enhanced crops. The discovery now makes clear the entire 10-step process by which plants convert glucose into vitamin C, an important antioxidant in nature.

"If we can find ways to enhance the activity of this enzyme, it may be possible to engineer plants to make more vitamin C and produce better crops," said Steven Clarke, UCLA professor of chemistry and biochemistry, director of UCLA's Molecular Biology Institute and co-author of the research study, to be published as a 'Paper of the Week' in the Journal of Biological Chemistry and currently available online.

"We hit on gold," Clarke said, "because we now have a chance to improve human nutrition and to increase the resistance of plants to oxidative stress. Plants may grow better with more vitamin C, especially with more ozone in the atmosphere due to pollution."

Carole Linster, a UCLA postdoctoral fellow in chemistry and biochemistry and lead author of the study, discovered the controlling enzyme, GDP-L-galactose phosphorylase, which serves as the biosynthetic pathway by which plants manufacture vitamin C.

"Our finding leads to attractive approaches for increasing the vitamin C content in plants," Linster said. "We now have two strategies to provide enhanced protection against oxidative damage: Stimulate the endogenous activity of the identified enzyme or engineer transgenic plants which overexpress the gene that encodes the enzyme."

When life on Earth began, there was almost no oxygen, Clarke noted.

"Two billion years ago, plants devised an efficient way to get sunlight to make sugar from carbon dioxide that produced oxygen as a waste product; that waste product probably killed off most of all living species at that time," Clarke said. "The only organisms that survived developed defenses against it, and one of the best defenses is vitamin C. Plants learned how to make vitamin C to protect themselves."

Prior to the new research, vitamin C may have been the most important small molecule whose biosynthetic pathway remained a mystery.

An essential vitamin for humans, vitamin C is also an important antioxidant for animals and plants. Humans do not have the ability to make vitamin C and get it from dietary sources, especially from plants. It was not until 1998 that a biosynthetic pathway was proposed to explain how plants make this compound. Research confirmed much of the pathway, although one crucial missing link continued to baffle scientists and remained unknown until this new research.

Clarke, who studies the biochemistry of aging, said the finding is an example of serendipity in science.

The research started as an effort to understand the role of a gene in Caenorhabditis elegans, a tiny worm used as a model for aging studies by Tara Gomez, a former UCLA undergraduate in Clarke's laboratory and now a graduate student at the California Institute of Technology. The gene's sequence suggested that it was related to a family of genes altered in cancer, known as HIT genes; these genes are studied in the laboratory of Charles Brenner at the Norris Cotton Cancer Center at Dartmouth Medical School.

Collaboration between Clarke's and Brenner's laboratories revealed a similarity between the worm gene and the product of the VTC2 gene of Arabidopsis thaliana, a small roadside plant. Mutations in this gene had been previously linked to low levels of vitamin C. Linster and Gomez were able to express and to purify the plant VTC2 enzyme from bacteria. The research team, led by Linster, produced the GDP-L-galactose substrate and reconstituted in test tubes the mysterious seventh step in vitamin C synthesis.

Clarke and Brenner liken the first six steps in vitamin C synthesis to a roadmap in which there are multiple possible routes from glucose to a variety of cellular compounds. Once the GDP-L-galactose compound takes the exit marked "VTC2," however, the atoms are reconfigured to make vitamin C. The remaining three steps, like a curving driveway, "require some turns but no real choices and no backing up," Brenner said.

The researchers are still studying what VTC2-related genes do in animals and how these genes may relate to aging and cancer.

Follow the 'green' brick road?

Bricks made from coal-fired power plant waste pass safety test

Researchers have found that bricks made from fly ash--fine ash particles captured as waste by coal-fired power plants--may be even safer than predicted. Instead of leaching minute amounts of mercury as some researchers had predicted, the bricks apparently do the reverse, pulling minute amounts of the toxic metal out of ambient air.

Each year, roughly 25 million tons of fly ash from coal-fired power plants are recycled, generally as additives in building materials such as concrete, but 45 million tons go to waste. Fly ash bricks both find a use for some of that waste and counter the environmental impact from the manufacture of standard bricks.

"Manufacturing clay brick requires kilns fired to high temperatures," said Henry Liu, a longtime National Science Foundation (NSF) awardee and the president of Freight Pipeline Company (FPC), which developed the bricks. "That wastes energy, pollutes air and generates greenhouse gases that contribute to global warming. In contrast, fly ash bricks are manufactured at room temperature. They conserve energy, cost less to manufacture, and don't contribute to air pollution or global warming."

Once colored and shaped, the FPC bricks are similar to their clay counterparts, both in appearance and in meeting or exceeding construction-material standards.

Supported by NSF's Small Business Innovation Research (SBIR) program, Liu has been working since 2004 to develop the bricks. The first phase of support enabled him to make fly ash bricks more durable by engineering them to resist freezing and thawing due to weather. Liu is now working from a second-phase SBIR award to test the brick material's safety and prepare it for market.

"Green manufacturing is a focus for the nation," said Tom Allnutt of NSF's SBIR program, who oversaw Liu's award. "Liu's innovative use of fly ash to manufacture high quality building materials will potentially decrease some of the negative environmental impact of coal-fired power generation while meeting increasing demands for greener building materials."

While researchers need to study the bricks further to determine how the mercury adsorption occurs and how tightly the metal is trapped, the new findings suggest the bricks will not have a negative impact on indoor air quality.

On average, air contains low amounts of mercury that can range from less than 1 nanogram per cubic meter (ng/m3) to tens of ng/m3--a small fraction of the Environmental Protection Agency limit for continuous exposure.

Inside a confined experimental chamber, the bricks did not raise the mercury levels in the surrounding air (originally more than one nanogram), and instead appeared to lower the concentration down to roughly half a nanogram.

Engineers from FPC of Columbia, Mo., developed the bricks with NSF support and reported their findings on mercury leaching at the May 7-10, 2007, World of Coal Ash Conference in Cincinnati, Ohio.

New study indicates that people may need more dietary choline than previously thought

Eggs 1 of the best sources of the nutrient

Washington, D.C. -- A new study published in the May issue of the American Journal of Clinical Nutrition indicates that the current recommended Adequate Intake (AI) for choline may, in fact, be inadequate for some people.1 Choline is an essential nutrient for normal functioning of all cells, including those involved with liver metabolism, brain and nerve function, memory, and the transportation of nutrients throughout the body.

In this depletion-repletion study, 57 adult subjects (26 men, 16 premenopausal women and 15 postmenopausal women) were fed a diet containing 550 mg of choline for 10 days, then fed less than 50 mg a day of choline for up to 42 days.

• When deprived of the nutrient, 77 percent of men, 80 percent of postmenopausal women and 44 percent of premenopausal women developed fatty liver or muscle damage.

• Six men (23 percent) developed these signs while consuming the initial 550 mg of daily choline, even though 550 mg is the current AI for men.

• Nineteen percent of the subjects required as high as 825 mg of daily choline to prevent or reverse the organ dysfunction associated with the low-choline diet, an amount significantly higher than the current AI.

• For all participants, blood homocysteine levels increased during choline depletion. Other studies have associated high homocysteine levels with heart disease.

"These study results clearly indicate that some adults, notably men and post-menopausal women, need more choline than is recommended by the current AI," says study co-author Kerry-Ann da Costa, PhD, a research assistant professor at the University of North Carolina at Chapel Hill. "We hope these findings will aid the Institute of Medicine in refining the Dietary Reference Intake (DRI) of this nutrient."

This study is the most complete study of choline requirements to date and is the first to include women. Its division of participants into two groups – one receiving dietary supplementation of folic acid and one not – also determined that susceptibility to choline deficiency was not altered by folic acid supplementation.

Closing the Choline Gap

Additional research on the population demonstrated that choline intake is far below the current AI, a concern that intakes may be too low to meet the needs of many individuals.

• Research conducted at Iowa State University found that only 10 percent or less of older children, men, women and pregnant women in America get the AI of choline each day.2

• A separate study presented this month at the National Nutrient Data Bank Conference found that choline intake decreases with age and that adults ages 71 and older consume an average of about 264 milligrams per day – roughly half of the AI for choline.3

Eggs, beef liver, chicken liver and wheat germ are considered excellent sources of choline. Two eggs contain 280 milligrams of choline, half the recommended daily supply.

"Eggs are a practical food that can help people get the choline they need, along with several other nutrients, at just 75 calories an egg," says registered dietitian Maye Musk. "Choline is actually found in the yolk of the egg, so people who consistently only eat egg whites may be missing out on a key nutrient opportunity."

Why Choline Matters

The importance of dietary choline has been well-established.

• A 2004 study in the American Journal of Epidemiology linked poor dietary choline to adverse outcomes during pregnancy, including a four-fold increased risk of having a baby with a neural tube defect. 4

• A research review published in the Annual Reviews of Nutrition suggests that choline plays an important role in normal fetal development, particularly during the stages that involve knowledge acquirement and life-long memory function. 5

Mars rover's disability leads to major water discovery

* 15:49 23 May 2007

* news service

* David L Chandler

Ironically, the most severe mechanical failure ever experienced by either of NASA's two Mars rovers has led to one of their most important discoveries: the first silica found on the planet, a telltale sign that ancient water was involved in its formation.

The rover Spirit, roaming inside a large crater called Gusev, suffered a failure in 2006 that froze one of its six wheels. Ever since then, the rover has had to drag the immobilised wheel along as it moves, significantly slowing down its progress and scraping away a layer of soil as it rolls.

"It produced an inadvertent trenching tool," said rover lead scientist Steven Squyres, who presented the findings on Tuesday at a meeting of the American Geophysical Union in Acapulco, Mexico.

Spirit's mini-Thermal Emission Spectrometer found that an unusually bright patch of soil revealed in one of these trenches produced a strong signature of silica, a mineral whose formation usually depends on water and which had never been seen before by either rover.

"Nobody's ever found high-silica soil on Mars," Squyres told New Scientist. "Silica is particularly soluble in water, especially hot water."

An immobilised wheel on the Spirit rover uncovered a bright patch of soil that is high in silica – a mineral whose formation usually depends on water (Image: NASA/JPL/Cornell)

Lots of water

There are two known ways that such silica-rich soil – which is 90% pure silicon dioxide – could have formed. Water heated by subsurface volcanic activity, with lots of silica dissolved in it, could have percolated up into the soil, and then as it evaporated left the silica behind.

Or hot, highly acidic steam from a volcanic eruption – essentially concentrated sulfuric acid – could have rained down on soil that contained a variety of minerals, and leached away everything except the silica. Both mechanisms occur on Earth: the action of acidic steam is seen around fumaroles in places such as Yellowstone, and the water percolating through volcanic soil is common around volcanoes in Hawaii.

Either way, there was a lot of water involved. "Both of these involve substantial interactions of water with hot volcanic material," Squyres said. And the site is just a few metres away from the formation known as Home Plate, a bedrock formation produced by the interaction between water and very hot basaltic lava in a volcanic explosion. Because of the new find, the small depression where Spirit is currently exploring has now been dubbed "silica valley".

In hopes of figuring out which way it formed, the team will use other instruments to try to determine the composition of the remaining 10% of the soil, Squyres said. "We're using the good front wheel to intentionally scuff up the soil [to study the composition]," he said. "You don't see this stuff unless you scrape it up."

Bugs struck down by 'super-oxidised' water

* 23 May 2007

* news service

* Andy Coghlan

WATER washes away many things, but could it be used to kill harmful viruses, fungi and bacteria in wounds? The developers of a form of "super-oxidised" water certainly think so - and they claim it may do so more effectively than bleach, without harming human tissue.

Information on the product, called Microcyn, was presented last week at Global Healthcare, a biomedical business conference in Monte Carlo, Monaco. It revealed that wounds of patients with diabetes treated with the product and an antibiotic healed within 43 days on average, compared with 55 days for patients given the standard treatment of iodine plus an antibiotic.

Oxychlorine ions are the key ingredient, rapidly piercing the walls of free-living microbes and killing them. Human cells are spared because they are tightly bound together in a matrix, says Hoji Alimi, founder of Oculus, the company in Petaluma, California, that developed Microcyn. "Microcyn only kills cells it can completely surround," he says.

Ordinarily, water consists of hydroxyl and hydrogen ions as well as H2O molecules. However, by exposing purified water to sodium chloride through a semi-permeable membrane and then using electrolysis, various oxychlorine ions are formed too. These kill microbes and viruses, but are present in much lower amounts than in bleach, which also contains a slightly different combination of ions, including large amounts of the highly reactive hypochlorite ion.

Despite containing 300 times less hypochlorite than bleach, Microcyn killed 10 strains of bleach-resistant bacteria, according to a study by Eileen Thatcher of Sonoma State University in Rohnert Park, California. "It may be that other, unusual ions [that are] in Microcyn but not bleach are instantly lethal to bugs," says Thatcher. Alimi has also found a way to stabilise the ions by making them react with and regenerate each other during storage, so that the fluid remains active for up to two years.

While Microcyn was officially approved in the US for cleaning wounds around two years ago, some physicians have also been using it "off label" to accelerate healing by repeatedly applying it to the wound. "When you spray it on, you see the treated tissue 'pink up' and go beefy, which is good because it means the oxygen supply has resumed," says Cheryl Bongiovanni, director of wound care at the Lake District Hospital in Lakeview, Oregon, who has used Microcyn on around 1000 diabetic patients with leg and foot wounds over the past 18 months. Official phase II trials to test the product's wound-healing potential are currently taking place in the US and Europe.

"It does seem promising," says Andrew Boulton of the Manchester Royal Infirmary in the UK, who is conducting one such trial. "Hopefully it will confirm our initial good experience."

Tracy Kelly of Diabetes UK says that 15 per cent of people with diabetes who develop foot ulcers eventually suffer amputations. "We would welcome any safe, effective treatment which could help hasten recovery," she says.

Ancient gene kit came in handy for limbs

* 23 May 2007

* news service

* Jeff Hecht

The hands you hold this magazine with are not quite the dramatic evolutionary innovations you might have thought. The master genes controlling development in the primitive animal known as the paddlefish turn out to be unexpectedly similar to those controlling the development of limbs in land animals. Rather than evolving a new set of control genes for their limbs, it seems that our amphibian ancestors adapted the genes their own ancestors used to develop fins.

The paddlefish Polyodon spathula is often referred to as a "living fossil", an organism that is similar to no known species apart from fossils. Most previous genetic work on fish has been done on the more highly evolved zebrafish. This appeared to show that the Hox family of control genes in land animals and fish were different, implying land animals had evolved new genes to control growth of hands and feet.

Hox genes control the alignment and polarisation of body structures in all animals, separating head from tail. In fish they arrange the structures of fins, and in land animals the structure of limbs. Zebrafish develop their fins in a single stage, in which Hox genes produce parallel stripes that underlie fin structures. Mice and chickens have a second phase of development in which Hox genes turn on only in the regions that become a hand or foot.

Cartilaginous stain, showing the pectoral fin, the equivalent of the arms of land vertebrates (Image: MC Davis)

"The logical explanation was that since fish don't have hands, they don't have a second stage of Hox gene development, so [addition of] the second stage should correlate with evolution of the hand," says Marcus Davis of the University of Chicago. But zebrafish are highly evolved, so he wondered if they had lost the ancestral form of fin development.

Davis and colleagues looked at Hox genes in the paddlefish, a primitive relative of the sturgeon, because it is relatively unevolved. They found that the little arm fins of paddlefish develop in two phases, implying that the second phase of Hox gene expression had evolved long before arms and legs, but was lost in zebrafish (Nature, vol 447, p 473).

That ancient set of genes played a key role in helping vertebrates crawl onto land. Tiktaalik, the fish with feet discovered last year (New Scientist, 8 April 2006, p 14), "already had the toolkit needed to modify the part of the limbs furthest out", Davis says. As ancestral amphibians moved onto land, they used their existing genetic tools to adapt their limbs.

"Here's something we thought was invented from scratch, but it was there in a deep ancestor of tetrapods," says Sean Carroll, a developmental biologist at the University of Wisconsin, Madison.

US approves birth control pill that blocks menstruation

* 20:22 23 May 2007

* news service

* New Scientist and Reuters

A birth control pill that may eliminate a woman's monthly menstrual period has received approval from US drug regulators.

Lybrel, made by pharmaceutical giant Wyeth, is meant to be taken every day to indefinitely stop monthly menstrual bleeding and prevent pregnancy. It contains two hormones widely used in other oral contraceptives – levonorgestrel (a progestin) and ethinyl estradiol (an oestrogen).

Traditional birth control pills are usually taken for 21 days followed by seven days of placebo pills or no pills, which allows a period of bleeding to occur.

For years, some women have been stopping their periods by taking birth control constantly, with no break whatsoever. Lybrel is the first, however, that is approved for that use.

It takes time for periods to be suppressed, so most women will have intermittent bleeding or spotting during the first year of use, according to the US Food and Drug Administration (FDA), which has approved the drug for sale.

Unscheduled bleeding

"The convenience of having no scheduled menstruation should be weighed against the inconvenience of unscheduled bleeding or spotting," the agency said in a statement.

Wyeth studied more than 2400 women aged 18 to 49. In the main study, 59% who took Lybrel for one year had no bleeding or spotting during the last month.

Bleeding and spotting may occur on four or five days each month, said Daniel Shames, deputy director of the FDA office that reviews contraceptives. It decreases over time for most women who stay on the drug for a year. About half of the women in Wyeth's studies dropped out before that time, he notes.

Blood clots and strokes

The new contraceptive has potential major side effects such as blood clots and strokes, similar to those of traditional birth control.

Vanessa Cullins at the reproductive advocacy non-profit Planned Parenthood welcomed Lybrel as a new option. "Women who have been presented with the option of extended hormonal use opt for and like it. It's turning out to be fairly popular," she says.

But others have questioned the idea of eliminating a natural process because it may be inconvenient.

"Menstrual manipulation appears to be another in a long line of attempts to medicalise women's natural biological life events," says sociologist Jean Elson of the University of New Hampshire in Durham, US.

Wyeth said Lybrel should be available in pharmacies in July.

'Teaching gap' exists among US and Asian math teachers, study says

US teachers less effective in use of analogies in math instruction, UC Irvine study finds

Irvine, Calif. -- Compared to math teachers in the high-achieving nations of Hong Kong and Japan, teachers in the United States offer less of certain supports that could help students learn more. This could contribute to the lower performance among U.S. students on international math tests, a UCI researcher discovered.

The findings are published in the May 25 issue of Science.

The study analyzed how analogies – a reasoning practice that involves connecting two concepts, often a better-known concept to a less familiar one – are used in the United States, Hong Kong and Japan. They are known to be helpful for learning mathematical concepts, but only if teachers use enough imagery and gestures that students’ attention to the analogous relations. These strategies, or cognitive supports, are necessary to ensure that students notice and understand the analogies.

U.S. teachers incorporate analogies into their lessons as often as teachers in Hong Kong and Japan, but they less frequently utilize spatial supports, mental and visual imagery, and gestures that encourage active reasoning. Less cognitive support may result in students retaining less information, learning in a less conceptual way, or misunderstanding the analogies and learning something different altogether.

"There is no guarantee that without these cues, the students are actually benefiting from the analogies and thinking about math in a comparative way," said Lindsey Richland, assistant professor of education and co-author of the study.

Richland and research colleagues analyzed videotapes of math lessons from the large-scale video portion of the 1999 Trends in International Mathematics and Science Study. That study found U.S. teachers engaged students in complex connected reasoning and problem solving significantly less than teachers in countries where students score higher in math. Richland examined the instructional uses of analogy in the videotapes and coded the frequency of teaching strategies that provide cognitive supports for students’ reasoning.

The "teaching gap" with respect to analogy could be attributed to different cultural orientations to relational reasoning. However, the authors conclude U.S. math teachers could improve the effectiveness of their analogies through slight adjustments in their instruction.

"Teachers are already using analogies; we’re not recommending going into the classroom and changing the way they’re doing things. But if teachers could be more attentive to the use of these kinds of supports, the students would be likely to benefit and learn a lot more," Richland said.

Magnets may make the brain grow stronger

* 24 May 2007

* news service

* Linda Geddes

Could magnets make the mind grow stronger? In mice at least, stimulating the brain with a magnetic coil appears to promote the growth of new neurons in areas associated with learning and memory. If the effect is confirmed in humans, it might open up new ways of treating age-related memory decline and diseases like Alzheimer's.

Transcranial magnetic stimulation (TMS) has been used experimentally to treat a range of brain disorders, including depression and schizophrenia, and to rehabilitate people after a stroke. TMS uses a magnetic coil to induce electric fields in the brain tissue - activating or deactivating groups of neurons, although the exact mechanism has remained unknown. One theory was that it aided learning and memory by strengthening brain circuits through a process called long-term potentiation (LTP).

To investigate, Fortunato Battaglia at the City University of New York and his colleagues gave mice TMS for five days, then analysed their brains for evidence of LTP or cell proliferation.

They confirmed that TMS enhanced LTP in all areas of the brain tested, by modifying key glutamate receptors so that they stayed active for longer. The team also saw large increases in the proliferation of stem cells in the dentate gyrus hippocampus. These cells divide throughout life and are now believed to play a crucial role in memory and mood regulation (See "Memories are made of this?").

"The effect on the stem cells is the most exciting finding," says Battaglia, who presented his results at a meeting of the American Academy of Neurology in Boston earlier this month. Physical exercise and some antidepressants also promote neuron growth, but they can be difficult to target to specific areas.

Battaglia thinks TMS could eventually be used to improve learning and memory in people with age-related memory decline and Alzheimer's - which is associated with a loss of neurons in the hippocampus, among other areas. His team is now running a trial to test this theory.

John Rothwell, a TMS researcher at the Institute of Neurology at University College London, says this is the first time TMS has been shown to enhance neurogenesis, but he questions whether TMS could stimulate neuron growth elsewhere in the brain. However, even if TMS cannot replace lost neurons, Rothwell believes it could still slow down the progression of diseases like Alzheimer's by enhancing LTP. "It may be a way of reinforcing connections that are becoming weaker," he says.

Memories are made of this?

Brain cells generated during adulthood may play a greater role in memory formation than previously thought.

New cells are generated in at least two areas of the brain throughout life. Researchers had speculated that they could be involved in the formation of new memories, but until recently there was little evidence to support this.

Now Hongjun Song at Johns Hopkins University School of Medicine in Baltimore, Maryland, and his colleagues have shown that young adult-grown neurons display similar properties to neurons in the developing nervous system - potentially providing a mechanism by which new neurons could integrate into existing brain circuits and modify them.

The team labelled dividing cells in the hippocampi of mice, and then measured how easily they were activated in response to an electrical stimulus at different ages. Cells aged between 1 and 1.5 months were easier to activate and displayed a greater degree of activation than younger or older cells (Neuron, vol 54, p 559).

"It indicates that adult neurogenesis continuously provides a pool of highly excitable and flexible neurons, facilitating the formation of new connections within the adult brain," says Josef Bischofberger, a neuroscientist at the University of Freiburg in Germany. "At the same time, maturation will slowly 'cool down' the new cells, making them reliable units for stable representation and storage of the newly learned memories."

Song believes the new cells may modify existing networks, helping them incorporate new information and adapt to changing conditions, such as disease.

'Probiotics' could save frogs from extinction

* 13:26 24 May 2007

* news service

* Catherine Brahic

"Probiotics" could be used to tackle a disease which is decimating amphibian populations around the world.

The idea, now tested in the lab, is to use naturally occurring bacteria that kill the fungus which causes the condition. Chytridiomycosis, as the disease is known, has been identified as one of the main threats to the survival of up to a third of the world's amphibian species (see Global frog crisis defies explanation).

Reid Harris, at James Madison University, Virginia, US, and his team have identified over 20 probiotics that kill the fungus, in a Petri dish. Harris, who describes the frogs' backs as mini-ecosystems, home to dozens of bacteria, says there could be many more.

Less weight loss

So far, they have tested two of the bacteria on live salamanders and found that they have different effects on infected animals.

Salamanders that were treated with Pedobacter cryoconitis and then infected with the chytrid fungus were able to clear the infection 30% faster than were those that were not treated.

Those that were given Pseudomonas reactans did not clear the fungus any faster, but suffered a less severe infection. "Salamanders lose weight in response to the fungus – about 30% of their weight in 45 days," explains Harris. "For the ones that were given Pseudomonas reactans, this [weight loss] was halved."

Harris says the next step is to run the same experiments on frogs, which are even more vulnerable to the fungus. Salamanders are not killed by the chytrid fungus, but were convenient animals for a first round of experiments because they are fairly common.

The researchers are preparing to test the two bacteria on mountain yellow-legged frogs soon. Initially, testing will continue in the lab, but Harris wants to see it expanded to a field setting.

Mountain yellow-legged frogs live around lakes and ponds in California's Sierra Nevada. Populations are not very large and they do not tend to stray far from the water shore, so Harris thinks it should be possible to apply bacteria to a large proportion of the wild frogs and observe how it protects the population from future chytrid epidemics.

Not a panacea

Although it would be difficult to protect large populations of frogs in a rainforest in this way, Harris says it might be possible to treat small endemic populations of frogs in the tropics.

He also suggests spraying bacteria over the scene of a chytrid epidemic, pointing out that similar methods have been used to kill the gypsy moth caterpillar. The US National Parks Service has already contacted him about his research, he says.

Probiotics will not be a panacea, however. Nobody knows what is causing the upsurge of chytrid fungus infections. "If human activities are causing a change that's making frogs more vulnerable to chytrid fungi - increasing their stress levels, for instance - then the bacteria will not address this underlying problem," he told New Scientist.

Hubble's successor could be fixed in space after all

* 15:29 24 May 2007

* news service

* David Shiga

Hubble's successor, the James Webb Space Telescope (JWST), may be serviceable in space after all. Although the mission was originally expected to be beyond any possible help after it launches in 2013, NASA officials say they are now looking into minor modifications to the design to allow a servicing spacecraft to dock with it.

JWST will provide an infrared view of unprecedented clarity, allowing it to glimpse the universe's first galaxies and study developing solar systems. It will be placed 1.5 million kilometres from Earth in the direction opposite the Sun at a spot called L2, almost four times the distance to the Moon (scroll down for illustration). Watch an animation showing the planned location of JWST at L2.

It was assumed from the project's inception that it would be impossible to service at this location. But NASA officials now say they are studying the possibility of making small modifications to JWST so that a potential future servicing mission could go there and dock with the telescope.

The James Webb Space Telescope will be able to see the first galaxies forming in the early universe (Illustration: NASA)

Under study

Edward Weiler, who heads NASA's Goddard Space Flight Center in Greenbelt, Maryland, US, the NASA centre in charge of JWST, said the telescope would be modified to allow for docking, according to a story.

Weiler was not available for further comment, but Eric Smith, program scientist for JWST at NASA headquarters in Washington, DC, told New Scientist that modifying the spacecraft to facilitate docking is only an idea that NASA is studying at the moment and that no decision would be made about any changes until after a March 2008 review of the spacecraft design.

"What we've actually asked the JWST project to do is a study of what it would take to make the JWST 'grappleable'," Smith told New Scientist. "Is there some small feature you could put on the observatory that would allow a future spacecraft to more or less grab hold of it?"

Smith says the docking feature could allow both human and robotic missions to dock, although there are no official plans to send humans to L2. "The first thing might be to consider how a remote [robotic] craft might get it," he says.

Dangerous radiation

The study is not trying to determine what components could be fixed with a servicing mission, and is limited instead to simply hashing out a design for a docking feature, Smith says.

The new capabilities NASA is developing to send humans to the Moon and beyond have prompted the study of possible modifications to facilitate docking for crewed spacecraft, Smith says.

But Matt Mountain, director of the Space Telescope Science Institute, in Baltimore, Maryland, US, which is responsible for Hubble's science operations and will manage JWST's operations, says any servicing missions are likely to be done by robots rather than humans, since the environment at L2 is hazardous.

JWST will be placed at a spot called L2, 1.5 million kilometres from Earth (Illustration: ESA)

"It's fundamentally not a great place to have people," he told New Scientist. "There's no protection from radiation – there's no Moon to bury yourself in and there's no magnetic field from Earth."

Topping up

Simple robotic missions would be most likely, he says. For example, a small robot could be sent to shake the spacecraft if a solar panel fails to unfurl properly, he says. "A simple micro-satellite goes out, grabs this thing and shakes the spacecraft and tries to loosen [the panel]," he says

More complicated possibilities could include refuelling the spacecraft. Fuel is used by thrusters that, in combination with internal reaction wheels, point the telescope at different targets, while keeping the spacecraft stable. The fuel supply is the primary factor limiting the spacecraft's five- to 10-year lifetime.

Mountain points out that a mission called Orbital Express is already testing out the technology needed to refuel a spacecraft in orbit (see Estranged satellite pair reunited at last). "Who knows what robots might be able to do 10 years hence," he says. "Maybe we might be able to refuel JWST somehow."

If the spacecraft could be refuelled, the next limiting factor would be the gradual degradation of the spacecraft's sunshade, a set of thin plastic sheets that protect its instruments from sunlight. Micrometeorites are expected to poke holes in it, letting light through and eventually making it impossible to keep the spacecraft cool enough for observations.

Spray on

It might be possible for a robot to fix this, too, Mountain says. "You could imagine a robot with a camera with some kind of spray-on stuff that it puts on as it goes around," he says. "You could imagine this thing whizzing around and when it finds a hole, it squirts something in it."

Replacing the spacecraft's instruments would be much more difficult, and is beyond current abilities, he says, but he does not rule it out for the future. "We should never underestimate the imagination and ingenuity of engineers to find ways to fix things that go wrong," he says, noting that the upcoming Hubble servicing mission will involve changing some parts that were not designed to be accessible.

It is hoped that there will be no problems deploying JWST, Mountain says, noting that Northrop Grumman, which is building the spacecraft, has a good record of deploying things in space successfully. "But let's at least make sure we can grab the thing," he says. "The first step to being able to fix James Webb is to be able to grab it."

Our solar system started with a nudge, not a bang

* 19:00 24 May 2007

* news service

* Zeeya Merali

Our solar system came into existence with a nudge, rather than a bang, according to a meteorite analysis that rules out a popular theory for the formation of our planetary system.

Most astrophysicists believe that the solar system formed from a cloud of gas and dust when a nearby supernova exploded, compressing the dust and triggering the birth of the Sun and planets, says Martin Bizzarro of the University of Copenhagen in Denmark.

To investigate, Bizzarro and his colleagues looked for iron-60, an isotope produced by supernovae, in meteorites that formed during the first million years in the solar system's history. "To our great surprise, there was no iron-60, ruling out the supernova trigger mechanism," says Bizzarro.

The team found another isotope, aluminium-26, suggesting an alternative trigger. Aluminium-26 only forms in extremely massive stars, around 30 times the mass of the Sun, and such stars release a great amount of energy in winds loaded with aluminium-26, says Bizzarro. These winds could have buffeted the gas cloud, causing the solar system to form, he says.

There was also evidence of iron-60 in meteorites dating from a few million years later, suggesting that this massive star exploded at a later date, injecting iron-60 into the youthful solar system.

The team are now looking for evidence of other supernovae in our solar system's vicinity. "This could have been a very crowded and dynamic neighbourhood," says Bizzarro. Journal reference: Science (vol 316, p 1178)

National Briefing | Science and Health

Report Seeks F.D.A. Regulation of Tobacco

A report from the Institute of Medicine, part of the National Academy of Sciences, urged Congress and the president to give the Food and Drug Administration the authority to regulate tobacco. The report also asked that the agency be authorized to enforce standards for nicotine reduction and to regulate claims by companies that their products reduce risk. The report said cigarettes contained carcinogens and other dangerous toxins and would be banned if federal laws did not exempt tobacco. A bill before Congress would give the F.D.A. regulatory authority, but the agency’s commissioner, Dr. Andrew C. von Eschenbach, expressed skepticism, saying that if nicotine levels were reduced, smokers would change their habits to maintain current levels. The report also called for higher tobacco taxes and a national ban on indoor smoking.

Drinking 4 or more cups of coffee a day may help prevent gout

Long-term study links increased coffee consumption to decreased risk of gout in men over age 40

Coffee is a habit for more than 50 percent of Americans, who drink, on average, 2 cups per day. This widely consumed beverage is regularly investigated and debated for its impact on health conditions from breast cancer to heart disease. Among its complex effects on the body, coffee or its components have been linked to lower insulin and uric acid levels on a short-term basis or cross-sectionally. These and other mechanisms suggest that coffee consumption may affect the risk of gout, the most prevalent inflammatory arthritis in adult males.

To examine how coffee consumption might aggravate or protect against this common and excruciatingly painful condition, researchers at the Arthritis Research Centre of Canada, University of British Columbia in Canada, Brigham and Women’s Hospital, Harvard Medical School, and Harvard School of Public Health in Boston conducted a prospective study on 45,869 men over age 40 with no history of gout at baseline. Over 12 years of follow-up, Hyon K. Choi, MD, DrPH, and his associates evaluated the relationship between the intake of coffee and the incidence of gout in this high risk population. Their findings, featured in the June 2007 issue of Arthritis & Rheumatism (), provide compelling evidence that drinking 4 or more cups of coffee a day dramatically reduces the risk of gout for men.

Subjects were drawn from an ongoing study of some 50,000 male health professionals, 91 percent white, who were between 40 and 75 years of age in 1986 when the project was initiated. To assess coffee and total caffeine intake, Dr. Choi and his team used a food-frequency questionnaire, updated every 4 years. Participants chose from 9 frequency responses – ranging from never to 2 to 4 cups per week to 6 or more per day – to record their average consumption of coffee, decaffeinated coffee, tea, and other caffeine-containing comestibles, such as cola and chocolate.

Through another questionnaire, the researchers documented 757 newly diagnosed cases meeting the American College of Rheumatology criteria for gout during the follow-up period. Then, they determined the relative risk of incident gout for long-term coffee drinkers divided into 4 groups – less than 1 cup per day, 1 to 3 cups per day, 4 to 5 cups per day, and 6 or more cups per day – as well as for regular drinkers of decaffeinated coffee, tea, and other caffeinated beverages. They also evaluated the impact of other risk factors for gout – body mass index, history of hypertension, alcohol use, and a diet high in red meat and high-fat dairy foods among them – on the association between coffee consumption and gout among the study participants.

Most significantly, the data revealed that the risk for developing gout decreased with increasing coffee consumption. The risk of gout was 40 percent lower for men who drank 4 to 5 cups a day and 59 percent lower for men who drank 6 or more cups a day than for men who never drank coffee. There was also a modest inverse association with decaffeinated coffee consumption. These findings were independent of all other risk factors for gout. Tea drinking and total caffeine intake were both shown to have no effect on the incidence of gout among the subjects. On the mechanism of these findings, Dr. Choi speculates that components of coffee other than caffeine may be responsible for the beverage’s gout-prevention benefits. Among the possibilities, coffee contains the phenol chlorogenic acid, a strong antioxidant.

While not prescribing 4 or more cups a day, this study can help individuals make an informed choice regarding coffee consumption. "Our findings are most directly generalizable to men age 40 years and older, the most gout-prevalent population, with no history of gout," Dr. Choi notes. "Given the potential influence of female hormones on the risk of gout in women and an increased role of dietary impact on uric acid levels among patients with existing gout, prospective studies of these populations would be valuable."

Aggressive treatment for whiplash does not promote faster recovery

Whiplash, the most common traffic injury, leads to neck pain, headache and other symptoms, resulting in a significant burden of disability and health care utilization. Although there are few effective treatments for whiplash, a growing body of evidence suggests that the type and intensity of treatment received shortly after the injury have a long-lasting influence on the prognosis. A new study published in the June 2007 issue of Arthritis Care & Research () examined whether the association between early types of care and recovery time shown in an earlier study was reproducible with whiplash compensated under tort insurance.

A previous study led by Pierre Côté, of the University of Toronto in Toronto, Canada, found that patients compensated under no-fault insurance had a longer recovery if they visited general practitioners numerous times and/or consulted chiropractors or specialists than if they just visited general practitioners once or twice. In the current study, the authors examined patterns of care for 1,693 patients with whiplash injuries who were compensated under tort insurance.

The results showed that increasing the intensity of care to more than 2 visits to a general practitioner, 6 visits to a chiropractor, or adding chiropractic care to general practitioner care was associated with slower recovery. "The results agree with our previous analysis in a cohort of patients compensated under a no-fault insurance scheme and support the hypothesis that the prognosis of whiplash injuries is influenced by the type and intensity of care received within the first month after injury," the authors state.

They note that effective care, if medically needed, improves the prognosis of patients and that practice guidelines recommend treatment shortly after the injury. However, it may be that doctors responding to pressure from patients use treatments, schedule follow-up visits and refer patients to specialists when not medically needed. "This in turn may lead to adverse outcomes and even prolong recovery by legitimizing patients’ fears and creating unnecessary anxiety," according to the authors. It is also possible that early aggressive treatment delays recovery by encouraging the use of passive coping strategies. "Reliance on frequent clinical care, a form of passive coping strategy, may have a negative effect on recovery by reinforcing the patients’ belief that whiplash injuries often lead to disability," the authors state. They cite another study that showed that whiplash patients who used coping strategies such as wishing for pain medication or believing that they couldn’t do anything to lessen the pain had a slower recover than those who did not use such strategies.

Unlike the previous study, the current one did not show a slower recovery for patients who consulted a general practitioner and a specialist. This suggests that the insurance system (tort versus no-fault) can affect the association between certain patterns of care and recovery because it may influence how patients perceive their medical needs, the pressure they put on clinicians to be referred, and how insurers require them to legitimize their injury. The authors conclude that further trials "are essential to understand the influence of health care provision in preventing or facilitating disability."

Adult stem cells from human cord umbilical cord blood successfully engineered to make insulin

GALVESTON, Texas -- In a fundamental discovery that someday may help cure type 1 diabetes by allowing people to grow their own insulin-producing cells for a damaged or defective pancreas, medical researchers here have reported that they have engineered adult stem cells derived from human umbilical cord blood to produce insulin.

The researchers announced their laboratory finding, which caps nearly four years of research, in the June 2007 issue of the medical journal Cell Proliferation, posted online this week. Their paper calls it "the first demonstration that human umbilical cord blood-derived stem cells can be engineered" to synthesize insulin.

"This discovery tells us that we have the potential to produce insulin from adult stem cells to help people with diabetes," said Dr. Randall J. Urban, senior author of the paper, professor and chair of internal medicine at the University of Texas Medical Branch at Galveston and director of UTMB’s Nelda C. and Lutcher H. J. Stark Diabetes Center. Stressing that the reported discovery is extremely basic research, Urban cautioned: "It doesn’t prove that we’re going to be able to do this in people — it’s just the first step up the rung of the ladder."

The lead author of the paper, UTMB professor of internal medicine/endocrinology Larry Denner, said that by working with adult stem cells rather than embryonic stem cells, doctors practicing so-called regenerative medicine eventually might be able to extract stem cells from an individual’s blood, then grow them in the laboratory to large numbers and tweak them so that they are directed to create a needed organ. In this way, he said, physicians might avoid the usual pitfall involved in transplanting cells or organs from other people — organ rejection, which requires organ recipients to take immune-suppressing drugs for the rest of their lives.

Huge numbers of stem cells are thought to be required to create new organs. Researchers might remove thousands of donor cells from an individual and grow them in the laboratory into billions of cells, Denner explained. Then, for a person with type 1 diabetes, researchers might engineer these cells to become islets of Langerhans, the cellular masses that produce the hormone insulin, which allows the body to utilize sugar, synthesize proteins and store neutral fats, or lipids. "But we’re a long way from that," Denner warned.

Denner said this research, which reflects a fruitful collaboration with co-authors Drs. Colin McGuckin and Nico Forraz at the University of Newcastle Upon Tyne in the United Kingdom, used human umbilical cord blood because it is an especially rich source of fresh adult stem cells and is easily available from donors undergoing Caesarian section deliveries in UTMB hospitals. "However," he added, "embryonic stem cell research was absolutely necessary to teach us how to do this."

Embryonic stem cells have been engineered to produce cardiac, neural, blood, lung and liver progenitor cells that perform many of the functions needed to help replace cells and tissues injured by many diseases, the paper notes. Among the insights into cell and tissue engineering gained from work with embryonic stem cells, it adds, are those "relevant to the engineering of functional equivalents of pancreatic, islet-like, glucose-responsive, insulin-producing cells to treat diabetes."

The researchers said they tested adult stem cells in the laboratory to ensure that they were predisposed to divide. Then they used a previously successful method in which complex signals produced by the embryonic mouse pancreas were used to direct adult stem cells to begin developing, or "differentiating," into islet-like cells.

As they grew these adult stem cells in the laboratory, the researchers conducted other tests in which the cells to be engineered showed evidence of a characteristic, or marker, known as SSEA-4 that was previously thought to exist only in embryonic cells. They also found that, just as embryonic cells have been shown to do, these adult stem cells produced both C-peptide, a part of the insulin precursor protein, and insulin itself. Confirming the presence of the C-peptide was especially crucial, the researchers suggested, because although insulin is often found in the growth media with which the cells are nurtured and is often taken up by such cells, the presence of the C-peptide proves that at least some of the insulin was produced, or synthesized, by the engineered cells.

Using drugs as weapons 'unsafe'

UK doctors fear public safety could be compromised by the growing interest of world governments in using drugs for law enforcement.

A report by the British Medical Association points to the example of the Moscow theatre siege of October 2002 where over 120 hostages died.

The Russian authorities had used a drug delivered through the air-conditioning system to end the siege.

Medics argue the innocent are inevitably harmed alongside criminals.

Indiscriminate

It is impossible to deliver the right drug in the right dose to the right individuals in a way that is both effective and does not cause significant deaths, the BMA's Board of Science concludes.

The anaesthetic drug used in Moscow killed one in six of those present in the theatre.

It warns that using powerful drugs in this way may constitute a violation of international conventions which prohibit the use of chemical weapons.

And future advances in drug development may spawn more sophisticated and sinister agents, it says.

Dr Vivienne Nathanson, head of science and ethics at the BMA, said their concern surrounded drugs that could poison and kill at the wrong dose, rather than less harmful agents used for riot control such as tear gas.

She explained: "It is disingenuous of governments to describe drugs as non-lethal - there is no difference between a drug and a poison except the dose.

"It is virtually impossible to control the amount of a drug delivered or to ensure it acts without producing toxic effects or causing death."

Vigilance

She said doctors needed to be aware that their medical knowledge might be called upon for the development of drugs for military purposes, as well as antidotes and treatments.

She urged medics to advocate against the use of drugs for law enforcement and not be involved in the training of military or law enforcement personnel in the administration of drugs as weapons.

According to the report, some experts in some countries, including America and China, are pushing for legislation to allow the use of chemical weapons beyond the current narrow definition of riot control.

Dr Nathanson said: "It is absolutely essential that we do not allow an extension of the use of chemical weapons or a re-writing of the law that bans them. If we do, that will put all of us at risk."

'Living plugs' smooth ant journey

A scientific study of the teamwork of army ants has discovered how they are prepared to let their fellow ants walk all over them to get the job done.

Scientists from the University of Bristol observed that, when ants were foraging on rough terrain, some of them used their own bodies to plug potholes.

They even chose which of them was the best fit to lie across each hole.

The flatter surface provided the rest of the group, which can number 200,000, a faster route between prey and nest.

The research, published in the journal of Animal Behaviour, said that the team first noticed the army ants' ( Eciton burchellii ) unusual behaviour in the insects' native rainforest home in Panama.

To investigate this further, the researchers inserted wooden planks, drilled with a variety of different sized holes, into the army ants' trails.

They found that the ants did indeed plug the holes, but the team also discovered that individuals would size-match themselves to a hole for the best fit.

The ants plug gaps to smooth the trail

Wobbling about

"The ants have a very large size range within their colony, measuring from 2mm up to 1cm (0.08-0.4in)," explained Dr Scott Powell, a biologist at the University of Bristol and an author of the paper.

"When the ants bump into a hole they cannot cross, they edge their way around it and then spread their legs and wobble back and forth to check their fit.

"If they are too big, then they carry on and another ant will come along and measure itself in the same way. This carries on until an appropriately sized ant plugs the hole."

At this point, Dr Powell told the BBC News website, the ant becomes a "living surface" remaining in place for hours at a time while thousands of foragers walk back and forth across the trail.

"At the end of the day, when the traffic eventually diminishes, the ant that forms this motionless plug will detect that and pop out of the hole and run home," Dr Powell said.

The scientists found ant-plugged smoother surfaces speeded up the route from prey to nest and also increased the daily prey intake, which for army ants consists of other species of ants and other bugs.

Dr Powell said: "Broadly, our research demonstrates that a simple but highly specialised behaviour performed by a minority of ant workers can improve the performance of the majority, resulting in a clear benefit for the society as a whole."

Co-author Professor Nigel Franks, also from the University of Bristol, added: "I think every road user who has ever inwardly cursed as their vehicle bounced across a pothole - jarring every bone in their body - will identify with this story.

"When it comes to rapid road repairs, the ants have their own do-it-yourself highways agency."

Tracks in ancient lake show dinosaurs swam

Twelve footprints found in the bed of an ancient lake in northern Spain have thrown up the first compelling evidence that some land dinosaurs could swim, say researchers.

The 15-metre-long track (49 feet) in sandstone "strongly suggests a floating animal clawing the sediment" as it swam against a current, they say. The creature is believed to have been a therapod – a large family of carnivorous dinosaurs - that lived in the Early Cretaceous, 125 million years ago.

The track sequence in the former lakebed consists of six asymmetrical pairs of two or three S-shaped scratch marks. Each set measures about 50 centimetres (20 inches) in length and 15 cm (6in) wide (see image, right).

The prints paint a picture of a large, buoyant dinosaur whose clawed feet raked the sediment as it swam in a depth of about 3.2 metres (10.4 feet) of water, according to the team. Ripple marks on the surface of the rocks indicate the dinosaur was swimming against a current, and struggling to maintain a straight path, they say.

"The dinosaur swam with alternating movements of the two hind limbs, a pelvic paddle swimming motion," says Loïc Costeur, one of the team at the University of Nantes, France. "It is a swimming style of amplified walking, with movements similar to those used by modern bipeds, including aquatic birds."

Scratch marks made by a swimming theropod. A to D are left-side claw prints, E to H are right-side claw prints. The marked asymmetry suggests the dinosaur was probably struggling to maintain a straight path. The scale bar is 10 centimetres (Image: Loic Costeur)

New niches

The question as to whether dinosaurs could swim has been debated for years. But until now, no conclusive evidence had come to light.

Previously discovered fossils showed swimming tracks apparently left by other dinosaurs such as sauropods - long-necked animals like Diplodocus. But some of these have been disputed and were not as detailed as the new ones.

The new tracks provide the first definitive evidence of active swimming behaviour in dinosaurs and are the best record of swimming by theropods, the researchers say.

Asked to speculate as to which theropod may have made the tracks, Costeur cautiously pointed to the allosaurus - a bipedal carnivorous dinosaur with a large skull balanced by a long, heavy tail. Some allosauruses could reach more than 10 metres (32 feet) in length.

The discovery opens up new avenues in dinosaur research, said Costeur. Computer modelling will be able to reveal more about anatomy and biomechanics, "as well as our view of the ecological niches in which they lived."

The theropod swam on the shores of a Cretaceous lake at Cameros, Spain (Image: Guillaume Suan, University Lyon)

The Virgen del Campo tracks are located at the Cameros Basin in La Rioja, Spain, at the site of a delta to a former lake. The basin is already known as a treasure trove of footprints of walking theropods.

Journal reference: Geology (vol 35, p 507)

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download

To fulfill the demand for quickly locating and searching documents.

It is intelligent file search solution for home and business.

Literature Lottery

Related searches