Tuesday, August 30, 2011

News in Brief: Genes & Cells

A family without fingerprints and the long-term harm of sleep skimping in this week’s news Web edition : Friday, August 19th, 2011

No fingerprints
Members of a Swiss family who lack fingerprints have a mutation in a gene called SMARCAD1, Eli Sprecher of Tel Aviv University in Israel and colleagues report. The mutation, which also reduces the density of sweat glands in the hands, leads to a problem with a version of the gene’s protein that is made only in the skin, the team writes in the August 12 American Journal of Human Genetics. The researchers don’t yet know whether the gene is involved in creating the pattern of fingerprints or just in building the skin ridges that make up the print. —Tina Hesman Saey

Chronic sleep loss can cause permanent harm
Repeatedly skimping on sleep can add up to permanent health damage, a new study in rats suggests. Carol Everson and Aniko Szabo of the Medical College of Wisconsin in Milwaukee restricted rats’ sleep in repeated 10-day bouts, with two days in between to rest up. The rats lost weight despite chowing down on far more food and water than usual. The animals’ hormones were messed up, and they developed other problems as well, the researchers report August 11 in PLoS ONE. Even after four months of recovery, the rats still had hormone imbalances and were eating 20 percent more and drinking 35 percent more than rested rats. —Tina Hesman Saey


Found in: Genes & Cells

View the original article here

Monday, August 29, 2011

Antidepressants show signs of countering Alzheimer’s

Mice and human data link treatment to less plaque in the brainWeb edition : Monday, August 22nd, 2011

Widely used antidepressants may reduce the ominous brain plaques associated with Alzheimer’s disease, a new study in mice and humans finds.

Brain scans of people who have taken antidepressants reveal fewer clumps of the protein amyloid-beta, a target of Alzheimer’s prevention strategies, when compared with people who have not taken the drugs.

Many in the field voiced caution about the results. But if borne out by further study, the findings may point to a new, relatively safe way to treat and prevent Alzheimer’s disease, which is the sixth leading cause of death in the United States.

“I think this is a wonderful piece of news, and I think there’s going to be a lot of excitement about this,” says internist Michael Weiner, who leads the Alzheimer’s Disease Neuroimaging Initiative at the Veterans Affairs Medical Center campus of the University of California, San Francisco. “It points the way towards a possible approach to treating Alzheimer’s disease that people have not been talking about very much.”

In the study, mice genetically engineered to overproduce amyloid-beta, or A-beta, were given one of three selective serotonin reuptake inhibitors, a class of antidepressants that boost circulating levels of the chemical messenger serotonin in the brain. After a single dose of the antidepressants, A-beta levels dropped in the fluid that surrounds mouse brain cells, researchers report online the week of August 22 in the Proceedings of the National Academy of Sciences. A full day after receiving the drug, the mice’s A-beta levels fell by nearly a quarter.

Long-term, chronic administration of the drug had a larger effect. Engineered mice that took the SSRI citalopram for four months had about half the A-beta plaques in their brains as mice that hadn’t had the drug. This reduction seems to happen through a protein called ERK, which serves as the middleman between brain cells’ serotonin-sensing proteins and A-beta production.

Figuring out the details of this process may open the door for developing new ways to prevent A-beta buildup, says study coauthor John Cirrito of the Washington University School of Medicine in St. Louis.

To see if a similar effect might be happening in people, the scientists scanned the brains of 186 cognitively normal elderly people and looked for signs of A-beta plaques. The team used a compound called PIB that binds to big clumps of A-beta in the brain and glows on a PET scan.

Of these participants, 52 reported that they had taken an antidepressant in the last five years. These people, researchers found, had about half the A-beta load in their brains as the people who hadn’t taken an antidepressant. What’s more, the length of time the participants took the drugs correlated with the density of A-beta plaques in the brain — the longer the antidepressant dose, the less plaque.

“We think there are influences going in two opposite directions,” says study coauthor and psychiatrist Yvette Sheline, also of Washington University. “We think depression pushes you toward dementia, but antidepressant treatment pushes you toward protection.”

Finding similar results in mice and humans lends the study credibility, Weiner says. “When you have animal data and human data coming together, then you start to get really excited,” he says.

Still, Weiner and others caution that it would be premature to conclude that antidepressants protect against A-beta buildup or that fewer plaques necessarily translate into less disease.

The study uncovered an association — not a clear-cut cause and effect, Weiner notes. “We cannot say with certainty that the reason why people who took the SSRIs have lower cortical amyloid is due to the fact that they took SSRIs,” he says.

And molecular neuroscientist Heather Snyder of the Alzheimer’s Association in Chicago points out that even if antidepressants are shown to reduce A-beta, scientists still don’t know how A-beta levels affect the brain. “We don’t really know what modulating amyloid will do to cognition,” she says. “And we don’t know if we need to reduce it by 10 percent or 20 percent, or if it needs to be completely reversed.”

Another confounding factor is that A-beta can take several forms in the brain, from small molecules to large, sticky clumps, and some forms may be more dangerous than others. Interpreting the A-beta clumps that PIB detects in human brain scans remains challenging.

“We’re being very cautious,” Cirrito says. “There are a lot of people on these drugs and we don’t want to get anybody overly excited without reason.” He and his colleagues plan to test whether acute doses of SSRIs change A-beta levels in the cerebrospinal fluid of healthy human subjects.

Even if the new findings are replicated in larger studies, a major question about Alzheimer’s and antidepressants remains, Sheline says. “The real question is — which this paper sheds no light on — does that mean that long-term, they [SSRI-treated people] will have less of a risk of dementia? And that’s exactly the big study that needs to be done.”


Found in: Body & Brain and Genes & Cells

View the original article here

News in Brief: Molecules/Matter & Energy

Metamaterial warp drives, secrets of coffee rings and more in this week's newsWeb edition : Sunday, August 21st, 2011

Toy warp drive proposed
A physicist who previously created a toy universe out of metamaterials that bend light in unusual ways has now dreamed up a way to use those materials to build a warp drive. Simulations by Igor Smolyaninov at the University of Maryland in College Park show that moving faster than light is still impossible in toy universes, just as it is in the real one. But the right material should allow superspeed travel at up to one-quarter light speed — by riding in a moving spacetime bubble, he reports in an upcoming issue of Physical Review B. —Devin Powell.

Butterflies sip like sponges
To sip nectar, a butterfly uses a proboscis that looks like straw but also works like a sponge. At the small scales of butterfly existence, liquid is just too thick to be slurped, a team led by researchers at Clemson University in South Carolina reports. X-ray images of butterfly feeding tubes reveal pores that draw fluid upward by capillary action, the same process that pulls water through a paper towel. This anatomy may help butterflies dine on a greater variety of foods, and its principle could be borrowed to design probes that sample the liquid inside cells, the researchers report online August 17 in the Journal of the Royal Society Interface. —Devin Powell.

Coffee ring regulation
Starbucks tables everywhere rejoice: There’s a way to prevent the coffee ring effect. The quiet drying of spilled coffee on a countertop produces a crusty edge because round particles flee in a frenzy from the center of the spill. But oblong particles in the liquid can’t flee that well and end up spreading out uniformly, University of Pennsylvania scientists report in the Aug. 18 Nature. Upping the number of elongated particles compared to spherical ones can prevent ring formation, suggesting that particle shape influences fluid interactions. The finding could help in designing better inks, paints and even foams and lotions.—Rachel Ehrenberg


Found in: Matter & Energy and Molecules

View the original article here

Sunday, August 28, 2011

News in Brief: Body & Brain

Leukemia gene therapy, the brain tickle of beautiful voices and more in this week's newsWeb edition : Thursday, August 18th, 2011

Gene therapy for leukemia
Tweaking immune cells to attack cancer cells in leukemia patients can bring about remission, a small study shows. Scientists at the University of Pennsylvania genetically altered immune T cells to target malignant cells in chronic lymphocytic leukemia patients and mass produced the T cells before injecting them into three patients. The modified cells gravitated to bone marrow, where they killed malignant cells. In two of three patients tested the cancer went into remission, and a portion of the genetically modified T cells persisted, possibly as a cadre of defenders on standby. The researchers report the findings in the Aug. 10 Science Translational Medicine. —Nathan Seppa

Gout drug lessens flares
A drug approved in 2010 for treatment of gout has proved its worth, reducing symptoms over six months in many people who had failed to get relief from standard medications. The routine therapy for gout fails in about 3 percent of the 5 million to 6 million people in the United States who have it. A drug called pegloticase, which is given by intravenous infusion over two hours, gained approval last year for chronic gout. Two clinical trials in the United States, Canada and Mexico of people who had failed to improve? on standard drugs now show that of 65 of 169 patients getting pegloticase (Krystexxa) every two or four weeks for six months had reduced uric acid in the blood, a standard measure. The biweekly group had better symptom reduction. —Nathan Seppa

Live longer with exercise
Just 15 minutes of moderate daily exercise seems to extend life. A team of U.S. and Taiwanese researchers kept track of physical activity levels in more than 400,000 adults age 20 and older in Taiwan using questionnaires. Compared with sedentary people who didn’t exercise, those putting in 92 minutes a week — 15 minutes a day on average — were 14 percent less likely to die over an average follow-up period of eight years. The benefits applied to both sexes and to all age groups, the researchers report online August 16 in the Lancet.  —Nathan Seppa

Granddaddy’s stress changes grandson’s brain
What your grandfather experienced in the womb may change your brain. Grandfather mice who were stressed out in utero went on to produce grandsons with more feminine brains, a study in the Aug. 17 Journal of Neuroscience shows. In male descendents of stressed-out grandfathers, genes important for brain development switched their behavior to become more like the gene activity in female mice’s brains. These results may offer a way to link stress and neurological disorders that strike males and females differently, such as autism spectrum disorders, Christopher Morgan and Tracy Bale of the University of Pennsylvania write in the study. —Laura Sanders


Beautiful voices tickle the brain
From Laura: Attractive voices tickle the part of the brain that normally handles visual input, a new study finds. In the study, participants listened to different voices saying “had” and later rated how attractive the voices were. Voices rated more attractive were associated with greater brain activity in a region near the part of the brain that responds to faces, an international team of scientists reports in an upcoming Cerebral Cortex. That the brain detects and responds to vocal beauty suggests that people may be tuned in to hidden, nonverbal forms of communication. —Laura Sanders

Why autistic brains confuse pronouns
The brains of people with autism behave abnormally when grappling with pronouns such as “you” and “I.” While answering a question that contained the word “you,” adults with autism had a weaker connection between two key brain regions than unaffected participants, Akiko Mizuno of Carnegie Mellon University in Pittsburgh and colleagues report in an upcoming Brain. This weak brain connection vanished when the question omitted pronouns and instead used people’s names. The results help explain why children with autism often have trouble with the concept of self-identity, sometimes referring to themselves as “you.” —Laura Sanders


Found in: Body & Brain

View the original article here

News in Brief: Humans

Prehistoric assembly lines, a trigger for riots and more in this week's newsWeb edition : Monday, August 22nd, 2011

Early start for advanced tools
Large-scale production of sophisticated stone tools, using standardized assembly steps, emerged a surprisingly long time ago. Between 400,000 and 200,000 years ago, unidentified members of the genus Homo regularly made slender, sharp-edged blades and other animal-butchery implements at Qesem Cave in what’s now Israel, say archaeologist Ron Shimelmitz of Tel Aviv University and his colleagues. Analyses of thousands of Qesem stone artifacts, and experimental reproductions of these finds, indicate that the implements were made in stages requiring as much planning as Neandertal tools that didn’t appear until 200,000 years ago, the researchers will report in the Journal of Human Evolution.  —Bruce Bower

A fate worse than death
Patients in persistent vegetative states often get tagged as having less mental capacity than the dead. Ending up in biological limbo is also regarded as a fate worse than death, say psychologist Kurt Gray of the University of Maryland in College Park and his colleagues. Religious and non-religious participants alike attributed less mental activity to vegetative patients than to the dead, due to afterlife beliefs and a tendency to assume that deceased but unseen people still have minds, the scientists will report in Cognition. Religious people also usually advocated life support for vegetative patients despite regarding such states as worse than death. —Bruce Bower

Food fights
Violent protests in North Africa and the Middle East in 2008 and 2011 coincided with large spikes in global food prices, a new study shows. The analysis, which spans January 1990 to May 2011, suggests that high global food prices are a precipitating condition for social unrest, say Yaneer Bar-Yam and colleagues at the New England Complex Systems Institute in Cambridge, Mass. Their study, posted at arXiv.org on August 11, also calculates a food price threshold above which vulnerable populations typically find themselves in desperate straits. This indicator could help guide policy interventions.—Rachel Ehrenberg


Found in: Humans

View the original article here

Saturday, August 27, 2011

Science & the Public: Blacks far less likely than whites to land NIH grants

Among minority scientists applying for National Institutes of Health research grants, blacks alone face a substantially lower likelihood of being successful than whites, a new study finds. This investigation, which was prompted by the research agency itself, will catalyze further probes and a host of changes, promises NIH director Francis Collins.

The findings emerge from a study published online Aug. 18 in Science. Donna Ginther of the University of Kansas in Lawrence and her colleagues probed the success rates of Ph.D. investigators working at U.S. institutions in winning NIH grants between 2002 and 2006. Over that time, more than 40,000 individuals submitted a total of 83,188 applications. Each hoped for the opportunity to dip deeply from a pool of funds averaging about $1.4 billion annually.

Researchers who described themselves as Hispanic had grant success rates comparable to whites. Initially, Asians appeared to have a 4 percentage-point lower success rate than whites. But when Ginther’s group reanalyzed the data, restricting the applicants to U.S. citizens, the Asian disadvantage vanished. (Says Collins, this suggests that applicants who didn’t fare well here might have been non-native English speakers with a language barrier to articulating their ideas cogently.)

But nothing erased the black disadvantage, Ginther’s team found.

“These data are deeply troubling,” Collins says. “Even after controlling for all of the factors that were considered to be important for predicting success” — like education and NIH training, the labs in which they ended up working, the number of research papers they’ve authored — “black applicants were 10 percentage points less likely than white applicants to receive research grants.”

Numerically, he told reporters, that translates to about 27 percent of white researcher’s grant applications winning funding, compared to only about 17 percent of those submitted by blacks.

“It is certainly a situation that we all agree is unacceptable and requires intervention,” Collins says.

On the drawing board
He and Lawrence Tabak, the principal deputy director of NIH, broadly outline an action plan to deal with the problem in a report that also appears online in Science. Details, however, await the findings of a pair of new advisory groups that Collins created.

Tabak co-chairs one of these: a diversity in biomedical research working group. He reports that in its initial teleconference, the group discussed how to probe obstacles to the recruitment of first-rate applications from minority scientists and to ensure that those proposing the best science have an equally high chance of landing NIH funds.

“This is a committee that is not designed to produce a glossy report that will sit on shelves,” Tabak says, but rather to provide Collins “tangible action items” by next June.

One issue NIH plans to take on immediately: how to adequately “blind” grant reviewers to applicants so that any chance of bias is minimized — without also eliminating important details that might reflect the quality of a scientist’s research or resources.

NIH already strips off a researcher’s stated race or ethnicity from a grant proposal before reviewers judge its merit. However, names and the background of the principal investigator remain — which in some instances may offer clues. Such as a first name like Kwame or L’Shaniqua, or perhaps reference to an applicant having attended a historically black college.

In an experiment set to begin soon, Collins promised to investigate whether further blinding is needed. His agency will simultaneously subject some collections of grant applications to two review panels, then compare their results. One panel will receive applications that contain the same information as in the past; the other will receive proposals from which all names and identifying characteristics have been removed.

Boosting help, enriching opportunities
While the magnitude of the black disadvantage might have surprised NIH, the agency has long known that blacks play a bit role in biomedical research. They constitute 10.2 percent of the U.S. population, Collins notes, yet only 1.2 percent of researchers leading projects financed by NIH (through investigator-led — or RO1 — grants).

The newly quantified disadvantage highlighted by Ginther’s team might reflect weaker training of black researchers on how to craft a winning application, less access to mentoring on research design, some subtle bias on the part of grant-review committees — or a combination of all these. Collins vows his new advisory groups will look into each.

For instance, not all grant applications are scored for quality. Those that reviewers feel will fall within the lower half are just rejected.

Ginther’s group now reports that among grant proposals that were scored highly, race and ethnicity had no impact on funding success. This suggests that blacks may face some disadvantage in drafting a compelling proposal.

Collins now promises NIH will soon offer extra assistance to inexperienced grant applicants and be “supporting innovative approaches to encourage more extensive and effective local mentoring of junior faculty” by the academic institutions at which they work. He adds that his new advisory groups will also consider how NIH might encourage researchers who weren’t successful the first time around to reapply for a grant (since most successful applicants have to try a few times before they land a grant — and blacks are less likely to do that).

Sitting in on grant deliberations should help young faculty of all ethnic backgrounds get a better taste of how research proposals are judged and what’s most likely to wow their peers. Scientists recruited to review grants, however, tend to be older, more experienced individuals. Collins aims to change this by actively recruiting “promising junior faculty” to become reviewers: “We aim to have 50 of these early career reviewers assigned to each of NIH’s three rounds of grant reviews in the 2012 fiscal year, which begins Oct. 1."

At a briefing for reporters, Tabak noted that NIH has already begun sharing what it’s learned with officials at the National Science Foundation, Department of Defense, Energy Department and Education Department. From those discussions, he says, it sounds like these agencies are also open to investigating whether racial or ethnic disparities may exist among applicants for their grants.


Found in: Biomedicine and Food Science

View the original article here

News in Brief: Atom & Cosmos

Getting to supernova
White dwarf stars often don’t waltz together when they die. A new survey of 41 “Type Ia” supernovas, the explosions these dwarfs die in when they suck too much material off another star, found sodium gas flowing away from many of the explosions. But white dwarfs are mostly made of carbon and oxygen, so the sodium gas was probably thrown off by an ordinary or giant star. That suggests that the dwarfs weren’t eating their own kind. Knowing how Type Ia supernovas form is important because astronomers use their brightness to help measure cosmic distances. An international team of researchers reports the discovery in the Aug. 12 Science. —Alexandra Witze

Sunspots rising
Magnetic fields lurking deep beneath the sun’s surface could signal the imminent emergence of sunspots. Using the Solar and Heliospheric Observatory satellite, scientists have spotted such fields, much stronger than models had predicted, up to 65,000 kilometers below the surface. Over the course of a day or two, the magnetic disturbances rise upward and eventually trigger the formation of sunspots. Knowing sunspots are coming could help people better prepare for telecommunications and other outages caused by space weather, Stathis Ilonidis of Stanford University and colleagues write in the Aug. 19 Science. —Alexandra Witze


Found in: Atom & Cosmos

View the original article here

Butterfly species a master of disguise

access Tasty Heliconius numata butterflies (right) mimic the wing patterns of foul-tasting Melinaea butterflies (left) to avoid getting eaten. Genetic tricks using a supergene allow the butterflies to faithfully masquerade as other species. © Mathieu Joron

The tasty Heliconius numata butterfly evades predators by copying the wing patterns of foul-tasting Melinaea butterflies. H. numata, also known as the passion-vine butterfly, has to get the pattern exactly right or a sharp-eyed bird will spot the fake and gobble it up.

Born mimics, members of the species lock in wing patterns with flipped-around bits of DNA, Richard ffrench-Constant of the University of Exeter in England and colleagues report online August 14 in Nature.

The flipped DNA causes six or more genes — on a section of a chromosome important in setting wing patterns in butterflies and peppered moths — to be inherited as a single unit, a supergene. Different versions of the super­gene allow H. numata to adopt seven different wing patterns reminiscent of several bad-tasting species.

Other Heliconius butterflies mimic only one nonpalatable species. Researchers aren’t sure if other butterfly species use DNA flipping to determine their patterns. 


Found in: Genes & Cells and Life

View the original article here

Friday, August 26, 2011

Stress spears deployed service personnel

LAS VEGAS — Soldiers fighting at the tip of the spear — the leading edge of combat — confront fighting, suffering and dying. But the success of those soldiers’ operations depends on a huge network of service and support personnel who themselves face considerable and often overlooked war stress, says military sociologist Wilbur Scott of the U.S. Air Force Academy in Colorado Springs.

After returning from one or more deployments, National Guard combat service personnel — including clerks, truck drivers, medics and supply officers — displayed slightly less emotional resilience and described having experienced more stress while overseas and after returning home than their comrades engaged in combat, Scott reported August 20 at the annual meeting of the American Sociological Association.

In particular, combat service personnel cited deployment stress triggered by exposure to danger, life-threatening situations and death.

Their responses reflect the changed nature of warfare, Scott suggested. In Iraq and Afghanistan, counterinsurgency efforts have replaced conventional warfare. “While those in combat arms typically are thought of as being at the tip of the spear, this thinking applies more accurately to conventional settings rather than those encountered in Iraq and Afghanistan,” Scott said.

Combat units not only fight and kill but establish relationships with local officials, head local building projects and encourage trust in local governments. Service personnel work in the midst of operations, where they can encounter guerilla attacks or roadside bombs.

Poor coping upon returning from National Guard deployment — whether among former service personnel or combat troops — usually involved excessive alcohol drinking, abuse of prescription drugs and carrying a gun for protection, Scott said. Those behaviors are a potentially deadly mix.

Such findings underscore the need to provide programs that ease veterans back into civilian life, commented military sociologist Bradford Booth of ICF International, a private research and consulting firm in Fairfax, Va. Booth directed a pilot study of a government-funded program for 4,000 National Guard members returning home in 2007 from a year in Iraq.

In that study, those who attended three once-a-month training sessions — which focused on mental health issues, strengthening marriages, financial counseling and other topics — displayed signs of adjusting better than vets who declined to attend training.

Although traditionally regarded as “weekend warriors,” National Guard volunteers now regularly get deployed for up to one year in Iraq and Afghanistan.

Scott and his colleagues surveyed 1,460 Army National Guard soldiers, including 969 deployed solely to war zones in Afghanistan or Iraq.

Whether combat or service personnel, soldiers returning from war zones were most likely to report having grappled with and found some meaning in their military experiences. Yet they also reported the most symptoms of post-traumatic stress disorder and the worst difficulties readjusting to civilian life.

“Experiencing personal growth from going through tough times doesn’t mean you’re doing well,” Scott said.


Found in: Humans and Science & Society

View the original article here

Lager's mystery ingredient found

access Galls — growths resulting from a fungal infection — grow on southern beech trees in Northern Patagonia. Scientists recently found a wild yeast that gave rise to a hybrid yeast used to brew lagers in such galls on trees growing in and near two national parks in Argentina.Diego Libkind

Lager beers got their start in Bavaria, but it was a little South American spice that really kicked things off.

Scientists have known for decades that a hybrid species of yeast, Saccharomyces pastorianus, is the microbe that ferments lagers. It’s also well known that one parent of S. pastorianus is the common baking and brewing yeast Saccharomyces cerevisiae. But the other parent of lager yeast has eluded scientists, who have scoured Europe and North America looking for it.

Turns out they were looking in the wrong hemisphere. An international team of researchers led by Chris Todd Hittinger of the University of Wisconsin–Madison and Diego Libkind of the Argentinean National Council for Scientific and Technical Research in Bariloche has tracked the missing wild parent of lager yeast to the beech forests of Patagonia. The researchers report the capture of the newly discovered yeast, dubbed Saccharomyces eubayanus, online the week of August 22 in the Proceedings of the National Academy of Sciences.

“I got chills reading [about] it, I was so excited,” says Barbara Dunn, a comparative geneticist at Stanford University who has been on the trail of the wild yeast herself. “It is incredibly surprising that it is from Patagonia.”

Libkind found S. eubayanus in galls — pale peach, balloonlike structures resulting from fungal infections — on Patagonian beech trees. The galls, full of sugar, house S. eubayanus and another wild yeast that ferment the sugars. It’s generally chilly in Patagonia, just the way lager yeast like it. Lagers are brewed at 4° to 9° Celsius (39° to 48° Fahrenheit).

The S. cerevisiae yeast used for making ales, wines and other alcoholic beverages don’t like the cold, preferring temperatures of about 15° to 25° C (59° to 77° F). So when Germans started brewing beers in the winter to avoid summertime contaminants such as molds, bacteria and other things that skunk beer, the new brewing conditions would have favored the creation of a hybrid lager yeast based on S. cerevisiae and a cold-loving relative, says Antonis Rokas, an evolutionary biologist at Vanderbilt University.

The surprise is where that partner came from. Bavarian brewers started making cold-brewed beers in the 15th century, before Columbus crossed the Atlantic (although the lagers didn’t have a big breakout in popularity until much later). S. eubayanus probably hopped a ship for Europe sometime early in the 16th century. The researchers aren’t sure exactly how S. eubayanus got to Bavaria; perhaps by hitching a ride on pieces of beech wood or barrels made of beech, or on fruit or even in the belly of a fruit fly. However it got to Europe, when S. eubayanus arrived it found a ready-made niche and a partner to merge with, Rokas speculates.

It’s also possible that S. eubayanus lives or lived in some forgotten pocket of the Old World, Hittinger says. “Obviously we haven’t searched every habitat on the entire globe.”

Knowing the identity of the wild parent may help scientists learn how the lager hybrids formed and how domestication genetically changed the yeast, Rokas says. Brewers may also be able to create new hybrid strains that can be tailored for modern brewing practices.


Found in: Genes & Cells

View the original article here

Thursday, August 25, 2011

Spoilers freshen up stories

People who read the last page of a mystery novel first may be on to something. Giving away plot surprises generally makes readers like stories better, say psychology graduate student Jonathan Leavitt and psychologist Nicholas Christenfeld, both of the University of California, San Diego.

Volunteers especially enjoyed classic short stories, including mysteries and tales with ironic twists, after seeing spoiler paragraphs that revealed how the yarns ended, Leavitt and Christenfeld report in a paper published online August 12 in Psychological Science.

“Spoilers may enhance story enjoyment by making texts easier to read and understand, leading to deeper comprehension, or they may reduce readers’ anxiety about what’s to come, allowing them to focus on a story’s aesthetic details,” Leavitt says. These responses could explain why a favorite book can be read many times with undiminished pleasure, he suggests.

It’s also possible that spoilers amplify stories’ appeal by increasing tension, Leavitt adds. Giving away the ending of, say, Oedipus Rex may elicit pleasurable tension as a reader contemplates the title character marching unknowingly to his doom.

“The impact of suspense on enjoyment seems likely to be more complicated than the simple take-home point of this new article,” remarks psychologist Leigh Ann Vaughn of Ithaca College in New York. Spoilers may apply a pleasant oomph to well-told tales but could easily intensify readers’ distaste for unappealing or boring stories, Vaughn suggests.

Leavitt and Christenfeld recruited 819 college students to read short ironic-twist tales, mysteries and literary stories. For each story, the researchers created a spoiler paragraph that revealed the outcome in a seemingly inadvertent way. Data from students who had previously read these stories were excluded.

Each volunteer read one story after reading a spoiler beforehand, a second story with the spoiler incorporated as the opening paragraph and a third narrative with no spoiler.

Some spoilers in the new study revealed ironic plot twists, such as mentioning that a condemned man’s apparent escape from hanging is just a momentary fantasy, or they demystified crimes, such as divulging that an apparent target of attempted murder turned out to be the perpetrator.

Spoilers also spiced up subtler stories. One paragraph overtly disclosed that a teenage boy and girl watching a couple struggle with a baby were glimpsing their own future, while the couple relived their own past upon seeing the teens.

Overall, participants reported liking all stories best after first reading spoilers. Spoilers incorporated into the beginning of the text had no effect on story enjoyment, yielding no more pleasure than unspoiled stories.

That result may be due to peoples’ general belief that spoilers ruin stories. So giving away outcomes in a way that seems unintended — as in artfully worded prefatory paragraphs or book reviews — may be necessary to enrich reader satisfaction, Leavitt proposes.

Birthday presents wrapped in cellophane, like stories preceded by spoilers, would give recipients an unexpected thrill, he predicts. “In both cases, the outcome is known, yet there is still pleasure in unraveling the clues, or the plastic wrap, as the case may be,” Leavitt says.


Found in: Humans and Psychology

View the original article here

News in Brief: Earth & Environment

Antarctic ice flows, atmospheric response to nuclear fallout and more in this week's news Web edition : Saturday, August 20th, 2011 access Moving Antarctic ice forms a tributary system across the continent in this map based on satellite radar.Science/AAAS

Icy flows

A new map of Antarctica assembled from satellite radar data reveals how ice shifted around on the frozen continent between 2007 and 2009. Ice flowing through narrow channels forms a river-like tributary system that accounts for much of the movement, a team led by Eric Rignot of the University of California, Irvine reports online August 18 in Science.  Researchers want to better understand how great ice sheets like Antarctica’s respond as global temperatures rise. —Alexandra Witze

Arctic may get breather in sea-ice losses

Roughly half of the recent loss of summertime Arctic sea ice appears to be due to greenhouse gas emissions as a result of human activity, the rest from natural climate variability, report scientists at the National Center for Atmospheric Research in Boulder, Colo. Their new analyses also indicate that even as global warming continues, summer Arctic sea ice losses could pause. “We could see a 10-year period of stable ice or even an increase,” says lead author Jennifer Kay. Her team’s analyses appeared online August 11 in Geophysical Research Letters. —Janet Raloff

Japanese fallout left electric signature

Radioactive fallout from the tsunami-crippled Fukushima reactors in Japan caused a rapid and dramatic response in the atmosphere’s electric field. The effect, brought about by a change in the atmosphere’s charge, showed up 150 kilometers southwest of the nuclear facility. Similar changes have occurred elsewhere after nuclear weapons tests and the Chernobyl accident, usually as rain washed fallout from the air. The Japanese radiation signature instead marked the arrival of fallout carried by dry low-altitude winds, scientists from Japan and Sweden reported online August 12 in Geophysical Research Letters. They conclude that early-warning electrical sensors be installed around all reactors to detect fallout. —Janet Raloff


Found in: Earth and Environment

View the original article here

Wednesday, August 24, 2011

Magnitude 5.8 earthquake hits Virginia

access A map released by the USGS reveals the epicenter of Tuesday’s earthquake in Virginia (red star), as well as areas of "very strong" shaking with the potential for "moderate" damage (orange). Less intense shaking fades to turquoise.USGSThe seismically sleepy mid-Atlantic states were struck by a magnitude 5.8 earthquake on August 23. At 1:51 p.m. EDT on Tuesday an earthquake struck Virginia, 135 kilometers southwest of Washington, D.C. It’s among the largest temblors to strike the state in recorded history.

The quake’s epicenter was located in central Virginia about 61 kilometers northwest of Richmond, at a preliminary estimated depth of 6 kilometers. Reports of the quake were posted on the U.S. Geological Survey’s Did You Feel It? website from as far away as Ohio and New York. That’s not unusual for earthquakes on the East Coast, which are commonly felt over a wide area.

“An event of the same size is felt over a much larger area in the East, compared to the West,” says Charles Ammon, an earthquake seismologist at Pennsylvania State University in University Park.

Vibrations from an 1897 temblor centered on Virginia’s Giles County, magnitude 5.9, reached Georgia and Pennsylvania and as far westward as Kentucky and Indiana. Chimneys crumbled and visible fissures appeared during this event.

On Tuesday, buildings rattled and lights flickered as people took to the streets in the nation’s capital. Both the U.S. Capitol Building and the Pentagon were evacuated, the Associated Press reported — as was the building that houses the USGS Earthquake Hazards Program in Reston, Va. While it’s unclear if the city’s buildings suffered much structural damage from the quake, three pinnacles did snap off the central tower of the Washington National Cathedral, says Richard Weinberg, media spokesperson. Stone masons and engineers are assessing the full extent of the damage.

Earthquakes in the mid-Atlantic tend to be small — like last year’s magnitude 3.6 temblor beneath Germantown, Md. The East Coast lacks the active fault systems created by colliding continental plates characteristic of California and the West Coast. The crust that underlies the mid-Atlantic region tends to be older, cooler and more stable. The recent quake likely started on a previously unknown fault that has been dormant for a long time in the middle of the Central Virginia Seismic Zone.

The East Coast isn’t entirely immune to big ones, though. In 1886 an earthquake registering between magnitude 6.6 and 7.3 hit Charleston, S.C., damaging an estimated 2,000 buildings and killing dozens of people.

A small magnitude 2.8 aftershock gently rocked Virginia again at 2:46 p.m. Rumors, spread in some initial news reports about the event, that this latest earthquake is a foreshock coming before a larger event are just speculation.

“There’s no scientific method that could identify a foreshock,” says Ammon.


Found in: Earth and Earth Science

View the original article here

The Iceman's last meal: goat

Researchers examine the stomach contents of a famous 5,300-year-old mummyWeb edition : Monday, August 22nd, 2011 access Scientists have only now discovered the stomach of the 5,300-year-old Iceman mummy, because many of the internal organs had shrunk and moved from their original positions.South Tyrol Museum of Archaeology

Outside of the Nancy Grace show, few people have had their final hours as poked, prodded and scrutinized as much as Ötzi, the “Iceman” who died high in the Italian Alps 5,300 years ago.

Hikers discovered his frozen, mummified body in 1991. Two decades later, scientists have a good idea of what happened to Ötzi: Fleeing pursuers, he retreated to the mountains only to be shot in the back with an arrow. But even today, the Iceman is still giving up surprises.

New, more detailed radiological images of the mummy have revealed his stomach for the first time and shown that he didn’t die hungry. Within an hour of his murder, Ötzi ate a big meal mostly of the wild goat called ibex, reports a team led by Albert Zink, head of the Institute for Mummies and the Iceman in Bolzano, Italy.

“We now think that he must have felt quite safe, because otherwise he wouldn’t have had this big meal,” Zink says. “This was a really big surprise.” The work was published online August 17 in the Journal of Archaeological Science.

The Iceman may have eaten meat fairly regularly, the scientists suggest; the new scans also uncovered three gallstones, a sign that his diet could have been richer in animal products than researchers thought. And newfound signs of heavy strain in Ötzi’s knees may mean he walked a lot in mountainous terrain — as opposed to being a valley dweller who wandered up high just before his death.

In life, Ötzi was a brown-eyed, long-haired man in his mid-40s who stood 5 foot 3 inches tall, average height for the Copper Age. In death, he became one of the world’s best-preserved mummies, thanks to the ice that encased him soon after his murder. When climbers found Ötzi sticking out of a retreating glacier front in September 1991, scientists rushed his body into a climate- and humidity-controlled cell so he wouldn’t thaw.

Until now, the closest researchers had gotten to Ötzi’s last meal was locating and taking samples from his colon. It contained the remains of several meals, including the meat of red deer and ibex along with vegetables and grains like einkorn, a local wheat.

But in 2005 Zink’s team took new and more detailed X-ray computed tomography images of the mummy, quickly sliding the frozen corpse in and out of a hospital scanner. Those images revealed an organ once thought to be part of the colon but now recognizable as the long-sought stomach. After death, many of the Iceman’s organs shrank and moved from their original locations, and nobody had recognized the stomach because it had shifted into the upper abdomen, Zink says.

In November, the researchers pulled some of Ötzi’s stomach contents out through an incision in the abdominal wall. Preliminary DNA analysis of the fatty tissue shows it came from an ibex.

“What we have found is that he consumed an omnivorous diet,” says Klaus Oeggl, a paleobotanist at the University of Innsbruck in Austria who is analyzing the nonmeat parts of the stomach contents. The Iceman’s last several meals contained a mix of meat, vegetables and grains, but with a lot more meat in his final meal.

To Zink, a full stomach suggests that Ötzi wasn’t actively fleeing from his pursuers just before he died. Oeggl, however, speculates that the Iceman could have gotten a head start on those chasing him, then sat down for a break before an enemy surprised and shot him from behind.

“I’ve been on top of this particular mountain, and it’s an ideal place to stop and have a rest, maybe have something to eat,” adds Frank Rühli, head of the Center for Evolutionary Medicine at the University of Zurich in Switzerland. Once the arrow hit Ötzi, Rühli and other scientists reported in 2007, it tore open an artery and sent him into fatal hemorrhagic shock. He died on the spot.

Other evidence supports the theory that the Iceman had been under stress in the days before he died. In his last 33 hours, Ötzi moved from up near the timber line to down among the trees and up again into the realm of ice, as shown by pollen grains from various alpine plant species lodged in his body.  

Ötzi also had a deep laceration on his right hand that he received at least several days before he died.

One factor that may have made the Iceman’s life uncomfortable — though it certainly didn’t kill him — was the state of his teeth. Rühli and his colleagues recently took a close look at Ötzi’s teeth and found that the Iceman had a lot of cavities. “The whole oral health of the Iceman was much worse than we had thought before,” says Rühli.

In October, Iceman scientists will gather in Bolzano for a 20th anniversary symposium to talk about what they’ve learned about the life and death of Ötzi. That will probably include the first complete analysis of the Iceman’s nuclear DNA, which has been finished but not yet formally published.


Found in: Humans

View the original article here

Tuesday, August 23, 2011

The world's oldest profession: chef

access CHEW, CHEW, SLEEP, CHEWThis chimp and other non-human primates spend nearly half their time eating, but new research demonstrates that the cooking skills of Homo erectus allowed the lineage to save time and extract more nutrients, paving the way for bigger brains.Ronan Donovan

Nearly 2 million years ago, it seems the original naked chef was cooking up a storm. Homo erectus, the extinct hominid that’s a mere branch or so away from humans on the family tree, was the first to master cooking, new evidence suggests. This seminal event had huge implications for hominid evolution, giving the ancestors of modern humans time and energy for activities such as running, thinking deep thoughts and inventing things like the wheel and beer-can chicken.

“In the big picture, eating cooked food has huge ramifications,” says Harvard’s Chris Organ, a coauthor of the new study. Cooking and other food-processing techniques aren’t just time-savers; they provide a bigger nutritional punch than a raw diet. The new work is further evidence that cooking literally provided food for thought, making it easier for the body to extract calories from the diet that could then be used to grow a nice, big brain.

Humans are the only animals who cook, and compared to our living primate relatives we spend very little time gathering and eating food. We also have smaller jaws and teeth.

Homo erectus also had small teeth relative to others in the human lineage, and the going idea was that hominids must have figured out how to soften up their food by the time that H. erectus evolved. But behavioral traits such as the ability to whip up a puree or barbecue ribs don’t fossilize, so a real rigorous test of the H. erectus-as-chef hypothesis was lacking.

Organ and his colleagues, including Harvard’s Richard Wrangham, an early champion of the cooking hypothesis, decided to quantify the time one would expect humans to spend eating by looking at body size and feeding time in our living primate relatives. After building a family tree of primates, the researchers found that people spend a tenth as much time eating relative to their body size compared with their evolutionary cousins — a mere 4.7 percent of daily activity rather than the expected 48 percent if humans fed like other primates.

Then the team looked at tooth size within the genus Homo. From H. erectus on down to H. sapiens, teeth are much smaller than would be predicted based on what is seen in other primates, the team reports online the week of August 22 in the Proceedings of the National Academy of Sciences.

“Tooth size becomes dramatically smaller than what we would expect,” says paleoanthropologist David Strait of the University at Albany in New York, who was not involved with the work. “This is really compelling indirect evidence the human lineage became adapted to and dependent on cooking their food by the time Homo erectus evolved.”


Found in: Humans

View the original article here

Monday, August 22, 2011

Parkinson's protein comes in fours

Stabilizing the ties that bind a protein important in Parkinson’s disease to its buddies might help fend off the disease, a new study of the protein’s structure suggests.

Alpha-synuclein builds up in tough aggregates in the brains of patients with Parkinson’s disease. Researchers thought that this protein was normally a floppy, snakelike molecule.

But now, neuroscientist and neurologist Dennis Selkoe of Brigham and Women’s Hospital and Harvard Medical School and his colleagues show that alpha-synuclein normally forms bands of four molecules in living cells. These quartets (scientists call them tetramers) of alpha-synuclein molecules resist the clumping that leads single molecules of the protein down the path to brain cell destruction, Selkoe and colleagues report online August 14 in Nature.

Discovering that alpha-synuclein works in groups of four could be important in treating or preventing Parkinson’s disease, says Patrik Brundin, a neuroscientist at Lund University in Sweden. The findings suggest that loner alpha-synuclein molecules could be “part of the ‘bad guy’ pathway, and stabilizing it as a tetramer might help avoid the disease,” he says.

No one yet knows whether quartets of alpha-synuclein disintegrate into single molecules in the brains of people with Parkinson’s disease, leading to big brain-cell-killing plaques. Studies comparing normally aging brains with those of people with the disease may help answer those and other questions raised by the study, Brundin says.

The new discovery may also help researchers figure out what alpha-synuclein’s day job is, says Joakim Bergström, a molecular biologist and biochemist at Uppsala University in Sweden. The protein was thought to be important in grasping lipids — fats and other molecules that make up membranes, carry messages and perform other functions in a cell — but earlier experiments have given mixed results. In the new study, the researchers found that the alpha-synuclein quartet can hold much more lipid than single molecules of the protein can.

The finding that the protein hangs out in groups of four “potentially explains a lot of contradictory information in the literature,” about alpha-synuclein’s behavior, says Chad Rienstra, a structural biologist at the University of Illinois at Urbana-Champaign.

Rienstra suggests that contradictory information may have come from the way scientists produce samples of the protein for study. Most researchers studying the protein induce bacteria to make large quantities of the molecule.

But the bacteria-produced version of the protein doesn’t fold up the same way as it does in human cells, Selkoe’s group found. In human red blood and brain cells, the protein winds up into spiral structures called alpha helices and then forms hardy four-man bands.

Even so, it’s far too soon to throw out decades of data suggesting that the protein is a loner, says Hilal Lashuel, a protein biochemist at the Swiss Federal Institute of Technology in Lausanne. The group’s claims “should be treated with caution until reproduced,” he says. That’s something he has tried to do without success since hearing about the study at a meeting earlier this year.

Lashuel is concerned about some of the experimental methods used in the study and the group’s interpretation of the results, which may have larger implications for Parkinson’s disease research. “I’m worried that people will start questioning all the previous data,” he says. Abandoning that previous work could prove costly if the new finding ultimately doesn’t pan out, he says. “My view is that any time we waste, we take away from patients and their families.”


Found in: Genes & Cells

View the original article here

White dwarfs gobble Earthlike treats

Astronomers studying the atmospheres of planet-munching white dwarf stars have found that some stellar meals included the same ingredients as Earth.

Remains of rocky bodies that once circled the white dwarfs pepper the gas envelopes around the dead stars. The ratios of elements in these remains — called “pollution,” since it mars the star’s normally pristine hydrogen or helium atmosphere — tell astronomers what the bodies were made of and where they might have come from. Although about as common as normal stars in the Milky Way, white dwarfs aren’t the most obvious choice for astronomers looking for traces of extrasolar planets — but, it turns out, the dense, collapsed stars may be incredibly useful.

Each of two polluted white dwarf stars snarfed at least 10 sextillion grams of rocky dust, roughly equal to the mass of the dwarf planet Ceres (a sextillion equals 1 with 21 zeros after it). And, one of the stars ate something very similar in composition to the Earth, astronomers report online August 7 at arXiv.org and in an upcoming issue of the Astrophysical Journal.

“This means that planetlike rocky material is forming at Earthlike distances or temperatures from these stars,” says astronomer and study coauthor Ben Zuckerman of UCLA. Zuckerman notes that it’s still unclear whether the material is from a planet, planetlike bodies or an asteroid, but it is clear that there’s a lot of it.

For years, astronomers thought the dwarfs were simply catching dust during their interstellar travels. Now, scientists think the atmospheric debris signals the presence of ancient orbiting planetary systems. Zuckerman says that between 25 and 30 percent of white dwarfs have orbital systems that contain both large planets and smaller rocky bodies. After the dwarf forms, larger, Jupiter-mass planets can perturb the orbits of smaller bodies and bounce them toward the star.

“This is the first hint that despite all the oddball planetary systems we see, some of them must be more like our own,” says astronomer John Debes of NASA’s Goddard Space Flight Center in Greenbelt, Md., who was not involved in the study. “We think that most of these systems that show pollution must in some way approximate ours.”

Using the Keck I telescope on Mauna Kea in Hawaii, Zuckerman and his team peered closely at two helium-dominated white dwarfs and determined the ratios of the polluting elements. Star PG1225-079 has a mix of elements — including magnesium, iron and nickel — in ratios resembling those found in bulk Earth and some elements that are two or three times more abundant (calcium, for example).

The other star, HS2253+8023, munched material that contains more than 85 percent oxygen, magnesium, silicon and iron — very much like Earth. That’s indicative of a rocky parent body forming in an area with conditions similar to those where the Earth first formed.

“I’ve never seen so much detail in spectra,” says astronomer Jay Holberg of the University of Arizona in Tucson, who was not involved in the study. “People have seen iron and calcium and other things in these stars, but [this group has] gone off and found a whole slew of other elements.”

Incredibly dense, white dwarfs are about the size of Earth but as massive as the sun, and mark the final stage of stellar evolution for more than 90 percent of the stars in the Milky Way. But before reaching that stage, stars puff up into red giants, a process that can rearrange an orbiting system and gobble up any bodies too close-by. Then the stars collapse — and some survivors of that initial expansion and contraction might be chucked inward by larger bully planets.


Found in: Atom & Cosmos and Planetary Science

View the original article here

Sunday, August 21, 2011

Moms talk, daughters' hormones listen

A comforting voice packs a biological punch that instant messages lackWeb edition : Friday, August 12th, 2011

Now hear this: A mother’s encouraging words heard over the phone biologically aid her stressed-out daughter about as much as in-person comforting from mom and way more than receiving instant messages from her.

That’s consistent with the idea that people and many other animals have evolved to respond to caring, familiar voices with hormonal adjustments that prompt feelings of calm and closeness, say biological anthropologist Leslie Seltzer of the University of Wisconsin–Madison and her colleagues. Written exchanges such as instant messaging, texting and Facebook postings can’t apply biological balm to frazzled nerves, the researchers propose in a paper published online July 29 in Evolution and Human Behavior.

Seltzer’s group found that 7- to 12-year-old girls who talked to their mothers in person or over the phone after a stressful lab task displayed drops in levels of cortisol, a stress hormone, accompanied by the release of oxytocin, a hormone linked to love and trust between partners in good relationships. Girls who instant messaged with their mothers after the lab challenge showed no oxytocin response and their cortisol levels rose as high as those of girls who had no contact with their mothers.

“At least in our subjects, instant messaging falls short of the endocrine payoff of speech or physical contact with a loved one after a stressful event,” Seltzer says.

It makes sense that speech, with ancient evolutionary roots, can trigger biological markers of reassurance, comments psychologist Jeffry Simpson of the University of Minnesota in Minneapolis. Mothers may have expressed support better in speech than in writing, or the tone of their voices could have had a special impact on daughters, Simpson says.

Unfamiliarity with instant messaging, especially among mothers, may have undercut the ability of digital connections to alleviate daughters’ stress in the new study, suggests psychologist Sandra Calvert, director of Georgetown University’s Children’s Digital Media Center in Washington, D.C. Still, “mom’s voice is very important to all of us who are daughters,” Calvert says.

Seltzer’s team studied 68 girls who reported good relationships with their mothers. Each girl spoke about a preselected topic for five minutes and then tried to solve mental arithmetic problems for five minutes in front of two strangers who maintained neutral facial expressions. Youngsters said that these tasks caused them considerable stress. Researchers tracked cortisol in saliva samples and oxytocin in urine samples.

Afterward, girls were randomly assigned to talk with their mothers in person, over the phone, via instant messaging or not at all. Mothers were told to offer as much emotional support to their daughters as possible.

Although this study found no hormonal benefit for instant messaging between mothers and daughters, children may profit biologically when such messages come from peers, remarks psychologist Kaveri Subrahmanyam of California State University, Los Angeles. A 2009 study found that instant messaging with an unknown peer for 12 minutes eased the sting of rejection among teens excluded from a group game in the lab.


Found in: Humans and Psychology

View the original article here

The color of controversy

access When it comes to the safety of dyeing food, the one true shade is gray. Artificial colorings have been around for decades, and for just about as long, people have questioned whether tinted food is a good idea. In the 1800s, when merchants colored their products with outright poisons, critics had a pretty good case. Today’s safety questions, though, aren’t nearly so black and white — and neither are the answers. Take the conclusions reached by a recent government inquiry: Depending on your point of view, an official food advisory panel either affirmed that food dyes were safe, questioned whether they were safe enough or offered a conclusion that somehow merged the two. It was a glass of cherry Kool-Aid half full or half empty. About the only thing all sides agree on is that there would be no discussion if shoppers didn’t feast with their eyes. Left alone, margarine would be colorless, cola wouldn’t be dark, peas and pickles might not be so vibrantly green, and kids cereals would rarely end up with the neon hues of candy. But as the 1990s flop of Crystal Pepsi showed, consumers expect their food to look a certain way. Some of the earliest attempts to dye food used substances such as chalk or copper — or lead, once a favorite for candy — that turned out to be clearly harmful. Most of the added colors in use today were originally extracted from coal tar but now are mostly derived from petroleum. Overseeing the safety of artificial food color was one of the reasons the U.S. Food and Drug Administration was founded (with its current name, in 1930). And the issue of food dye safety has continued to attract government notice, sometimes in dramatic ways, such as the time investigators demanded to know why trick-or-treaters became ill in 1950 after eating Halloween candy dyed with orange No. 1. The most recent government attention came in March, when an FDA advisory panel made up of scientists, consumers and industry representatives held a two-day hearing to try to determine whether food dyes cause hyperactivity in children. It is a debate that has gone on, in some incarnation, more than 30 years. Though scientific attention has grown, the disagreement lingers, partly because the issue is complicated to study and partly because dyes, if harmful, probably affect only a subset of children who have some yet-undiscovered genetic sensitivity. Over the years, skeptics of any connection have seized on uncertainties and other logistical flaws in the research that could lead to misleading results. Still, many scientists say studies are strong enough to warrant some kind of government action. And some of them are now criticizing the FDA, saying that, in retrospect, questions about the hyperactivity-dye link were presented to the advisory panel in a way that meant inaction was almost a foregone conclusion. “To me, the whole process was defective,” says Bernard Weiss, a psychologist in the Department of Environmental Medicine at the University of Rochester School of Medicine and Dentistry in New York who was invited to speak before the panel. The main question that committee members were assigned was whether “a causal relationship between consumption of certified color additives in food and hyperactivity in children in the general population has not been established” (a conclusion ultimately supported by 11 of 14 voting panel members). Weiss calls that “a ridiculous question,” not only because of its tortured, negative wording, but also because even those concerned about food dyes acknowledge that the science has not shown a link to hyperactivity in all kids.

Untrue colors

Nine different artificial dyes are currently approved for use in the United States; many of these chemicals have been staples of the food industry for generations. While the FDA does not have data on consumption, it does keep track of how much dye of each type gets the OK for use in products; the amount per capita has increased fivefold since the 1950s. Dyes have never been without criticism — a “pure food” movement was well under way even by the late 1800s. But specific concern about hyperactivity and other neurological effects first arose in 1975, when Ben Feingold, former chief allergist at Kaiser Permanente Medical Center in San Francisco, hypothesized that food additives were contributing to hyperactivity. His book Why Your Child is Hyperactive drew largely on his own clinical observations.

In 1976 in the journal Pediatrics, researchers published a study that compared a regular diet with a diet that eliminated artificial flavors and colors in 15 hyperactive children. After eating what has since become known as the “Kaiser Permanente elimination diet” or the “Feingold diet,” children showed an improvement in symptoms such as difficulty paying attention.

Three decades of studies since then have accumulated evidence linking food dyes to an exacerbation of hyperactivity. But the controversy remains unsettled. Skeptics have a lot of ammunition, pointing out that findings often have been inconsistent and confusing. To set up a study of food dyes, researchers have to juggle a lot of variables at once — including how big a dose of dyes to give, which ones to give and the fine art of having parents and teachers document symptoms that aren’t easy to measure.

Other factors also complicate the research. Studies have used mixtures of dyes, making it difficult to tease out the possible effects of any individual color. Also, it may be that only an unknown subset of children are affected: In a scientific analysis, the children not affected might outnumber those who are, blunting the overall findings when data are lumped together.

Finally, evidence suggests that dyes may not be the lone culprit. Children who appear to be sensitive to dyes may also have neurological reactions to other ingredients, even naturally occurring components such as wheat and chocolate. In some studies, children were given the dyes in cookies; if the children react to wheat or milk as well, the “placebo” might not have been the placebo scientists thought.

In the end, the disagreement comes down to this: How much evidence is necessary to add product warnings about (or ban, as some consumer groups want) chemicals that offer no nutritional benefit and are consumed each day by millions of healthy children?

access GOING NATURALNatural food dyes include betanin (derived from beetroot), compounds from the seeds of the achiote tree and curcumin (from turmeric).from top: oksana2010/ShutterStock; Dr. Morley Read/ShutterStock; Le Do/ShutterStock

Europe gets the blues

Food safety advocates believe the substantial suggestion of harm, even without proof, is enough to take action. So does the European Parliament, which in 2008 dictated that foods with certain dyes had to contain warnings that the chemicals “may have an adverse effect on activity and attention in children.” Neither the FDA nor American lawmakers have gone that far, saying that the levels of dye currently in foods are safe.

Most dyes have no set cap on the amount that can be used, just stipulations requiring manufacturers to use only enough to reach their desired color, and no more. “When the FDA established legal limits on dyes, they did not consider children,” says Laura Anderko, a researcher in public health at Georgetown University Medical Center in Washington, D.C. And it is not known, she says, what the lasting effects from constant exposure might be. “Kids, they have a long shelf life. If they are exposed at an early age — depending on those kinds of petrochemicals that are consumed — it could mean lifelong impacts,” she says.

The color industry says any link between food coloring and hyperactivity remains unproven. “We don’t see any strong compelling data at this point that there is a neurological effect,” says Sean Taylor, a chemist at Verto Solutions in Washington, D.C., and a representative of the International Association of Color Manufacturers. He notes that the dyes on the market today have been consumed in populations worldwide, without any apparent harm, for decades. In animal toxicity tests, Taylor says, most of the dyes in food are excreted, and the small amounts absorbed are broken down by the liver.

More than a dozen clinical studies have tried to investigate the relationship between food dyes and hyperactivity. In 2004, psychiatrists David Schwab from Columbia University and Nhi-Ha Trinh of Harvard University published a meta-analysis of all 15 known double-blind placebo-controlled trials — meaning those in which neither the researchers nor the participants knew who was getting the dyes. That study, in the Journal of Developmental & Behavioral Pediatrics, reported that the results “strongly suggest an association” between food dyes and hyperactivity, though the researchers included a long list of caveats.

Following the 2004 meta-analysis, the British Food Standards Agency (the equivalent of the U.S. FDA) commissioned large studies to further examine whether food dyes, along with a common food preservative, affected children’s behavior. Unlike most previous investigations, these new experiments included children from the general population who had no history of hyperactivity.

In those studies, researchers from the University of Southampton gave two groups of children (one toddler group, and one school age) beverages with one of two mixes of food dyes and the preservative sodium benzoate or a placebo, and asked parents and educators to note any behavior changes. The older children also took a computerized test designed to measure attention.

The results, published in 2007 in the Lancet, “lend strong support for the case that food additives exacerbate hyperactive behaviors,” the researchers write. “Our results are consistent with those from previous studies and extend the findings to show significant effects in the general population.” The scientists recognized the potential political impact of their findings: “The implications of these results for the regulation of food additive use could be substantial.”

And in Europe, they were. While the European Food Safety Authority did not think the evidence was strong enough to prompt action, the European Parliament was convinced. Dyes are not banned outright, but warning labels alone have been enough to change the way many products are made. A strawberry sundae at McDonald’s in the United States gets a boost of crimson from red No. 40. In Great Britain, a McDonald’s strawberry sundae gets its red only from strawberries.

In 2008, the year warning labels took effect in Europe, the D.C.-based Center for Science in the Public Interest (the same food watchdogs known to denounce the nutritional wasteland of convenience foods and movie popcorn) petitioned the FDA to ban the dyes. A long list of scientists and researchers signed on to the center’s appeal. “Food manufacturers voluntarily could substitute safe natural colors or other ingredients (such as fruit or fruit juices) for dyes, but that’s unlikely to happen throughout the food supply without the level playing field provided by government regulation,” the document stated. “Accordingly, the Food and Drug Administration ... should ban the use of dyes in all foods; until such action takes effect, the FDA should require a prominent warning notice on product labels.”

While no large trials have been published since 2007, the government took the Center for Science in the Public Interest petition seriously enough to hold the hearings in March, asking members of its Food Advisory Committee to decide whether the evidence establishes a link between food dyes and hyperactivity in children in the general population.

Even Michael Jacobson, executive director of the Center for Science in the Public Interest, says he would answer “no.” To him and others, it was not the valid question to address. Better, he said, would have been to assess whether food dyes pose a danger to certain children, in the same way that allergens affect only susceptible people. Few products, no matter how dangerous, affect everyone in the population. “Even smoking does not affect everybody,” he says.

Metabolic black box

No one knows which children may be at risk, because the biology behind any potential neurological effect associated with hyperactivity isn’t clear. Taylor, the color industry biochemist, says that animal studies find that the molecules do not easily get through intestinal cell walls, and most of the dye passes through the body without leaving the digestive system.

Laura Stevens, a nutrition researcher at Purdue University in Indiana, acknowledges that this is the case. “In animals, very little of it is absorbed,” she says. “It is excreted in the feces.” But that doesn’t necessarily negate the idea of any effects on the body, she says; effects could come through metabolites, or through indirect mechanisms.

As examples, she cites two studies by British researchers. In one, published in the Journal of Nutritional Medicine in 1990, the scientists investigated how the yellow dye tartrazine affected the zinc levels of 10 hyperactive boys, compared with 10 nonhyperactive peers. (Zinc is a mineral important for proper brain function.) The team found that zinc levels dropped in the blood and increased in the urine among the hyperactive kids after tartrazine consumption. Another study, published in the Journal of Nutritional and Environmental Medicine in 1997, found a similar drop in zinc levels, and an increase in hyperactivity, in some children consuming tartrazine.

Newer research suggests that dyes trigger the release of histamines, which are part of the body’s immune system. An experiment reported last year in the American Journal of Psychiatry suggested that differences in genes that control histamines might explain why some children are affected and others are not.

But studies are few. In truth, Stevens says, aside from extrapolations from animal studies, the metabolic fate of dyes in humans is a black box. She and her colleagues at Purdue are among those trying to look at food dye metabolism in humans. “If there’s any chance at all there’s a problem, this should be addressed,” she says.

Ultimately, the future of food dyes may not rest with scientists or government regulators, but with consumers, says Ron Wrolstad, an agricultural chemist at Oregon State University in Corvallis.

“A lot of times now, particularly with natural colorants, it will be a marketing decision rather than a regulatory ruling,” he says. The snack food giant Frito-Lay, for instance, has announced, and heavily publicized, a commitment to use fewer artificial dyes in its products. A company spokeswoman said in December that the move was in response to consumers wanting more snacks “made with real food ingredients.”

“My personal opinion is that the synthetics don’t cause you any harm, but I don’t think they do you any good,” Wrolstad says. While other researchers are looking for harmful effects of synthetic dyes, Wrolstad is looking for beneficial effects of natural, plant-derived colors. “A lot of these compounds have antioxidant properties,” he says.

Though just as the idea of harm by synthetic colors isn’t universally accepted, neither is the suggestion of benefit from dyes extracted from plants. “I would feel a lot more comfortable if we had some data on those, too,” Weiss says.

In the meantime, dyes of all kinds will continue to dominate the grocery aisle unless shoppers demand otherwise. In the food business, the most influential color is green.

In the limelight

Though concern over a link to hyperactivity has prompted the latest attacks on food dyes, artificial colorings have caught the public’s attention for other economic and health reasons for more than a century.

1850s A Victorian-era domestic standby Enquire Within Upon Everything described how bread could be tested at home for the presence of alum, a metallic salt used to create a more preferable, whiter color in the dietary staple. As early as the Middle Ages, some bread manufacturers were rumored to make very white bread on the cheap by adding chalk.

1890s One effort used by the dairy industry to prevent newly invented and relatively cheap margarine from undercutting the popularity of butter was the push for regulations that would tax or ban margarine with the yellow tint of butter. (Naturally, margarine is colorless.) Anticoloring laws were adopted in 30 states, and some legislatures went so far as to demand that margarine be dyed pink. Because of the restrictions, some margarine manufacturers sold yellow dye packets with their products, so consumers could color their own margarine at home.

1950s In 1950, children became ill after eating Halloween candy containing orange No. 1, which had been approved for use in food by the U.S. Food and Drug Administration. The reports led to a public outcry, and along with other concerns, led the FDA to re-evaluate the safety of food colorings. Several dyes were delisted, and the Color Additive Amendments of 1960 established the current regulatory protocol.

1970s In the 1970s, it was red No. 2’s turn to cause a stir. Russian studies had suggested that the dye caused rats to develop intestinal tumors and was toxic to the gonads and embryos. Though the tests were largely debunked, when combined with earlier studies showing breast tumors in female rats fed the dye, the findings were enough to lead to a public health scare. The FDA banned red No. 2, and many manufacturers removed red products regardless of whether they contained the dye. Mars didn’t bring back red M&Ms until the late ’80s.

1990s Natural food dyes have caused controversy too. The reddish cochineal extract and carmine came to the attention of the Center for Science in the Public Interest in 1998. The dyes, made from a type of female beetle, had been used for hundreds of years, exempt from certification because they are natural. Recorded allergic reactions as well as anecdotal reports of outrage among vegetarians and kosher-keeping Jewish people who were unknowingly consuming insect products prompted demands for labeling. The FDA agreed to require manufacturers to list the dyes as ingredients on the product label, but consumers have to figure out for themselves that the products come from animals.

Added color 

The U.S. Food and Drug Administration currently certifies nine synthetically produced food dyes (three popular colorings are described below). Such dyes can transform colorless products, giving faded veggies a more vibrant hue and making children’s candies more fun.

Brilliant blue  Designated as blue No. 1 by the FDA, this dye is found in ice creams, ice pops, baked goods and a host of blue raspberry–flavored beverages. It shows up in ranch-flavored chips, prepared guacamole and mixed-berry applesauce. The dye was approved by the FDA in 1969.

Allura red Red No. 40 is found in strawberry-flavored drinks, ice creams and cream cheeses; some Nutri-Grain bars; licorice; and most other red sweets. It was approved by the FDA in 1971 and, in terms of consumption, is currently the most-used food dye.

Tartrazine  Yellow No. 5 is in products such as Mountain Dew, Peeps, Doritos and Cheez Doodles. It’s commonly found in relish, pickles, lemon-flavored seasonings and boxed macaroni and cheese. The dye was approved by the FDA in 1969.



View the original article here

Saturday, August 20, 2011

Book Review : BOOK REVIEW: The Man of Numbers: Fibonacci’s Arithmetic Revolution by Keith Devlin

access

Leonardo of Pisa, also known as Fibonacci, is best remembered today for introducing a sequence of numbers: 0, 1, 1, 2, 3, 5 and so on, each number after 0 and 1 equaling the sum of the two before it. The Fibonacci sequence is closely connected to the “golden ratio” used in art and architecture and turns up frequently in mathematics and nature.

For Devlin, NPR’s “Math Guy,” Leonardo of Pisa is much more:  he’s the man who brought arithmetic to the West — a celebrity of 13th century Italy.

“The likes of Apple Computer’s Steve Jobs and Microsoft’s Bill Gates will always be linked to the rise of the personal computer, and in this way Leonardo should be linked to the rise

of modern arithmetic,” Devlin writes.

The book, which draws heavily on academic sources, pieces together the little that is known about Leonardo’s life. As a young man, he traveled to North Africa and learned Hindu-Arabic numerals and al-jabr, the basis of modern algebra. At the time, merchants and traders in Italy still used Roman numerals, which were impractical for multiplication or division.

Leonardo’s Liber Abbaci was the first comprehensive arithmetic textbook in Europe. It helped change the way business was done, with hundreds of math problems based on everyday commerce such as converting currency or calculating profits.

A mathematician himself, Devlin demonstrates Leonardo’s original methods by walking the reader through copious calculations that read like cooking recipes. Readers with the patience to tackle the math, though, will discover a legacy that extends across eight centuries to the textbooks of schoolchildren today and the numbers that run the modern world. 

Walker & Co., 2011, 183 p., $25



View the original article here

Particle physicists chasing ghosts

Wispy neutrinos could explain why matter dominates the universeWeb edition : Friday, August 12th, 2011

PROVIDENCE, R.I. — Two experiments on different continents have found hints that particles called neutrinos can shape-shift in an unexpected way.

This behavior may be the key to understanding why these particles are so weird, says neutrino physicist Jennifer Raaf of the Fermi National Accelerator Laboratory in Batavia, Ill., the nation’s largest particle physics lab. Raaf presented an overview of recent neutrino findings August 9 at a meeting of the American Physical Society’s Division of Particles and Fields.

The new results also bode well for future experiments with neutrinos that may one day help scientists understand why the universe contains vastly more matter than antimatter. These experiments are part of the changing landscape of particle physics in the United States. With Fermilab’s Tevatron, once the most powerful particle collider in the world, shutting down soon, the government laboratory is reconfiguring itself to focus on projects that require particularly intense beams and look for extremely rare events.

“Neutrinos will play a big role moving forward,” says Young-Kee Kim, deputy director at Fermilab.

In the bestiary of particle physics, neutrinos are the neutral counterparts to the three charged leptons: the familiar electron and the heavier and more exotic muon and tau. Neutrinos are loners by nature, rarely interacting with the rest of the universe. But they do occasionally change form. That process, called oscillation, may offer clues about why the universe contains so little antimatter.

In June the T2K experiment in Japan reported evidence that muon neutrinos occasionally oscillate into electron neutrinos. Six electron neutrinos appeared in a nearly pure beam of muon neutrinos traveling from an accelerator at the J-PARC facility to an underground detector 295 kilometers away.

Days later, physicists at the MINOS experiment announced finding traces of this oscillation in neutrinos traveling 735 kilometers from Fermilab to a mine in Minnesota. Those results, presented August 9 at the physics meeting, help to narrow T2K’s estimate of how often this changeup happens.

Taken together, the chance that both sightings are flukes is less than one in a thousand, according to a recent analysis by a team of physicists in Italy and Germany. That’s below the standard for claiming a discovery but good enough to warrant further study, says Ed Kearns, a neutrino physicist at Boston University and T2K team member.

“This helps us justify future experiments,” he says. “It makes a big difference in our confidence going forward.”

If confirmed, this oscillation will be a crucial piece of information for a neutrino experiment now under construction at Fermilab. The NOvA experiment, which is currently testing its first prototype detector, could help scientists work out the differences in the masses of the different kinds of neutrinos, a long-standing puzzle.

Another project, called LBNE (for Long-Baseline Neutrino Experiment), also hopes to extend this line of research. LBNE would send beams of neutrinos and antineutrinos from Fermilab to a detector 1,300 kilo?meters away, giving the particles more time to change identity — and the scientists a better shot at understanding whether neutrinos behave differently than their antimatter counterparts.

LBNE is still on the drawing board, though, and its future is clouded by uncertainties about funding. Last December, the National Science Foundation pulled out of a plan to build a laboratory in an old mine in South Dakota that would house not only a detector for the LBNE experiment but also a variety of other underground experiments. That decision ups the price tag for the Department of Energy, the experiment’s main funder. With the DOE’s budget uncertain, physicists are exploring different ways to try to lower the costs as much as possible.

“We’re expecting a clear decision about what’s right for LBNE by the end of this year,” William Brinkman, director of the DOE’s Office of Science, told a roomful of physicists during an August 11 forum at the physics meeting.


Found in: Atom & Cosmos and Matter & Energy

View the original article here

Friday, August 19, 2011

Book Review : BOOK REVIEW: The New Universe and the Human Future: How a Shared Cosmology Could Transform the World by Nancy Ellen Abrams and Joel R. Primack

By Nancy Ellen Abrams and Joel R. Primack access

Living only for the present, using up natural resources, polluting the environment without considering future generations — can humans ever change? Lawyer and popular-culture lecturer Abrams and her husband Primack, an astrophysicist noted for his work on dark matter, argue that people might, if only they learned a little cosmology.

Echoing the words of Joseph Campbell, who studied the myths of ancient and modern peoples, the authors argue that the world needs a modern understanding of human beginnings — a common story. The origin of the universe — with concepts such as the Big Bang, cosmic inflation, dark matter and dark energy — could become this overarching story for all humankind.

Abrams and Primack are careful to explain that they’re not discounting religion as a way for people to connect with each other and understand the meaning of the universe. But the authors believe that by understanding concepts of cosmology, a more global understanding of the human role in the cosmos will emerge.

“We need to feel in our bones that something much bigger is going on than our petty quarrels and our obsession with getting and spending, and that the role we each play in this very big something is what really defines the meaning and purpose of our lives,” they write.

The authors tell the cosmology story well and illustrate it with stunning images, in the book and online at www.new-universe.org. But it’s unclear whether a universal understanding of cosmic origins can ever take hold, since those who disagree may never pick up the book in the first place. 

Yale Univ. Press, 2011, 256 p., $28



View the original article here

Financial world dominated by a few deep pockets

Economic “superentity” controls more than one-third of global wealthWeb edition : Monday, August 15th, 2011 access POWER BALLA central core of extremely powerful actors (red dots) dominates international corporate finance, a new mathematical analysis finds.Vitali et al, 2011

Conventional wisdom says a few sticky, fat fingers control a disproportionate slice of the world economy’s pie. A new analysis suggests that the conventional wisdom is right on the money.

Diagramming the relationships between more than 43,000 corporations reveals a tightly connected core of top economic actors. In 2007, a mere 147 companies controlled nearly 40 percent of the monetary value of all transnational corporations, researchers report in a paper published online July 28 at arXiv.org.

“This is empirical evidence of what’s been understood anecdotally for years,” says information theorist Brandy Aven of the Tepper School of Business at Carnegie Mellon in Pittsburgh.

The analysis is a first effort to document the international web of relationships among companies and to examine who owns shares — and how many — in whom. Tapping into the financial information database Orbis, scientists from ETH Zurich in Switzerland examined transnational companies, which they defined as having at least 10 percent of their holdings in more than one country. Then the team looked at upstream and downstream connections, yielding a network of 600,508 economic actors connected through more than a million ownership ties.

This network takes on a bowtie shape, with a large number of diffuse actors in the wings and a few major players tangled up in the tie’s knot. So while it’s true that ownership of publicly held corporations is broadly distributed, says complex systems scientist James Glattfelder, a coauthor of the new work, “take a step back and it’s all flowing into the same few hands.”

While any man on the street may have predicted this outcome, the economic literature portrays markets as so dynamic that they lack hot spots of control, Glattfelder says.

Researchers aren’t sure what to make of the core’s interconnectedness. On the one hand, it could expose the whole network to risk.

“Imagine a disease spreading,” says Aven. “If you have a high school where everyone’s sleeping together and one person gets syphilis, then everyone gets syphilis.”

But on the flip side, she notes, interconnectedness can lead to better self-policing and positive behaviors, such as fair labor practices or environmentally friendly policies.

And even though the status of many players in the analysis has changed drastically since 2007 (now-defunct Lehman Brothers is a key element of the core), the analysis shows that ownership is becoming increasingly concentrated and increasingly transnational, says Gerald Davis of the University of Michigan in Ann Arbor.

Because interpreting and analyzing these kinds of data is difficult, he says, the analysis serves more as “an impression of the moon’s surface you get with a telescope. It’s not a street map.”

Ownership can be difficult to study internationally because holding shares in a mutual fund doesn’t necessarily mean the same thing in the U.S. as it does in communist China. And even within a single country ownership can be hard to tease out, says economist Matthew Jackson of Stanford University. For example, when an individual invests in a mutual fund or even purchases shares through an institution like Merrill Lynch, the firm is often still the official owner of the assets. And even when shareholders do have voting rights, they may not exercise them.

“This becomes worrisome if everyone is like me and says I’ll let Vanguard do the voting,” says Jackson. “Maybe we should be a little bit worried. I don’t know if we should be.”


Found in: Numbers and Science & Society

View the original article here

Thursday, August 18, 2011

News in Brief: Earth & Environment

Methane from rice, killer fungus and more in this week’s news Web edition : Saturday, August 13th, 2011

Methane puzzle
After years of climbing, atmospheric concentrations of methane, a potent greenhouse gas, have held steady for nearly 30 years — and for reasons that have not been obvious. Using different data and methods, two new papers in the Aug. 11 Nature link the arrested growth in atmospheric methane with human activities. Chemical markers studied by one U.S. team suggest the plateau traces to a larger global drop in the combustion of fossil fuels during the 1980s than had been recognized. Another group, at the University of California, Irvine, linked much of the change instead to reduced methane emissions from rice production in Asia. —Janet Raloff

 Pumping groundwater raises sea level
The removal of groundwater for drinking, irrigation and other uses appears to have contributed more than 6 percent of the global sea level rise incurred since 1900, a new study finds. Leonard Konikow of the U.S. Geological Survey in Reston, Va., used data sources from around the world to show that over this time, groundwater extraction transferred some 4,500 cubic kilometers into the oceans. And the rate has increased since 1950 — and especially since 2000, when groundwater releases are estimated to have increased sea level some 0.40 millimeters per year. Konikow presents his analyses in a paper that will appear in Geophysical Research Letters. —Janet Raloff

Killer fungus caused extinction
A fungus may have contributed to the world’s great die-off some 250 million years ago. Paleontologists have puzzled over stringy-looking fossils called Reduviasporonites, which appear in rocks deposited worldwide during the mass extinction that ended the Permian period of geologic time. Now, scientists in the Netherlands, England and California suggest the fossils resemble modern-day soil fungi that include many plant pathogens. Fungi spreading through Permian-age plants may thus have helped kill off forests, the scientists proposed online in Geology on Aug. 5. —Alexandra Witze

Beyond Eyjafjallajökull
Europe had better be prepared for more volcanic ash clouds. Prompted by last year’s Eyjafjallajökull eruption in Iceland, a team including Graeme Swindles of the University of Leeds in England searched for signs of volcanic ash that had fallen across northern Europe in the last 7,000 years. Over the last millennium, the team found, ash clouds arrived on average around every 56 years. For any given decade the chance of an ashfall is 16 percent, the scientists reported August 5 in Geology. —Alexandra Witze


Found in: Earth and Environment

View the original article here