Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

Tuesday, December 1, 2015

Why Foods That Make You Fart Are a Good Thing

Although renowned for creating delight in children, farts are not considered the best way to make friends and influence people. But there is an upside: the production of gas means that your body is hosting the right kinds of bacteria. To encourage these ‘good’ bugs – known as our microbiome – we need to eat fibre.

Credit: CSIRO

“Fermentable components of dietary fibre have a critical role in feeding the gut microbiome,” said Dr Trevor Lockett, Head of the Gut Health and Nutrition Group at CSIRO Food and Nutrition.

“This part of fibre is fermented mostly to short chain fatty acids, a process which creates gas.”

Fermentation is a chemical process that breaks down carbohydrates in fibre: bacteria do it in our bowels to create food for themselves. Molecules that improve the health of their host – that’s you – are also produced. Dr Lockett presented an update on his group’s latest research at ‘Bugs, Bowels and Beyond’, the 2015 National Scientific Conference of the Australian Society for Medical Research held in Adelaide, South Australia last week.

Credit: CSIRO

He focused in particular on recent findings describing how different dietary components influence the microbiome, and determine their production of not just gas, but also molecules that are beneficial in the large intestine.

“For example, we know now that bacteria living in the large intestine produce a short chain fatty acid known as butyrate, which can reduce inflammation by stimulating regulatory immune cells,” he said.

“We’re now seeking to expand some of this work to see if we can improve inflammatory bowel disease.”

The component in food that manages to make it through digestive processes in the stomach and small intestine to feed the microbiome in the large intestine is known as resistant starch.

You can improve the proportion of resistant starch in your diet by eating unrefined whole grains, pulses and legumes, unripe bananas and cooked and cooled foods such as potatoes, pasta and rice. Dr Lockett explained that in addition to conducting fundamental research, scientists at the CSIRO work across the fields of nutrition and agriculture for product development.

“As our understanding of the beneficial components of dietary fibre has improved, we’ve been able to inform colleagues who are growing grains for cereal purposes,” he explained.

“If we can include and enrich resistant starch in marketable grains, perhaps we can drive health benefits.”

Dr Lockett’s CSIRO colleague Dr Bianca Benassi-Evans – who is based in the South Australian Health and Medical Research Institute (SAHMRI) building – has recently trialed two non-genetically modified barley grains to determine their impact on bowel health. Comparing the two candidate CSIRO barley grains against regular barley and puffed rice as breakfast meals in a sample of 20 adults, she found both to have desirable features.

“Our grains increased the acidity in stool samples, and increased bowel production of butyrate, a short chain fatty acid,” she said.

“Both of these outcomes are biomarkers of good bowel health.”

Additional studies are currently taking place, through which Dr Benassi-Evans hopes these grains may end up in your cereal aisle.

“What you choose in that supermarket can have dramatic effects on gut health,” she said.

Which breakfast products make you fart is a question you’ll have to determine yourself.

CSIRO Food and Nutrition in South Australia has laboratories and clinical consulting facilities at SAHMRI and The University of Adelaide.

Contacts and sources:

Unveiling the Turbulent Times of a Dying Star

All the stars in the sky will eventually die - and some will really go out with a bang.

When a dying star goes supernova, it explodes with such ferocity that it outshines the entire galaxy in which it lived, spewing material and energy across unimaginable distances at near-light speed.
Visualization of a supernova emitting jets 
Credit: NASA_GSFC Dana Barry

In some cases, these cosmic cataclysms defy expectations, blasting not symmetrically in all directions - as an exploding firework might - but instead launching two narrow beams, known as jets, in opposite directions.

Understanding how these jets are created is a vexing challenge, but an international research team has recently employed powerful computer simulations to sleuth out some answers.

The team - led by Phillip Moesta (NASA Einstein Fellow at UC Berkeley), with Caltech researchers Christian Ott, David Radice and Luke Roberts, Perimeter Institute computational scientist Erik Schnetter, and Roland Haas of the Max-Planck Institute for Gravitational Physics - published their findings Nov. 30 in Nature.

Their work sheds light on an explosive chain reaction that creates jets and, over time, helps create the structure of the universe as we know it.
Supercomputer visualization of the toroidal magnetic field in a collapsed, massive star, showing how in a span of 10 milliseconds the rapid differential rotation revs up the star's magnetic field to a million billion times that of our sun (yellow is positive, light blue is negative). Red and blue represent weaker positive and negative magnetic fields, respectively. From left to right are shown: 500m, 200m, 100m, and 50m simulations. 
Simulations and visualization by Philipp Mösta.

"We were looking for the basic mechanism, the core engine, behind how a collapsing star could lead to the formation of jets," said Schnetter, who designed computer programs for the simulations employed by the research team to model dying stars.

That core engine, the team discovered, is a highly turbulent place. Any turbulent system - like an aging car with a deteriorating suspension on a bumpy road - is bound to get progressively more chaotic. In certain types of supernovae, that turbulence is caused by what is known as magnetorotational instability - a type of rapid change within the magnetic field of a spinning system, like some stars.
Supercomputer visualization of the toroidal magnetic field in a collapsed, massive star, showing how in a span of 10 milliseconds the rapid differential rotation revs up the stars magnetic field to a million billion times that of our sun (yellow is positive, light blue is negative). Red and blue represent weaker positive and negative magnetic fields, respectively. 
Credit: Robert R. Sisneros (NCSA) and Philipp Mösta.

Prior to the work of Schnetter and colleagues, this instability was believed to be a possible driver of jet-formation in supernovae, but the evidence to support that belief was scant.

Uncovering such evidence, Schnetter says, required a something of a scientific perfect storm.

"You need to have the right people, with the right expertise and the right chemistry between them, you need to have the right understanding of physics and mathematics and computer science, and in the end you need the computer hardware that can actually run the experiment."

They assembled the right people and found the computational horsepower they needed at the University of Urbana-Champaign in Illinois.

The team used Blue Waters, one of the world's most powerful supercomputers, to run simulations of supernovae explosions - simulations so complex that no typical computer could handle the number-crunching required. On Blue Waters, the simulations provided an unprecedented glimpse into the extreme magnetic forces at play in stellar explosions.

Supercomputer visualization of the toroidal magnetic field in a collapsed, massive star, showing how in a span of 10 milliseconds the rapid differential rotation revs up the stars magnetic field to a million billion times that of our sun (yellow is positive, light blue is negative). Red and blue represent weaker positive and negative magnetic fields, respectively.

Simulations and visualization by Philipp Mösta.
The 3D simulations revealed an inverse cascade of magnetic energy in the core of spinning stars, which builds up with enough intensity to launch jets from the stellar poles.
Though the simulations do not take into account every chaotic variable inside a real supernova, they achieve a new level of understanding that will drive follow-up research with more specialized simulations.

Deepening our understanding of supernova explosions is an ongoing process, Schnetter says, and one that may help us better understand the origins of - to borrow a phrase from Douglas Adams - life, the universe, and everything.

The formation of galaxies, stars, and even life itself are fundamentally connected to energy and matter blasted outward in exploding stars. Even our own Sun, which supports all life on our planet, is known to be the descendent of earlier supernovae.

So the study of stellar explosions is, Schnetter says, deeply connected to some of the most fundamental questions humans can ask about the universe. A nice bonus, he adds, is that supernovae are also really awesome explosions.

"These are some of the most powerful events in the universe," he says. "Who wouldn't want to know more about that?"

Contacts and sources:
Eamon O'Flynn
Perimeter Institute for Theoretical Physics

Missing Link Found Between Turbulence In Collapsing Star And Hypernova, Gamma-Ray Burst

A supercomputer simulation of a mere 10 milliseconds in the collapse of a massive star into a neutron star proves that these catastrophic events, often called hypernovae, can generate the enormous magnetic fields needed to explode the star and fire off bursts of gamma rays visible halfway across the universe.

The results of the simulation, published online Nov. 30 in advance of publication in the journal Nature, demonstrate that as a rotating star collapses, the star and its attached magnetic field spin faster and faster, forming a dynamo that revs the magnetic field to a million billion times the magnetic field of Earth.

A field this strong is sufficient to focus and accelerate gas along the rotation axis of the star, creating two jets that ultimately can produce oppositely directed blasts of highly energetic gamma rays.

The first electrical generators were dynamos, generating current as wires rotated through a magnetic field. Stellar dynamos generate electrical currents as magnetic fields move through space, while the currents in turn boost the magnetic field, resulting in a feedback loop that produces monster magnetic fields.

"A dynamo is a way of taking the small-scale magnetic structures inside a massive star and converting them into larger and larger magnetic structures needed to produce hypernovae and long gamma-ray bursts," said Philipp Mösta, a UC Berkeley postdoctoral fellow and first author of the paper. "That kicks off the process."

"People had believed this process could work out," he said. "Now we actually show it."

Key to this success was a computer simulation at finer detail than ever before, though one that required 130,000 computer cores operating in parallel over a span of two weeks on Blue Waters, one of the most powerful supercomputers in the world. It is located at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.

Hypernovae produce heavy elements

Astrophysicists like Mösta are trying to improve their models of what stars do when they reach the ends of their lives, hoping to explain strange cosmic phenomena - like gamma-ray bursts and hypernovae that flash 10 times brighter than the average supernova - and understand how some of the very heavy elements found in nature are made.

"Now we have the first prototype model that allows us to ask the question, How are heavy elements made in these powerful supernova explosions?" said Eliot Quataert, a UC Berkeley professor of astronomy who was not involved with the study.

Supercomputer visualization of the toroidal magnetic field in a collapsed, massive star, showing how in a span of 10 milliseconds the rapid differential rotation revs up the stars magnetic field to a million billion times that of our sun (yellow is positive, light blue is negative). Red and blue represent weaker positive and negative magnetic fields, respectively. 
Credit: Robert R. Sisneros (NCSA) and Philipp Mösta.

"The breakthrough here is that Philipp's team starts from a relatively weak magnetic field and shows it building up to be a very strong and large-scale coherent magnetic field of the kind that is usually assumed to be there when people make models of gamma-ray bursts," Quataert said.

Brightest events in universe

Gamma-ray bursts are so brief and energetic - long bursts last about 100 seconds, with wavelengths far outside the visible or ultraviolet bands - that they were not observed until 1967 by satellites looking for evidence of nuclear bomb tests. Most are billions of light years away in distant galaxies, so the fact we can see them at all means they are among the brightest events in the universe.

Observations over the last 50 years have led astronomers to propose that the bursts are produced during the extremely rare explosions of massive stars - stars 25 times the mass of the sun or larger - but the details of how such a hypernova generates focused beams of gamma rays are still being worked out. These stellar explosions are typically classified as Type Ic broadline supernovae.

It is thought that jets held together by ultra-strong magnetic fields are required to power these explosions, Mösta said, but one of the missing links was how a star with a normal magnetic field, like that of the sun, could amplify it a quadrillion (1015) times. One possibility is that energy stored in the rotation of the collapsed star could be transformed into magnetic energy. These strong magnetic fields may also be critical to help accelerate charged particles to a speed and energy able to generate a gamma ray.

"We expect only a small fraction of stars to be spinning rapidly enough before collapse to explain pulsar spin periods of milliseconds," said co-author Christian Ott, a professor of theoretical astrophysics at the California Institute of Technology. "But if a star is spinning this fast, then there is a lot of energy in the rotation. The problem has been how to extract that and dump it into the explosion."

Creating ultra-strong magnetic fields

A core-collapse supernova occurs when hydrogen fusion in the core - which powers stars during most of their lifetime - stops after all the hydrogen is used up and the star begins to fuse helium and then carbon and oxygen. When the star finally fuses all these elements into iron, fusion stops entirely and the pressure at the core of the star can no longer support the gravitational weight of the surrounding material.

Within one second, the inner star out to a radius of about 1,500 kilometers collapses to a neutron star about 10 to 15 kilometers across, containing the mass of about 1.4 suns. This creates an outward-moving shock wave that plows into the outer layers of the star. As the inner star collapses to a neutron star, it increases its spin just as pirouetting ice skaters spin faster as they draw in their arms.

Theorists have attempted to explain how massive, rotating stars generate strong magnetic fields after they have collapsed by a process called magnetorotational instability: Layers of the star rotate at different speeds, creating turbulence that molds the embedded magnetic fields into kilometer-wide magnetic flux tubes much like magnetic flares on the sun. But can this process generate the much larger-scale magnetic fields needed to drive an explosion?

"What we have done are the first global extremely high-resolution simulations of this that actually show that you create this large global field from a purely turbulent one," Mösta said. "The simulations also demonstrate a mechanism to form magnetars, neutron stars with an extremely strong magnetic field, which may be driving a particular class of very bright supernovae."

Quataert compares the process to how small-scale turbulence in Earth's atmosphere coalesces into large-scale hurricanes.

Mösta and his colleagues found that the key to this process in a rapidly rotating neutron star is a shear zone about 15 to 35 kilometers from the star where the different layers are rotating at very different speeds, causing turbulence large enough to create a dynamo.

Mösta is working on simulations that encompass more than 10 milliseconds of the star's evolution after collapse, or "post-bounce," to better understand how the collapsing matter and outflowing material interact with the swirling magnetic fields.

Other co-authors with Mösta and Ott are David Radice and Luke Roberts of Caltech in Pasadena, Erik Schnetter of the Perimeter Institute and the University of Guelph in Ontario, Canada, and Roland Haas of the Max Planck Institute for Gravitational Physics in Potsdam-Golm, Germany.

This research was supported by the National Science Foundation (AST-1212170, PHY-1151197, OCI-0905046), the National Aeronautics and Space Administration's Einstein Fellowship Program and the Sherman Fairchild Foundation.

Contacts and sources:
Robert Sanders
University Of California - Berkeley

Monday, November 30, 2015

Rare Fossil of a Horned Dinosaur Found from 'Lost Continent'

A rare fossil from eastern North America of a dog-sized horned dinosaur has been identified by a scientist at the University of Bath. The fossil provides evidence of an east-west divide in North American dinosaur evolution.

During the Late Cretaceous period, 66-100 million years ago, the land mass that is now North America was split in two continents by a shallow sea, the Western Interior Seaway, which ran from the Gulf of Mexico to the Arctic Ocean. Dinosaurs living in the western continent, called Laramidia, were similar to those found in Asia.

However, few fossils of animals from the eastern 'lost continent' of Appalachia have been found because these areas being densely vegetated, making it difficult to discover and excavate fossils.

This is an illustration of the ceratopsian dinosaur from the Late Cretaceous period of eastern North America

Credit: Dr Nick Longrich

Dr Nick Longrich, from the Milner Centre for Evolution based in the University of Bath's Department of Biology & Biochemistry, studied one of these rare fossils, a fragment of a jaw bone kept in the Peabody Museum at Yale University. It turned out to be a member of the horned dinosaurs - the Ceratopsia. His study, published in the journal Cretaceous Research, highlights it as the first fossil from a ceratopsian dinosaur identified from this period of eastern North America.

Ceratopsia is a group of plant-eating horned dinosaurs that lived in the Cretaceous period. The fossil in question comes from a smaller cousin of the better known Triceratops, the leptoceratopsids. It was about the size of a large dog.

The specimen studied by Longrich was too incomplete to identify the exact species accurately, but showed a strange twist to the jaw, causing the teeth to curve downward and outwards in a beak shape. The jaw was also more slender than that of Ceratopsia found in western North America, suggesting that these dinosaurs had a different diet to their western relatives, and had evolved along a distinct evolutionary path.

Dr Nick Longrich explained: "Just as many animals and plants found in Australia today are quite different to those found in other parts of the world, it seems that animals in the eastern part of North America in the Late Cretaceous period evolved in a completely different way to those found in the western part of what is now North America due to a long period of isolation.

"This adds to the theory that these two land masses were separated by a stretch of water, stopping animals from moving between them, causing the animals in Appalachia to evolve in a completely different direction, resulting in some pretty weird looking dinosaurs.

"Studying fossils from this period, when the sea levels were very high and the landmasses across the Earth were very fragmented, is like looking at several independent experiments in dinosaur evolution.

"At the time, many land masses - eastern North America, Europe, Africa, South America, India, and Australia - were isolated by water.

"Each one of these island continents would have evolved its own unique dinosaurs- so there are probably many more species out there to find."

Contacts and sources:
Vicky Just
University of Bath

Dopamine Affects Learning: Abrupt Dopamine Increases Result as a Person Perceives Stimuli that Predict Rewards

If you've ever felt lackadaisical to start a new project, focus on imagining the joy of completing it, say University of Michigan researchers.

Both are a function of dopamine, which explains the motivation to start and the satisfaction of finishing work, they say.

Credit: University of Michigan 

In a new study, U-M researchers Arif Hamid and Joshua Berke, professor of psychology and biomedical engineering, argue that dopamine levels continuously signal how good or valuable the current situation is regarding obtaining a reward. This message helps people decide how vigorously to work toward a goal, while also allowing themselves to learn from mistakes.

"We provide a new theoretical account for how dopamine affects learning (what to do later) and motivation (getting fired up to go now) simultaneously," said study lead author Hamid, U-M neuroscience doctoral student.

For many years, researchers have known that dopamine is important for arousal, movement, mood and executing activities with haste and vigor. Aspects of these normal dopamine functions are highlighted in disorders, such as Parkinson's disease and depression. Drugs that elevate brain dopamine levels, like cocaine or amphetamines, produce euphoric feelings of well-being, in addition to heightened arousal and attention.

Aside from affecting immediate mood and behavior, dopamine also produces changes in the brain that are persistent, sometimes lasting a lifetime.

"This is basically how we stamp in memories of what the smell of cookies or the McDonald's sign mean: predictors of delicious, calorie rich rewards," Hamid said.

Abrupt dopamine increases when a person perceives stimuli that predict rewards is a dominant mechanism of reward learning within the brain—a concept similar to Russian physiologist Ivan Pavlov's dog hearing the bell and salivating at a response to stimuli, he said.

Hamid said the precise mechanism of how a neurotransmitter can achieve both invigorating and learning functions is counterintuitive, and many decades of neuropsychological research has attempted to resolve exactly how.

One theory, spearheaded by U-M psychologists Kent Berridge and Terry Robinson, suggests that dopamine invigorates actions toward desired goals. For example, rats with almost no brain dopamine will not retrieve food a few inches away while they're starving.

Another theory suggests dopamine is a "teaching signal," like a coach who tells his player "good job" or "bad job" to encourage a future reward. In the current study, U-M researchers describe those dopamine fluctuations as a continuous cheer to motivate, with brief moments of criticism.

They measured dopamine levels in rats while they performed a decision-making task, and compared it with how motivated the rats were and how much they learned. They also increased dopamine levels to artificially motivate the rats and repeatedly made them learn to perform actions that did not produce rewards.

The findings appear in the current issue of Nature Neuroscience.

The study's other authors include Jeffrey Pettibone, Omar Mabrouk, Vaughn Hetrick, Robert Schmidt, Caitlin Vander Weele, Robert Kennedy and Brandon Aragona.

The work was supported by grants from the National Institute on Drug Abuse (DA032259, DA007281), National Institute of Mental Health (MH093888, MH101697), National Institute on Neurological Disorders and Stroke (NS078435, NS076401) and National Institute of Biomedical Imaging and Bioengineering (EB003320). R.S. was supported by the BrainLinks-BrainTools Cluster of Excellence funded by the German Research Foundation (DFG grant number EXC1086).

Contacts and sources:
Jared Wadley
University of Michigan

Rice Basket Study Rethinks Roots Of Human Culture

A new study from the University of Exeter has found that teaching is not essential for people to learn to make effective tools.

The results counter established views about how human tools and technologies come to improve from generation to generation and point to an explanation for the extraordinary success of humans as a species. The study reveals that although teaching is useful, it is not essential for cultural progress because people can use reasoning and reverse engineering of existing items to work out how to make tools.

Although teaching is useful, it is not essential for cultural progress. 
  Rice in basket
Credit: Wikipedia

The capacity to improve the efficacy of tools and technologies from generation to generation, known as cumulative culture, is unique to humans and has driven our ecological success. It has enabled us to inhabit the coldest and most remote regions on Earth and even have a permanent base in space. The way in which our cumulative culture has boomed compared to other species however remains a mystery.

It had long been thought that the human capacity for cumulative culture was down to special methods of learning from others - such as teaching and imitation - that enable information to be transmitted with high fidelity.

To test this idea, the researchers recreated conditions encountered during human evolution by asking groups of people to build rice baskets from everyday materials. Some people made baskets alone, while others were in ’transmission chain’ groups, where each group member could learn from the previous person in the chain either by imitating their actions, receiving teaching or simply examining the baskets made by previous participants.

Teaching produced the most robust baskets but after six attempts all groups showed incremental improvements in the amount of rice their baskets could carry.

Dr Alex Thornton from the Centre for Ecology and Conservation at the University of Exeter’s Penryn Campus in Cornwall said: “Our study helps uncover the process of incremental improvements seen in the tools that humans have used for millennia. While a knowledgeable teacher clearly brings important advantages, our study shows that this is not a limiting factor to cultural progress. Humans do much more than learn socially, we have the ability to think independently and use reason to develop new ways of doing things. This could be the secret to our success as a species.”

The results of the study shed light on ancient human society and help to bridge the cultural chasm between humans and other species. The researchers say that to fully understand those elements that make us different from other animals, future work should focus on the mental abilities of individuals and not solely mechanisms of social learning.

Cognitive requirements of cumulative culture: teaching is useful but not essential by Elena Zwirner and Alex Thornton is published in Scientific Reports

Contacts and sources:
University of Exeter
Louise Vennells

Prescription for Bullying: Kids on ADHD Meds Twice As Likely To Face Bullies

Kids and teens who take medications like Ritalin to treat attention-deficit hyperactivity disorder are twice as likely to be physically or emotionally bullied by peers than those who don't have ADHD, a new University of Michigan study found.

At even higher risk were middle and high school students who sold or shared their medications--those kids were four-and-a-half times likelier to be victimized by peers than kids without ADHD.

The main findings are the same for both sexes, said the study's first author, Quyen Epstein-Ngo, research assistant professor at the U-M Institute for Research on Women and Gender and a fellow at the U-M Injury Center. Carol Boyd, professor of nursing, is the principal investigator.

Credit: The Simpsons

It's long been known that kids with ADHD have a harder time making and keeping friends and are bullied and victimized more. This study is believed to be the first known to look at how stimulant medications affect their relationships with peers.

"Many youth with ADHD are prescribed stimulant medications to treat their ADHD and we know that these medications are the most frequently shared or sold among adolescents," said Epstein-Ngo, a licensed clinical psychologist.

The U-M researchers surveyed nearly 5,000 middle and high school students over four years. About 15 percent were diagnosed with ADHD and roughly 4 percent were prescribed stimulants within the past 12 months.

Of those who took ADHD meds, 20 percent reported being approached to sell or share them, and about half of them did.

Credit: nobullying.com 

When looking at the overall figures, relatively few students were asked to divert their medications or did. However, Epstein-Ngo said the numbers don't tell the entire story.

"Having a diagnosis of ADHD has lifelong consequences," she said. "These youth aren't living in isolation. As they transition into adulthood, the social effects of their ADHD diagnosis will impact a broad range of people with whom they come into contact."

From 2003 to 2011, there was a 42 percent increase in ADHD cases diagnosed in the U.S., and between 2007 and 2011, there was a 27 percent increase in stimulant-treated.

Epstein-Ngo said the findings shouldn't scare parents away from considering a stimulant medication. Rather, the study reinforces why parents must talk to kids about never sharing their medications.

"For some children stimulant medications are immensely helpful in getting through school," Epstein-Ngo said. "This study doesn't say 'don't give your child medication.' It suggests that it's really important to talk to your children about who they tell."

It's unclear why kids with prescriptions for stimulant medications are more at risk for bullying and victimization, but Epstein-Ngo said it's probably several factors.

"Is it a function of the fact that they are in riskier situations, or are they being coerced and forced to give up their medications? Probably a little bit of both," she said.

Epstein-Ngo believes the biggest takeaway is to have compassion for kids with ADHD.

"I think the biggest misconception about ADHD is that these kids aren't trying hard enough, and that's just not the case," she said. "If these kids could do better they would. With the proper support and treatment they can overcome this." 

The study, "Diversion of ADHD stimulants and peer victimization among adolescents," is scheduled to appear in the Journal of Pediatric Psychology. It was funded by the National Institute on Drug Abuse

Contacts and sources: 
Laura Bailey
University of Michigan

Earth's First Ecosystems Were More Complex Than Previously Thought, Study Finds

Computer simulations have allowed scientists to work out how a puzzling 555-million-year-old organism with no known modern relatives fed, revealing that some of the first large, complex organisms on Earth formed ecosystems that were much more complex than previously thought.

The international team of researchers from Canada, the UK and the USA, including Dr Imran Rahman from the University of Bristol, UK studied fossils of an extinct organism called Tribrachidium, which lived in the oceans some 555 million years ago. Using a computer modelling approach called computational fluid dynamics, they were able to show that Tribrachidium fed by collecting particles suspended in water. This is called suspension feeding and it had not previously been documented in organisms from this period of time.

Computer simulations have allowed scientists, led by Dr Imran Rahman of the University of Bristol, UK to work out how this 555-million-year-old organism with no known modern relatives fed. Their research reveals that some of the first large, complex organisms on Earth formed ecosystems that were much more complex than previously thought.
Credit: M. Laflamme

Tribrachidium lived during a period of time called the Ediacaran, which ranged from 635 million to 541 million years ago. This period was characterised by a variety of large, complex organisms, most of which are difficult to link to any modern species. It was previously thought that these organisms formed simple ecosystems characterised by only a few feeding modes, but the new study suggests they were capable of more types of feeding than previously appreciated.

Dr Simon Darroch, an Assistant Professor at Vanderbilt University, said: "For many years, scientists have assumed that Earth's oldest complex organisms, which lived over half a billion years ago, fed in only one or two different ways. Our study has shown this to be untrue,Tribrachidium and perhaps other species were capable of suspension feeding. This demonstrates that, contrary to our expectations, some of the first ecosystems were actually quite complex."

Co-author Dr Marc Laflamme, an Assistant Professor at the University of Toronto Mississauga, added: "Tribrachidium doesn't look like any modern species, and so it has been really hard to work out what it was like when it was alive. The application of cutting-edge techniques, such as CT scanning and computational fluid dynamics, allowed us to determine, for the first time, how this long-extinct organism fed."

Such simulations have allowed scientists, led by Dr Imran Rahman of the University of Bristol, UK to work out how this 555-million-year-old organism with no known modern relatives fed. Their research reveals that some of the first large, complex organisms on Earth formed ecosystems that were much more complex than previously thought.
Credit: I.A. Rahman

Computational fluid dynamics is a method for simulating fluid flows that is commonly used in engineering, for example in aircraft design, but this is one of the first applications of the technique in palaeontology (following up previous research carried out at Bristol).

Dr Rahman, a Research Fellow in Bristol's School of Earth Sciences said: "The computer simulations we ran allowed us to test competing theories for feeding in Tribrachidium. This approach has great potential for improving our understanding of many extinct organisms."

Co-author Dr Rachel Racicot, a postdoctoral researcher at the Natural History Museum of Los Angeles County added: "Methods for digitally analysing fossils in 3D have become increasingly widespread and accessible over the last 20 years. We can now use these data to address any number of questions about the biology and ecology of ancient and modern organisms."

The research was funded by the UK's Royal Commission for the Exhibition of 1851.

The study is published today in the journal Science Advances.

 Contacts and sources:
Hannah Johnson
University of Bristol

Citation: 'Suspension feeding in the enigmatic Ediacaran organism Tribrachidium demonstrates complexity of Neoproterozoic ecosystems' by Imran A. Rahman, Simon A. F. Darroch, Rachel A. Racicot and Marc Laflamme in Science Advances

Friday, November 27, 2015

Vitamin B May Counter Negative Effects of Pesticide on Fertility

Women exposed to DDT who have adequate B vitamin intake more likely to get and stay pregnant than those who are deficient

Women who have adequate levels of B vitamins in their bodies are more likely to get and stay pregnant even when they also have high levels of a common pesticide known to have detrimental reproductive effects, according to new Johns Hopkins Bloomberg School of Public Health research.

Credit: Johns Hopkins Bloomberg School of Public Health

The findings, published in the December issue of the American Journal of Clinical Nutrition, suggest that B vitamins may have a protective effect that counteracts the levels of DDT in their bodies. DDT, a known endocrine disruptor, is still used to kill mosquitoes in many countries where malaria remains a serious public health concern. The United States banned the pesticide in 1972; China, where the study was conducted, followed suit in 1984. DDT, however, can remain in the body and environment for decades.

“Our previous work has shown that high levels of DDT in the body can increase the risk of early miscarriage,” says study leader Xiaobin Wang, MD, ScD, MPH, the Zanvyl Krieger Professor and Director of the Center on the Early Life Origins of Disease at the Johns Hopkins Bloomberg School of Public Health. “This study tells us that improved nutrition may modify the toxic effects of DDT, by better preparing the body to cope with environmental toxins and stressors. We have shown that women with high levels of DDT who also had high levels of B vitamins had a better chance of getting and staying pregnant than those were deficient in those vitamins.”

The findings suggest that looking at toxins and nutrition in isolation doesn’t paint a full picture of how these different factors intersect. She says studies like this may provide a model for how future research groups can examine the relationship between other toxins and nutrients.

For the study, conducted between 1996 and 1998, Wang and her colleagues recruited and followed a group of female Chinese textile workers who were trying to get pregnant. Every day for up to a year, researchers tested the urine of the women in the study, detecting their levels of hCG the hormone that signals conception. This approach allowed researchers to determine whether a woman was pregnant much earlier than she might normally find out, in the days over even weeks before she realized she’d missed a menstrual period. It also allowed the researchers to determine whether women had an early pregnancy loss (miscarried before six weeks of pregnancy). DDT and DDE (major degraded product of DDT) and B vitamin levels were measured in the women before conception.

A 2003 study by Wang and her colleagues showed that one-third of all conceptions end before women even know they are pregnant. “This is a very vulnerable period,” she says.

Among the 291 women ultimately included in the study, there were 385 conceptions, 31 percent of which were lost before six weeks. Women with high DDT levels and sufficient levels of vitamin B had a 42 percent greater chance of early miscarriage than women with lower DDT levels. But in those with high DDT levels and vitamin B deficiencies, women were twice as likely to suffer a miscarriage before six weeks of gestation. The researchers looked at three types of B vitamins – B-6, B-12 and folic acid – and determined that the risk to a pregnancy was higher with B-12 and folic acid deficiencies and with deficiencies in more than one type of B vitamins.

The researchers also found that women with high DDT and low B vitamin levels took nearly twice as long to conceive in the first place.

The standard of care in many nations is to give an iron-folate supplement to women once they seek prenatal health care, which typically occurs between eight and 12 weeks of gestation, if at all. But that supplement is rarely taken prior to conception, meaning it likely comes too late to prevent early pregnancy loss. And unlike in the United States, where many foods are fortified with folic acid, that is not the norm around the world.

Difficulty conceiving is prevalent in both developed and developing countries. In the United States, the percentage of married women between the ages of 15 and 44 who had difficulty achieving and maintaining pregnancy increased from 8 percent in 1982 to 11.8 percent in 2002, an increase that cannot be completely explained by the age of the women.

Better nutrition – including fortifying foods with B vitamins – in countries where DDT is still in wide use could improve pregnancy outcomes, Wang says. She says there may also be implications in the United States, particularly among women who immigrate from countries where DDT is still common and among low-income women whose diets may not include foods high in B vitamins such as leafy green vegetables and beans. Even women from the United States may have DDT in their systems, which may come from imported foods or even from local food grown in soil still contaminated with DDT.

“Health care providers need to make sure women get adequate micronutrients including B vitamins in their diets not only during pregnancy but before they even conceive,” she says. “Otherwise, we may miss that critical window.”

Preconception serum 1,1,1-trichloro-2,2,bis(p-chlorophenyl)ethane and B-vitamin status: independent and joint effects on women’s reproductive outcomes” was written by Fengxiu Ouyang, Matthew P Longnecker, Scott A Venners, Sara Johnson, Susan Korrick, Jun Zhang, Xiping Xu, Parul Christian, Mei-Cheng Wang and Xiaobin Wang.

The research was supported in part by grants from the National Institutes of Health’s National Institute of Child Health and Human Development (R01 HD32505) and the National Institute of Environmental Health Sciences (R01 ES11682 and R03 ES022790), the Intramural Research Program of the National Institute of Environmental Health Sciences.

Contacts and sources:
 Johns Hopkins Bloomberg School of Public Health 

Researchers Find Link Between Air Pollution and Heart Disease

Researchers from the Johns Hopkins Bloomberg School of Public Health have found a link between higher levels of a specific kind of air pollution in major urban areas and an increase in cardiovascular-related hospitalizations such as for heart attacks in people 65 and older.

The findings, published in the November issue of Environmental Health Perspectives, are the strongest evidence to date that coarse particulate matter – airborne pollutants that range in size from 2.5 to 10 microns in diameter and can be released into the air from farming, construction projects or even wind in the desert – impacts public health. It has long been understood that particles smaller in size, which typically come from automobile exhaust or power plants, can damage the lungs and even enter the bloodstream. This is believed to be the first study that clearly implicates larger particles, which are smaller in diameter than a human hair.

Schematic drawing, causes and effects of air pollution: (1) greenhouse effect, (2) particulate contamination, (3) increased UV radiation, (4) acid rain, (5) increased ground level ozone concentration, (6) increased levels of nitrogen oxides.File:Air Pollution-Causes&Effects.svg
Credit: Wikimedia Commons

“We suspected that there was an association between coarse particles and health outcomes, but we didn’t have the research to back that up before,” says study leader Roger D. Peng, PhD, an associate professor of biostatistics at the Bloomberg School. “This work provides the evidence, at least for cardiovascular disease outcomes. I don’t feel like we need another study to convince us. Now it’s time for action.”

The researchers also studied respiratory diseases but did not find a correlation between high levels of coarse particles and hospitalizations for those illnesses.

For the national study, Peng and his colleagues studied data from an air monitoring network set up by the U.S. Environmental Protection Agency (EPA) in 110 large urban counties in the United States and linked it to Medicare data on hospitalizations in those same areas from 1999 to 2010. The hospitalizations covered people ages 65 and older.

Counties were included in the study if they had more than 20,000 Medicare enrollees in 2010 and had equipment that monitored fine and coarse particles for at least 200 days of the study. Over the time period, there were 6.37 million cardiovascular and 2.51 million respiratory emergency hospitalizations over the 110 counties.

 Smog in New York City
Credit: Wikimedia Commons

The researchers found that on days when coarse particle levels were higher, cardiovascular hospitalizations were also higher that same day. They did not find a correlation in the following days.

As part of the Clean Air Act, the EPA more closely regulates finer particles, which are more likely to come from manmade sources. States work to reduce those levels through various mechanisms, including stronger car emissions standards or adding scrubbers to coal-fired power plants. In some areas, coarse particles may be more difficult to reduce, as they can come from natural sources.

The coarse particles enter the respiratory tract and can trigger systemic health problems, though the mechanism is not fully understood.

The findings varied by geographic region. While there were higher concentrations of coarse particles found in the western United States, there were more cardiovascular events requiring hospitalization in the eastern United States.

“Just because the particles are the same size doesn’t mean they are made of the same material,” Peng says. “It’s possible that the chemical composition of the particles in the east could make them more toxic.”

Peng says that EPA’s monitoring network is not designed to measure coarse particles and they may need a national monitoring network for particles of that size. In the past, he says, the EPA has proposed tighter regulations on coarse particles, but they were never finalized, in part because there wasn’t enough evidence.

“It’s worth revisiting given this new data,” he says.

Contacts and sources: 
Stephanie Desmon
Johns Hopkins Bloomberg School of Public Health

Citation: “Ambient Coarse Particulate Matter and Hospital Admissions in the Medicare Cohort Air Pollution Study, 1999-2010” was written by Helen Powell, Jenna R. Krall, Yun Wang, Michelle L. Bell and Roger D. Peng.

The research was supported by grants from the National Institutes of Health’s National Institute of Environmental Health Sciences (R01ES019560, R01ES019587 and R21ES021427); the National Institute on Aging (T32AG000247) and the EPA.

Thursday, November 26, 2015

Rapid Plankton Growth in Ocean Seen As Sign of Carbon Dioxide Loading

A microscopic marine alga is thriving in the North Atlantic to an extent that defies scientific predictions, suggesting swift environmental change as a result of increased carbon dioxide in the ocean, a study led a by Johns Hopkins University scientist has found.

What these findings mean remains to be seen, as does whether the rapid growth in the tiny plankton's population is good or bad news for the planet.

A scanning electron microscope image of a coccolithophore, which can measure from 5 to 15 microns across, less than a fifth the width of a human hair.  

Image: Amy Wyeth, Bigelow Laboratory for Ocean Sciences

Published recently in the journal Science, the study details a tenfold increase in the abundance of single-cell coccolithophores between 1965 and 2010, and a particularly sharp spike since the late 1990s in the population of these pale-shelled floating phytoplankton.

"Something strange is happening here, and it's happening much more quickly than we thought it should," saidAnand Gnanadesikan, associate professor in the Morton K. Blaustein Department of Earth and Planetary Sciences at Johns Hopkins and one of the study's five authors.

Gnanadesikan said the Science report certainly is good news for creatures that eat coccolithophores, but it's not clear what those are. "What is worrisome," he said, "is that our result points out how little we know about how complex ecosystems function." The result highlights the possibility of rapid ecosystem change, suggesting that prevalent models of how these systems respond to climate change may be too conservative, he said.

The team's analysis of Continuous Plankton Recorder survey data from the North Atlantic Ocean and North Sea since the mid-1960s suggests rising carbon dioxide in the ocean is causing the coccolithophore population spike, said Sara Rivero-Calle, a Johns Hopkins doctoral student and lead author of the study. A stack of laboratory studies supports the hypothesis, she said. Carbon dioxide is a greenhouse gas already fingered by scientific consensus as one of the triggers of global warming.

"Our statistical analyses on field data from the CPR point to carbon dioxide as the best predictor of the increase" in coccolithophores, Rivero-Calle said. "The consequences of releasing tons of CO2 over the years are already here and this is just the tip of the iceberg."

The CPR survey is a continuing study of plankton, floating organisms that form a vital part of the marine food chain. The project was launched by a British marine biologist in the North Atlantic and North Sea in the early 1930s. It is conducted by commercial ships trailing mechanical plankton-gathering contraptions through the water as they sail their regular routes.

William M. Balch of the Bigelow Laboratory for Ocean Sciences in Maine, a co-author of the study, said scientists might have expected that ocean acidity due to higher carbon dioxide would suppress these chalk-shelled organisms. It didn't. On the other hand, their increasing abundance is consistent with a history as a marker of environmental change.

"Coccolithophores have been typically more abundant during Earth's warm interglacial and high CO2 periods," said Balch, an authority on the algae. "The results presented here are consistent with this and may portend, like the 'canary in the coal mine,' where we are headed climatologically."

Coccolithophores are single-cell algae that cloak themselves in a distinctive cluster of pale disks made of calcium carbonate, or chalk. They play a role in cycling calcium carbonate, a factor in atmospheric carbon dioxide levels. In the short term they make it more difficult to remove carbon dioxide from the atmosphere, but in the long term—tens and hundreds of thousands of years—they help remove carbon dioxide from the atmosphere and oceans and confine it in the deep ocean.

In vast numbers and over eons, coccolithophores have left their mark on the planet, helping to show significant environmental shifts. The White Cliffs of Dover are white because of massive deposits of coccolithophores. But closer examination shows the white deposits interrupted by slender, dark bands of flint, a product of organisms that have glassy shells made of silicon, Gnanadesikan said.

"These clearly represent major shifts in ecosystem type," Gnanadesikan said. "But unless we understand what drives coccolithophore abundance, we can't understand what is driving such shifts. Is it carbon dioxide?"

The study was supported by the Sir Alister Hardy Foundation for Ocean Science, which now runs the CPR, and by the Johns Hopkins Applied Physics Laboratory. Other co-authors are Carlos del Castillo, a former biological oceanographer at APL who now leads NASA's Ocean Ecology Laboratory, and Seth Guikema, a former Johns Hopkins faculty member now at the University of Michigan.

Contacts and sources:
Arthur Hirsch
Johns Hopkins University

Diamonds May Be Pervasive Not Rare

Diamonds may not be as rare as once believed, but this finding in a new Johns Hopkins University research report won’t mean deep discounts at local jewelry stores.

“Diamond formation in the deep Earth, the very deep Earth, may be a more common process than we thought,” said Johns Hopkins geochemist Dimitri A. Sverjensky, whose article co-written with doctoral student Fang Huang appears today in the online journal Nature Communications. The report says the results “constitute a new quantitative theory of diamond formation,” but that does not mean it will be easier to find gem-quality diamonds and bring them to market.

Raw diamonds
Credit: www.indus-global.com

For one thing, the prevalence of diamonds near the Earth’s surface – where they can be mined – still depends on relatively rare volcanic magma eruptions that raise them from the depths where they form. For another, the diamonds being considered in these studies are not necessarily the stuff of engagement rings, unless the recipient is equipped with a microscope. Most are only a few microns across and are not visible to the unaided eye.

Using a chemical model, Sverjensky and Huang found that these precious stones could be born in a natural chemical reaction that is simpler than the two main processes that up to now have been understood to produce diamonds. Specifically, their model – yet to be tested with actual materials – shows that diamonds can form with an increase in acidity during interaction between water and rock.

The common understanding up to now has been that diamonds are formed in the movement of fluid by the oxidation of methane or the chemical reduction of carbon dioxide. Oxidation results in a higher oxidation state, or a loss of electrons. Reduction means a lower oxidation state, and collectively the two are known as “redox” reactions.

“It was always hard to explain why the redox reactions took place,” said Sverjensky, a professor in the Morton K. Blaustein Department of Earth and Planetary Sciences in the university’s Krieger School of Arts and Sciences. The reactions require different types of fluids to be moving through the rocks encountering environments with different oxidation states.

The new research showed that water could produce diamonds as its pH falls naturally – that is, as it becomes more acidic – while moving from one type of rock to another, Sverjensky said.

The finding is one of many in about the last 25 years that expands scientists’ understanding of how pervasive diamonds may be, Sverjensky said.

“The more people look, the more they’re finding diamonds in different rock types now,” Sverjensky said. “I think everybody would agree there’s more and more environments of diamond formation being discovered.”

Nobody has yet put a number on the greater abundance of diamonds, but Sverjensky said scientists are working on that with chemical models. It’s impossible to physically explore the great depths at which diamonds are created: roughly 90 to 120 miles below the Earth’s surface at intense pressure and at temperatures about 1,650 to 2,000 degrees Fahrenheit.

The deepest drilling exploration ever made was about 8 or 9 miles below the surface, he said.

If the study doesn’t shake the diamond markets, it promises to help shed light on fluid movement in the deep Earth, which helps account for the carbon cycle on which all life on the planet depends.

“Fluids are the key link between the shallow and the deep Earth,” Sverjensky said. “That’s why it’s important.”

This research was supported by grants from the Sloan Foundation through the Deep Carbon Observatory (Reservoirs and Fluxes and Extreme Physics and Chemistry programs) and by a U.S. Energy Department grant, DE-FG-02-96ER-14616.

Contacts and sources:
Arthur Hirsch
Johns Hopkins University 

Massive 'Development Corridors' In Africa Could Spell Environmental Disaster

In sub-Saharan Africa, dozens of major 'development corridors,' including roads, railroads, pipelines, and port facilities, are in the works to increase agricultural production, mineral exports, and economic integration. And, if all goes according to plan, it's going to be a disaster, say researchers reporting in the Cell Press journal Current Biology on Nov. 25. They assessed the potential environmental impacts of 33 planned or existing corridors -- spanning 53,000 kilometers -- that would crisscross much of the African continent, opening up vast areas of sparsely populated land.

This photo shows a savannah elephant, one of many animals expected to be affected by poaching with a dramatic expansion of African roads.
Credit: William Laurance

"In terms of development pressures, these corridors would be the biggest thing to hit Africa -- ever," says William Laurance of the Centre for Tropical Environmental and Sustainability Science at James Cook University in Cairns, Australia.

Earlier this year, Laurance and his colleagues issued a warning that this unprecedented expansion of infrastructure would come at a great cost. In the new study, he and his colleagues sought to quantify those costs by mapping each corridor along with the estimated human occupancy and the environmental values, including endangered and endemic vertebrates, plant diversity, critical habitats, carbon storage, and climate-regulation services, inside a 50-kilometer-wide band overlaid onto each corridor. They also assessed the potential for each corridor to facilitate increases in agricultural production.

Over 53,000 kilometers of development corridors have been proposed or are underway in sub-Saharan Africa. These corridors are intended to rapidly accelerate agriculture, mining, and land colonization, often in sparsely populated areas. In this Current Biology video abstract, Distinguished Professor William Laurance describes how the corridors were found to vary greatly in their potential socioeconomic benefits and likely environmental costs, with some corridors posing severe risks for key African ecosystems.

Credit: Laurance et al./Current Biology 2015

"We found striking variability in the likely environmental costs and economic benefits of the 33 'development corridors' that we assessed,' Laurance says. "Some of the corridors seem like a good idea, and some appear to be a really bad idea. Several of the corridors could be environmentally disastrous, in our view."

Based on the findings, Laurance says he thinks some of the planned development corridors should be cancelled altogether. His biggest concerns fall in areas near the equator, such as the Congo Basin, West Africa, and the rich tropical savannahs. Other corridors should proceed only with the most stringent safeguards and mitigation measures in place.

Change won't come easily. "The proponents of these projects include some very powerful economic interests, and no one can dispute the dire need to increase food security and economic development in Africa, where populations are growing very rapidly," Laurance says. "The trick -- and it's going to be a great challenge -- will be to amp up African food production without creating an environmental crisis in the process."

However, he and his colleagues were surprised to find that many of the proposed corridors are planned for places where the agricultural potential appears to be very limited, because of poor soils or climates or the remoteness of the area in question.

"One of the key justifications for these corridors is to ramp up farm and food production, but in fact it appears that massive mining investments--securing access to high-volume minerals such as iron ore and coal--are actually a key driver for a number of the corridors," Laurance says.

This photo shows deforestation in Africa, one of the expected effects of poorly planned road expansion.
Credit: William Laurance

The researchers are calling for key stakeholders -- African governments, international lenders and donors, private investors, and others -- to carefully scrutinize the development corridors. He and his team hope to advance a research agenda aimed at better environmental assessment and planning for those corridors that do move forward. They also plan to follow up this comprehensive survey with more detailed studies of key corridors and to develop more local partnerships to advance this work.

"Africa is now facing a 'decade of decision,'" Laurance says. "The stakes are enormous. Once any particular development corridor is established, Pandora's Box will be opened and there won't be very much that one can do to control the onslaught of hunting, habitat disruption, and legal and illegal mining activities. The only real chance to manage this situation is to stop those corridors that are most likely to cause truly profound environmental damage and to create stringent land-use planning around those corridors that do proceed."

This work was supported by the Australian Research Council, the Australian Council for International Agricultural Research, and James Cook University.

Contacts and sources:
Joseph Caputo
Cell Press

Current Biology, Laurance et al.: "Estimating the Environmental Costs of Africa's Massive ''Development Corridors.'' http://www.cell.com/current-biology


Aging Star’s Weight Loss Secret Revealed

VY Canis Majoris is a stellar goliath, a red hypergiant, one of the largest known stars in the Milky Way. It is 30-40 times the mass of the Sun and 300 000 times more luminous. In its current state, the star would encompass the orbit of Jupiter, having expanded tremendously as it enters the final stages of its life.

The new observations of the star used the SPHERE instrument on the VLT. Theadaptive optics system of this instrument corrects images to a higher degree than earlier adaptive optics systems. This allows features very close to bright sources of light to be seen in great detail [1]. SPHERE clearly revealed how the brilliant light of VY Canis Majoris was lighting up clouds of material surrounding it.

In this very close-up view from SPHERE the star itself is hidden behind an obscuring disc. The crosses are artefacts due to features in the instrument.
VLT image of the surroundings of VY Canis Majoris seen with SPHERE

Credit; ESO

And by using the ZIMPOL mode of SPHERE, the team could not only peer deeper into the heart of this cloud of gas and dust around the star, but they could also see how the starlight was scattered and polarised by the surrounding material. These measurements were key to discovering the elusive properties of the dust.

Careful analysis of the polarisation results revealed these grains of dust to be comparatively large particles, 0.5 micrometres across, which may seem small, but grains of this size are about 50 times larger than the dust normally found in interstellar space.

This wide-field view shows the sky around the very brilliant red hypergiant star VY Canis Majoris, one of the largest stars known in the Milky Way. The star itself appears at the centre of the picture, which also includes clouds of glowing red hydrogen gas, dust clouds and the bright star cluster around the bright star Tau Canis Majoris towards the upper right. This picture was created from images forming part of the Digitized Sky Survey 2.
Credit:ESO/Digitized Sky Survey 2. Acknowledgement: Davide De Martin

Throughout their expansion, massive stars shed large amounts of material -- every year, VY Canis Majoris sees 30 times the mass of the Earth expelled from its surface in the form of dust and gas. This cloud of material is pushed outwards before the star explodes, at which point some of the dust is destroyed, and the rest cast out into interstellar space. This material is then used, along with the heavier elements created during the supernova explosion, by the next generation of stars, which may make use of the material for planets.

This chart shows the location of the very brilliant red hypergiant star VY Canis Majoris, one of the largest stars known in the Milky Way. Most of the stars visible to the naked eye on a clear and dark night are shown and the location of VY Canis Majoris is marked with a red circle. This star is visible in a small telescope and has a strikingly red colour.

Credit: ESO, IAU and Sky & Telescope

Until now, it had remained mysterious how the material in these giant stars' upper atmospheres is pushed away into space before the host explodes. The most likely driver has always seemed to be radiation pressure, the force that starlight exerts. As this pressure is very weak, the process relies on large grains of dust, to ensure a broad enough surface area to have an appreciable effect [2].

This video sequence takes you on a voyage from a broad vista of the sky into a close-up look at one of the biggest stars in the Milky Way, VY Canis Majoris. The final image comes from the SPHERE instrument on ESO’s Very Large Telescope in Chile.

Credit: ESO/Digitized Sky Survey 2/N. Risinger (skysurvey.org)
Music: Johan B. Monell

"Massive stars live short lives," says lead author of the paper, Peter Scicluna, of the Academia Sinica Institute for Astronomy and Astrophysics, Taiwan. "When they near their final days, they lose alot of mass. In the past, we could only theorise about how this happened. But now, with the new SPHERE data, we have found large grains of dust around this hypergiant. These are big enough to be pushed away by the star's intense radiation pressure, which explains the star's rapid mass loss."

The large grains of dust observed so close to the star mean that the cloud can effectively scatter the star's visible light and be pushed by the radiation pressure from the star. The size of the dust grains also means much of it is likely to survive the radiation produced by VY Canis Majoris' inevitable dramatic demise as a supernova [3]. This dust then contributes to the surrounding interstellar medium, feeding future generations of stars and encouraging them to form planets.


[1] SPHERE/ZIMPOL uses extreme adaptive optics to create diffraction-limited images, which come a lot closer than previous adaptive optics instruments to achieving the theoretical limit of the telescope if there were no atmosphere. Extreme adaptive optics also allows much fainter objects to be seen very close to a bright star.

The images in the new study are also taken in visible light -- shorter wavelengths than the near-infrared regime, where most earlier adaptive optics imaging was performed. These two factors result in significantly sharper images than earlier VLT images. Even higher spatial resolution has been achieved with the VLTI, but the interferometer does not create images directly.

[2] The dust particles must be large enough to ensure the starlight can push it, but not so large that it simply sinks. Too small and the starlight would effectively pass through the dust; too large and the dust would be too heavy to push. The dust the team observed about VY Canis Majoris was precisely the right size to be most effectively propelled outwards by the starlight.

[3] The explosion will be soon by astronomical standards, but there is no cause for alarm, as this dramatic event is not likely for hundreds of thousands of years. It will be spectacular as seen from Earth -- perhaps as bright as the Moon -- but not a hazard to life here.

Contacts and sources:
Peter Scicluna
Academia Sinica Institute for Astronomy and AstrophysicsTaiwan

Richard Hook
ESO Public Information Officer
Garching bei München, Germany

This research was presented in a paper entitled "Large dust grains in the wind of VY Canis Majoris", by P. Scicluna et al., to appear in the journal Astronomy & Astrophysics. Research paper - http://www.eso.org/public/archives/releases/sciencepapers/eso1546/eso1546a.pdf

The team is composed of P. Scicluna (Academia Sinica Institute for Astronomy and Astrophysics, Taiwan), R. Siebenmorgen (ESO, Garching, Germany), J. Blommaert (Vrije Universiteit, Brussels, Belgium), M. Kasper (ESO, Garching, Germany), N.V. Voshchinnikov (St. Petersburg University, St. Petersburg, Russia), R. Wesson (ESO, Santiago, Chile) and S. Wolf (Kiel University, Kiel, Germany).

Brain Training: Researchers at Johns Hopkins Solve Puzzle of How We Learn

A new study sheds light on relationship between stimuli and delayed rewards, explaining how Pavlov's dogs were conditioned to drool.

More than a century ago, Pavlov figured out that dogs fed after hearing a bell eventually began to salivate when they heard the ring. A Johns Hopkins University-led research team has now figured out a key aspect of why.

In an article published in the journal Neuron, Johns Hopkins neuroscientist Alfredo Kirkwood settles a mystery of neurology that has stumped scientists for years: Precisely what happens in the brain when we learn, or how Pavlov's dogs managed to associate an action with a delayed reward to create knowledge. For decades scientists had a working theory of how it happened, but Kirkwood's team is now the first to prove it.

"If you're trying to train a dog to sit, the initial neural stimuli, the command, is gone almost instantly—it lasts as long as the word sit," said Kirkwood, a professor with the university's Zanvyl Krieger Mind/Brain Institute. "Before the reward comes, the dog's brain has already turned to other things. The mystery was, 'How does the brain link an action that's over in a fraction of a second with a reward that doesn't come until much later?'"

Credit: Johns Hopkins University

The working theory—which Kirkwood's team has validated—is that invisible "eligibility traces" effectively tag the synapses activated by the stimuli so that it can be cemented as true learning with the arrival of a reward.

In the case of a dog learning to sit, when the dog gets a treat or a reward, neuromodulators like dopamine flood the dog's brain with "good feelings." Though the brain has long since processed the sit command, eligibility traces respond to the neuromodulators, prompting a lasting synaptic change.

The team was able to prove the theory by isolating cells in the visual cortex of a mouse. When they stimulated the axon of one cell with an electrical impulse, they sparked a response in another cell. By doing this repeatedly, they mimicked the synaptic response between two cells as they process a stimulus and create an eligibility trace. When the researchers later flooded the cells with neuromodulators, simulating the arrival of a delayed reward, the response between the cells strengthened or weakened, showing the cells had "learned" and were able to do so because of the eligibility trace.

"This is the basis of how we learn things through reward," Kirkwood said, "a fundamental aspect of learning."

In addition to a greater understanding of the mechanics of learning, these findings could enhance teaching methods and lead to treatments for cognitive problems.

Researchers included Johns Hopkins postdoctoral fellow Su Hong; Johns Hopkins graduate student Xiaoxiu Tie; former Johns Hopkins research associate Kaiwen He; along with Marco Huertas and Harel Shouval, neurobiology researchers at the University of Texas at Houston; and Johannes W. Hell, a professor of pharmacology at University of California, Davis. The research was supported by grants from JHU's Science of Learning Institute and National Institutes of Health.

Contacts and sources:
Jill Rosen
 Johns Hopkins University

Scientists Get First Glimpse of Black Hole Eating Star, Ejecting High-Speed Flare

An international team of astrophysicists led by a Johns Hopkins University scientist has for the first time witnessed a star being swallowed by a black hole and ejecting a flare of matter moving at nearly the speed of light.

The finding reported Thursday in the journal Science tracks the star -- about the size of our sun -- as it shifts from its customary path, slips into the gravitational pull of a supermassive black hole and is sucked in, said Sjoert van Velzen, a Hubble fellow at Johns Hopkins.

"These events are extremely rare," van Velzen said. "It's the first time we see everything from the stellar destruction followed by the launch of a conical outflow, also called a jet, and we watched it unfold over several months."

Artist’s conception of a star being drawn toward a black hole and destroyed (left), and the black hole later emitting a “jet” of plasma composed of debris left from the star’s destruction. 
Modified from an original image by Amadeo Bachar.

Black holes are areas of space so dense that irresistible gravitational force stops the escape of matter, gas and even light, rendering them invisible and creating the effect of a void in the fabric of space. Astrophysicists had predicted that when a black hole is force-fed a large amount of gas, in this case a whole star, then a fast-moving jet of plasma - elementary particles in a magnetic field - can escape from near the black hole rim, or "event horizon." This study suggests this prediction was correct, the scientists said.

"Previous efforts to find evidence for these jets, including my own, were late to the game," said van Velzen, who led the analysis and coordinated the efforts of 13 other scientists in the United States, the Netherlands, Great Britain and Australia.

Supermassive black holes, the largest of black holes, are believed to exist at the center of most massive galaxies. This particular one lies at the lighter end of the supermassive black hole spectrum, at only about a million times the mass of our sun, but still packing the force to gobble a star.

The first observation of the star being destroyed was made by a team at the Ohio State University, using an optical telescope in Hawaii. That team announced its discovery on Twitter in early December 2014.

After reading about the event, van Velzen contacted an astrophysics team led by Rob Fender at the University of Oxford in Great Britain. That group used radio telescopes to follow up as fast as possible. They were just in time to catch the action.

By the time it was done, the international team had data from satellites and ground-based telescopes that gathered X-ray, radio and optical signals, providing a stunning "multi-wavelength" portrait of this event.

It helped that the galaxy in question is closer to Earth than those studied previously in hopes of tracking a jet emerging after the destruction of a star. This galaxy is about 300 million light years away, while the others were at least three times farther away. One light year is 5.88 trillion miles.

The first step for the international team was to rule out the possibility that the light was from a pre-existing expansive swirling mass called an "accretion disk" that forms when a black hole is sucking in matter from space. That helped to confirm that the sudden increase of light from the galaxy was due to a newly trapped star.

"The destruction of a star by a black hole is beautifully complicated, and far from understood," van Velzen said. "From our observations, we learn the streams of stellar debris can organize and make a jet rather quickly, which is valuable input for constructing a complete theory of these events."

Van Velzen last year completed his doctoral dissertation at Radboud University in the Netherlands, where he studied jets from supermassive black holes. In the last line of the dissertation, he expressed his hope to discover these events within four years. It turned out to take only a few months after the ceremony for his dissertation defense.

Van Velzen and his team were not the only ones to hunt for radio signals from this particular unlucky star. A group at Harvard observed the same source with radio telescopes in New Mexico and announced its results online. Both teams presented results at a workshop in Jerusalem in early November. It was the first time the two competing teams had met face to face.

"The meeting was an intense, yet very productive exchange of ideas about this source," van Velzen said. "We still get along very well; I actually went for a long hike near the Dead Sea with the leader of the competing group."

Support for this study came from sources including NASA, the Netherlands Foundation for Scientific Research (NOW), the European Research Council, the International Centre for Radio Astronomy Research, the Alfred P. Sloan Foundation and the Australian Research Council.

Contacts and sources:
Arthur Hirsch
Johns Hopkins University s

Study Shows White Matter Brain Damage Caused By 'Skunk-Like' Cannabis

Smoking high potency 'skunk-like' cannabis can damage a crucial part of the brain responsible for communication between the two brain hemispheres, according to a new study by scientists from King's College London and Sapienza University of Rome.

Researchers have known for some time that long-term cannabis use increases the risk of psychosis, and recent evidence suggests that alterations in brain function and structure may be responsible for this greater vulnerability. However, this new research, published today in Psychological Medicine, is the first to examine the effect of cannabis potency on brain structure.

The corpus callosum
Credit: King's College London

Exploring the impact of cannabis potency is particularly important since today's high potency 'skunk-like'products have been shown to contain higher proportions of Δ9-tetrahydrocannabinol (THC) than they did around a decade ago. In experimental studies THC has been shown to induce psychotic symptoms and 'skunk-like' products high in THC are now thought to be the most commonly used form of cannabis in the UK.

Dr Paola Dazzan, Reader in Neurobiology of Psychosis from the Institute of Psychiatry, Psychology & Neuroscience (IoPPN) at King's College London, and senior researcher on the study, said: 'We found that frequent use of high potency cannabis significantly affects the structure of white matter fibres in the brain, whether you have psychosis or not.

High potency cannabis
Credit: Wikimedia Commons

'This reflects a sliding scale where the more cannabis you smoke and the higher the potency, the worse the damage will be.'

Diffusion Tensor Imaging (DTI), a Magnetic Resonance Imaging (MRI) technique, was used to examine white matter in the brains of 56 patients who had reported a first episode of psychosis at the South London and Maudsley NHS Foundation Trust (SLaM), as well as 43 healthy participants from the local community.

The researchers specifically examined the corpus callosum, the largest white matter structure in the brain, which is responsible for communication between the left and right hemispheres. White matter consists of large bundles of nerve cell projections (called axons), which connect different regions of the brain, enabling communication between them.

The corpus callosum is particularly rich in cannabinoid receptors, on which the THC content of cannabis acts.

The study found that frequent use of high potency cannabis was linked to significantly higher mean-diffusivity (MD), a marker of damage in white matter structure.

Dr Tiago Reis Marques, a senior research fellow from the IoPPN at King's College London, said: 'This white matter damage was significantly greater among heavy users of high potency cannabis than in occasional or low potency users, and was also independent of the presence of a psychotic disorder.'

Dr Dazzan added: 'There is an urgent need to educate health professionals, the public and policymakers about the risks involved with cannabis use.

'As we have suggested previously, when assessing cannabis use it is extremely important to gather information on how often and what type of cannabis is being used. These details can help quantify the risk of mental health problems and increase awareness on the type of damage these substances can do to the brain.'

This research was funded primarily by the NIHR Biomedical Research Centre at the South London and Maudsley (SLaM) NHS Foundation Trust and King's College London.

The study was led by Dr Tiago Reis Marques and Dr Paola Dazzan of the IoPPN at King's, and Dr Silvia Rigucci of Sapienza University of Rome.

Contacts and sources:
Jack Stonebridge
King's College London

Loss of Carbon in Martian Atmosphere Explained

Mars is blanketed by a thin, mostly carbon dioxide atmosphere -- one that is far too thin to keep water from freezing or quickly evaporating. However, geological evidence has led scientists to conclude that ancient Mars was once a warmer, wetter place than it is today. To produce a more temperate climate, several researchers have suggested that the planet was once shrouded in a much thicker carbon dioxide atmosphere. For decades that left the question, "Where did all the carbon go?"

The solar wind stripped away much of Mars' ancient atmosphere and is still removing tons of it every day. But scientists have been puzzled by why they haven't found more carbon -- in the form of carbonate -- captured into Martian rocks. They have also sought to explain the ratio of heavier and lighter carbons in the modern Martian atmosphere.

This graphic depicts paths by which carbon has been exchanged among Martian interior, surface rocks, polar caps, waters and atmosphere, and also depicts a mechanism by which it is lost from the atmosphere with a strong effect on isotope ratio. 
Image Credit: Lance Hayashida/Caltech 

Now a team of scientists from the California Institute of Technology and NASA's Jet Propulsion Laboratory, both in Pasadena, offer an explanation of the "missing" carbon, in a paper published today by the journal Nature Communications.

They suggest that 3.8 billion years ago, Mars might have had a moderately dense atmosphere. Such an atmosphere -- with a surface pressure equal to or less than that found on Earth -- could have evolved into the current thin one, not only minus the "missing" carbon problem, but also in a way consistent with the observed ratio of carbon-13 to carbon-12, which differ only by how many neutrons are in each nucleus.

"Our paper shows that transitioning from a moderately dense atmosphere to the current thin one is entirely possible," says Caltech postdoctoral fellow Renyu Hu, the lead author. "It is exciting that what we know about the Martian atmosphere can now be pieced together into a consistent picture of its evolution -- and this does not require a massive undetected carbon reservoir."

When considering how the early Martian atmosphere might have transitioned to its current state, there are two possible mechanisms for the removal of the excess carbon dioxide. Either the carbon dioxide was incorporated into minerals in rocks called carbonates or it was lost to space.

An August 2015 study used data from several Mars-orbiting spacecraft to inventory carbonates, showing there are nowhere near enough in the upper half mile (one kilometer) or the crust to contain the missing carbon from a thick early atmosphere during a time when networks of ancient river channels were active, about 3.8 billion years ago.

The escaped-to-space scenario has also been problematic. Because various processes can change the relative amounts of carbon-13 to carbon-12 isotopes in the atmosphere, "we can use these measurements of the ratio at different points in time as a fingerprint to infer exactly what happened to the Martian atmosphere in the past," says Hu. The first constraint is set by measurements of the ratio in meteorites that contain gases released volcanically from deep inside Mars, providing insight into the starting isotopic ratio of the original Martian atmosphere. The modern ratio comes from measurements by the SAM (Sample Analysis at Mars) instrument on NASA's Curiosity rover.

One way carbon dioxide escapes to space from Mars' atmosphere is called sputtering, which involves interactions between the solar wind and the upper atmosphere. NASA's MAVEN (Mars Atmosphere and Volatile Evolution) mission has yielded recent results indicating that about a quarter pound (about 100 grams) of particles every second are stripped from today's Martian atmosphere via this process, likely the main driver of atmospheric loss. Sputtering slightly favors loss of carbon-12, compared to carbon-13, but this effect is small. The Curiosity measurement shows that today's Martian atmosphere is far more enriched in carbon-13 -- in proportion to carbon-12 -- than it should be as a result of sputtering alone, so a different process must also be at work.

Hu and his co-authors identify a mechanism that could have significantly contributed to the carbon-13 enrichment. The process begins with ultraviolet (UV) light from the sun striking a molecule of carbon dioxide in the upper atmosphere, splitting it into carbon monoxide and oxygen. Then, UV light hits the carbon monoxide and splits it into carbon and oxygen. Some carbon atoms produced this way have enough energy to escape from the atmosphere, and the new study shows that carbon-12 is far more likely to escape than carbon-13.

Modeling the long-term effects of this "ultraviolet photodissociation" mechanism, the researchers found that a small amount of escape by this process leaves a large fingerprint in the carbon isotopic ratio. That, in turn, allowed them to calculate that the atmosphere 3.8 billion years ago might have had a surface pressure a bit less thick than Earth's atmosphere today.

"This solves a long-standing paradox," said Bethany Ehlmann of Caltech and JPL, a co-author of both today's publication and the August one about carbonates. "The supposed very thick atmosphere seemed to imply that you needed this big surface carbon reservoir, but the efficiency of the UV photodissociation process means that there actually is no paradox. You can use normal loss processes as we understand them, with detected amounts of carbonate, and find an evolutionary scenario for Mars that makes sense."

Contacts and sources:
Deborah Williams-Hedges
California Institute of Technology, Pasadena

Guy Webster
Jet Propulsion Laboratory, Pasadena, Calif.