Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Tuesday, December 31, 2013

Global Temperatures To Rise At Least 4°C By 2100


Global average temperatures will rise at least 4°C by 2100 and potentially more than 8°C by 2200 if carbon dioxide emissions are not reduced according to new research published in Nature. Scientists found global climate is more sensitive to carbon dioxide than most previous estimates.


The research also appears to solve one of the great unknowns of climate sensitivity, the role of cloud formation and whether this will have a positive or negative effect on global warming.

Prof Steve Sherwood explains research into cloud mixing that indicates our climate is highly sensitive to a doubling of carbon dioxide. His findings suggest Equilibrium Climate Sensitivity is at least 3°C. As a result, unless we curb emissions, global temperatures will rise 4°C by 2100 and more than 8°C by 2200.  

Credit: UNSW TV - University of New South Wales

“Our research has shown climate models indicating a low temperature response to a doubling of carbon dioxide from preindustrial times are not reproducing the correct processes that lead to cloud formation," said lead author from the University of New South Wales’ Centre of Excellence for Climate System Science Prof Steven Sherwood.

“When the processes are correct in the climate models the level of climate sensitivity is far higher. Previously, estimates of the sensitivity of global temperature to a doubling of carbon dioxide ranged from 1.5°C to 5°C. This new research takes away the lower end of climate sensitivity estimates, meaning that global average temperatures will increase by 3°C to 5°C with a doubling of carbon dioxide."

The key to this narrower but much higher estimate can be found in the real world observations around the role of water vapour in cloud formation.

Observations show when water vapour is taken up by the atmosphere through evaporation, the updraughts can either rise to 15 km to form clouds that produce heavy rains or rise just a few kilometres before returning to the surface without forming rain clouds.

When updraughts rise only a few kilometres they reduce total cloud cover because they pull more vapour away from the higher cloud forming regions.

However water vapour is not pulled away from cloud forming regions when only deep 15km updraughts are present.

The researchers found climate models that show a low global temperature response to carbon dioxide do not include enough of this lower-level water vapour process. Instead they simulate nearly all updraughts as rising to 15 km and forming clouds.

When only the deeper updraughts are present in climate models, more clouds form and there is an increased reflection of sunlight. Consequently the global climate in these models becomes less sensitive in its response to atmospheric carbon dioxide.

However, real world observations show this behaviour is wrong.

When the processes in climate models are corrected to match the observations in the real world, the models produce cycles that take water vapour to a wider range of heights in the atmosphere, causing fewer clouds to form as the climate warms.

This increases the amount of sunlight and heat entering the atmosphere and, as a result, increases the sensitivity of our climate to carbon dioxide or any other perturbation.

The result is that when water vapour processes are correctly represented, the sensitivity of the climate to a doubling of carbon dioxide - which will occur in the next 50 years – means we can expect a temperature increase of at least 4°C by 2100.

“Climate sceptics like to criticize climate models for getting things wrong, and we are the first to admit they are not perfect, but what we are finding is that the mistakes are being made by those models which predict less warming, not those that predict more,” said Prof. Sherwood.

“Rises in global average temperatures of this magnitude will have profound impacts on the world and the economies of many countries if we don’t urgently start to curb our emissions.







Contacts and sources:
Alvin Stone
University of New South Wales

Cloudy Weather On Alien Super-Earth Revealed By Hubble Space Telescope

Weather forecasters on exoplanet GJ 1214b would have an easy job. Today's forecast: cloudy. Tomorrow: overcast. Extended outlook: more clouds.

A team of scientists led by researchers in the Department of Astronomy and Astrophysics at the University of Chicago report they have definitively characterized the atmosphere of a super-Earth class planet orbiting another star for the first time.

This image shows an artist's view of exoplanet GJ 1214b.

Credit: NASA, ESA, & G. Bacon/STScI, STScI-PRC14-06

The scrutinized planet, which is known as GJ1214b, is classified as a super-Earth type planet because its mass is intermediate between those of Earth and Neptune. Recent searches for planets around other stars ("exoplanets") have shown that super-Earths like GJ 1214b are among the most common type of planets in the Milky Way galaxy. Because no such planets exist in our Solar System, the physical nature of super-Earths is largely unknown.

Previous studies of GJ 1214b yielded two possible interpretations of the planet's atmosphere. Its atmosphere could consist entirely of water vapor or some other type of heavy molecule, or it could contain high-altitude clouds that prevent the observation of what lies underneath.

This rendering shows the size of GJ 1214b and another, larger exoplanet compared to Earth and Neptune.

Credit: NASA & ESA, STScI-PRC14-06b

But now a team of astronomers led by UChicago's Laura Kreidberg and Jacob Bean have detected clear evidence of clouds in the atmosphere of GJ 1214b from data collected with the Hubble Space Telescope. The Hubble observations used 96 hours of telescope time spread over 11 months. This was the largest Hubble program ever devoted to studying a single exoplanet.

The researchers describe their work as an important milestone on the road to identifying potentially habitable, Earth-like planets beyond our Solar System. The results appear in the Jan. 2 issue of the journal Nature.

"We really pushed the limits of what is possible with Hubble to make this measurement," said Kreidberg, a third-year graduate student and first author of the new paper. "This advance lays the foundation for characterizing other Earths with similar techniques."

"I think it's very exciting that we can use a telescope like Hubble that was never designed with this in mind, do these kinds of observations with such exquisite precision, and really nail down some property of a small planet orbiting a distant star," explained Bean, an assistant professor and the project's principal investigator.

GJ 1214b is located just 40 light-years from Earth, in the direction of the constellation Ophiuchus. Because of its proximity to our solar system and the small size of its host star, GJ 1214b is the most easily observed super-Earth. It transits, or passes in front of its parent star, every 38 hours, giving scientists an opportunity to study its atmosphere as starlight filters through it.

Kreidberg, Bean and their colleagues used Hubble to precisely measure the spectrum of GJ 1214b in near-infrared light, finding what they consider definitive evidence of high clouds blanketing the planet. These clouds hide any information about the composition and behavior of the lower atmosphere and surface.

The planet was discovered in 2009 by the MEarth Project, which monitors two thousand red dwarf stars for transiting planets. The planet was next targeted for follow-up observations to characterize its atmosphere. The first spectra, which were obtained by Bean in 2010 using a ground-based telescope, suggested that the planet's atmosphere either was predominantly water vapor or hydrogen-dominated with high-altitude clouds.


The University of Chicago's Jacob Bean and Laura Kreidberg led a team of scientists that has definitively characterized the atmosphere of a super-Earth class planet orbiting another star for the first time.
Credit: Rob Kozloff/University of Chicago

More precise Hubble observations made in 2012 and 2013 allowed the team to distinguish between these two scenarios. The news is about what they didn't find. The Hubble spectra revealed no chemical fingerprints whatsoever in the planet's atmosphere. This allowed the astronomers to rule out cloud-free atmospheres made of water vapor, methane, nitrogen, carbon monoxide, or carbon dioxide.

The best explanation for the new data is that there are high-altitude clouds in the atmosphere of the planet, though their composition is unknown. Models of super-Earth atmospheres predict clouds could be made out of potassium chloride or zinc sulfide at the scorching temperatures of 450 degrees Fahrenheit found on GJ 1214b. "You would expect very different kinds of clouds to form than you would expect, say, on Earth," Kreidberg said.

The launch of NASA's next major space telescope, the 6.5m James Webb Space Telescope (JWST), later this decade should reveal more about such worlds, Kreidberg said. "Looking forward, JWST will be transformative," she said. "The new capabilities of this telescope will allow us to peer through the clouds on planets like GJ 1214b. But more than that, it may open the door to studies of Earth-like planets around nearby stars."


###



Citation: "Clouds in the atmosphere of the super-Earth exoplanet GJ 1214b," by Laura Kreidberg, Jacob L. Bean, Jean-Michel Désert, Björn Benneke, Drake Deming, Kevin B. Stevenson, Sara Seager, Zachory Berta-Thompson, Andreas Seifahrt, & Derek Homeier.

Contacts and sources:
Steve Koppes
University of Chicago

Monday, December 30, 2013

Soft Biological Robots Will Move Through The Body

Increasingly small robots that carry out their functions even inside the human body. No, this isn’t a sci-fi dream but a close possibility. On one condition: the miniaturization of these devices requires them to acquire the same “softness” and flexibility as biological tissues. 

Simulation of euglenid movement 
  Credit: SISSA

This is the opinion of scientists like Antonio De Simone, from SISSA (the International School for Advanced Studies of Trieste and Marino Arroyo from the Polytechnic University of Catalonia, who have just published a paper in the Journal of the Mechanics and Physics of Solids: taking inspiration from unicellular water micro-organisms, they studied the locomotion mechanisms of “soft robots ”. 

Simulation of euglenid movement 
 
Credit: SISSA

"Forget cogwheels, pistons and levers: miniaturized robots of the future will be “soft”. “If I think of the robots of tomorrow, what comes to mind are the tentacles of an octopus or the trunk of an elephant rather than the mechanical arm of a crane or the inner workings of a watch.  And if I think of micro-robots then I think of unicellular organisms moving in water. The robots of the future will be increasingly like biological organisms” explains Antonio De Simone who, together with Marino Arroyo of the Polytechnic University of Catalonia, has just published a study in the Journal of the Mechanics and Physics of Solids. 

Simulation of euglenid movement 
 
Credit: SISSA

De Simone and his team at SISSA have been studying the movement of euglenids, unicellular aquatic animals, for several years. One of the aims of De Simone’s research – which has recently been awarded a European Research Council Advanced Grant of 1,300,000 euro – is to transfer the knowledge acquired in euglenids to micro-robotics, a field that represents a promising challenge for the future. Micro-robots may in fact carry out a number of important functions, for example for human health, by delivering drugs directly to where they are needed, re-opening occluded blood vessels, or helping to close wounds, to name just a few.

To do this, these tiny robots will have to be able to move around efficiently. “Imagine trying to miniaturize a device made up of levers and cogwheels: you can’t go below a certain minimal size.

Instead, by mimicking biological systems we can go all the way down to cell size, and this is exactly the direction research is taking. We, in particular, are working on movement and studying how certain unicellular organisms with highly efficient locomotion move”.

In their study, De Simone and Arroyo simulated euglenid species with different shapes and locomotion methods, based chiefly on cell body deformation and swelling, to describe in detail the mechanics and characteristics of the movement obtained.

“Our work not only helps to understand the movement mechanism of these unicellular organisms, but it provides a knowledge base to plan the locomotion system of future micro-robots”.

De Simone and Arroyo’s paper has been selected to appear in the special issue of the J. Mech. Phys. Solids published to celebrate the 60th anniversary of the prestigious journal.


Contacts and sources:



 
  

Sunday, December 29, 2013

Is Space-Time Smooth Or Grainy?

Smooth or grainy? Is space-time continuous or is it made up of very fine (10-35 metres on the “Planck scale”) but discrete grains, if we look at it very close up? If the latter were true, scientists think, this would lead to deviations from the theory of special relativity formulated by Albert Einstein more than 100 years ago. 

Credit: NASA, ESA, J.Rigby Goddard Space Flight Center, K.Sharon,  Kavli!Institute!

In some theoretical scenarios, the “non-continuity” of space-time implies violations to the invariance of the physical laws under the so-called Lorentz transformations (which establish that physical laws are the same for all inertial reference frames that are at the basis of special relativity). Since the 90s physicists have devised several methods (often based on phenomena connected to high-energy astrophysics) to test these deviations from standard physics. Stefano Liberati, coordinator of the Astroparticle Physics group of the International School for Advanced Studies (SISSA) of Trieste, recently published a systematic review to present the state of the art in this field and the constraints that can be placed on the various models that predict violations to Special Relativity.

The paper is an invited Topic 'Review published in the journal Classical and Quantum Gravity. This journal periodically asks leading world experts to “sum up” what is known in a specific field of study. The review has now been selected as one of the journal’s Highlight' papers for 2013.

“Physicists have been wondering about the nature of space-time for years. We’ve been asking ourselves whether it is continuous at all scales, as we perceive it in our daily experience, or whether at very small sizes it presents an irregular grain that we, in our direct experience, are unable to perceive”, explains Liberati. “Imagine looking at a slab of marble from some distance: it will probably seem to have a uniform texture. However, on closer inspection, for example using a powerful microscope, you can see that the marble is porous and irregular”.

“In a certain sense physicists have been trying to do something similar with space-time: to find something that acts as a microscope to find out whether at very small length scales there is indeed some irregularity. In my paper I presented a systematic overview of the experiments and observations that can be exploited to investigate the existence of these irregularities. Special relativity is one of the cornerstones of modern physics and as such it is very important to test its validity, insofar as current observations allow us”.

Contacts and sources:\

Cloud Atlas Reshapes Astronomers' Views Of Stellar Birthplaces

A multi-year study of the Whirlpool galaxy (M51) has shaken up astronomers' views of the properties of giant molecular clouds. The new study, which mapped 1,500 such clouds, shows that, instead, they are embedded in a kind of fog of molecular hydrogen much more dense than anyone expected, which permeates the whole of the galactic disc. Pressure exerted by this fog is crucial in determining whether or not new stars will form within the clouds. The study, led by Eva Schinnerer from the Max Planck Institute for Astronomy, made extensive use of the millimeter telescopes of IRAM, the Institut de Radioastronomie Millimétrique.

Molecular hydrogen in the Whirlpool Galaxy M51. The blueish features show the distribution of hydrogen molecules in M51, the raw material for forming new stars. The PAWS team has used this data to create a catalogue of more then 1,500 molecular clouds.

The background is a color image of M51 by the Hubble Space Telescope. Superimposed in blue is the CO(1-0) radiation emitted by carbon monoxide (CO) molecules, as measured for the PAWS study using the millimeter telescopes of the Institut de Radioastronomie Millimétrique. The CO molecules are used as tracers for molecular hydrogen.

Credit: PAWS team/IRAM/NASA HST/T. A. Rector (University of Alaska Anchorage)

Most of a galaxy's stars are born within giant molecular clouds – accumulations of hydrogen molecules with total masses between a thousand and several million times that of our Sun. As a region within such a cloud collapses under its own gravity, it contracts until pressure and temperature are high enough for nuclear fusion to set in: a new star is born.

Now, a new study challenges astronomers' traditional views about these stellar birthplaces. Study leader Eva Schinnerer (Max Planck Institute for Astronomy) explains: "Over the past four years, we have created the most complete map yet of giant molecular clouds in another spiral galaxy similar to our own Milky Way, reconstructing the amounts of hydrogen molecules and correlating them with the presence of new or older stars. The picture that is emerging is quite different from what astronomers thought these clouds should be like."The survey, known as PAWS, targeted the Whirlpool galaxy, also known as M51, at a distance of about 23 million light-years in the constellation Canes Venatici ("Hunting dogs").

Annie Hughes, a post-doctoral researcher at MPIA involved in the study, says: "We used to think of giant molecular clouds as solitary objects, drifting within the surrounding interstellar medium of rarified gas in isolated splendor; the main repository of a galaxy's supply of hydrogen molecules. But our study shows that 50% of the hydrogen is outside the clouds, in a diffuse, disk-shaped hydrogen fog permeating the galaxy!"

This "fog" of surrounding gas plays an important role in star formation. So does a structural feature characteristic of spiral galaxies: the spiral arms, which slowly move through the galaxy like ripples on a lake, and are somewhat more densely filled with stars and gas than the rest of the galactic disk. Sharon Meidt, another MPIA post-doctoral researcher involved in the study, says: "These clouds are definitely not isolated. On the contrary, interactions between clouds, fog, and overall galactic structure appear to hold the key to whether or not a cloud will form new stars. When the molecular fog moves relative to the galaxy's spiral arms, the pressure it exerts on any clouds within is reduced, in line with a physical law known as Bernoulli's principle. Clouds feeling this reduced pressure are unlikely to form new stars."

Incidentally, Bernoulli's law is also thought to be responsible for part of the well-known shower-curtain effect: shower curtains blowing inward when one takes a hot shower, another display of reduced pressure.

Jerome Pety of the Institut de Radioastronomie Millimétrique (IRAM), which operates the telescopes used for the new observations, says: "It's good to see our telescopes live up to their full potential. A study that needed such extensive observation time, and required both an interferometer to discern vital details and our 30 m antenna to put those details into a larger context, would not have been possible at any other observatory."

Schinnerer concludes: "So far, the Whirlpool galaxy is one example which we have studied in depth. Next, we need to check that what we have found also applies to other galaxies. For our next steps, we hope to profit from both the extension NOEMA of the compound telescope on the Plateau de Bure and from the newly opened compound telescope ALMA in Chile, which will allow in-depth studies of more distant spiral galaxies."

Contacts and sources: 
Eva Schinnerer
Max Planck Institute for Astronomy

More information about the project can be found on the PAWS home page,http://www.mpia.de/home/PAWS

Friday, December 27, 2013

Toys, Books, Cribs Harbor Bacteria For Long Periods

Streptococcus biofilms persisted on objects and surfaces in a daycare center, in some cases after a cleaning.

Numerous scientific studies have concluded that two common bacteria that cause colds, ear infections, strep throat and more serious infections cannot live for long outside the human body. So conventional wisdom has long held that these bacteria won’t linger on inanimate objects like furniture, dishes or toys.

But University at Buffalo research published  in infection and Immunity shows that Streptococcus pneumoniae and Streptococcus pyogenes do persist on surfaces for far longer than has been appreciated. The findings suggest that additional precautions may be necessary to prevent infections, especially in settings such as schools, daycare centers and hospitals.

This SEM image shows a mature pneumococcal biofilm: the nearly round structures of S. pneumonaie bacteria are organizing together a matrix of smaller, oddly shaped material surrounding them that makes them more resistant to environmental stresses and antimicrobial agents.
This SEM image shows a mature pneumococcal biofilm: the nearly round structures of S. pneumonaie bacteria are organizing together a matrix of smaller,...
Credit: Laura Marks

“These findings should make us more cautious about bacteria in the environment since they change our ideas about how these particular bacteria are spread,” says senior author Anders Hakansson, PhD, assistant professor of microbiology and immunology in the UB School of Medicine and Biomedical Sciences. “This is the first paper to directly investigate that these bacteria can survive well on various surfaces, including hands, and potentially spread between individuals.”

S. pneumoniae, a leading cause of ear infections in children and morbidity and mortality from respiratory tract infections in children and the elderly, is widespread in daycare centers and a common cause of hospital infections, says Hakansson. And in developing countries, where fresh water, good nutrition and common antibiotics may be scarce, S. pneumoniae often leads to pneumonia and sepsis, killing one million children every year.

S. pyogenes commonly causes strep throat and skin infections in school children but also can cause serious infection in adults.

The UB researchers found that in the day care center, four out of five stuffed toys tested positive for S. pneumonaie and several surfaces, such as cribs, tested positive for S. pyogenes, even after being cleaned. The testing was done just prior to the center opening in the morning so it had been many hours since the last human contact.

Hakansson and his co-authors became interested in the possibility that some bacteria might persist on surfaces when they published work last year showing that bacteria form biofilms when colonizing human tissues. They found that these sophisticated, highly structured biofilm communities are hardier than other forms of bacteria.

“Bacterial colonization doesn’t, by itself, cause infection but it’s a necessary first step if an infection is going to become established in a human host,” he explains. “Children, the elderly and others with compromised immune systems are especially vulnerable to these infections.”

He explains that studies of how long bacteria survive on inanimate objects have used cultures grown in laboratory media, called broth-grown planktonic bacteria, and invariably show that bacteria die rapidly.

“But we knew that this form of bacteria may not represent how they actually grow in the host,” says Hakansson. “Since discovering that biofilms are key to the pathogenesis of S. pneumonaie, we wanted to find out how well biofilm bacteria survive outside the body.”

The UB experiments found that month-old biofilm of S. pneumoniae and S. pyogenes from contaminated surfaces readily colonized mice, and that biofilms survived for hours on human hands and persisted on books and soft and hard toys and surfaces in a daycare center, in some cases, even after being well-cleaned.

“In all of these cases, we found that these pathogens can survive for long periods outside a human host,” says Hakansson. But, he says, the scientific literature maintains that you can only become infected by breathing in infected droplets expelled through coughing or sneezing by infected individuals.

“Commonly handled objects that are contaminated with these biofilm bacteria could act as reservoirs of bacteria for hours, weeks or months, spreading potential infections to individuals who come in contact with them,” concludes Hakansson. He cautions that more research should be done to understand under what circumstances this type of contact leads to spread between individuals.

“If it turns out that this type of spread is substantial, then the same protocols that are now used for preventing the spread of other bacteria, such as intestinal bacteria and viruses, which do persist on surfaces, will need to be implemented especially for people working with children and in health-care settings,” he adds.

Hakansson, who is affiliated with the Witebsky Center for Microbial Pathogenesis and Immunology and the New York State Center of Excellence in Bioinformatics and Life Sciences, both at UB, performed the study with co-authors Laura R. Marks, an MD/PhD candidate, and Ryan M. Reddinger, a PhD candidate, both in the Department of Microbiology and Immunology at UB.

The research was funded by the UB Dept. of Microbiology and Immunology and the UB medical school.

Contacts and sources:
University of Buffalo


Doomed Planet Foreshadows Earth's Fate

A group of astronomers that includes Amelia Bayo and Luigi Mancini of the Max Planck Institute for Astronomy has found a doomed planet that within 55 million years – a mere blink of the eye on astronomical scales – will be swallowed whole by its host star.

The evolution of planetary systems is intimately linked to the evolution of their host star. Our understanding of the whole planetary evolution process is based on the large planet diversity observed so far. To date, only few tens of planets have been discovered orbiting stars ascending the Red Giant Branch. Although several theories have been proposed, the question of how planets die remains open due to the small number statistics, making clear the need of enlarging the sample of planets around post-main sequence stars.

Artist's impression of the planet Kepler-91b which will be swallowed by its host star shortly (astronomically speaking).
Credit: Max Planck Institute for Astronomy/ David Cabezas Jimeno

This is a fate similar to that facing the earth in roughly 5 billion years. The planet, known as Kepler 91b, is currently orbiting its host star a mere 3 times the radius of the sun. The star, a red giant with a radius currently 6 times that of our sun, will swell to engulf the unhappy satellite.



The team leader is Jorge Lillo of the Center of Astrobiology (CAB) in Madrid. The initial discovery was made with NASA's Kepler Space Telescope based on observations taken in 2009–2012, but the object was only unambiguously identified as a planet when Lillo and his colleagues followed up on the initial observations, using the 2.2m telescope at Calar Alto Observatory in Andalusia. precarious.

Original article (accepted for publication in Astronomy & Astrophysics)


Contacts and sources:
Amelia Bayo (co-author)
Max Planck Institute for Astronomy

Sharks, Bees And Humans: What They Share When Hunting

A research team led by UA anthropologist David Raichlen has found that the Hadza tribe’s movements while foraging can be described by a mathematical pattern called a Lévy walk – a pattern that also is found in the movements of many other animals.

The Hadza people of Tanzania wore wristwatches with GPS trackers that followed their movements while hunting or foraging. Data showed that humans join a variety of other species including sharks and honeybees in using a Lévy walk pattern while foraging.

Photo by Brian Wood/Yale University
A mathematical pattern of movement called a Lévy walk describes the foraging behavior of animals from sharks to honey bees, and now for the first time has been shown to describe human hunter-gatherer movement as well. The study, led by University of Arizona anthropologist David Raichlen, was published today in the Proceedings of the National Academy of Sciences.

The Lévy walk pattern appears to be ubiquitous in animals, similar to the golden ratio, phi, a mathematical ratio that has been found to describe proportions in plants and animals throughout nature.

“Scientists have been interested in characterizing how animals search for a long time,” said Raichlen, an associate professor in the UA School of Anthropology, “so we decided to look at whether human hunter-gatherers use similar patterns.”

Funded by a National Science Foundation grant awarded to study co-author Herman Pontzer, Raichlen and his colleagues worked with the Hadza people of Tanzania.

The Hadza are one of the last big-game hunters in Africa, and one of the last groups on Earth to still forage on foot with traditional methods. “If you want to understand human hunter-gatherer movement, you have to work with a group like the Hadza,” Raichlen said.

One of the last hunter-gatherer tribes on Earth, the Hadza people of Tanzania still hunt on foot with traditional foraging methods. “If you want to understand human hunter-gatherer movement, you have to work with a group like the Hadza,” said UA anthropologist David Raichlen, who led the study.

 Photo by Brian Wood/Yale University

Members of the tribe wore wristwatches with GPS units that tracked their movement while on hunting or foraging bouts. The GPS data showed that while the Hadza use other movement patterns, the dominant theme of their foraging movements is a Lévy walk – the same pattern used by many other animals when hunting or foraging.

“Detecting this pattern among the Hadza, as has been found in several other species, tells us that such patterns are likely the result of general foraging strategies that many species adopt, across a wide variety of contexts,” said study co-author Brian Wood, an anthropologist at Yale University who has worked with the Hadza people since 2004.

“This movement pattern seems to occur across species and across environments in humans, from East Africa to urban areas,” said Adam Gordon, study co-author and a physical anthropologist at the University at Albany, State University of New York. “It shows up all across the world in different species and links the way that we move around in the natural world. This suggests that it’s a fundamental pattern likely present in our evolutionary history.”

The Lévy walk, which involves a series of short movements in one area and then a longer trek to another area, is not limited to searching for food. Studies have shown that humans sometimes follow a Lévy walk while ambling around an amusement park. The pattern also can be used as a predictor for urban development.

“Think about your life,” Raichlen said. “What do you do on a normal day? Go to work and come back, walk short distances around your house? Then every once in a while you take these long steps, on foot, bike, in a car or on a plane. We tend to take short steps in one area and then take longer strides to get to another area.”

Following a Lévy walk pattern does not mean that humans don’t consciously decide where they are going, Raichlen said. “We definitely use memories and cues from the environment as we search,” he explained, “but this pattern seems to emerge in the process.”

In future studies, Raichlen and his colleagues hope to understand the reasons for using a Lévy walk and whether the pattern is determined by the distribution of resources in the environment.

“We’re very interested in studying why the Hadza use this pattern, what’s driving their hunting strategies and when they use this pattern versus another pattern,” said Pontzer, a member of the research team and an anthropologist at Hunter College in New York.

“We'd really like to know how and why specific environmental conditions or individual traits influence movement patterns,” added Wood.

Describing human movement patterns could also help anthropologists to understand how humans transported raw materials in the past, how our home ranges expanded and how we interact with our environment today, Raichlen noted.

“We can characterize these movement patterns across different human environments, and that means we can use this movement pattern to understand past mobility,” Raichlen said. “Also, finding patterns in nature is always fun.”


Contacts and sources:
David Raichlen
University of Arizona School of Anthropology

Wednesday, December 25, 2013

Mysterious Phobos, 360 Degree View

The innermost moon of Mars, Phobos, is seen here in full 360 degree glory. The images were taken by the High Resolution Stereo Camera (HRSC) on ESA's Mars Express at various times throughout the mission's 10 years.

The moon's parallel sets of grooves are perhaps the most striking feature, along with the giant 9 km-wide Stickney impact crater that dominates one face of the 27 x 22 x 18 km moon.


The origin of the moon's grooves is a subject of much debate. One idea assumes that the crater chains are associated with impact events on the moon itself.



Another idea suggests they result from Phobos moving through streams of debris thrown up from impacts 6000 km away on the surface of Mars, with each 'family' of grooves corresponding to a different impact event.

Mars Express has imaged Phobos from a wide range of distances, but will make its closest flyby yet on 29 December 2013, at just 45 km above the moon.

Although this is too close to take images, gravity experiments will give insight into the interior structure of Phobos.

Credits: ESA/DLR/FU Berlin (G. Neukum)

Monday, December 23, 2013

Enormous Aquifer Discovered Under Greenland Ice Sheet

Buried underneath compacted snow and ice in Greenland lies a large liquid water reservoir that has now been mapped by researchers using data from NASA's Operation IceBridge airborne campaign.

A team of glaciologists serendipitously found the aquifer while drilling in southeast Greenland in 2011 to study snow accumulation. Two of their ice cores were dripping water when the scientists lifted them to the surface, despite air temperatures of minus 4 F (minus 20 C). The researchers later used NASA's Operation Icebridge radar data to confine the limits of the water reservoir, which spreads over 27,000 square miles (69,930 square km) – an area larger than the state of West Virginia. The water in the aquifer has the potential to raise global sea level by 0.016 inches (0.4 mm).

Glaciologist Lora Koenig (left) operates a video recorder that has been lowered into the bore hole to observe the ice structure of the aquifer in April 2013.

Image Credit:University of Utah/Clément Miège

"When I heard about the aquifer, I had almost the same reaction as when we discovered Lake Vostok [in Antarctica]: it blew my mind that something like that is possible," said Michael Studinger, project scientist for Operation IceBridge, a NASA airborne campaign studying changes in ice at the poles. "It turned my view of the Greenland ice sheet upside down – I don't think anyone had expected that this layer of liquid water could survive the cold winter temperatures without being refrozen."

Southeast Greenland is a region of high snow accumulation. Researchers now believe that the thick snow cover insulates the aquifer from cold winter surface temperatures, allowing it to remain liquid throughout the year. The aquifer is fed by meltwater that percolates from the surface during the summer.

The new research is being presented in two papers: one led by University of Utah's Rick Forster that was published on Dec. 22 in the journal Nature Geoscience and one led by NASA's Lora Koenig that has been accepted for publication in the journal Geophysical Research Letters. The findings will significantly advance the understanding of how melt water flows through the ice sheet and contributes to sea level rise.

When a team led by Forster accidentally drilled into water in 2011, they weren't able to continue studying the aquifer because their tools were not suited to work in an aquatic environment. Afterward, Forster's team determined the extent of the aquifer by studying radar data from Operation IceBridge together with ground-based radar data. The top of the water layer clearly showed in the radar data as a return signal brighter than the ice layers.

An ice core segment extracted from the aquifer by Koenig's team, with trapped water collecting at the lower left of the core.

Image Credit: NASA's Goddard Space Flight Center/Ludovic Brucker

Koenig, a glaciologist with NASA's Goddard Space Flight Center in Greenbelt, Md., co-led another expedition to southeast Greenland with Forster in April 2013 specifically designed to study the physical characteristics of the newly discovered water reservoir. Koenig's team extracted two cores of firn (aged snow) that were saturated with water. They used a water-resistant thermoelectric drill to study the density of the ice and lowered strings packed with temperature sensors down the holes, and found that the temperature of the aquifer hovers around 32 F (zero C), warmer than they had expected it to be.

Koenig and her team measured the top of the aquifer at around 39 feet (12 meters) under the surface. This was the depth at which the boreholes filled with water after extracting the ice cores. They then determined the amount of water in the water-saturated firn cores by comparing them to dry cores extracted nearby. The researchers determined the depth at which the pores in the firn close, trapping the water inside the bubbles – at this point, there is a change in the density of the ice that the scientists can measure. This depth is about 121 feet (37 meters) and corresponds to the bottom of the aquifer. Once Koenig’s team had the density, depth and spatial extent of the aquifer, they were able to come up with an estimated water volume of about 154 billion tons (140 metric gigatons). If this water was to suddenly discharge to the ocean, this would correspond to 0.016 inches (0.4 mm) of sea level rise.

Researchers think that the perennial aquifer is a heat reservoir for the ice sheet in two ways: melt water carries heat when it percolates from the surface down the ice to reach the aquifer. And if the trapped water were to refreeze, it would release latent heat. Altogether, this makes the ice in the vicinity of the aquifer warmer, and warmer ice flows faster toward the sea.

"Our next big task is to understand how this aquifer is filling and how it's discharging," said Koenig. "The aquifer could offset some sea level rise if it's storing water for long periods of time. For example after the 2012 extreme surface melt across Greenland, it appears that the aquifer filled a little bit. The question now is how does that water leave the aquifer on its way to the ocean and whether it will leave this year or a hundred years from now."


Contacts and sources:
Rob Gutro
NASA/Goddard Space Flight Center

'Runaway' Mechanism In Intermediate-Depth Earthquakes Discovered

Researchers find immense heating at high pressures helps spread intermediate-depth quakes.

Nearly 25 percent of earthquakes occur more than 50 kilometers below the Earth’s surface, when one tectonic plate slides below another, in a region called the lithosphere. Scientists have thought that these rumblings from the deep arise from a different process than shallower, more destructive quakes. But limited seismic data, and difficulty in reproducing these quakes in the laboratory, have combined to prevent researchers from pinpointing the cause of intermediate and deep earthquakes. 

Local seismometers detect clusters of intermediate-depth earthquakes in and around the Colombian city of Bucaramanga. The epicenter of the quakes, more than 50 kilometers below the surface, is known as the "Nest."
Study faults a ‘runaway’ mechanism in intermediate-depth earthquakes
Image courtesy of researchers

Now a team from MIT and Stanford University has identified a mechanism that helps these deeper quakes spread. By analyzing seismic data from a region in Colombia with a high concentration of intermediate-depth earthquakes, the researchers identified a “runaway process” in which the sliding of rocks at great depths causes surrounding temperatures to spike. This influx of heat, in turn, encourages more sliding — a feedback mechanism that propagates through the lithosphere, generating an earthquake.

German Prieto, an assistant professor of geophysics in MIT’s Department of Earth, Atmospheric and Planetary Sciences, says that once thermal runaway starts, the surrounding rocks can heat up and slide more easily, raising the temperature very quickly.

“What we predict is for medium-sized earthquakes, with magnitude 4 to 5, temperature can rise up to 1,000 degrees Centigrade, or about 1,800 degrees Fahrenheit, in a matter of one second,” Prieto says. “It’s a huge amount. You’re basically allowing rupture to run away because of this large temperature increase.”

Prieto says that understanding deeper earthquakes may help local communities anticipate how much shaking they may experience, given the seismic history of their regions.

He and his colleagues have published their results in the journal Geophysical Research Letters.

Water versus heat: two competing theories

The majority of Earth’s seismic activity occurs at relatively shallow depths, and the mechanics of such quakes is well understood: Over time, abutting plates in the crust build up tension as they shift against each other. This tension ultimately reaches a breaking point, creating a sudden rupture that splinters through the crust.

However, scientists have determined that this process is not feasible for quakes that occur far below the surface. Essentially, higher temperatures and pressures at these depths would make rocks behave differently than they would closer to the surface, gliding past rather than breaking against each other.

By way of explanation, Prieto draws an analogy to glass: If you try to bend a glass tube at room temperature, with enough force, it will eventually shatter. But with heating, the tube will become much more malleable, and bend without breaking.

So how do deeper earthquakes occur? Scientists have proposed two theories: The first, called dehydration embrittlement, is based on the small amounts of water in rocks’ mineral composition. At high pressure and heat, rocks release water, which lubricates surrounding faults, creating fractures that ultimately set off a quake.

The second theory is thermal runaway: Increasing temperatures weaken rocks, promoting slippage that spreads through the lithosphere, further increasing temperatures and causing more rocks to slip, resulting in an earthquake.

Probing the nest

Prieto and his colleagues found new evidence in support of the second theory by analyzing seismic data from a region of Colombia that experiences large numbers of intermediate-depth earthquakes — quakes whose epicenters are 50 to 300 kilometers below the surface. This region, known as the Bucaramanga Nest, hosts the highest concentration of intermediate-depth quakes in the world: Since 1993, more than 80,000 earthquakes have been recorded in the area, making it, in Prieto’s view, an “ideal natural laboratory” for studying deeper quakes.

The researchers analyzed seismic waves recorded by nearby surface seismometers and calculated two parameters: stress drop, or the total amount of energy released by an earthquake, and radiated seismic energy, or the amount of that energy that makes it to the surface as seismic waves — energy that is manifested in the shaking of the ground.

The stronger a quake is, the more energy, or heat, it generates. Interestingly, the MIT group found that only 2 percent of a deeper quake’s total energy is felt at the surface. Prieto reasoned that much of the other 98 percent may be released locally as heat, creating an enormous temperature increase that pushes a quake to spread.

Prieto says the study provides strong evidence for thermal runaway as the likely mechanism for intermediate-depth earthquakes. Such knowledge, he says, may be useful for communities around Bucaramanga in predicting the severity of future quakes.

“Usually people in Bucaramanga feel a magnitude 4 quake every month or so, and every year they experience a larger one that can shake significantly,” Prieto says. “If you’re in a region where you have intermediate-depth quakes and you know the size of the region, you can make a prediction of the type of magnitudes of quakes that you can have, and what kind of shaking you would expect.”

While scientists have not focused as much on intermediate-depth earthquakes because most do not cause significant damage, Hiroo Kanamori, a professor emeritus of geophysics at the California Institute of Technology, says more knowledge about these quakes is warranted, as some have been destructive in the past.

“Some intermediate events have caused very strong shaking,” says Kanamori, who was not involved in the research. “For example, one of the intermediate-depth aftershocks of the magnitude 9 Tohoku-Oki earthquake in 2011 caused ground motion accelerations as large as, [and] at some locations [even larger than, those] of the main shock. Thus, a better physical understanding of the mechanism of intermediate earthquakes has important implications for hazard mitigation.”

Prieto, a native Colombian, plans to deploy seismic stations above the Bucaramanga Nest to better understand the activity of deeper quakes.


Contacts and sources:
Kimberly Allen
Massachusetts Institute of Technology

Why Does It Snow So Much In Frozen North

When it doesn’t show signs of stopping, most of us just mumble a few choice words and get out the snow shovel. Scientists, however, wonder where all that snow is coming from, particularly in pristine places like the Arctic. Raymond Shaw and his colleagues may have found an answer.

Here’s the conundrum: Snow doesn’t just materialize out of thin air. For those delicate, six-sided crystals of ice to form, they need a nucleus, a speck of dust, where water molecules can cling and order their structure as they freeze. Those ice-forming nuclei are relatively rare. Yet, over the Arctic, where the atmosphere is very clean and the ocean is covered with ice, sometimes it snows interminably. With bazillions of snowflakes crystalizing over dust specks and falling to Earth, why don’t the clouds run out of nuclei? And why doesn’t it quit snowing?
Credit: NOAA Climate Program Office, NABOS 2006 Expedition

The same question applies to a lesser degree in places like Lake Superior, whose soggy clouds drop countless megatons of snow on the hapless residents of Minnesota, Wisconsin and Michigan.

“Within a few hours, you basically purge the atmosphere of all those particles,” said Shaw, a physicist at Michigan Technological University. “So how can it snow for days on end?”

To answer the question, Shaw and his colleagues, including graduate student Fan Yang, set about developing a model to describe how ice crystals form, grow and fall, and they backed it up using data on arctic clouds, which are very well studied. They hoped that by characterizing just how snow comes into being, they would uncover clues to the puzzle.

What they discovered surprised them. As the number of snow crystals increases, their mass soars by a power of 2.5. “Our first guess would have been that if you triple the number of crystals, you triple the mass,” said Shaw. “It turns out to be a much stronger relationship than that.” For example, if you triple the number of crystals, the mass goes up by a factor of 16. Simply put, the more crystals you have, the bigger they are.

Their model hinges on the idea that ice crystals are forming on atmospheric particles that were previously thought to be useless for making ice crystals. “The key assumption we made was that there’s a hidden source of ice nuclei that’s always there, but they are just really, really low efficiency,” said Shaw. “The consensus in the research community has been that you need special pieces of dust to catalyze the ice. We thought, ‘What if there was more stuff out there that would produce ice if you just wait long enough? Maybe when you put it in contact with a drop of water, it doesn’t freeze immediately. But if you wait an hour, or two hours, it does. Our model assumes that the atmosphere is full of those really inefficient nuclei.”

Those inefficient nuclei are behind those big crystals that show up during heavy snowfalls.

“The mass of an ice crystal is related to its growth time,” Shaw said. “The longer it’s in the cloud, the bigger it will be.” So, when there’s an updraft that keeps crystals from falling, snowflakes that form on regular, snow-forming particles get larger and larger. During that time, many more snowflakes have a chance to form on weak nuclei.

Eventually, all the snow crystals get too heavy for the updraft to support, and they tumble earthward. By then, they are huge, and there are lots of them. Not only is that born out in Shaw’s model, it also appears to fit with data gathered from arctic clouds.

They don’t know what those weak nuclei are, or where they come from. But the scientists on Shaw’s team are confident enough in their existence that they are looking for them in lab experiments.

“By assuming they are there, we got this mathematical prediction that fits with the experimental data,” said Shaw. “So there’s indirect evidence that these inefficient nuclei are there. This could be a solution to the mystery.”

The study was funded by the US Department of Energy. An article describing their work, “A Minimalist Approach to Modeling Complex Arctic Clouds,” coauthored by Shaw, Yang and Mikhail Ovchinnikov, a research scientist at Pacific Northwest National Laboratory, was published in the July 2013 edition of Geophysical Research Letters.


Contacts and sources:
Michigan Technological University

Dazzling New Images Of Saturn And Its Moons

This holiday season, feast your eyes on images of Saturn and two of its most fascinating moons, Titan and Enceladus, in a care package from NASA's Cassini spacecraft. All three bodies are dressed and dazzling in this special package assembled by Cassini's imaging team.

The globe of Saturn, seen here in natural color, is reminiscent of a holiday ornament in this wide-angle view from NASA's Cassini spacecraft.

The globe of Saturn, seen here in natural color, is reminiscent of a holiday ornament in this wide-angle view from NASA's Cassini spacecraft
Image Credit: NASA/JPL-Caltech/Space Science Institute

"During this, our tenth holiday season at Saturn, we hope that these images from Cassini remind everyone the world over of the significance of our discoveries in exploring such a remote and beautiful planetary system," said Carolyn Porco, Cassini imaging team leader, based at the Space Science Institute, Boulder, Colo. "Happy holidays from all of us on Cassini."

Winter is approaching in the southern hemisphere of Saturn and with this cold season has come the familiar blue hue that was present in the northern winter hemisphere at the start of NASA's Cassini mission.

Image Credit: NASA/JPL-Caltech/Space Science Institute

Two views of Enceladus are included in the package and highlight the many fissures, fractures and ridges that decorate the icy moon's surface. Enceladus is a white, glittering snowball of a moon, now famous for the nearly 100 geysers that are spread across its south polar region and spout tiny icy particles into space. Most of these particles fall back to the surface as snow. Some small fraction escapes the gravity of Enceladus and makes its way into orbit around Saturn, forming the planet's extensive and diffuse E ring. Because scientists believe these geysers are directly connected to a subsurface, salty, organic-rich, liquid-water reservoir, Enceladus is home to one of the most accessible extraterrestrial habitable zones in the solar system.

Using a special spectral filter, the high-resolution camera aboard NASA's Cassini spacecraft was able to peer through the hazy atmosphere of Saturn's moon Titan.

Image Credit: NASA/JPL-Caltech/Space Science Institute

Packaged along with Saturn and Enceladus is a group of natural-color images of Saturn's largest moon, Titan, highlighting two of Titan's most outstanding features. Peering through the moon's hazy, orange atmosphere, the Cassini narrow-angle camera spots dark, splotchy features in the polar regions of the moon. These features are the lakes and seas of liquid methane and ethane for which the moon is renowned. 

Lakes Through the Haze (NASA Cassini Saturn Mission Image) (target TITAN)
Image Credit: NASA/JPL-Caltech/Space Science Institute

Titan is the only other place in the solar system that we know has stable liquids on its surface, though in Titan's case, the liquids are ethane and methane rather than water. At Titan's south pole, a swirling high-altitude vortex stands out distinctly against the darkness of the moon's un-illuminated atmosphere. Titan's hazy atmosphere and surface environment are believed to be similar in certain respects to the early atmosphere of Earth.

NASA's Cassini captures a still and partially sunlit Enceladus.
Image Credit: NASA/JPL-Caltech/Space Science Institute

This view looks toward the side of Enceladus (313 miles or 504 kilometers across) that faces backward in the moon's orbit around Saturn. North on Enceladus is up. The images were taken with the Cassini spacecraft narrow-angle camera on April 7, 2010, using filters sensitive to ultraviolet, visible and infrared light (spanning wavelengths from 338 to 750 nanometers).

But the planet that towers over these moons is a celestial wonder itself. The north and south poles of Saturn are highlighted and appear drastically different from each other, as seen in new natural-color views. The globe of Saturn resembles a holiday ornament in a wide-angle image overlooking its north pole, bringing into view the hexagonal jet stream and rapidly spinning polar vortex that reside there. And the planet's south pole, now in winter, looking very different than the springtime north, displays brilliant blue hues, reminiscent of a frosty winter wonderland.

"Until Cassini arrived at Saturn, we didn't know about the hydrocarbon lakes of Titan, the active drama of Enceladus' jets, and the intricate patterns at Saturn's poles," said Linda Spilker, the Cassini project scientist at NASA's Jet Propulsion Laboratory, Pasadena, Calif. "Spectacular images like these highlight that Cassini has given us the gift of knowledge, which we have been so excited to share with everyone."

Launched in 1997, Cassini has explored the Saturn system for more than nine years. NASA plans to continue the mission through 2017, with the anticipation of much more groundbreaking science and imagery to come.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory (JPL), a division of the California Institute of Technology in Pasadena, manages the Cassini-Huygens mission for NASA's Science Mission Directorate, Washington. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging team consists of scientists from the U.S., England, France, and Germany. The imaging team is based at the Space Science Institute in Boulder, Colo.

The new images are available online at: http://www.nasa.gov/cassini , http://saturn.jpl.nasa.gov and http://ciclops.org .


Contacts and sources:
Jia-Rui Cook
Jet Propulsion Laboratory, Pasadena, Calif.

Steve Mullins
Space Science Institute,

Sunday, December 22, 2013

Scientists Solve A Decades-Old Mystery In The Earth's Upper Atmosphere

New research published in the journal Nature resolves decades of scientific controversy over the origin of the extremely energetic particles known as ultra-relativistic electrons in the Earth's near-space environment and is likely to influence our understanding of planetary magnetospheres throughout the universe.

Schematic illustration of electron acceleration by 'chorus' The top panel shows electron fluxes before (left) and after (right) a geomagnetic storm. The injection of low-energy plasma sheet electrons into the inner magnetosphere (1) causes chorus wave excitation in the low-density region outside the cold plasmasphere (2). Local energy diffusion associated with wave scattering leads to the development of strongly enhanced phase space density just outside the plasmapause (3). Subsequently, radial diffusion can redistribute the accelerated electrons inwards or outwards from the developing peak (4). 
Schematic illustration of electron acceleration by 'chorus'
Credit: UCLA

Discovering the processes that control the formation and ultimate loss of these electrons in the Van Allen radiation belts — the rings of highly charged particles that encircle the Earth at a range of about 1,000 to 50,000 kilometers above the planet's surface — is a primary science objective of the recently launched NASA Van Allen Probes mission. Understanding these mechanisms has important practical applications, because the enormous amounts of radiation trapped within the belts can pose a significant hazard to satellites and spacecraft, as well astronauts performing activities outside a craft.

Ultra-relativistic electrons in the Earth's outer radiation belt can exhibit pronounced variability in response to activity on the sun and changes in the solar wind, but the dominant physical mechanism responsible for radiation-belt electron acceleration has remained unresolved for decades. Two primary candidates for this acceleration have been "inward radial diffusive transport" and "local stochastic acceleration" by very low-frequency plasma waves.

In research published Dec. 19 in Nature, lead author Richard Thorne, a distinguished professor of atmospheric and oceanic sciences in the UCLA College of Letters and Science, and his colleagues report on high-resolution satellite measurements of high-energy electrons during a geomagnetic storm on Oct. 9, 2012, which they have numerically modeled using a newly developed data-driven global wave model.

Their analysis reveals that scattering by intense, natural very low–frequency radio waves known as "chorus" in the Earth's upper atmosphere is primarily responsible for the observed relativistic electron build-up.

The team's detailed modeling, together with previous observations of peaks in electron phase space density reported earlier this year by Geoff Reeves and colleagues in the journal Science, demonstrates the remarkable efficiency of natural wave acceleration in the Earth's near-space environment and shows that radial diffusion was not responsible for the observed acceleration during this storm, Thorne said.

Co-authors of the new research include Qianli Ma, a graduate student who works in Thorne's lab; Wen Li, Binbin Ni and Jacob Bortnik, researchers in Thorne's lab; and members of the science teams on the Van Allen Probes, including Harlan Spence of the University of New Hampshire (principal investigator for RBSP-ECT) and Craig Kletzing of the University of Iowa (principal investigator for EMFISIS).

The local wave-acceleration process is a "universal physical process" and should also be effective in the magnetospheres of Jupiter, Saturn and other magnetized plasma environments in the cosmos, Thorne said. He thinks the new results from the detailed analysis of Earth will influence future modeling of other planetary magnetospheres.

The Van Allen radiation belts were discovered in the Earth's upper atmosphere in 1958 by a team led by space scientist James Van Allen.

The new research was funded by the NASA, which launched the twin Van Allen probes in the summer of 2012.


Contacts and sources:
UCLA

Adult Stem Cells Found To Suppress Cancer While Dormant

Researchers at UCLA's Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research have discovered a mechanism by which certain adult stem cells suppress their ability to initiate skin cancer during their dormant phase — an understanding that could be exploited for better cancer-prevention strategies.

Adult stem cell displaying typical ultrastructural characteristics.
File:MSC high magnification.jpg
Credit:  Wikipedia

The study, which was led by UCLA postdoctoral fellow Andrew White and William Lowry, an associate professor of molecular, cell and developmental biology who holds the Maria Rowena Ross Term Chair in Cell Biology in the UCLA College of Letters and Science, was published online Dec. 15 in the journal Nature Cell Biology.

Hair follicle stem cells, the tissue-specific adult stem cells that generate the hair follicles, are also the cells of origin for cutaneous squamous cell carcinoma, a common skin cancer. These stem cells cycle between periods of activation (during which they can grow) and quiescence (when they remain dormant).

Using mouse models, White and Lowry applied known cancer-causing genes to hair follicle stem cells and found that during their dormant phase, the cells could not be made to initiate skin cancer. Once they were in their active period, however, they began growing cancer.

"We found that this tumor suppression via adult stem cell quiescence was mediated by PTEN, a gene important in regulating the cell's response to signaling pathways," White said. "Therefore, stem cell quiescence is a novel form of tumor suppression in hair follicle stem cells, and PTEN must be present for the suppression to work."

Understanding cancer suppression through quiescence could better inform preventative strategies for certain patients, such as organ transplant recipients, who are particularly susceptible to squamous cell carcinoma, and for those taking the drug vemurafenib for melanoma, another type of skin cancer. The study also may reveal parallels between squamous cell carcinoma and other cancers in which stem cells have a quiescent phase.


Contacts and sources: 
Shaun Mason
University of California - Los Angeles

Solar Activity Not A Key Cause Of Climate Change, Study Shows

Climate change has not been strongly influenced by variations in heat from the sun, a new scientific study shows

Climate change has not been strongly influenced by variations in heat from the sun, a new scientific study shows.

The findings overturn a widely held scientific view that lengthy periods of warm and cold weather in the past might have been caused by periodic fluctuations in solar activity.

A solar cycle: a montage of ten years' worth of Yohkoh SXT images, demonstrating the variation in solar activity during a sunspot cycle, from after August 30, 1991, to September 6, 2001. 
File:The Solar Cycle XRay hi.jpg
Credit: the Yohkoh mission of ISAS (Japan) and NASA (US).

Research examining the causes of climate change in the northern hemisphere over the past 1000 years has shown that until the year 1800, the key driver of periodic changes in climate was volcanic eruptions. These tend to prevent sunlight reaching the Earth, causing cool, drier weather. Since 1900, greenhouse gases have been the primary cause of climate change.

The findings show that periods of low sun activity should not be expected to have a large impact on temperatures on Earth, and are expected to improve scientists' understanding and help climate forecasting.

Scientists at the University of Edinburgh carried out the study using records of past temperatures constructed with data from tree rings and other historical sources. They compared this data record with computer-based models of past climate, featuring both significant and minor changes in the sun.

They found that their model of weak changes in the sun gave the best correlation with temperature records, indicating that solar activity has had a minimal impact on temperature in the past millennium.

The study, published in Nature GeoScience, was supported by the Natural Environment Research Council.

Dr Andrew Schurer, of the University of Edinburgh's School of GeoSciences, said: "Until now, the influence of the sun on past climate has been poorly understood. We hope that our new discoveries will help improve our understanding of how temperatures have changed over the past few centuries, and improve predictions for how they might develop in future. Links between the sun and anomalously cold winters in the UK are still being explored."





Contacts and sources:
Catriona Kelly
University of Edinburgh

Saturday, December 21, 2013

8,400+ Terrorist Attacks in 2012, New Data Shows

Although terrorism touched 85 countries in 2012, just three - Pakistan, Iraq and Afghanistan - suffered more than half of 2012's attacks (54 percent) and fatalities (58 percent), according to new data released today by the National Consortium for the Study of Terrorism and Responses to Terrorism(START) Global Terrorism Database (GTD), based at the University of Maryland. The next five most frequently targeted countries were India, Nigeria, Somalia, Yemen and Thailand.




"While terrorist attacks have in large part moved away from Western Europe and North America to Asia, the Middle East and Africa, worldwide terrorism is reaching new levels of destructiveness," said Gary LaFree, START director and professor of criminology and criminal justice at UMD.

In addition to illustrating a continued shift in location of attacks, the new data -- with more than 8,400 terrorist attacks killing more than 15,400 people in 2012 -- also show an increase in attacks and fatalities over the past decade. The previous record for attacks was set in 2011 with more than 5,000 incidents; for fatalities, the previous high was 2007 with more than 12,500 deaths. A map showing concentration and intensity of 2012 attacks is available here.

It is important to note that beginning with 2012 data collection, START made several important changes to the GTD collection methodology, improving the efficiency and comprehensiveness of the process. As a result of these improvements, a direct comparison between 2011 and 2012 likely overstates the increase in total attacks and fatalities worldwide during this time period. However, analysis of the data indicate that this increase began before the shift in data collection methodology, and important developments in key conflicts around the world suggest that considerable upward trends remain even when accounting for the possibility of methodological artifacts.

In the 1970s, most attacks occurred in Western Europe. In the 1980s, Latin America saw the most terrorist acts. Beginning with the 1990s, South Asia, North Africa and the Middle East have seen steadily rising numbers of attacks, a trend that has accelerated in recent years.

"The other striking development in recent years is the incredible growth in terrorist attacks linked to al-Qaida affiliates," LaFree said.

Though al-Qaida central was not directly responsible for any attacks in 2012, the six deadliest terrorist groups in the world were all affiliated to some extent with the organization. These include the Taliban (more than 2,500 fatalities), Boko Haram (more than 1,200 fatalities), al-Qaida in the Arabian Peninsula (more than 960 fatalities), Tehrik-e Taliban Pakistan (more than 950 fatalities), al-Qaida in Iraq (more than 930 fatalities) and al-Shabaab (more than 700 fatalities).

Attacks in Yemen, Nigeria and Iraq were among the deadliest in 2012.

On Jan. 5, unidentified Sunni perpetrators in Dhi Qar, Al Anbar and Baghdad, Iraq bombed various Shiite civilian targets, including pilgrims and laborers, in six separate attacks. Nearly 120 people were killed across all six attacks, including at least two suicide bombers.

In Nigeria on Jan. 20, approximately 190 people were killed in bombings targeting government, police, media, schools, utilities and private citizens, primarily in Kano. Boko Haram claimed responsibility for the attacks, indicating that they were carried out in response to Nigerian authorities detaining and killing Boko Haram members.

On March 4, al-Qaida in the Arabian Peninsula attacked a series of military targets in Zinjibar, Yemen, killing a total of 195 soldiers and kidnapping 73. More than 40 perpetrators were also killed in these attacks, and the hostages were released the following month.

GTD data files and documentation are available for download from the START website for users who would like to conduct custom analysis of the data. In addition to the methodological improvements made in the collection and coding process, the GTD team has added elements to improve the experience for those using the database. 2012 is the first year of data that includes geocodes for all attacks that occurred worldwide. Efforts to geocode the historical data back to 1970 are ongoing, and with the current update the geocoding process was completed for an additional 20 countries in North Africa and Southeast Asia. This information makes it possible for analysts to explore geospatial patterns of terrorist violence and more easily identify the sub-national concentrations of attacks.

Other new variables include target subtypes, which systematically classify targets into more specific categories. For example, while previous versions of the data allowed users to explore a subset of attacks against transportation targets, now analysts can easily identify attacks that target busses (42 percent of all transportation attacks), trains (33 percent), bridges and tunnels (9 percent), stations (7 percent), roads (4 percent), subways (2 percent), or taxis (1 percent).

"This update includes a number of improvements that we have been working on for several years, in response to common requests from users," said Erin Miller, GTD program manager. "We are always happy to get feedback on what types of information would make this a more useful resource and better serve the needs of researchers and practitioners."

According to Miller, the most commonly requested feature is the ability to distinguish between international and domestic attacks. To address this need, the GTD team developed a set of indicators that classify attacks as international or domestic across several dimensions, including logistics (whether the perpetrator group crossed a border to carry out the attack) and ideology (whether the perpetrator group was attacking a target of a different nationality, regardless of where the attack took place). The domestic/international indicators and other new variables are currently included in the downloadable data files. START plans to incorporate them into the online user interface in a future update. More information about the new variables can be found in the GTD Codebook.

With this data release, the GTD now contains information on more than 113,000 domestic and international terrorist attacks between 1970 and 2012 that resulted in more than 243,000 deaths and more than 324,000 injuries. These attacks are defined as the threatened or actual use of illegal force and violence by a non-state actor to attain a political, economic, religious or social goal through fear, coercion or intimidation.

The GTD is funded through START by the Department of Homeland Security Science and Technology Directorate's Office of University Programs, the U.S. Department of State's Bureau of Counterterrorism, and the Department of Homeland Security Science and Technology Directorate's Resilient Systems Division.


Contacts and sources:
University of Maryland 

Radiation Belt Yields Clues To Unsolved Mystery

New research using data from NASA's Van Allen Probes mission helps resolve decades of scientific uncertainty over the origin of ultra-relativistic electrons in Earth's near space environment, and is likely to influence our understanding of planetary magnetospheres throughout the universe.

Graphic depiction of NASA's Van Allen Probes orbiting within Earth's radiation belts.
Graphic depiction of NASA's Van Allen Probes orbiting within Earth's radiation belts.
Image Credit: NASA

Understanding the processes that control the formation and ultimate loss of such relativistic electrons is a primary science objective of the Van Allen Probes and has important practical applications, because of the enormous amounts of radiation trapped within the two Van Allen radiation belts. The belts, consisting of high-energy electrons and protons discovered above Earth's upper atmosphere in 1958 by James Van Allen, can pose a significant hazard to satellites and spacecraft, as well to astronauts performing activities outside a spacecraft.

Such electrons in the Earth's outer radiation belt can exhibit pronounced increases in intensity, in response to activity on the sun, and changes in the solar wind — but the dominant physical mechanisms responsible for such radiation belt electron acceleration has remained unresolved for decades.

Two primary candidates for electron acceleration exist, one external and one internal. From outside the belts, a theoretical process known as inward radial diffusive transport has been developed. From within the belts, scientists hypothesize that the electrons are undergoing strong local acceleration from very low frequency plasma waves. Controversies also exist as to the very nature of the wave acceleration: Is it stochastic – that is, a linear and diffusive process – or is it non-linear and coherent?

In research published Dec. 19, 2013, in Nature, lead author Richard Thorne and colleagues report on high-resolution measurements, made by the Van Allen Probes, which suggest that local acceleration is at work. The team observed high-energy electrons during a geomagnetic storm of Oct. 9, 2012, which they analyzed together with a data-driven global wave model. Their analysis reveals that linear, stochastic scattering by intense, natural very low-frequency radio waves -- known as chorus waves -- in Earth's upper atmosphere can account for the observed relativistic electron build-up.

"The successful point-by-point comparison of radiation belt features observed by the Van Allen Probes with the predictions of the state of the art model developed by Richard Thorne and his group dramatically demonstrates the significance of in situ particle acceleration within Earth's radiation belts," said David Sibeck, mission scientist for the Van Allen Probes at NASA's Goddard Space Flight Center in Greenbelt, Md.

The detailed modeling reported in Nature, together with previous observations reported earlier this year in the journal Science [link to article ] of peaks in electron phase space density by Geoff Reeves at Los Alamos National Laboratory in New Mexico and colleagues, demonstrates the remarkable efficiency of natural wave acceleration in Earth's near space environment. Their research shows that radial diffusion was not responsible for the observed acceleration during this storm, said Thorne, a scientist at the University of California at Los Angeles.

The local wave acceleration process is a universal physical process and should also be effective in the magnetospheres of Jupiter, Saturn and other magnetized plasma environments in the cosmos, Thorne said. He thinks the new results from the detailed analysis at Earth will influence future modeling of other planetary magnetospheres.

"This new finding is of paramount importance to unlocking the multitude of processes behind particle behavior in the belts," says Barry Mauk, project scientist for the Van Allen Probes at the Johns Hopkins Applied Physics Laboratory in Laurel, Md. "To have one of the primary science objectives of the mission met within just over a year of launch is a testament to the quality and quantity of the data the instruments on the probes are gathering, and to the teams analyzing them."

The research was funded by NASA, which launched the twin Van Allen Probes in the summer of 2012. The Johns Hopkins Applied Physics Laboratory built and operates the probes for NASA's Science Mission Directorate. The Van Allen Probes are the second mission in NASA's Living With a Star program, managed by NASA Goddard. The program explores aspects of the connected sun-Earth system that directly affect life and society.

For more on the Van Allen Probes: www.nasa.gov/vanallenprobes


Contacts and sources:
Rob Gutro
NASA/Goddard Space Flight Center

Stuart Wolpert and Karen C. Fox
University of California, Los Angeles and NASA's Goddard Space Flight Center

Friday, December 20, 2013

Birth Of Black Hole Kills The Radio Star

Astronomers led by a Curtin University researcher have discovered a new population of exploding stars that “switch off” their radio transmissions before collapsing into a Black Hole.

These exploding stars use all of their energy to emit one last strong beam of highly energetic radiation – known as a gamma-ray burst – before they die.

 
Credit: NASA

Up until now, it was thought all gamma-ray bursts were followed by a radio afterglow – a premise that a team of Australian astronomers of the Centre for All-sky Astrophysics (CAASTRO) at Curtin University and the University of Sydney originally set out to prove correct.

“But we were wrong. After studying an ultra-sensitive image of gamma-ray bursts with no afterglow, we can now say the theory was incorrect and our telescopes have not failed us,” lead researcher and Curtin research fellow Dr Paul Hancock said.


The technique used to create the ultra-sensitive image was recently published in The Astrophysical Journal.

It allowed for the stacking of 200 separate observations on top of each other to re-create the image of a gamma-ray burst in much better quality – yet, no trace of a radio afterglow was found.

“In our research paper we argue that there must be two distinct types of gamma-ray burst, likely linked to differences in the magnetic field of the exploding star,” Dr Hancock said.

“Gamma-ray bursts are thought to mark the birth of a Black Hole or Neutron Star – both of which have super-dense cores. But Neutron Stars have such strong magnetic fields (a million times stronger than those of Black Holes) that producing gamma-rays are more difficult.

Credit: NASA

“We think that those stars that collapse to form a Neutron Star have energy left over to produce the radio afterglow whereas those that become Black Holes put all their energy into one final powerful gamma-ray flash.”

New work is underway to test the team’s theory and to see if there are other subtle ways in which the two types of bursts differ.

“We now have to take a whole new look at gamma-ray bursts – so far this work has shown that being wrong is sometimes more interesting than being right,” Dr Hancock said.

Telescope facilities such as the Australia Telescope Compact Array in northern New South Wales and the Karl Jansky Very Large Array in the US both have observing programs to search for gamma-ray burst afterglows and have been recently upgraded to increase their sensitivity.

The research report can be found at http://arxiv.org/abs/1308.4766


Contacts and sources:
Megan Meates
Curtin University