Unseen Is Free

Unseen Is Free
Try It Now


Google Translate

Monday, September 30, 2013

Don Quixote Comet Found, 11 Miles Long And "Sopping Wet," Third Largest Near Earth Object

For 30 years, a large near-Earth asteroid wandered its lone, intrepid path, passing before the scrutinizing eyes of scientists armed with telescopes while keeping something to itself. The object, known as Don Quixote, whose journey stretches to the orbit of Jupiter, now appears to be a comet.

Credit: Northern Arizona University

The discovery resulted from an ongoing project coordinated by researchers at Northern Arizona University, Flagstaff, Ariz., using NASA's Spitzer Space Telescope. Through a lot of focused attention and a little luck, they found evidence of comet activity, which had evaded detection for three decades.

The results show that Don Quixote is not, in fact, a dead comet, as previously believed, but it has a faint coma and tail. In fact, this object, the third-biggest near-Earth asteroid known, skirts Earth with an erratic, extended orbit and is "sopping wet," said David Trilling of Northern Arizona University, with large deposits of carbon dioxide and presumably water ice. Don Quixote is about 11 miles (18 kilometers) long.

"This discovery of carbon dioxide emission from Don Quixote required the sensitivity and infrared wavelengths of the Spitzer telescope and would not have been possible using telescopes on the ground," said Michael Mommert, who conducted the research at the German Aerospace Center, Berlin, before moving to Northern Arizona University. This discovery implies that carbon dioxide and water ice might be present on other near-Earth asteroids, as well.

Don Quixote’s coma and tail (left) as seen in infrared light by NASA’s Spitzer Space Telescope. After image processing (right), the tail is more apparent. 
Image courtesy NASA/JPL-Caltech/DLR/NAU

The implications have less to do with a potential impact, which is extremely unlikely in this case, and more with "the origins of water on Earth," Trilling said. Impacts with comets like Don Quixote over geological time may be the source of at least some of it, and the amount on Don Quixote represents about 100 billion tons of water -- roughly the same amount that can be found in Lake Tahoe, Calif.

Mommert presented the results at the European Planetary Science Congress in London on Sept. 10.

First Cloud Map Of Alien Planet, Kepler-7b.

Astronomers using data from NASA's Kepler and Spitzer space telescopes have created the first cloud map of a planet beyond our solar system, a sizzling, Jupiter-like world known as Kepler-7b.

The planet is marked by high clouds in the west and clear skies in the east. Previous studies from Spitzer have resulted in temperature maps of planets orbiting other stars, but this is the first look at cloud structures on a distant world.

Kepler-7b (left), which is 1.5 times the radius of Jupiter (right), is the first exoplanet to have its clouds mapped.
Kepler-7b (left), which is 1.5 times the radius of Jupiter (right), is the first exoplanet to have its clouds mapped.
Image Credit: NASA/JPL-Caltech/MIT

"By observing this planet with Spitzer and Kepler for more than three years, we were able to produce a very low-resolution 'map' of this giant, gaseous planet," said Brice-Olivier Demory of Massachusetts Institute of Technology in Cambridge. Demory is lead author of a paper accepted for publication in the Astrophysical Journal Letters. "We wouldn't expect to see oceans or continents on this type of world, but we detected a clear, reflective signature that we interpreted as clouds."

Kepler has discovered more than 150 exoplanets, which are planets outside our solar system, and Kepler-7b was one of the first. The telescope's problematic reaction wheels prevent it from hunting planets any more, but astronomers continue to pore over almost four years' worth of collected data.

Kepler's visible-light observations of Kepler-7b's moon-like phases led to a rough map of the planet that showed a bright spot on its western hemisphere. But these data were not enough on their own to decipher whether the bright spot was coming from clouds or heat. The Spitzer Space Telescope played a crucial role in answering this question.

Like Kepler, Spitzer can fix its gaze at a star system as a planet orbits around the star, gathering clues about the planet's atmosphere. Spitzer's ability to detect infrared light means it was able to measure Kepler-7b's temperature, estimating it between 1,500 and 1,800 degrees Fahrenheit (1,100 and 1,300 Kelvin). This is relatively cool for a planet that orbits so close to its star -- within 0.6 astronomical units -- and, according to astronomers, too cool to be the source of light Kepler observed. Instead, they determined, light from the planet's star is bouncing off cloud tops located on the west side of the planet.

"Kepler-7b reflects much more light than most giant planets we've found, which we attribute to clouds in the upper atmosphere," said Thomas Barclay, Kepler scientist at NASA's Ames Research Center in Moffett Field, Calif. "Unlike those on Earth, the cloud patterns on this planet do not seem to change much over time -- it has a remarkably stable climate."

The findings are an early step toward using similar techniques to study the atmospheres of planets more like Earth in composition and size.

"With Spitzer and Kepler together, we have a multi-wavelength tool for getting a good look at planets that are billions of miles away," said Paul Hertz, director of NASA's Astrophysics Division in Washington. "We're at a point now in exoplanet science where we are moving beyond just detecting exoplanets, and into the exciting science of understanding them."

Kepler identified planets by watching for dips in starlight that occur as the planets transit, or pass in front of their stars, blocking the light. This technique and other observations of Kepler-7b previously revealed that it is one of the puffiest planets known: if it could somehow be placed in a tub of water, it would float. The planet was also found to whip around its star in just less than five days.

Explore all 900-plus exoplanet discoveries with NASA's "Eyes on Exoplanets," a fully rendered 3D visualization tool, available for download at The program is updated daily with the latest findings from NASA's Kepler mission and ground-based observatories around the world as they search for planets like our own.

Other authors include: Julien de Wit, Nikole Lewis, Adras Zsom and Sara Seager of Massachusetts Institute of Technology; Jonathan Fortney of the University of California, Santa Cruz; Heather Knutson and Jean-Michel Desert of the California Institute of Technology, Pasadena; Kevin Heng of the University of Bern, Switzerland; Nikku Madhusudhan of Yale University, New Haven, Conn.; Michael Gillon of the University of Liège, Belgium; Vivien Parmentier of the French National Center for Scientific Research, France; and Nicolas Cowan of Northwestern University, Evanston, Ill. Lewis is also a NASA Sagan Fellow.

The technical paper is online at .

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA. Science operations are conducted at the Spitzer Science Center at Caltech. Data are archived at the Infrared Science Archive housed at the Infrared Processing and Analysis Center at Caltech. Caltech manages JPL for NASA. For more information about Spitzer, visit: and .

For more information about Spitzer, visit:

For more information about Kepler, visit:

Household Plastic Molecules Found On Titan

NASA's Cassini spacecraft has detected propylene, a chemical used to make food-storage containers, car bumpers and other consumer products, on Saturn's moon Titan.
This is the first definitive detection of the plastic ingredient on any moon or planet, other than Earth.

A small amount of propylene was identified in Titan's lower atmosphere by Cassini's Composite Infrared Spectrometer (CIRS). This instrument measures the infrared light, or heat radiation, emitted from Saturn and its moons in much the same way our hands feel the warmth of a fire.

Propylene is the first molecule to be discovered on Titan using CIRS. By isolating the same signal at various altitudes within the lower atmosphere, researchers identified the chemical with a high degree of confidence. Details are presented in a paper in the Sept. 30 edition of the Astrophysical Journal Letters.

"This chemical is all around us in everyday life, strung together in long chains to form a plastic called polypropylene," said Conor Nixon, a planetary scientist at NASA's Goddard Space Flight Center in Greenbelt, Md., and lead author of the paper. "That plastic container at the grocery store with the recycling code 5 on the bottom -- that's polypropylene."

CIRS can identify a particular gas glowing in the lower layers of the atmosphere from its unique thermal fingerprint. The challenge is to isolate this one signature from the signals of all other gases around it.

The detection of the chemical fills in a mysterious gap in Titan observations that dates back to NASA's Voyager 1 spacecraft and the first-ever close flyby of this moon in 1980.

Voyager identified many of the gases in Titan's hazy brownish atmosphere as hydrocarbons, the chemicals that primarily make up petroleum and other fossil fuels on Earth.

On Titan, hydrocarbons form after sunlight breaks apart methane, the second-most plentiful gas in that atmosphere. The newly freed fragments can link up to form chains with two, three or more carbons. The family of chemicals with two carbons includes the flammable gas ethane. Propane, a common fuel for portable stoves, belongs to the three-carbon family.

Voyager detected all members of the one- and two-carbon families in Titan's atmosphere. From the three-carbon family, the spacecraft found propane, the heaviest member, and propyne, one of the lightest members. But the middle chemicals, one of which is propylene, were missing.

As researchers continued to discover more and more chemicals in Titan's atmosphere using ground- and space-based instruments, propylene was one that remained elusive. It was finally found as a result of more detailed analysis of the CIRS data.

"This measurement was very difficult to make because propylene's weak signature is crowded by related chemicals with much stronger signals," said Michael Flasar, Goddard scientist and principal investigator for CIRS. "This success boosts our confidence that we will find still more chemicals long hidden in Titan's atmosphere."

Cassini's mass spectrometer, a device that looks at the composition of Titan's atmosphere, had hinted earlier that propylene might be present in the upper atmosphere. However, a positive identification had not been made.

"I am always excited when scientists discover a molecule that has never been observed before in an atmosphere," said Scott Edgington, Cassini's deputy project scientist at NASA's Jet Propulsion Laboratory (JPL) in Pasadena, Calif. "This new piece of the puzzle will provide an additional test of how well we understand the chemical zoo that makes up Titan's atmosphere."

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. JPL manages the mission for NASA's Science Mission Directorate in Washington. The CIRS team is based at Goddard.

For more information about the Cassini mission, visit:

Contacts and sources:

Turning Algae Into Fuel

Blue-green in colour, slimy and present in seas and fresh water worldwide - the presence of microalgae is not generally met with great excitement. But this may be about to change. A team of European scientists is on a mission to prove that microalgae can be used to produce bioethanol as a biofuel for less than EUR 0.40 a litre.
Illustration of this article
The EU-funded project DEMA ('Direct Ethanol from MicroAlgae') is focusing on cyanobacteria - a microalgae found in almost every terrestrial and aquatic habitat, including in oceans, lakes and damp soil, and on rocks. They obtain their energy via photosynthesis.

The research team is seeking to improve biofuel production at two levels. First, the team will introduce the capacity to produce ethanol through metabolic engineering - by altering the chemical reactions that occur within its cells so that they can produce bioethanol effectively.

The bioethanol will then be secreted by the algae and filtered from the medium through a membrane.

The DEMA team will develop and demonstrate the technology, and is confident that the process, once fine-tuned, will be superior to any other put forward so far in scientific literature.

Biofuels have the potential to significantly reduce transport's output of carbon and reduce its impact on climate change. Using microalgae to produce biofuels has many advantages over other forms of biomass: it occurs naturally and grows quickly, and as it does not grow on land, it does not compete with food crops.

The project brings together nine partners from both academia and industry from six EU countries. It is coordinated by the University of Limerick in Ireland and has received almost EUR 5 million from the EU under the energy strand of the Seventh Framework Programme (FP7). The project started work in December 2012 and completes its work in May 2017.

Contacts and sources:

Niacin, The Fountain Of Youth

The vitamin niacin has a life-prolonging effect, as Michael Ristow has demonstrated in roundworms. From his study, the ETH-Zurich professor also concludes that so-called reactive oxygen species are healthy, not only disagreeing with the general consensus, but also many of his peers.

Roundworms live longer when fed the food supplement niacin (inverted microscopic photo). 
 Photo: Michael Ristow / ETH Zurich

Who would not want to live a long and healthy life? A freely available food supplement could help in this respect, scientists from ETH Zurich have demonstrated in roundworms. Vitamin B3 – also known as niacin – and its metabolite nicotinamide in the worms’ diet caused them to live for about one tenth longer than usual.

As an international team of researchers headed by Michael Ristow, a professor of energy metabolism, has now experimentally demonstrated, niacin and nicotinamide take effect by promoting formation of so-called free radicals. “In roundworms, these reactive oxygen species prolong life,” says Ristow.

“No scientific evidence for usefulness of antioxidants”

This might seem surprising as reactive oxygen species are generally considered to be unhealthy. Ristow’s view also contradicts the textbook opinion championed by many other scientists. Reactive oxygen species are known to damage somatic cells, a condition referred to as oxidative stress. Particular substances, so-called antioxidants, which are also found in fruit, vegetables and certain vegetable oils, are capable of neutralising these free radicals. Many scientists believe that antioxidants are beneficial to health.

“The claim that intake of antioxidants, especially in tablet form, promotes any aspect of human health lacks scientific support,” says Ristow. He does not dispute that fruit and vegetables are healthy. However, this may rather be caused by other compounds contained therein, such as so-called polyphenols. “Fruit and vegetables are healthy, despite the fact that they contain antioxidants,” says the ETH-Zurich professor. Based on the current and many previous findings he is convinced that small amounts of reactive oxygen species and the oxidative stress they trigger have a health-promoting impact. “Cells can cope well with oxidative stress and neutralise it,” says Ristow.

Substance mimics endurance sport

In earlier studies on humans, Ristow demonstrated that the health-enhancing effect of endurance sports is mediated via an increased formation of reactive oxygen species – and that antioxidants abolish this effect. Based on the present study, he concludes that niacin brings about a similar metabolic condition to exercise. “Niacin tricks the body into believing that it is exercising – even when this is not the case,” says Ristow. Such compounds are known as “exercise mimetics”.

The researchers conducted their experiments on the model organismCaenorhabditis elegans. This worm, which is merely one millimetre in length, can be easily maintained and has a lifespan of only a month, making it the ideal model organism for ageing research.

Also relevant for humans

The results of the study may also be of relevance for humans, says Ristow. After all, the metabolic pathway initiated by niacin is very similar in roundworms and higher organisms. Whether niacin has similar effects on the life expectancy of mice is the subject of Ristow’s current research. Previous studies also suggest a health-enhancing effect of niacin in humans with elevated blood cholesterol levels.

Niacin and nicotinamide have been approved as dietary supplements for decades. Ristow could easily envisage the substances being used broadly for therapeutic purposes in the future. A whole series of foods naturally contain niacin, including meat, liver, fish, peanuts, mushrooms, rice and wheat bran. Whether nutritional uptake is sufficient for a health-enhancing or lifespan-extending effect, however, remains to be demonstrated, says Ristow.

Disputed impact of enzymes

The latest study on the effects of niacin and nicotinamide is based on a particular class of enzymes, the sirtuins, which convert niacin into nicotinamide. Moreover, they are also involved in gene regulation, helping to down regulate the activity of certain genes. Until today, scientists have been disputing whether sirtuins have a life-prolonging impact.

Ristow and his team’s work now suggests that the activity of sirtuins actually prolongs life in roundworms. According to the study, however, the life-prolonging effect is not down to gene regulation, as has often been supposed in the past. Instead, the effect is due to the conversion of niacin into nicotinamide. Studying genetically modified roundworms that were unable to convert nicotinamide into certain other metabolic products, the scientists did not observe any lifespan extension, even after overexpression of sirtuins, which otherwise lead to an increased life expectancy.

Contacts and sources: 
Michael Ristow
ETH Zurich

Citation: Schmeisser K et al.: Role of Sirtuins in Lifespan Regulation is Linked to Methylation of Nicotinamide. Nature Chemical Biology, 2013, Advance Online Publication, doi: 10.1038/nchembio.1352

Are Black Holes Hairy?

A black hole. A simple and clear concept, at least according to the hypothesis by Roy Kerr, who in 1963 proposed a "clean" black hole model, which is the current theoretical paradigm. From theory to reality things may be quite different. According to a new research carried out by a group of scientists that includes Thomas Sotiriou, a physicist of the International School for Advanced Studies (SISSA) of Trieste, black holes may be much "dirtier" than what Kerr believed.

Credit: SISSA

According to the traditional model, black holes are defined by only two quantities: mass and angular momentum (a black hole rotation velocity). Once their progenitor has collapsed (a high mass star, for instance, that at the end of its life cycle implodes inwards) its memory is lost forever. All that is left is a quiescent black hole, with almost no distinctive features: all black holes, mass and angular momentum aside, look almost the same.

According to Sotiriou, things may not have occurred this way. "Black holes, according to our calculations, may have hair", explains Sotiriou, referring to a well-known statement by physicist John Wheeler, who claimed that "black holes have no hair". Wheeler meant that mass and angular momentum are all one needs to describe them.

"Although Kerr's 'bald' model is consistent with General Relativity, it might not be consistent with some well-known extensions of Einstein's theory, called tensor-scalar theories", adds Sotiriou. "This is why we have carried out a series of new calculations that enabled us to focus on the matter that normally surrounds realistic black holes, those observed by astrophysicists. This matter forces the pure and simple black hole hypothesized by Kerr to develop a new 'charge' (the hair, as we call it) which anchors it to the surrounding matter, and probably to the entire Universe."

The experimental confirmation of this new hypothesis may come from the observations carried out with the interferometers, instruments capable of recording the gravitational waves. "According to our calculations, the growth of the black hole's hair," concludes Sotiriou "is accompanied by the emission of distinctive gravitational waves. In the future, the recordings by the instrument may challenge Kerr's model and broaden our knowledge of the origins of gravity."

Contacts and sources: 
Federica Sgorbissa
International School of Advanced Studies (SISSA)

Ice Age Coming Again: History Of Climate Must Be Rewritten Say Researchers

Geologists and geophysicists of the Alfred Wegener Institute , Helmholtz Centre for Polar and Marine Research (AWI), discovered traces of large ice sheets from the Pleistocene on a seamount off the north-eastern coast of Russia. 

Credit: Alfred Wegener Institute

These marks confirm for the first time that within the past 800,000 years in the course of ice ages, ice sheets more than a kilometer thick also formed in the Arctic Ocean. The climate history for this part of the Arctic now needs to be rewritten, report the AWI scientists jointly with their South Korean colleagues in the title story of the current issue of the scientific journal Nature Geoscience.

AWI geologist Dr. Frank Niessen and colleagues had already discovered the first signs of conspicuous scour marks and sediment deposits on the ocean floor north of Wrangle Island (Russia) on a Polarstern expedition in 2008. However, they were unable to gather extensive proof until last year, during an Arctic expedition on the South Korean research vessel Araon. 

"After we had analysed the bathymetric and seismic data from our first voyage, we knew exactly where we needed to search and survey the ocean floor with the swath sonar of the Araon on the second expedition," said Frank Niessen, the first author of the study.

The result of this research is a topographic map of the Arlis Plateau, a seamount on which deep, parallel-running furrows can be discerned on the upper plateau and the sides – and over an area of 2500 square kilometres and to an ocean depth of 1200 metres. "We knew of such scour marks from places like the Antarctic and Greenland. 

They arise when large ice sheets become grounded on the ocean floor and then scrape over the ground like a plane with dozens of blades as they flow. The remarkable feature of our new map is that it indicates very accurately right off that there were four or more generations of ice masses, which in the past 800,000 years moved from the East Siberian Sea in a north-easterly direction far into the deep Artic Ocean," says Frank Niessen.

These new findings overturn the traditional textbook view of the history of Arctic glaciations. “Previously, many scientists were convinced that mega-glaciations always took place on the continents – a fact that has also been proven for Greenland, North America, and Scandinavia. However, it was assumed that the continental shelf region of North-eastern Siberia became exposed in these ice ages and turned into a vast polar desert in which there was not enough snow to enable a thick ice shield to form over the years. Our work now shows that the opposite was true. With the exception of the last ice age 21,000 years ago, ice sheets formed repeatedly in the shallow areas of the Arctic Ocean. These sheets were at least 1200 metres thick and presumably covered an area as large as Scandinavia," says Frank Niessen.

The AWI scientists still cannot say for certain, however, under what climate conditions these ice sheets formed and when exactly they left their marks on the bottom of the Arctic Ocean. "We theorize that the East Siberian ice sheets arose during various ice ages when the average global temperature was around five to eight degrees Celsius cooler than what it is today. But evidently this relatively minor temperature difference was often sufficient to allow initially thin ocean ice to grow into an immense ice cap. An example that shows just how sensitively the Arctic reacts to changes in the global climate system," says the geologist.

Credit: Alfred Wegener Institute

In a next step, the AWI researchers now want to try collecting soil samples from deeper layers of the ocean floor with a sediment core drill and thus learn more details about the prehistoric ice sheets. "Our long-term goal is to reconstruct the exact chronology of the glaciations so that with the aid of the known temperature and ice data, the ice sheets can be modeled. 

On the basis of the models, we then hope to learn what climate conditions prevailed in Eastern Siberia during the ice ages and how, for example, the moisture distribution in the region evolved during the ice ages," says Frank Niessen. This knowledge should then help predict possible changes in the Arctic as a consequence of climate change more accurately.

Frank Niessen and his colleagues are anticipating a great number of surprising discoveries in the Arctic Ocean in the future. "As the Arctic Ocean sea-ice cover continues to shrink, more formerly unexplored ocean area becomes accessible. Today less than ten percent of the Arctic Ocean floor has been surveyed as thoroughly as the Arlis Plateau," says the AWI geologist. And this study would not have succeeded were it not for the outstanding cooperation of the AWI scientists with researchers of the South Korean Polar Research Institute KOPRI. 

"We complemented each other perfectly in this research. Our South Korean colleagues had the expedition and ship time, we knew the coordinates of the area in which we now found the evidence of the mega-glaciations," says Frank Niessen.

Ice ages: About 2.7 million years ago the global climate cooled considerably. We have had a permanent ice cap on Greenland ever since. Then came around 55 changes between ice ages and warm periods until today. About 800,000 years ago the magnitude and duration of the glaciations in the Northern Hemisphere increased considerably. Since then the world climate has regularly alternated between two extremes: each cycle of an ice age followed by a warm period now lasts 100,000 years. 

Warm periods or interglacials, such as the "Holocene” in which we are living, only lasted about 10,000 to 15,000 years. Afterwards the ice masses on the continents began to grow once again, causing the sea level to drop as much as 130 meters compared to today at the peaks of the glacial cycles. Vast areas of the northern continents were then covered by kilometer-thick ice masses, which, for example, expanded over and over again from Scandinavia even into Northern Germany. The last time that this happened was 21,000 years ago. Thus far there has been little research on the role of the Arctic Ocean in this interplay.

Sunday, September 29, 2013

Major Breakthrough: Microbes Make Gasoline

A major scientific breakthrough in the development of renewable energy sources and other important chemicals

The research team succeeded in producing 580 mg of gasoline per liter of cultured broth by converting in vivo generated fatty acids

For many decades, we have been relying on fossil resources to produce liquid fuels such as gasoline, diesel, and many industrial and consumer chemicals for daily use. However, increasing strains on natural resources as well as environmental issues including global warming have triggered a strong interest in developing sustainable ways to obtain fuels and chemicals.

E. coli bacteria
Credit: KAIST

Gasoline, the petroleum-derived product that is most widely used as a fuel for transportation, is a mixture of hydrocarbons, additives, and blending agents. The hydrocarbons, called alkanes, consist only of carbon and hydrogen atoms. Gasoline has a combination of straight-chain and branched-chain alkanes (hydrocarbons) consisted of 4-12 carbon atoms linked by direct carbon-carbon bonds.

Previously, through metabolic engineering of Escherichia coli (E. coli), there have been a few research results on the production of long-chain alkanes, which consist of 13-17 carbon atoms, suitable for replacing diesel. However, there has been no report on the microbial production of short-chain alkanes, a possible substitute for gasoline.

In the paper (entitled “Microbial Production of Short-chain Alkanes”) published online in Nature on September 29, a Korean research team led by Distinguished Professor Sang Yup Lee of the Department of Chemical and Biomolecular Engineering at the Korea Advanced Institute of Science and Technology (KAIST) reported, for the first time, the development of a novel strategy for microbial gasoline production through metabolic engineering of E. coli.

The research team engineered the fatty acid metabolism to provide the fatty acid derivatives that are shorter than normal intracellular fatty acid metabolites, and introduced a novel synthetic pathway for the biosynthesis of short-chain alkanes. This allowed the development of platform E. coli strain capable of producing gasoline for the first time. Furthermore, this platform strain, if desired, can be modified to produce other products such as short-chain fatty esters and short-chain fatty alcohols.

In this paper, the Korean researchers described detailed strategies for 1) screening of enzymes associated with the production of fatty acids, 2) engineering of enzymes and fatty acid biosynthetic pathways to concentrate carbon flux towards the short-chain fatty acid production, and 3) converting short-chain fatty acids to their corresponding alkanes (gasoline) by introducing a novel synthetic pathway and optimization of culture conditions. Furthermore, the research team showed the possibility of producing fatty esters and alcohols by introducing responsible enzymes into the same platform strain.

Professor Sang Yup Lee said, “It is only the beginning of the work towards sustainable production of gasoline. The titre is rather low due to the low metabolic flux towards the formation of short-chain fatty acids and their derivatives. We are currently working on increasing the titre, yield and productivity of bio-gasoline. Nonetheless, we are pleased to report, for the first time, the production of gasoline through the metabolic engineering of E. coli, which we hope will serve as a basis for the metabolic engineering of microorganisms to produce fuels and chemicals from renewable resources.”

This research was supported by the Advanced Biomass Research and Development Center of Korea (ABC-2010-0029799) through the Global Frontier Research Program of the Ministry of Science, ICT and Future Planning (MSIP) through the National Research Foundation (NRF), Republic of Korea. Systems metabolic engineering work was supported by the Technology Development Program to Solve Climate Changes on Systems Metabolic Engineering for Biorefineries (NRF-2012-C1AAA001-2012M1A2A2026556) by MSIP through NRF.

Contacts and sources:
Dr. Sang Yup Lee, Distinguished Professor of the Department of Chemical and
Biomolecular Engineering, KAIST
Metabolic & Biomolecular Engineering National Research Laboratory 

Advanced Wood Burning Cookstoves Get African Trial

Benefits of clean cookstoves examined in a two year study

The Cooking and Pneumonia Study (CAPS) received its first batch of 100 advanced cookstoves at its trial sites in Malawi (pictured). During the next 6 months the sites will receive eight 40ft container loads containing 10,000 cookstoves. This cluster randomised trial investigates an advanced cookstove intervention to prevent pneumonia in children under 5 in Malawi.

The two year study will track 10,000 children aged under five who live in randomised villages in Chikhwawa and Chilumba in Malawi. The homes of the children involved in the study will be supplied with two clean cookstoves to see if the new stoves will stop the children getting pneumonia, a major cause of death in this age group.

Whereas various ongoing cookstove interventions try to target issues around ecological devastation and fuel security this intervention will try to assess, through a randomized trial set-up, the health benefits of clean cookstoves aimed to reduce the effects of domestic smoke inhalation. This is a problem in low and middle income countries around the world, where open fires, used for heating, cooking and lighting, are commonly used inside the main living quarters of homes.

Head of LSTM's Clinical Sciences Department and Co-Principal Investigator for CAPS Professor Stephen Gordon, said: "It is the poorest people in the world who use open fires in their homes, which are often basic structures with poor ventilation. The harmful effects of the smoke inhalation lie between passive and active cigarette smoking. Chronic obstructive pulmonary disease (COPD) is an example of one of these harmful effects."

Credit: Liverpool School of Tropical Medicine

Working alongside Professor Gordon, is Co-Principal Investigator Dr Kevin Mortimer, a Respiratory Consultant at Liverpool's Aintree University Hospital and Senior Clinical Lecturer at LSTM. Dr Mortimer said: "The cookstoves burn the same fuel used in the open fires but much more efficiently, which reduces the amount of smoke emitted by around ninety per cent. The implications could be enormous and benefit millions of people around the world."

The study was recently awarded £2.7 million by the Joint Global Health Trials Scheme, a partnership of the UK Department for International Development (DfID), the Medical Research Council (MRC) and the Wellcome Trust.

It is estimated that just under half of the world's population live in such conditions, with those affected being primarily women and children who spend the most time in the home. Excessive smoke inhalation can cause pneumonia, COPD and cardiovascular disease, accounting for more than two million deaths annually. The importance of this problem has been highlighted by the Global Alliance for Clean Cookstoves, which is backed by former US secretary of state Hillary Clinton.

Contacts and sources:

Lady Thatcher And Tony Blair Used ‘Hubristic Language’, Research Finds

A new study has found that British Prime Ministers Tony Blair and the late Lady Thatcher used hubristic language during their respective periods in office.

It has been suggested that a number of Prime Ministers may have developed a personality disorder known as Hubris syndrome while in power. Researchers at St George’s, University of London have discovered that this personality change was reflected in both Blair’s and Thatcher’s use of language.

Credit: St George’s, University of London

Hubris is commonly associated with a loss of contact with reality and an overestimation of one's own competence, accomplishments or capabilities. It is characterised by a pattern of exuberant self-confidence, recklessness and contempt for others, and is most particularly recognised in subjects holding positions of significant power.

Fourteen clinical symptoms of Hubris syndrome have been described*. Subjects demonstrating at least three of these could be diagnosed with the disorder.

Researchers at St George’s, University of London searched for evidence of some of these clinical features in the language used by three British Prime Ministers – Margaret Thatcher, Tony Blair and John Major – by examining transcribed samples of spoken language taken from Prime Minister’s Questions. They thought that frequent use of certain words or phrases, such as ‘sure’, ‘certain’ and ‘confident’, the first person pronouns ‘I’ or ‘me’, references to God or history, might show up during ‘hubristic’ periods.

They found that ‘I’ and ‘me’ and the word ‘sure’ were among the strongest positive correlations over time in Tony Blair’s speech. Blair's use of the word 'important' also increased with time. Words and phrases that became more frequent with time in the speeches of Lady Thatcher and Tony Blair also included the phrase ‘we shall’, while phrases that included the word ‘duties’ diminished. The authors also found that language became more complex and less predictable during hubristic periods.

For example, Lady Thatcher’s language becomes more complex at the end of her term of office, when her decisions and judgements were opening deep divisions within her own party. The same happened to Tony Blair’s speech during the run-up to the invasion of Iraq.

These linguistic patterns were not reflected in the language of John Major. The relative frequency of the word ‘we’ compared to ‘I’ was in fact higher throughout the terms of office of both Thatcher and Blair than at any point of Major’s premiership.

Additionally, the changes over time in words and phrases adopted by both Thatcher and Blair appeared to mirror the time course of hubristic behaviour.

The research is published in the journal Cortex.

Dr Peter Garrard, the lead researcher, from St George’s, University of London, said:

“Hubris syndrome represents a radical change in a person’s outlook, style and attitude after they acquire positions of power or great influence. They become obsessed with their self-image, excessively confident in their own judgement and dismissive of others, often leading to rash, ill thought-out decisions. In other words, the acquisition of power can bring about a change in personality: it is as if power, almost literally ‘goes to their head’.

“This work shows us that language can reflect this highly characteristic personality change. Spontaneous language production is an automatic process: to some extent we can influence how we come across in our choice of words, but most of the time we don't. The way we use language reflects the cultural and social environment, but biological factors, including personality, are also important.

“Hubristic behaviour is widespread, and not confined to politics: hubristic overconfidence in the financial sector almost certainly contributed to the recent banking crisis.

“We need to refine these language based measures and apply them more widely using other sets of digital samples from the past, with a view to detecting hubris and preventing its potentially catastrophic consequences.”

Dr Garrard’s next research will analyse language studies using recordings or transcripts of bank annual general meetings (AGMs) in the years leading up to the financial crisis.

Contacts and sources: 

Citation: Peter Garrarda, Vassiliki Rentoumia, Christian Lamberta, David Owen, "Linguistic biomarkers of Hubris syndrome", Cortex, Available online 18 September 2013 at:

Superfast Switching Of Quantum Light Sources

Usually, an elementary light source – such as an excited atom or molecule – emits light of a particular color at an unpredictable instance in time. Recently, however, scientists from the MESA+ Institute for Nanotechnology of the UT, FOM and the Institute for Nanoscience and Cryogenics (CEA/INAC) in France have shown that a light source can be coaxed to emit light at a desired moment in time, within an ultrashort burst. 

Cartoon of the superfast emission of a light source. The light source is embedded in an optical resonator where it spontaneously emits a photon. During the emission of the photon the favored color of the resonator is quickly switched – symbolized by a hammer to match the color of the light source. During this short interval the light source is triggered to emit an ultrashort burst of photons within a desired moment in time.
Credit:  University of Twente

The superfast switching of a light source has applications in fast stroboscopes without laser speckle, in the precise control of quantum systems and for ultrasecure communication using quantum cryptography. The theoretical results were published in Optics Express.

Spontaneous emission of light from excited sources, such as atoms, molecules or quantum dots, is a fundamental process with many applications in modern technology, such as LEDs and lasers. As the term 'spontaneous emission' indicates, the emission is random in nature and it is therefore impossible to predict the exact emission time of a photon.

SEM picture of a semiconductor micropillar with a diameter of 1 µm. It consists of a central GaAs l-layer sandwiched between two Bragg stacks made from alternating layers of GaAs and AlAs. The structures are made in Grenoble by molecular-beam-epitaxy (MBE) and subsequent nanostructuring
Credit:  University of Twente

 However, for several applications it is desirable to receive single photons exactly when they are needed with as little uncertainty as possible. This property is crucial for ultra-secure communication using quantum cryptography and in quantum computers. Therefore, the important goal is to fabricate a quantum light source such that it emits a single photon exactly at a desired moment in time.

Switching light emission

The average emission time of quantum light sources can be reduced by locating them in various nanostructures, like optical resonators or waveguides. But the distribution of emission times is always exponential in time in a usual stationary environment. In addition, the smallest uncertainty in the emission time is limited by both the maximum intensity in the resonator and the variations in the preparation time of the emitter. The Dutch-French team proposes to overcome these limitations by quickly switching the resonator length, in which the light source is located. The time duration of the switch should be much shorter than the average emission time. The result is that the favored color of the resonator only matches the emission color of the light source within a short time interval. Only within this short time frame are the photons emitted by the light source into the resonator.

Ultrafast light source

The researchers propose to use quantum dot light sources, which can easily be integrated in semiconductor optical resonators with lengths on the order of microns. The switching of the resonator will be achieved by shining an ultrashort laser pulse at the micropillar resonator during the emission time of the quantum dots. 

The spontaneous emission intensity from a light source as function of time after excitation at time zero. The emission in a usual stationary environment follows an exponential curve (dashed curve); whereas the photons emitted by the light source placed in the switched optical resonator (red curve) can be bunched within a time window that is much shorter than the average emission time. The short intense burst of light is marked by the red area.
Credit:  University of Twente

This quickly changes the refractive in the resonator and thereby the effective resonator length. The switching time can be directly controlled by the arrival time of the short laser pulse and by the lifetime of the excited electrons. These controlled light switches have great prospects for creating incoherent ultrafast light sources for fast stroboscopes without laser speckle, in quantum cryptography, in quantum information and for studying ultrafast cavity Quantum electrodynamics.

Contacts and sources:
University of Twente

Citation:  Henri Thyrrestrup, Alex Hartsuiker, Jean-Michel Gérard, and Willem L. Vos, Non-exponential spontaneous emission dynamics for emitters in a time-dependent optical cavity, Optics Express, Vol. 21, Issue 20, pp. 23130-23144 (2013),

High Risk Situations Interpreted With Surveillance Software

The Center for Engineering and Industrial Development (CIDESI) developed software for a surveillance system that can detect the behavior of people and analyzes the movement of objects, a method that can distinguish when an unusual event is happening, as the case of an assault or an accident.

The prototype of this software is already installed in the research center located in the Mechatronics Laboratory of Queretaro, in central Mexico

The prototype consists of four cameras placed in the laboratory; they can detect situations that a traditional surveillance system does not, for example, how people behave when they enter the building for the first time. If the way they walk is different, the technology developed in CIDESI records it.

Hugo Jiménez Hernández, head of research at CIDESI said the interesting thing is that the system analyzes the behavior of moving objects, and then decides when an unusual event is happening and notifies it.

“In CIDESI’s design, all surveillance cameras work as a data gathering point and decide whether something is relevant or not; when one of the cameras detects something unusual it sends a signal, focuses and analyzes the event. When uncommon things happen the response time is very fast and the number of events it detects increases”

With this technology, the unusual event is sent to a central server to analyze the flow of moving objects, so that when it locates the event, it reports and saves the corresponding video fragment along with the time and the code of the camera that registered it.

This technology can also elaborate behavioral statistics, for example if the system were installed in a high traffic crossroad, where it would be more probable for an accident to occur in rush hour, the software would report so in order to prevent it.

This information is registered on a data base that obtains and analyzes the statistics, particularly the ones recorded on the day of the unusual event in order to inform about potential risk situations. In the case of an event that requires extra attention, the system can respond within seconds, so an alarm can be activated.

“This technology only records unusual situations and doesn’t require large resources as the traditional surveillance systems, making it a low cost technology and an efficient information system. To detect and anomaly no sequence search on every hour of the surveillance video is needed.”

For the time being, this technology has potential clients in the private and public areas. For example, the Ministry of Transport in Querétaro could employ it as a tool to monitor roads and avenues. And private companies could use a surveillance system oriented toward malls or busy locations.

This technological development was accomplished with the help of the Center of Research in Applied and Advanced Technology (CICATA), the National Polytechnic Institute (IPN) and the Autonomous University of Queretaro (UAQ). Currently, is in process of being patented and obtain copyrights. (Agencia ID)

Contacts and sources:
Investigación y Desarrollo

Saturday, September 28, 2013

New Insight On Sun's Atmosphere

Three months after the flight of the solar observatory Sunrise – carried aloft by a NASA scientific balloon in early June 2013 -- scientists from the Max Planck Institute for Solar System Research in Germany have presented unique insights into a layer on the sun called the chromosphere. Sunrise provided the highest-resolution images to date in ultraviolet light of this thin corrugated layer, which lies between the sun's visible surface and the sun's outer atmosphere, the corona.

The right image shows an image captured by the Sunrise balloon-borne telescope of a region of the chromosphere in close proximity to two sunspots. It serves as a close up of the left images, which were captured by NASA's Solar Dynamics Observatory. The images were taken on July 16, 2013.
SDO,left and SUNRISE, right for the same area of the sun.
Image Credit: NASA/SDO/MPS

With its one-meter mirror, Sunrise is the largest solar telescope to fly above the atmosphere. The telescope weighed in at almost 7,000 pounds and flew some 20 miles up in the air. Sunrise was launched from Kiruna in the north of Sweden and, after five days drifting over the Atlantic, it landed on the remote Boothia Peninsula in northern Canada, gathering information about the chromosphere throughout its journey.

The temperature in the chromosphere rises from 6,000 K/10,340 F/5,272 C at the surface of the sun to about 20,000 K/ 35,540 F/19,730 C. It's an area that's constantly in motion, with different temperatures of hot material mixed over a range of heights, stretching from the sun's surface to many thousands of miles up. The temperatures continue to rise further into the corona and no one knows exactly what powers any of that heating.

"In order to solve this riddle it is necessary to take as close a look as possible at the chromosphere – in all accessible wavelengths," said Sami Solanki, the principal investigator for Sunrise from the Max Planck Institute. Sunrise used an instrument that was able to filter particular ultraviolet wavelengths of light that are only emitted from the chromosphere.

Two images of the chromosphere as captured by the Sunrise solar observatory that flew on a NASA balloon in July 2013. On the left a typical pattern can be seen: dark areas surrounded by bright rims. On the right, the images show bright, stretched structures on the edges of the darker sunspots.
Image Credit:

Sunrise's extremely high-resolution images in this wavelength painted a complex picture of the chromosphere. Where the sun is quiet and inactive, dark regions with a diameter of around 600 miles can be discerned surrounded by bright rims. This pattern is created by the enormous flows of solar material rising up from within the sun, cooling off and sinking down again. Especially eye-catching are bright points that flash up occasionally—much richer in contrast in these ultraviolet images than have been seen before. Scientists believe these bright points to be signs of what's called magnetic flux tubes, which are the building blocks of the sun's magnetic field. The magnetic field is of particular interest to scientists since it is ultimately responsible for all of the dynamic activity we see on our closest star.

"These first analyses are extremely promising," said Solanki. "They show that the ultraviolet radiation from the chromosphere is highly suitable for visualizing detailed structures and processes."

The researchers now hope that the next months will provide more new insights – and are looking forward to a close collaboration with colleagues from NASA’s Interface Region Imaging Spectrograph, or IRIS mission. IRIS launched on June 27, only weeks after the end of the Sunrise mission, and also studies the ultraviolet radiation from chromosphere and corona. Michael Knoelker at the High Altitude Observatory in Boulder, Colo. Is the NASA principal investigator for Sunrise.

Contacts and sources:
Karen C. Fox
NASA's Goddard Space Flight Center 

Friday, September 27, 2013

Human Robots Getting Closer, They Will Learn From Experience

A robot that feels, sees and, in particular, thinks and learns like us. It still seems like science fiction, but if it's up to UT researcher Frank van der Velde, it won't be. In his work he wants to implement the cognitive process of the human brain in robots. The research should lead to the arrival of the latest version of the iCub robot in Twente. This human robot (humanoid) blurs the boundaries between robot and human.

Decades of scientific research into cognitive psychology and the brain have given us knowledge about language, memory, motor skills and perception. We can now use that knowledge in robots, but Frank van der Velde’s research goes even further. 

“The application of cognition in technical systems should also mean that the robot learns from its experiences and the actions it performs. A simple example: a robot that spills too much when pouring a cup of coffee can then learn how it should be done.”

Possible first iCub in the Netherlands

The arrival of the iCub robot at the University of Twente should signify the next step in this research. Van der Velde submitted an application together with other UT researchers Stefano Stramigioli, Vanessa Evers, Dirk Heylen and Richard van Wezel, all active in the robotics and cognitive research. 
At the moment, twenty European laboratories have an iCub, which was developed in Italy (thanks to a European FP7 grant for the IIT). The Netherlands is still missing from the list. Moreover, a newer version is currently being developed, with for example haptic sensors. In February it will be announced whether the robotics club will actually bring the latest iCub to the UT. 

The robot costs a quarter of a million Euros and NWO (Netherlands Organisation for Scientific Research) will reimburse 75% of the costs. Then the TNO (Netherlands Organisation for Applied Scientific Research) and the universities of Groningen, Nijmegen, Delft and Eindhoven can also make use of it. Within the UT, the iCub can be deployed in different laboratories thanks to a special transport system.

Robot guide dog

“The possibilities are endless, realises Van der Velde. “The new iCub has a skin and fingers that have a much better sense of touch and can feel strength. That makes interaction with humans much more natural. We want to ensure that this robot continues to learn and understands how people function. This research ensures, for example, that robots actually gather knowledge by focusing on certain objects or persons. In areas of application like healthcare and nursing, such robots can play an important role. A good example would be that in ten years’ time you see a blind person walking with a robot guide dog.”

Nano-neural circuits

A recent line of research that is in line with this profile is the development of electronic circuits that resemble a web of neurons in the human brain. Contacts have already been made to start this research in Twente. In the iCub robot, this can for example be used for the robot’s visual perception. This requires a lot of relatively simple operations that must all be performed in parallel. This takes a lot of time and energy in the current systems. With electronic circuits in the form of a web of nerve cells this is much easier.

““These connections are only possible at the nanoscale, that is to say the scale at which the material is only a few atoms thick. In combination with the iCub robot, it can be investigated how the experiences of the robot are recorded in such materials and how the robot is controlled by nano-neural circuitry. The bottleneck of the existing technical systems is often the energy consumption and the size. The limits of Moore's Law, the proposition that the number of transistors in a circuit doubles every two years through technological advances, are reached. In this area we are therefore also on the verge of many new applications.”

Frank van der Velde

Frank van der Velde has the Technical Cognition chair within the Department of Cognitive Psychology and Ergonomics of the Faculty of Behavioural Sciences. He is affiliated with the research institute CTIT and participates in the European ConCreTe (Concept Creation Technology) project. In the middle of this month, he delivered his inaugural lecture for the position of professor at the University of Twente. 

Van der Velde has long had a fascination for cognition and technical systems. He refers to robots as ‘he’ or ‘she’. The example of a robot guide dog comes from his experiences with blindness in his personal circle of acquaintances. He is not afraid that robots will eventually dominate humanity. “It will never go that far. The pouring of coffee that I was talking about; let's first make sure that it will no longer makes a mess.”

Contacts and sources:
Jochem Vreeman
University of Twente

Mars More Earth-Like Than Expected Say Scientists

During the nearly 14 months that it has spent on the red planet, Curiosity, the Mars Science Laboratory (MSL) rover, has scooped soil, drilled rocks, and analyzed samples by exposing them to laser beams, X-rays, and alpha particles using the most sophisticated suite of scientific instruments ever deployed on another planet. One result of this effort was evidence reported last March that ancient Mars could have supported microbial life.

But Curiosity is far more than a one-trick rover, and in a paper published today in the journal Science, a team of MSL scientists reports its analysis of a surprisingly Earth-like martian rock that offers new insight into the history of Mars's interior and suggests parts of the red planet may be more like our own than we ever knew.

Martian Rock "Jake Matijevic" Obtained By Curiosity's Mast Camera    
 Credit: NASA/JPL-Caltech/MSSS

The paper—whose lead author is Edward Stolper, Caltech's William E. Leonhard Professor of Geology, provost, and interim president—is one of five appearing in the journal with results from the analysis of data and observations obtained during Curiosity's first 100 martian days (sols). The other papers include an evaluation of fine- and coarse-grained soil samples and detailed analyses of the composition and formation process of a windblown drift of sand and dust.

"The results presented go beyond the question of habitability," says John Grotzinger, MSL project scientist and Caltech's Fletcher Jones Professor of Geology. "Mars Science Laboratory also has a major mission objective to explore and characterize the geological environment at all scales and also the atmosphere. In doing this we learn about the fundamental physical and chemical properties that distinguish the terrestrial planets from each other and also what they share in common."

The paper by Stolper and his colleagues—including Caltech senior research scientist Michael Baker and graduate student Megan Newcombe—examines in detail a 50-centimeter-tall pyramid-shaped rock named "Jake_M" (after MSL surface operations systems chief engineer Jacob "Jake" Matijevic, who passed away two weeks after Curiosity's landing).

The rock was encountered by Curiosity a few weeks after it landed, during its slow drive across Gale Crater on the way toward the crater's central peak, Mount Sharp. Visual inspection of the dark gray rock suggested that it was probably a fine-grained basaltic igneous rock formed by the crystallization of magma near the planet's surface. The absence of obvious mineral grains on its essentially dust-free surface further suggested that it would have a relatively uniform (i.e., homogeneous) chemical composition.

For that reason, MSL's scientists decided it would be a good test case for comparing the results obtained by two of the rover's scientific instruments, the Alpha Particle X-ray Spectrometer (APXS) and ChemCam, both of which are used to measure the chemical compositions of rocks, sediments, and minerals.

The APXS analyses, however, produced some unanticipated results. Far from being similar in its chemical composition to the many martian igneous rocks analyzed by the Spirit and Opportunity rovers on the surface of Mars or to martian meteorites found on Earth, Jake_M is highly enriched in sodium and potassium, making it chemically alkaline.

Although Jake_M is very different from known martian rocks, Stolper and colleagues realized that it is very similar in its chemical composition to a relatively rare type of terrestrial igneous rock, known as a mugearite, which is typically found on ocean islands and in continental rift zones.

"We realized right away that although nothing like it had ever been found on Mars, Jake_M is similar in composition to terrestrial mugearites, which although uncommon are very well known to igneous petrologists who study volcanic rocks on Earth," Stolper says. "In fact, if this rock were found on Earth, we would be hard pressed, based on its elemental composition, to tell it was not an Earth rock." However, he notes, "such rocks are so uncommon on Earth that it would be highly unlikely that, if you landed a spacecraft on Earth in a random location, the first rock you encountered within a few hundred meters of your landing site would be an alkaline rock like Jake_M."

On both Earth and Mars, basaltic liquids form by partial melting of rocks deep inside the planet. By analogy with terrestrial mugearites, Jake_M probably evolved from such a partial melt that cooled as it ascended toward the surface from the martian interior; as it cooled, crystals formed, and the chemical composition of the remaining liquid changed (just as, in the making of rock candy, a sugar-water solution becomes less sweet as it cools and sugar crystallizes from it).

"The minerals that crystallize have different elemental compositions than the melt and are either more dense or less dense than the liquid and thus tend to physically separate, that is, to settle to the bottom of the magma chamber or float to the top, causing the chemical composition of the remaining liquid to change," Baker explains.

The MSL team then modeled the conditions required to produce a residual liquid similar in composition to Jake_M by crystallization of plausible partial melts. From those results, they inferred that the cooling and crystallization that eventually produced Jake_M probably occurred at pressures of several kilobars, the equivalent of the pressure at a depth of a few tens of kilometers beneath the martian surface. The modeling also suggested—particularly by analogy with terrestrial mugearites—that the martian magmas were relatively rich in dissolved water.

According to Stolper, Baker, and their colleagues, Jake_M probably originated via the melting of a relatively alkali- and water-rich martian mantle that was different from the sources of other known martian basalts. Because the primitive martian mantle is believed to have been as much as two times richer in sodium and potassium than Earth's mantle, the researchers say that, in hindsight, it might not be surprising if alkaline magmas, which are so uncommon on Earth, are more common on Mars.

Moreover, Stolper adds, "there are many hypotheses for origin of alkaline magmas on Earth that are similar to Jake_M. Perhaps the most plausible is that regions deep in the mantle become enriched in alkalis by a process known as metasomatism, in which the chemical compositions of rocks are altered by the flow of water- and carbon-dioxide-rich fluids. The existence of Jake_M may be evidence that such processes also occur in the interior of Mars."

Intriguingly, the potassium-rich nature of many of the sedimentary rocks that have been analyzed by the MSL mission may turn out to reflect the presence of such a region enriched in alkalis in the mantle underlying Gale Crater.

However, he says, "with only one rock having this odd chemical composition, we don't want to get carried away. Is it a one-off, or is it a representative of an important class of igneous rocks from the Gale Crater region? Determining the answer to this will be an important goal for the ongoing MSL mission."

"The paper by Stolper et al. shows that the internal composition of Mars is more similar to Earth than we had thought and illustrates how even a single rock can provide insight into the evolution of the planet as a whole," Grotzinger says.

The work in the paper, "The Petrochemistry of Jake_M: A Martian Mugearite," was supported by grants from the National Science Foundation, the National Aeronautics and Space Administration, the Canadian Space Agency, and the Centre National d'Études Spatiales.

Contacts and sources:
Written by Kathy Svitil

Breathing Underwater: Evidence Of Microscopic Life In Oceanic Crust

Although long thought to be devoid of life, the bottom of the deep ocean is now known to harbor entire ecosystems teeming with microbes. Scientists have recently documented that oxygen is disappearing from seawater circulating through deep oceanic crust, a significant first step in understanding the way life in the "deep biosphere" beneath the sea floor is able to survive and thrive.
Dr. Beth Orcutt (front, second from left) of Bigelow Laboratory for Ocean Sciences examines oceanic crust samples with Dr. Wolfgang Bach of the University of Bremen, Germany, during IODP Expedition 336 to the Mid-Atlantic Ridge flank.
Credit: Photo courtesy of Jennifer T. Magnusson.

A team of researchers led by Dr. Beth Orcutt of the Bigelow Laboratory for Ocean Sciences used the JOIDES Resolution, a sophisticated 470-foot scientific drilling vessel operated by the international Integrated Ocean Drilling Program (IODP), to sample the muddy and sandy sediments that blanket the rocks on the seafloor, as well as drill into the hard crustal rocks themselves— considered by many to be the largest reservoir of life on Earth — in order to understand how microbes can "breathe" and get the energy necessary to live in this remote environment.

The team measured oxygen concentrations in sediment cores collected above the rocky oceanic crust, almost three miles below the sea surface, on the western edge of the remote Mid-Atlantic Ridge. These measurements then allowed the researchers to determine oxygen concentration in seawater circulating in the rocks of the oceanic crust itself.

"Our computer models showed that the crustal oxygen concentrations in the region were most likely the result of microbial life forms scavenging oxygen in the crust as seawater moves through fractures and cracks deep in the rocks," said Orcutt. "Under the cold conditions of the crust in this area, purely chemical oxygen consumption is minimal, which suggests that microbes in the oceanic crust are responsible for using the oxygen that's down there."

"We know there's a vast reservoir of life in the ocean crust, but unless we take steps to quantify its metabolism, we'll never know how vast it is," said co-author Dr. Sam Hulme, from Moss Landing Marine Laboratories.

Another co-author of the paper, Dr. Geoff Wheat of the University of Alaska Fairbanks, pointed out that chemical composition of seawater within pore spaces between sediment grains provides important information about what reactions occur there and how fast they happen. "This result sets the stage for more directed experiments to understand how microbes use the oxygen for growth in a place with little food," Wheat said.

"One of the biggest goals of the international scientific ocean drilling research community is to understand how life functions in the vast 'deep biosphere' buried alive below the seafloor, but it's very challenging to access and explore the hard rocks that make up the base of the seafloor," Orcutt added. "Our results are the first to document the removal of oxygen in the rocky crustal environment — something that had been expected but not shown until now. With this information, we can start to unravel the complex mystery of life below the seafloor."

"Detecting life by measuring oxygen in subseafloor environments with vigorous seawater flow is not an easy task," agreed Dr. Wolfgang Bach, a scientist at the University of Bremen in Germany, and another coauthor of the paper. "Imagine an extraterrestrial life-detection task force landing on Earth with oxygen probes as the only life-detection tool. If they ended up in a well-ventilated meeting room stuffed with delegates, they'd conclude from the measurements they'd be making that respiration was minimal, hence life is slow, if not absent. Doing these measurements in an environment where we think we know the direction of flow of seawater and detecting a gradient in oxygen makes all the difference in making inferences about subseafloor life."

"Tiny microbial life on Earth is responsible for big tasks like global chemical cycling. In order to understand how important elements like oxygen — which we all need to breathe — move around Earth, we need to understand how quickly it is consumed in the largest aquifer on Earth, oceanic crust," said Orcutt.

The Nature Communications paper, "Oxygen consumption in subseafloor basaltic oceanic crust," is an outcome of the 2011 IODP Expedition #336 to the western flank of the Mid-Atlantic Ridge, funded in part by the National Science Foundation (NSF).

Bigelow Laboratory is an independent, non-profit center for global ocean research, ocean science education, and technology transfer in coastal Maine. A recognized leader in Maine's emerging innovation economy, the Laboratory's research ranges from microbial oceanography to the large-scale processes that drive ocean systems and global environmental conditions.

The new research findings were published in the journal Nature Communications on September 27, 2013, and are helping to redefine our concepts of the limits of life on our planet.

Contacts and sources:
Tatiana Brailovskaya
Bigelow Laboratory for Ocean Sciences

Thursday, September 26, 2013

Lunar Orbiters Discover Source Of Space Weather Near Earth

Solar storms — powerful eruptions of solar material and magnetic fields into interplanetary space — can cause what is known as "space weather" near Earth, resulting in hazards that range from interference with communications systems and GPS errors to extensive power blackouts and the complete failure of critical satellites.

Process of magnetic reconnection, which powers the phenomena known as space weather.
An artist’s depiction of magnetic reconnection
Credit; UCLA

New research published today increases our understanding of Earth's space environment and how space weather develops.

Some of the energy emitted by the sun during solar storms is temporarily stored in Earth's stretched and compressed magnetic field. Eventually, that solar energy is explosively released, powering Earth's radiation belts and lighting up the polar skies with brilliant auroras. And while it is possible to observe solar storms from afar with cameras, the invisible process that unleashes the stored magnetic energy near Earth had defied observation for decades.

In the Sept. 27 issue of the journal Science, researchers from the UCLA College of Letters and Science, the Austrian Space Research Institute (IWF Graz) and the Japan Aerospace Exploration Agency (JAXA) report that they finally have measured the release of this magnetic energy close up using an unprecedented alignment of six Earth-orbiting spacecraft and NASA's first dual lunar orbiter mission, ARTEMIS.

Space weather begins to develop inside Earth's magnetosphere, the giant magnetic bubble that shields the planet from the supersonic flow of magnetized gas emitted by the sun. During solar storms, some solar energy enters the magnetosphere, stretching the bubble out into a long, teardrop-shaped tail that extends more than a million miles into space.

The stored magnetic energy is then released by a process called "magnetic reconnection." This event can be detected only when fast flows of energized particles pass by a spacecraft positioned at exactly the right place at the right time.

Still from animation of the space weather process: a solar storm eruption influences Earth’s magnetic field, resulting in an explosive burst of energy known as magnetic reconnection.
Magnetic reconnection

Luckily, this happened in 2008, when NASA's five Earth-orbiting THEMIS satellites discovered that magnetic reconnection was the trigger for near-Earth substorms, the fundamental building blocks of space weather. However, there was still a piece of the space weather puzzle missing: There did not appear to be enough energy in the reconnection flows to account for the total amount of energy released for typical substorms.

In 2011, in an attempt to survey a wider area of the Earth's magnetosphere, the THEMIS team repositioned two of its five spacecraft into lunar orbits, creating a new mission dubbed ARTEMIS after the Greek goddess of the hunt and the moon. From afar, these two spacecraft provided a unique global perspective of energy storage and release near Earth.

(For more on ARTEMIS, visit the ARTEMIS websites hosted by UC Berkeley, NASA and UCLA.)

Similar to a pebble creating expanding ripples in a pond, magnetic reconnection generates expanding fronts of electricity, converting the stored magnetic energy into particle energy. Previous spacecraft observations could detect these energy-converting reconnection fronts for a split second as the fronts went by, but they could not assess the fronts' global effects because data were collected at only a single point.

By the summer of 2012, however, an alignment among THEMIS, ARTEMIS, the Japanese Space Agency's Geotail satellite and the U.S. National Oceanic and Atmospheric Administration's GOES satellite was finally able to capture data accounting for the total amount of energy that drives space weather near Earth.

During this event, reported in the current Science paper, a tremendous amount of energy was released.

"The amount of power converted was comparable to the electric power generation from all power plants on Earth — and it went on for over 30 minutes," said Vassilis Angelopoulos, a professor in the UCLA Department of Earth, Planetary and Space Sciences, principal investigator for ARTEMIS and THEMIS, and lead author of the research in Science. "The amount of energy released was equivalent to a 7.1 Richter-scale earthquake."

 Space weather process: Solar storm eruption influences Earth's magnetic field, resulting in explosive burst of energy known as magnetic reconnection.
Credit: UCLA

Trying to understand how gigantic explosions on the sun can have effects near Earth involves tracking energy from the original solar event all the way to Earth. It is like keeping tabs on a character in a play who undergoes many costume changes, researchers say, because the energy changes frequently along its journey: Magnetic energy causes solar eruptions that lead to flow energy as particles hurtle away, or to thermal energy as the particles heat up.

Near Earth, that energy can go through all the various changes in form once again. Understanding the details of each step in the process is crucial for scientists to achieve their goal of someday predicting the onset and intensity of space weather.

Using ARTEMIS, a clear picture emerged of the total energy stored, and the entire fleet of satellites tracked the energy fronts at high time resolution, Angelopoulos said.

The spacecraft and satellites observed two expanding energy fronts launched symmetrically on either side of the magnetic reconnection site, one moving toward Earth and the other away from it, past the moon. The magnetic energy was transformed into particle and wave energy during its quarter-million–mile journey from its origin within a narrow region only a few dozen miles across.

This, the researchers said, explains why single-satellite measurements in the past did not make much of the energy release. The multiple satellite fleet, however, showed that the energy conversion continued for up to 30 minutes after the onset of reconnection.

"We have finally found what powers Earth's aurora and radiation belts," Angelopoulos said. "It took many years of mission planning and patience to capture this phenomenon on multiple satellites, but it has certainly paid off. We were able to track the total energy and see where and when it is converted into different kinds of energy."

With the full, global picture of energy storage and transfer in the magnetosphere, scientists can now focus their attention on the physics of the energy conversion and its eventual dissipation in order to improve space weather forecasts.

What scientists learn on Earth can also inform our knowledge elsewhere in our solar system. The sun's eruptions are also controlled largely by magnetic reconnection, and intense auroras at Jupiter create the most powerful electromagnetic emissions in the solar system besides the sun, Angelopoulos said.

Similar emissions from planets orbiting other stars may one day reveal the interior structure of distant worlds. Since the sun's surface and very distant planets cannot yet be visited, there is no place better than Earth's own space environment to study energy transformation on large and small scales with a coordinated fleet of highly capable satellites, he said.

NASA is currently building the Heliophysics System Observatory, which combines existing and future satellite resources in space, including THEMIS, ARTEMIS, the recently launched twin Van Allen Radiation Belt Probes, and the four Magnetospheric MultiScale satellites, which will be launched in 2014 (and which have involved UCLA scientific and hardware participation).

"It is a very exciting time ahead," said David Sibeck, THEMIS/ARTEMIS project scientist at NASA's Goddard Space Flight Center. "Never before did we have the possibility for so many high-quality observatories lining up."

ARTEMIS stands for Acceleration, Reconnection, Turbulence and Electrodynamics of the Moon's Interaction with the Sun. THEMIS (Time History of Events and Macroscale Interactions during Substorms) was launched Feb. 17, 2007, from Cape Canaveral, Fla., to impartially resolve the trigger mechanism of substorms. Themis was the blindfolded Greek goddess of order and justice.

THEMIS and ARTEMIS are part of NASA's Explorer program, managed by the Goddard Space Flight Center. UC Berkeley's Space Sciences Laboratory is responsible for mission operations and built several of the on-board and ground-based instruments. Austria, Canada, France and Germany contributed instrumentation, operations and science. ATK (formerly Swales Aerospace), built the THEMIS spacecraft. ARTEMIS's orbit design, navigation and execution were the result of a collaborative effort among NASA's Jet Propulsion Laboratory, the Goddard Space Flight Center and UC Berkeley. UCLA scientists built the ground magnetometers.

Contacts and sources:
By Emmanuel Masongsong and UCLA Newsroom
University of California - Los Angeles

Researchers Publish Enormous Catalog Of More Than 300,000 Nearby Galaxies

More than 83,000 volunteer citizen scientists. Over 16 million galaxy classifications. Information on more than 300,000 galaxies. This is what you get when you ask the public for help in learning more about our universe.

This galaxy, NGC 4565, is a disk galaxy viewed at nearly an edge-on angle. Galaxies like these are of particular interest for their links to star formation and the speeds at which galaxies rotate.

The project, named Galaxy Zoo 2, is the second phase of a crowdsourcing effort to categorize galaxies in our universe. Researchers say computers are good at automatically measuring properties such as size and color of galaxies, but more challenging characteristics, such as shape and structure, can currently only be determined by the human eye.

An international group of researchers, led by the University of Minnesota, has just produced a catalog of this new galaxy data. This catalog is 10 times larger than any previous catalog of its kind. It is available online at, and a paper describing the project and data was published today in the Monthly Notices of the Royal Astronomical Society.

View examples of images categorized by citizen scientists at

"This catalog is the first time we’ve been able to gather this much information about a population of galaxies," said Kyle Willett, a physics and astronomy postdoctoral researcher in the University of Minnesota’s College of Science and Engineering and the paper’s lead author. "People all over the world are beginning to examine the data to gain a more detailed understanding of galaxy types."

Between Feb. 2009 and April 2010, more than 83,000 Galaxy Zoo 2 volunteers from around the world looked at images online gathered from the Sloan Digital Sky Survey. They answered questions about the galaxy, including whether it had spirals, the number of spiral arms present, or if it had galactic bars, which are long extended features that represent a concentration of stars. Each image was classified an average of 40-45 times to ensure accuracy. More than 16 million classifications of more than 300,000 galaxies were gathered representing about 57 million computer clicks.

When volunteers were asked why they got involved in the project, the most common answer was because they enjoyed contributing to science. Researchers estimate that the effort of the volunteers on this project represents about 30 years of full-time work by one researcher.

"With today’s high-powered telescopes, we are gathering so many new images that astronomers just can’t keep up with detailed classifications," said Lucy Fortson, a professor of physics and astronomy in the University of Minnesota’s College of Science and Engineering and one of the co-authors of the research paper. "We could never have produced a data catalog like this without crowdsourcing help from the public."

Fortson said Galaxy Zoo 2 is similar to a census of the galaxies. With this new catalog, researchers now have a snapshot of the different types of galaxies as they are today. The next catalog will tell us about galaxies in the distant past. The catalogs together will let us understand how our universe is changing.

To help create the next catalog, volunteer citizen scientists continue to be needed for the project. To participate, visit No special skills are needed, and volunteers can start classifying galaxies and helping the scientists within minutes of going to the website.

In addition to Fortson and Willett, other authors of the research paper include Chris Lintott, Oxford Astrophysics and Adler Planetarium; Steven Bamford, University of Nottingham; Karen Masters, Robert Nichol and Daniel Thomas, University of Portsmouth and South East Physics Network; Brooke Simmons and Robert Simpson, Oxford Astrophysics; Kevin Casteels, University of Barcelona; Edward Edmondson and Thomas Melvin, University of Portsmouth; Sugata Kaviraj, Oxford Astrophysics and University of Hertfordshire; William Keel, University of Alabama; M. Jordan Raddick, Johns Hopkins University; Kevin Schawinski, ETH Zurich; Ramin Skibba, University of California, San Diego; and Arfon Smith, Adler Planetarium.

The research was funded primarily by the National Science Foundation and the Leverhulme Trust. Galaxy Zoo is one of the many online citizen science projects made available by the team.

To read the full research paper entitled "Galaxy Zoo 2: detailed morphological classifications for 304,122 galaxies from the Sloan Digital Sky Survey," visit the Monthly Notices of the Royal Astronomical Society website.

Contacts and sourcces:

NASA, Homeland Security Test Disaster Recovery Tool

NASA and the U.S. Department of Homeland Security are collaborating on a first-of-its-kind portable radar device to detect the heartbeats and breathing patterns of victims trapped in large piles of rubble resulting from a disaster.

The prototype technology, called Finding Individuals for Disaster and Emergency Response (FINDER) can locate individuals buried as deep as 30 feet (about 9 meters) in crushed materials, hidden behind 20 feet (about 6 meters) of solid concrete, and from a distance of 100 feet (about 30 meters) in open spaces.

A new portable radar device can detect heartbeats and breathing of victims trapped under rubble in a disaster .

Image Credit: NASA/JPL-Caltech

Developed in conjunction with Homeland Security's Science and Technology Directorate, FINDER is based on remote-sensing radar technology developed by NASA's Jet Propulsion Laboratory in Pasadena, Calif., to monitor the location of spacecraft JPL manages for NASA's Science Mission Directorate in Washington.

This picture is from a test of the Finding Individuals for Disaster and Emergency Response (FINDER) prototype technology at the Virginia Task Force 1 Training Facility in Lorton, VA.

Image Credit: DHS/John Price

"FINDER is bringing NASA technology that explores other planets to the effort to save lives on ours," said Mason Peck, chief technologist for NASA and principal advisor on technology policy and programs. "This is a prime example of intergovernmental collaboration and expertise that has a direct benefit to the American taxpayer."

The technology was demonstrated to the media today at the DHS's Virginia Task Force 1 Training Facility in Lorton, Va. Media participated in demonstrations that featured the device locating volunteers hiding under heaps of debris. FINDER also will be tested further by the Federal Emergency Management Agency this year and next.

"The ultimate goal of FINDER is to help emergency responders efficiently rescue victims of disasters," said John Price, program manager for the First Responders Group in Homeland Security's Science and Technology Directorate in Washington. "The technology has the potential to quickly identify the presence of living victims, allowing rescue workers to more precisely deploy their limited resources."

The technology works by beaming microwave radar signals into the piles of debris and analyzing the patterns of signals that bounce back. NASA's Deep Space Network regularly uses similar radar technology to locate spacecraft. A light wave is sent to a spacecraft, and the time it takes for the signal to get back reveals how far away the spacecraft is. This technique is used for science research, too. For example, the Deep Space Network monitors the location of the Cassini mission's orbit around Saturn to learn about the ringed planet's internal structure.

"Detecting small motions from the victim's heartbeat and breathing from a distance uses the same kind of signal processing as detecting the small changes in motion of spacecraft like Cassini as it orbits Saturn," said James Lux, task manager for FINDER at JPL.

In disaster scenarios, the use of radar signals can be particularly complex. Earthquakes and tornadoes produce twisted and shattered wreckage, such that any radar signals bouncing back from these piles are tangled and hard to decipher. JPL's expertise in data processing helped with this challenge. Advanced algorithms isolate the tiny signals from a person's moving chest by filtering out other signals, such as those from moving trees and animals.

Similar technology has potential applications in NASA's future human missions to space habitats. The astronauts' vital signs could be monitored without the need for wires.

The Deep Space Network, managed by JPL, is an international network of antennas that supports interplanetary spacecraft missions and radio and radar astronomy observations for the exploration of the solar system and the universe. The network also supports selected Earth-orbiting missions.

For more information about NASA programs, visit: .

Contacts and sources:
Dwayne Brown  

Whitney Clavin
Jet Propulsion Laboratory