Unseen Is Free

Unseen Is Free
Try It Now

OpenX

Google Translate

Wednesday, August 20, 2014

Neanderthals 'Overlapped' With Modern Humans For Up To 5,400 Years

Neanderthals and modern humans were both living in Europe for between 2,600 and 5,400 years, according to a new paper published in the journal, Nature. For the first time, scientists have constructed a robust timeline showing when the last Neanderthals died out.

The image shows a Neanderthal model from the Natural History Museum. The Museum carried out the research in collaboration with Oxford.
Credit: University of Oxford

Significantly, the research paper says there is strong evidence to suggest that Neanderthals disappeared at different times across Europe rather than being rapidly replaced by modern humans.

A team, led by Professor Thomas Higham of the University of Oxford, obtained new radiocarbon dates for around 200 samples of bone, charcoal and shell from 40 key European archaeological sites. The sites, ranging from Russia in the east to Spain in the west, were either linked with the Neanderthal tool-making industry, known as Mousterian, or were ‘transitional’ sites containing stone tools associated with either early modern humans or Neanderthals.

The chronology was pieced together during a six-year research project by building mathematical models that combine the new radiocarbon dates with established archaeological stratigraphic evidence. The results showed that both groups overlapped for a significant period, giving ‘ample time’ for interaction and interbreeding. The paper adds, however, it is not clear where interbreeding may have happened in Eurasia or whether it occurred once or several times.

Professor Thomas Higham said: "‘Other recent studies of Neanderthal and modern human genetic make-up suggest that both groups interbred outside Africa, with 1.5%-2.1% or more of the DNA of modern non-African human populations originating from Neanderthals."

He added, "We believe we now have the first robust timeline that sheds new light on some of the key questions around the possible interactions between Neanderthals and modern humans. The chronology also pinpoints the timing of the Neanderthals’ disappearance, and suggests they may have survived in dwindling populations in pockets of Europe before they became extinct."



In 2011, another Nature paper featuring Dr Katerina Douka of the Oxford team obtained some very early dates (around 45,000 years old) for the so-called ‘transitional’ Uluzzian stone-tool industry of Italy and identified teeth remains in the site of the Grotta del Cavallo, Apulia, as those of anatomically modern humans. 

Under the new timeline published today, the Mousterian industry (attributed to Neanderthals and found across vast areas of Europe and Eurasia) is shown to have ended between 41,030 to 39,260 years ago. This suggests strongly that there was an extensive overlapping period between Neanderthals and modern humans of several thousand years. The scientific team has for the first time specified exactly how long this overlap lasted, with 95% probability.

The Uluzzian also contains objects, such as shell beads, that scholars widely believe signify symbolic or advanced behaviour in early human groups. One or two of the Châtelperronian sites of France and northern Spain (currently, although controversially, associated with Neanderthals) contain some similar items. 

This supports the theory first advanced several years ago that the arrival of early modern humans in Europe may have stimulated the Neanderthals into copying aspects of their symbolic behaviour in the millennia before they disappeared. The paper also presents an alternative theory: that that the similar start dates of the two industries could mean that Châtelperronian sites are associated with modern humans and not Neanderthals after all.

There is currently no evidence to show that Neanderthals and early modern humans lived closely together, regardless of whether the Neanderthals were responsible for the Châtelperronian culture, the paper says. Rather than modern humans rapidly replacing Neanderthals, there seems to have been a more complex picture ‘characterised by a biological and cultural mosaic that lasted for several thousand years’. 

The Châtelperronian industry follows the Mousterian in archaeological layers at all sites where both occur. Importantly, however, the Châtelperronian industry appears to have started significantly before the end of Mousterian at some sites in Europe. This suggests that if Neanderthals were responsible for both cultures, there may have been some regional variation in their tool-making, says the paper.

Professor Higham said: ‘Previous radiocarbon dates have often underestimated the age of samples from sites associated with Neanderthals because the organic matter was contaminated with modern particles. We used ultrafiltration methods, which purify the extracted collagen from bone, to avoid the risk of modern contamination. This means we can say with more confidence that we have finally resolved the timing of the disappearance of our close cousins, the Neanderthals. Of course the Neanderthals are not completely extinct because some of their genes are in most of us today.’

Previous research had suggested that the Iberian Peninsula (modern-day Spain and Portugal) and the site of Gorham’s Cave, Gibraltar, might have been the final places in Europe where Neanderthals survived. Despite extensive dating work, the research team could not confirm the previous dates. The paper suggests that poor preservation techniques for the dating material could have led to contamination and false ‘younger’ dates previously.


Contacts and sources:
University of Oxford

Tuesday, August 19, 2014

Solar Energy That Doesn't Block The View

A team of researchers at Michigan State University has developed a new type of solar concentrator that when placed over a window creates solar energy while allowing people to actually see through the window.

It is called a transparent luminescent solar concentrator and can be used on buildings, cell phones and any other device that has a clear surface.

And, according to Richard Lunt of MSU’s College of Engineering, the key word is “transparent.”

Solar power with a view: MSU doctoral student Yimu Zhao holds up a transparent luminescent solar concentrator module.
Photo by Yimu Zhao.

Research in the production of energy from solar cells placed around luminescent plastic-like materials is not new. These past efforts, however, have yielded poor results – the energy production was inefficient and the materials were highly colored.

“No one wants to sit behind colored glass,” said Lunt, an assistant professor of chemical engineering and materials science. “It makes for a very colorful environment, like working in a disco. We take an approach where we actually make the luminescent active layer itself transparent.”

The solar harvesting system uses small organic molecules developed by Lunt and his team to absorb specific nonvisible wavelengths of sunlight.

“We can tune these materials to pick up just the ultraviolet and the near infrared wavelengths that then ‘glow’ at another wavelength in the infrared,” he said.

The “glowing” infrared light is guided to the edge of the plastic where it is converted to electricity by thin strips of photovoltaic solar cells.

A transparent luminescent solar concentrator waveguide is shown with colorful traditional luminescent solar concentrators in the background. The new LSC can create solar energy but is not visible on windows or other clear surfaces.
Photo by G.L. Kohuth   

“Because the materials do not absorb or emit light in the visible spectrum, they look exceptionally transparent to the human eye,” Lunt said.

One of the benefits of this new development is its flexibility. While the technology is at an early stage, it has the potential to be scaled to commercial or industrial applications with an affordable cost.

“It opens a lot of area to deploy solar energy in a non-intrusive way,” Lunt said. “It can be used on tall buildings with lots of windows or any kind of mobile device that demands high aesthetic quality like a phone or e-reader. Ultimately we want to make solar harvesting surfaces that you do not even know are there.”

Lunt said more work is needed in order to improve its energy-producing efficiency. Currently it is able to produce a solar conversion efficiency close to 1 percent, but noted they aim to reach efficiencies beyond 5 percent when fully optimized. The best colored LSC has an efficiency of around 7 percent.

The research was featured on the cover of a recent issue of the journal Advanced Optical Materials.

Other members of the research team include Yimu Zhao, an MSU doctoral student in chemical engineering and materials science; Benjamin Levine, assistant professor of chemistry; and Garrett Meek, doctoral student in chemistry.



Contacts and sources:
Tom Oswald
Michigan State University

Climate Change Will Threaten Fish By Drying Out Southwest U.S. Streams, Study Predicts

Fish species native to a major Arizona watershed may lose access to important segments of their habitat by 2050 as surface water flow is reduced by the effects of climate warming, new research suggests.

Most of these fish species, found in the Verde River Basin, are already threatened or endangered. Their survival relies on easy access to various resources throughout the river and its tributary streams. The species include the speckled dace (Rhinichthys osculus), roundtail chub (Gila robusta) and Sonora sucker (Catostomus insignis).

 Speckled Dace 
Credit: Wikipedia

A key component of these streams is hydrologic connectivity – a steady flow of surface water throughout the system that enables fish to make use of the entire watershed as needed for eating, spawning and raising offspring.

Models that researchers produced to gauge the effects of climate change on the watershed suggest that by the mid 21st century, the network will experience a 17 percent increase in the frequency of stream drying events and a 27 percent increase in the frequency of zero-flow days.

“We have portions of the channel that are going to dry more frequently and for longer periods of time,” said lead author Kristin Jaeger, assistant professor in The Ohio State University School of Environment and Natural Resources. “As a result, the network will become fragmented, contracting into isolated, separated pools.

Kristin Jaeger
Credit: OSU

“If water is flowing throughout the network, fish are able to access all parts of it and make use of whatever resources are there. But when systems dry down, temporary fragmented systems develop that force fish into smaller, sometimes isolated channel reaches or pools until dry channels wet up again.”

This study covers climate change’s effects on surface water availability from precipitation and temperature changes. It does not take into account any withdrawals of groundwater that will be needed during droughts to support the estimated 50 percent or more increase in Arizona’s population by 2050.

“These estimates are conservative,” said Jaeger, who conducted the study with co-authors Julian Olden and Noel Pelland of the University of Washington. The study is published in the Proceedings of the National Academy of Sciences.

The researchers used a rainfall runoff model, the Soil and Water Assessment Tool (SWAT), which incorporates the study basin’s elevation, terrain, soil, land use, vegetation coverage, and both current and future climate data, including precipitation and temperature.

“It’s a hydrological model that routes water received from precipitation through the landscape, a portion of which eventually becomes streamflow in the river,” Jaeger said. “We partitioned the watershed into many smaller pieces all linked to each other, with nodes placed 2 kilometers apart throughout the entire river network to evaluate if that portion of the river channel at an individual node supported streamflow for a given day.”

Jaeger describes the river network, as envisioned by this model, as a mosaic of wet and dry patches. Piecing data from all of those nodes together, the researchers established an index of connectivity for the entire watershed, which predicts that the mid-century and late-century climate will reduce connectivity by 6 to 9 percent over the course of a year and by up to 12 to 18 percent during spring spawning months.

“The index decreases that are predicted by the model will affect spawning the most,” said Jaeger, who also holds an appointment with the Ohio Agricultural Research and Development Center. “During the spring spawning period, fish are more mobile, traveling longer distances to access necessary habitat. Projected decreased connectivity compromises access to different parts of the network.”

Flowing portions of the system will diminish between 8 and 20 percent in spring and early summer, producing lengthier channels that will dry more frequently and over longer periods of time. These changes will reduce available habitat for fish and force them to travel longer distances for resources once channels rewet, Jaeger said.

The fish are already subject to stressors on the system, including both surface and groundwater extraction for irrigation and drinking water, loss of habitat and the introduction of nonnative species that prey on the native fish, Jaeger noted. The overall system’s connectivity is also already compromised, as well, because of existing dry conditions in the American Southwest.

“These fish are important cogs in the wheel of this greater ecosystem,” Jaeger said. “Loss of endemic species is a big deal in and of itself, and native species evaluated in this study are particularly evolved to this watershed. In this river network that currently supports a relatively high level of biodiversity, the suite of endemic fish species are filling different niches in the ecosystem, which allows the system to be more resilient to disturbances such as drought.

“If species are pushed over the edge to extinction, then what they bring to the ecosystem will be lost and potentially very difficult to replace.”

This project was funded by the Department of Defense Strategic Environmental Research and Development Program.


Contacts and sources:
By: Emily Caldwell

First Indirect Evidence Of So-Far Undetected Strange Baryons

New supercomputing calculations provide the first evidence that particles predicted by the theory of quark-gluon interactions but never before observed are being produced in heavy-ion collisions at the Relativistic Heavy Ion Collider (RHIC), a facility that is dedicated to studying nuclear physics.

Brookhaven theoretical physicist Swagato Mukherjee
Credit: BNL 

These heavy strange baryons, containing at least one strange quark, still cannot be observed directly, but instead make their presence known by lowering the temperature at which other strange baryons "freeze out" from the quark-gluon plasma (QGP) discovered and created at RHIC, a U.S. Department of Energy (DOE) Office of Science user facility located at DOE's Brookhaven National Laboratory.

RHIC is one of just two places in the world where scientists can create and study a primordial soup of unbound quarks and gluons—akin to what existed in the early universe some 14 billion years ago. The research is helping to unravel how these building blocks of matter became bound into hadrons, particles composed of two or three quarks held together by gluons, the carriers of nature's strongest force.

Added Berndt Mueller, Associate Laboratory Director for Nuclear and Particle Physics at Brookhaven, "This finding is particularly remarkable because strange quarks were one of the early signatures of the formation of the primordial quark-gluon plasma. Now we're using this QGP signature as a tool to discover previously unknown baryons that emerge from the QGP and could not be produced otherwise."

"Baryons, which are hadrons made of three quarks, make up almost all the matter we see in the universe today," said Brookhaven theoretical physicist Swagato Mukherjee, a co-author on a paper describing the new results in Physical Review Letters. 

"The theory that tells us how this matter forms—including the protons and neutrons that make up the nuclei of atoms—also predicts the existence of many different baryons, including some that are very heavy and short-lived, containing one or more heavy 'strange' quarks. Now we have indirect evidence from our calculations and comparisons with experimental data at RHIC that these predicted higher mass states of strange baryons do exist," he said. 
 
Freezing point depression and supercomputing calculations

The evidence comes from an effect on the thermodynamic properties of the matter nuclear physicists can detect coming out of collisions at RHIC. Specifically, the scientists observe certain more-common strange baryons (omega baryons, cascade baryons, lambda baryons) "freezing out" of RHIC's quark-gluon plasma at a lower temperature than would be expected if the predicted extra-heavy strange baryons didn't exist.

"It's similar to the way table salt lowers the freezing point of liquid water," said Mukherjee. "These 'invisible' hadrons are like salt molecules floating around in the hot gas of hadrons, making other particles freeze out at a lower temperature than they would if the 'salt' wasn't there."

To see the evidence, the scientists performed calculations using lattice QCD, a technique that uses points on an imaginary four-dimensional lattice (three spatial dimensions plus time) to represent the positions of quarks and gluons, and complex mathematical equations to calculate interactions among them, as described by the theory of quantum chromodynamics (QCD).

"The calculations tell you where you have bound or unbound quarks, depending on the temperature," Mukherjee said.

The scientists were specifically looking for fluctuations of conserved baryon number and strangeness and exploring how the calculations fit with the observed RHIC measurements at a wide range of energies.

The calculations show that inclusion of the predicted but "experimentally uncharted" strange baryons fit better with the data, providing the first evidence that these so-far unobserved particles exist and exert their effect on the freeze-out temperature of the observable particles.

These findings are helping physicists quantitatively plot the points on the phase diagram that maps out the different phases of nuclear matter, including hadrons and quark-gluon plasma, and the transitions between them under various conditions of temperature and density.

"To accurately plot points on the phase diagram, you have to know what the contents are on the bound-state, hadron side of the transition line—even if you haven't seen them," Mukherjee said. "We've found that the higher mass states of strange baryons affect the production of ground states that we can observe. And the line where we see the ordinary matter moves to a lower temperature because of the multitude of higher states that we can't see."

The research was carried out by the Brookhaven Lab's Lattice Gauge Theory group, led by Frithjof Karsch, in collaboration with scientists from Bielefeld University, Germany, and Central China Normal University. The supercomputing calculations were performed using GPU-clusters at DOE's Thomas Jefferson National Accelerator Facility (Jefferson Lab), Bielefeld University, Paderborn University, and Indiana University with funding from the Scientific Discovery through Advanced Computing (SciDAC) program of the DOE Office of Science (Nuclear Physics and Advanced Scientific Computing Research), the Federal Ministry of Education and Research of Germany, the German Research Foundation, the European Commission Directorate-General for Research & Innovation and the GSI BILAER grant. The experimental program at RHIC is funded primarily by the DOE Office of Science.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.


Contacts and sources: 
Karen McNulty Walsh
DOE/Brookhaven National Laboratory


Seafood Substitutions Can Expose Consumers To Unexpectedly High Mercury

New measurements from fish purchased at retail seafood counters in 10 different states show the extent to which mislabeling can expose consumers to unexpectedly high levels of mercury, a harmful pollutant.

Fishery stock "substitutions"—which falsely present a fish of the same species, but from a different geographic origin—are the most dangerous mislabeling offense, according to new research by University of Hawai‘i at Mānoa scientists.

Chilean sea bass fillet 
Photo courtesy Flickr user Artizone

“Accurate labeling of seafood is essential to allow consumers to choose sustainable fisheries,” said UH Mānoa biologist Peter B. Marko, lead author of the new study published in the scientific journal PLOS One. “But consumers also rely on labels to protect themselves from unhealthy mercury exposure. Seafood mislabeling distorts the true abundance of fish in the sea, defrauds consumers, and can cause unwanted exposure to harmful pollutants such as mercury.”

The study included two kinds of fish: those labeled as Marine Stewardship Council- (MSC-) certified Chilean sea bass, and those labeled simply as Chilean sea bass (uncertified). The MSC-certified version is supposed to be sourced from the Southern Ocean waters of South Georgia, near Antarctica, far away from man-made sources of pollution. MSC-certified fish is often favored by consumers seeking sustainably harvested seafood but is also potentially attractive given its consistently low levels of mercury.

In a previous study, the scientists had determined that fully 20 percent of fish purchased as Chilean sea bass were not genetically identifiable as such. Further, of those Chilean sea bass positively identified using DNA techniques, 15 percent had genetic markers that indicated that they were not sourced from the South Georgia fishery.

In the new study, the scientists used the same fish samples to collect detailed mercury measurements. When they compared the mercury in verified, MSC-certified sea bass with the mercury levels of verified, non-certified sea bass, they found no significant difference in the levels. That’s not the story you would have expected based on what is known about geographic patterns of mercury accumulation in Chilean sea bass.

Fish market in Oahu's Chinatown
Photo courtesy Flickr user Michelle Lee.

“What’s happening is that the species are being substituted,” Marko explained. “The ones that are substituted for MSC-certified Chilean sea bass tend to have very low mercury, whereas those substituted for uncertified fish tend to have very high mercury. These substitutions skew the pool of fish used for MSC comparison purposes, making certified and uncertified fish appear to be much more different than they actually are.”

But there’s another confounding factor. Even within the verified, MSC-certified Chilean sea bass samples, certain fish had very high mercury levels—up to 2 or 3 times higher than expected, and sometimes even greater than import limits to some countries.

Marko and his team again turned to genetics to learn more about these fishes’ true nature. “It turns out that the fish with unexpectedly high mercury originated from some fishery other than the certified fishery in South Georgia,” said Marko. “Most of these fish had mitochondrial DNA that indicated they were from Chile. Thus, fishery stock substitutions are also contributing to the pattern by making MSC-certified fish appear to have more mercury than they really should have.”

The bottom line: Most consumers already know that mercury levels vary between species, and many public outreach campaigns have helped educate the public about which fish species to minimize or avoid. Less appreciated is the fact that mercury varies considerably within a species.

“Because mercury accumulation varies within a species’ geographic range, according to a variety of environmental factors, the location where the fish is harvested matters a great deal,” Marko said.

“Although on average MSC-certified fish is a healthier option than uncertified fish, with respect to mercury contamination, our study shows that fishery-stock substitutions can result in a larger proportional increase in mercury,” Marko said. “We recommend that consumer advocates take a closer look at the variation in mercury contamination depending on the geographic source of the fishery stock when they consider future seafood consumption guidelines.”


Contacts and sources:
Peter Marko, Associate Professor, Biology
Talia Ogliore, PIO
University of Hawaiʻi at Mānoa


Citation:  Marko PB, Nance HA, van den Hurk P (2014) Seafood Substitutions Obscure Patterns of Mercury Contamination in Patagonian Toothfish (Dissostichus eleginoides) or “Chilean Sea Bass”. PLoS ONE 9(8): e104140. doi: 10.1371/journal.pone.0104140

Has The Puzzle Of Rapid Climate Change In The Last Ice Age Been Solved?


How rapid temperature changes might have occurred during times when the Northern Hemisphere ice sheets were at intermediate sizes  
The Northern Hemisphere in a cold (stadial) phase: During the cold stadial periods of the last ice age, massive ice sheets covered northern parts of North America and Europe. Strong northwest winds drove the Arctic sea ice southward, even as far as the French coast. Since the extended ice cover over the North Atlantic prevented the exchange of heat between the atmosphere and the ocean, the strong driving forces for the ocean currents that prevail today were lacking. Ocean circulation, which is a powerful “conveyor belt” in the world’s oceans, was thus much weaker than at present, and consequently transported less heat to northern regions.

Map: Alfred-Wegener-Institut 
During the last ice age a large part of North America was covered with a massive ice sheet up to 3km thick. The water stored in this ice sheet is part of the reason why the sea level was then about 120 meters lower than today. 

Has the puzzle of rapid climate change in the last ice age been solved? New report published in Nature shows that small variations in the climate system can result in dramatic temperature changes

Over the past one hundred thousand years cold temperatures largely prevailed over the planet in what is known as the last ice age. However, the cold period was repeatedly interrupted by much warmer climate conditions. Scientists have long attempted to find out why these drastic temperature jumps of up to ten degrees took place in the far northern latitudes within just a few decades.

Now, for the first time, a group of researchers at the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI), have been able to reconstruct these climate changes during the last ice age using a series of model simulations. The surprising finding is that minor variations in the ice sheet size can be sufficient to trigger abrupt climate changes. 

The Northern Hemisphere in a warm phase (a brief, warm interstadial phase during glacial climates) During the extended cold phases the ice sheets continued to thicken. When higher ice sheets prevailed over North America, typical in periods of intermediate sea levels, the prevailing northwest winds split into two branches.
Map: Alfred-Wegener-Institut

In the map, the major wind field ran to the north of the so-called Laurentide Ice Sheet and ensured that the sea ice boundary off the European coast shifted to the north. Ice-free seas permit heat exchange to take place between the atmosphere and the ocean. At the same time, the southern branch of the northwesterly winds drove warmer water into the ice-free areas of the northeast Atlantic and thus amplified the transportation of heat to the north.

The map shows modified conditions stimulated enhanced circulation in the ocean. Consequently, a thicker Laurentide Ice Sheet over North America resulted in increased ocean circulation and therefore greater transportation of heat to the north. The climate in the Northern Hemisphere became dramatically warmer within a few decades until, due to the retreat of the glaciers over North America and the renewed change in wind conditions, it began to cool off again.

The new study was published online in the scientific journal Nature last week and will be appearing in the 21 August print issue.

Young Chinese scientist Xu Zhang, lead author of the study who undertook his PhD at the Alfred Wegener Institute, explains, “The rapid climate changes known in the scientific world as Dansgaard-Oeschger events were limited to a period of time from 110,000 to 23,000 years before present. The abrupt climate changes did not take place at the extreme low sea levels, corresponding to the time of maximum glaciation 20,000 years ago, nor at high sea levels such as those prevailing today - they occurred during periods of intermediate ice volume and intermediate sea levels.” 

The results presented by the AWI researchers can explain the history of climate changes during glacial periods, comparing simulated model data with that retrieved from ice cores and marine sediments.

During the cold stadial periods of the last ice age, massive ice sheets covered northern parts of North America and Europe. Strong westerly winds drove the Arctic sea ice southward, even as far as the French coast. Since the extended ice cover over the North Atlantic prevented the exchange of heat between the atmosphere and the ocean, the strong driving forces for the ocean currents that prevail today were lacking. Ocean circulation, which is a powerful “conveyor belt” in the world’s oceans, was thus much weaker than at present, and consequently transported less heat to northern regions.

During the extended cold phases the ice sheets continued to thicken. When higher ice sheets prevailed over North America, typical in periods of intermediate sea levels, the prevailing westerly winds split into two branches. The major wind field ran to the north of the so-called Laurentide Ice Sheet and ensured that the sea ice boundary off the European coast shifted to the north. 

Ice-free seas permit heat exchange to take place between the atmosphere and the ocean. At the same time, the southern branch of the northwesterly winds drove warmer water into the ice-free areas of the northeast Atlantic and thus amplified the transportation of heat to the north. 

The modified conditions stimulated enhanced circulation in the ocean. Consequently, a thicker Laurentide Ice Sheet over North America resulted in increased ocean circulation and therefore greater transportation of heat to the north. The climate in the Northern Hemisphere became dramatically warmer within a few decades until, due to the retreat of the glaciers over North America and the renewed change in wind conditions, it began to cool off again.

“Using the simulations performed with our climate model, we were able to demonstrate that the climate system can respond to small changes with abrupt climate swings,” explains Professor Gerrit Lohmann, leader of the Paleoclimate Dynamics group at the Alfred Wegener Institute, Germany. 

Schematic depiction of current climate conditions in the Northern Hemisphere
At present, the extent of the Arctic sea ice is far less than during the last glacial period. The Laurentide Ice Sheet, the major driving force for ocean circulation during the glacials, has also disappeared. 
Map: Alfred-Wegener-Institut

The model simulations shown above demonstrate that today’s climate is much more robust in resisting the changes which existed during phases of intermediate ice thickness and intermediate sea levels. It was then, during the last ice age, that the most rapid temperature swings in the Northern Hemisphere took place. 

In doing so he illustrates the new study’s significance with regards to contemporary climate change. “At medium sea levels, powerful forces, such as the dramatic acceleration of polar ice cap melting, are not necessary to result in abrupt climate shifts and associated drastic temperature changes.”

At present, the extent of Arctic sea ice is far less than during the last glacial period. The Laurentide Ice Sheet, the major driving force for ocean circulation during the glacials, has also disappeared. Climate changes following the pattern of the last ice age are therefore not to be anticipated under today’s conditions.

“There are apparently some situations in which the climate system is more resistant to change while in others the system tends toward strong fluctuations,” summarises Gerrit Lohmann. “In terms of the Earth’s history, we are currently in one of the climate system’s more stable phases. The preconditions which gave rise to rapid temperature changes during the last ice age do not exist today. But this does not mean that sudden climate changes can be excluded in the future.”



Contacts and sources:
Sina Loeschke
Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research

Citation:  Xu Zhang, Gerrit Lohmann, Gregor Knorr, Conor Purcell:Abrupt glacial climate shifts controlled by ice sheet changes. Nature, DOI: 10.1038/nature13592

Why Global Warming Is Taking A Break

The average temperature on Earth has barely risen over the past 16 years. ETH researchers have now found out why. And they believe that global warming is likely to continue again soon.

The number of sunspots (white area here) varies in multi-year cycles. As a result, solar irradiance, which influences the Earth's climate, also fluctuates. The photo shows a UV image of the sun.

 Image: Trace Project / NASA 

Global warming is currently taking a break: whereas global temperatures rose drastically into the late 1990s, the global average temperature has risen only slightly since 1998 – surprising, considering scientific climate models predicted considerable warming due to rising greenhouse gas emissions. 

Climate sceptics used this apparent contradiction to question climate change per se – or at least the harm potential caused by greenhouse gases – as well as the validity of the climate models. Meanwhile, the majority of climate researchers continued to emphasise that the short-term ‘warming hiatus’ could largely be explained on the basis of current scientific understanding and did not contradict longer term warming.

Researchers have been looking into the possible causes of the warming hiatus over the past few years. For the first time, Reto Knutti, Professor of Climate Physics at ETH Zurich, has systematically examined all current hypotheses together with a colleague. In a study published in the latest issue of the journal Nature Geoscience, the researchers conclude that two important factors are equally responsible for the hiatus.
El Niño warmed the Earth

One of the important reasons is natural climate fluctuations, of which the weather phenomena El Niño and La Niña in the Pacific are the most important and well known. "1998 was a strong El Niño year, which is why it was so warm that year," says Knutti. In contrast, the counter-phenomenon La Niña has made the past few years cooler than they would otherwise have been.

Although climate models generally take such fluctuations into account, it is impossible to predict the year in which these phenomena will emerge, says the climate physicist. To clarify, he uses the stock market as an analogy: "When pension funds invest the pension capital in shares, they expect to generate a profit in the long term." 

At the same time, they are aware that their investments are exposed to price fluctuations and that performance can also be negative in the short term. However, what finance specialists and climate scientists and their models are not able to predict is when exactly a short-term economic downturn or a La Niña year will occur.

Longer solar cycles

According to the study, the second important reason for the warming hiatus is that solar irradiance has been weaker than predicted in the past few years. This is because the identified fluctuations in the intensity of solar irradiance are unusual at present: whereas the so-called sunspot cycles each lasted eleven years in the past, for unknown reasons the last period of weak solar irradiance lasted 13 years. 

Furthermore, several volcanic eruptions, such as Eyjafjallajökull in Iceland in 2010, have increased the concentration of floating particles (aerosol) in the atmosphere, which has further weakened the solar irradiance arriving at the Earth's surface.

The scientists drew their conclusions from corrective calculations of climate models. In all climate simulations, they looked for periods in which the El Niño/La Niña patterns corresponded to the measured data from the years 1997 to 2012. With a combination of over 20 periods found, they were able to arrive at a realistic estimate of the influence of El Niño and La Niña. They also retroactively applied in the model calculations the actual measured values for solar activity and aerosol concentration in the Earth's atmosphere. Model calculations corrected in this way match the measured temperature data much more closely.
Incomplete measured data

The discrepancy between the climate models and measured data over the past 16 years cannot solely be attributed to the fact that these models predict too  much warming, says Knutti. The interpretation of the official measured data should also be critically scrutinised.

According to Knutti, measured data is likely to be too low, since the global average temperature is only estimated using values obtained from weather stations on the ground, and these do not exist everywhere on Earth. From satellite data, for example, scientists know that the Arctic region in particular has become warmer over the past years, but because there are no weather stations in that area, there are measurements that show strong upward fluctuations. As a result, the specified average temperature is too low.

Last year, British and Canadian researchers proposed an alternative temperature curve with higher values, in which they incorporated estimated temperatures from satellite data for regions with no weather stations. If the model data is corrected downwards, as suggested by the ETH researchers, and the measurement data is corrected upwards, as suggested by the British and Canadian researchers, then the model and actual observations are very similar.
Warming to recommence

Despite the warming hiatus, Knutti is convinced there is no reason to doubt either the existing calculations for the climate activity of greenhouse gases or the latest climate models. "Short-term climate fluctuations can easily be explained. They do not alter the fact that the climate will become considerably warmer in the long term as a result of greenhouse gas emissions," says Knutti. He believes that global warming will recommence as soon as solar activity, aerosol concentrations in the atmosphere and weather phenomena such as El Niño naturally start returning to the values of previous decades.


Contacts and sources:
Fabio Bergamin
ETH Zurich

Citation: Huber M, Knutti R: Natural variability, radiative forcing and climate response in the recent hiatus reconciled. Nature Geoscience, online publication 17 August 2014, doi: 10.1038/ngeo2228

Love Makes Sex Better For Most Women Says Study

Love and commitment can make sex physically more satisfying for many women, according to a Penn State Abington sociologist.

In a series of interviews, heterosexual women between the ages of 20 and 68 and from a range of backgrounds said that they believed love was necessary for maximum satisfaction in both sexual relationships and marriage. The benefits of being in love with a sexual partner are more than just emotional. Most of the women in the study said that love made sex physically more pleasurable.

Credit: Wikimedia Commons

"Women said that they connected love with sex and that love actually enhanced the physical experience of sex," said Beth Montemurro, associate professor of sociology.

Women who loved their sexual partners also said they felt less inhibited and more willing to explore their sexuality.

"When women feel love, they may feel greater sexual agency because they not only trust their partners but because they feel that it is OK to have sex when love is present," Montemurro said.

While 50 women of the 95 that were interviewed said that love was not necessary for sex, only 18 of the women unequivocally believed that love was unnecessary in a sexual relationship.

Older women who were interviewed indicated that this connection between love, sex and marriage remained important throughout their lifetimes, not just in certain eras of their lives.

The connection between love and sex may show how women are socialized to see sex as an expression of love, Montemurro said. Despite decades of the women's rights movement and an increased awareness of women's sexual desire, the media continue to send a strong cultural message for women to connect sex and love and to look down on girls and women who have sex outside of committed relationships.

"On one hand, the media may seem to show that casual sex is OK, but at the same time, movies and television, especially, tend to portray women who are having sex outside of relationships negatively," said Montemurro.

In a similar way, the media often portray marriage as largely sexless, even though the participants in the study said that sex was an important part of their marriage, according to Montemurro, who presented her findings today (Aug. 19) at the annual meeting of the American Sociological Association.

"For the women I interviewed, they seemed to say you need love in sex and you need sex in marriage," said Montemurro.

From September 2008 to July 2011, Montemurro conducted in-depth interviews with 95 women who lived in Pennsylvania, New Jersey and New York. The interviews generally lasted 90 minutes.

Although some of the women who were interviewed said they had sexual relationships with women, most of the women were heterosexual and all were involved in heterosexual relationships.

Funds from the Career Development Professorship and the Rubin Fund supported this work.


Contacts and sources:
Matt Swayne
Penn State

Secrets Of How Worms Wriggle Uncovered

An engineer at the University of Liverpool has found how worms move around, despite not having a brain to communicate with the body.

Dr Paolo Paoletti, alongside his colleague at Harvard, Professor L Mahadevan, has developed a mathematical model for earthworms and insect larvae which challenges the traditional view of how these soft bodied animals get around.

Earthworm movement is controlled and influenced by the contours of the surface they are moving across
Credit: University of Liverpool

The most widely accepted model is that of the central pattern generator (CPG) which states that the central brain of these creatures generates rhythmic contraction and extension waves along the body. However, this doesn’t account for the fact that some of these invertebrates can move along even when their ventral nerve cord is cut.

Local control

Instead, Dr Paoletti and Professor Mahadevan hypothesised that there is a far greater role for the body’s mechanical properties and the local nerves which react to the surface that the animal is travelling across.

Dr Paoletti said: “When we analyse humans running there is clearly local control over movements as by the time nerve signals travel from the foot to the brain and back again, you will have taken three steps – and would otherwise probably have fallen over.”

“We see much the same in these soft bodied animals. Rather than generating a constant wave of contraction and expansion, their movement is controlled and influenced by the contours of the surface they are moving across.”

Improving robots

Dr Paoletti, from the School of Engineering, and Professor Mahadevan created a mathematical and computational theory to understand this and then tested these theories under different circumstances and conditions and using imagined worms of different masses. They now believe that this new model could be of use in robotics.

He said: “Replicating the movement of animals in robots is very difficult and often involves the use of many sensors. This new model avoids using sophisticated sensors and control strategies, and could be used to improve robots used for entering confined spaces or which have to deal with difficult terrain.”

The paper, ‘A proprioceptive neuromechanical theory of crawling’, was published in the journal Proceedings of the Royal Society B.

Contacts and sources:
Jamie Brown
University of Liverpool

Life On Mars? Implications Of A Newly Discovered Mineral-Rich Structure

A new ovoid structure discovered in the Nakhla Martian meteorite is made of nanocrystalline iron-rich clay, contains a variety of minerals, and shows evidence of undergoing a past shock event from impact, with resulting melting of the permafrost and mixing of surface and subsurface fluids.


Based on the results of a broad range of analytical studies to determine the origin of this new structure, scientists present the competing hypotheses for how this ovoid formed, point to the most likely conclusion, and discuss how these findings impact the field of astrobiology in a fascinating article published in Astrobiology, a peer-reviewed journal from Mary Ann Liebert, Inc., publishers. The Open Access article is available on the Astrobiology website.

In the article, "A Conspicuous Clay Ovoid in Nakhla: Evidence for Subsurface Hydrothermal Alteration on Mars with Implications for Astrobiology," Elias Chatzitheodoridis, National Technical University of Athens, Greece, and Sarah Haigh and Ian Lyon, the University of Manchester, UK, describe the use of tools including electron microscopy, x-ray, and spectroscopy to analyze the ovoid structure. 

While the authors do not believe the formation of this structure involved biological materials, that is a possible hypothesis, and they note that evidence exists supporting the presence of niche environments in the Martian subsurface that could support life.

"This study illustrates the importance of correlating different types of datasets when attempting to discern whether something in rock is a biosignature indicative of life," says Sherry L. Cady, PhD, Editor-in-Chief of Astrobiology and Chief Scientist at the Pacific Northwest National Laboratory. 

"Though the authors couldn't prove definitively that the object of focus was evidence of life, their research strategy revealed a significant amount of information about the potential for life to inhabit the subsurface of Mars," says Cady.  


Contacts and sources:
Mary Ann Liebert, Inc., Publishers


Citation: A Conspicuous Clay Ovoid in Nakhla: Evidence for Subsurface Hydrothermal Alteration on Mars with Implications for Astrobiology
Chatzitheodoridis Elias, Haigh Sarah, and Lyon Ian. Astrobiology. August 2014, 14(8): 651-693. doi:10.1089/ast.2013.1069.
Published in Volume: 14 Issue 8: August 7, 2014
Online Ahead of Print: July 21, 2014

Antibiotics In Early Life May Alter Immunity Long-Term

A new UBC study aims to help scientists understand how different antibiotics affect bacteria that play a positive role in promoting a healthy immune system.
Photo: podphoto, iStock.

New University of British Columbia research found that receiving antibiotic treatments early in life can increase susceptibility to specific diseases later on.

Most bacteria living in the gut play a positive role in promoting a healthy immune system, but antibiotic treatments often do not discriminate between good and bad bacteria. The study published today in Journal of Allergy and Clinical Immunology helps scientists understand how different antibiotics affect good bacteria.

Kelly McNagny
Credit: UBC

“This is the first step to understanding which bacteria are absolutely necessary to develop a healthy immune system later in life,” says Kelly McNagny, a professor in the Dept. of Medical Genetics who led the research along with UBC microbiologist Brett Finlay.

The researchers tested the impact of two antibiotics, vancomycin and streptomycin, on newborn mice. They found that streptomycin increased susceptibility to a disease known as hypersensitivity pneumonitis later in life, but vancomycin had no effect.

The difference in each antibiotic’s long-term effects can be attributed to how they changed the bacterial ecosystem in the gut. Hypersensitivity pneumonitis is an allergic disease found in people with occupations such as farming, sausage-making, and cleaning hot tubs.

The researchers stress that infants should be treated with antibiotics when needed, but they hope these results will help pinpoint which bacteria make us less susceptible to disease. This could open up the possibility of boosting helpful bacteria through the use of probiotics.

“Probiotics could be the next big trend in parenting because once you know which bacteria prevent disease, you can make sure that children get inoculated with those bacteria,” says McNagny.

This research was supported by the Canadian Institutes of Health Research and AllerGen NCE, a national research network funded by Industry Canada through the Networks of Centres of Excellence (NCE) Program.


Contacts and sources:

Monday, August 18, 2014

Did An Exceptional Iceberg Sink The Titanic?

The iceberg thought to have been hit by Titanic, photographed by the chief steward of the liner Prinz Adalbert on the morning of 15 April 1912. The iceberg was reported to have a streak of red paint from a ship's hull along its waterline on one side.
Credit: Navigation Center, United States Coast Guard

While the sinking of the Titanic in 1912 is typically blamed on human, design and construction errors, a new Significance paper points to 2 other unfavorable factors outside human control: there were a greater number of icebergs than normal that year, and weather conditions had driven them further south, and earlier in the year, than was usual.

The average sea-ice limit for April 1979–2013 (dotted), a typical Newfoundland maximum sea-ice limit for the early twentieth century (dashed and denoted as 1912) and the maximum iceberg limit for 1900–2000 are shown, in addition to the 48°N line. The location of the Titanic collision is shown by an "X". The blue shading shows depth, with the lightest blue denoting the continental shelf (< 1000 m depth)

Credit: Wiley

The paper also notes that iceberg discharge from glaciers is increasing, with more heavy iceberg years since the 1980s than before, and increasing global warming will likely cause this trend to continue.

“As use of the Arctic increases in the future with declining sea-ice, and as polar ice sheets are increasingly losing mass as well, the iceberg risk is likely to increase in the future, rather than decline,” said co-author Professor Grant Bigg.


Contacts and sources:
Nicole Weingartner

Invasion Of The Americas By Mosquito-Borne Virus Likely

While media attention has been focused recently on coronavirus cases in the Arabian peninsula and the Ebola outbreak in West Africa, experts note that another threat lies in the spread of Chikungunya fever, an illness that is transmitted by mosquitoes and can cause fever, joint and muscle pain, headaches, and rashes. While it does not often cause death, the symptoms can be severe and disabling, with no treatment available.

An A. aegypti mosquito biting a person
Credit: Wikipedia

The potential for worldwide spread of Chikungunya virus is much higher than the risk of dissemination of Middle East respiratory syndrome coronavirus or Ebola virus, and the number of cases expected from the introduction of Chikungunya virus into the Americas, Europe, or both is immeasurably higher.

Attention and funding should be directed to building up surveillance systems, organizing international coordination efforts, and rapidly developing countermeasures, according to a Clinical Microbiology and Infection article.

"Concerning worldwide globalization of Chikungunya, the question is not whether it can happen, but when it will happen," said lead author Dr. Remi Charrel.

Chikungunya virus is transmitted to people by mosquitoes. The most common symptoms of chikungunya virus infection are fever and joint pain. Other symptoms may include headache, muscle pain, joint swelling, or rash. Outbreaks have occurred in countries in Africa, Asia, Europe, and the Indian and Pacific Oceans.

In late 2013, chikungunya virus was found for the first time in the Americas on islands in the Caribbean. There is a risk that the virus will be imported to new areas by infected travelers. There is no vaccine to prevent or medicine to treat chikungunya virus infection. 

Travelers can protect themselves by preventing mosquito bites. When traveling to countries with chikungunya virus, use insect repellent, wear long sleeves and pants, and stay in places with air conditioning or that use window and door screens.

Contacts and sources:
Nicole Weingartner
Wiley

Pigs' Hearts Transplanted Into Baboon Hosts Remain Viable More Than A Year

NIH cardiac surgical scientists able to achieve prolonged survival by a combination of genetic engineering and new methods of suppressing the immune response

 Investigators from the National Heart, Lung, and Blood Institute (NHLBI) of the National Institutes of Health (NIH) have successfully transplanted hearts from genetically engineered piglets into baboons' abdomens and had the hearts survive for more than one year, twice as long as previously reported.


This was achieved by using genetically engineered porcine donors and a more focused immunosuppression regimen in the baboon recipients, according to a study published in The Journal of Thoracic and Cardiovascular Surgery, an official publication of the American Association for Thoracic Surgery.

Cardiac transplantation is the treatment of choice for end stage heart failure. According to the NHLBI, approximately 3,000 people in the US are on the waiting list for a heart transplant, while only 2,000 donor hearts become available each year. For cardiac patients currently waiting for organs, mechanical assist devices are the only options available. These devices, however, are imperfect and experience issues with power supplies, infection, and problems with blood clots and bleeding.

Transplantation using an animal organ, or xenotransplantation, has been proposed as a valid option to save human lives. "Until we learn to grow organs via tissue engineering, which is unlikely in the near future, xenotransplantation seems to be a valid approach to supplement human organ availability. Despite many setbacks over the years, recent genetic and immunologic advancements have helped revitalized progress in the xenotransplantation field," comments lead investigator Muhammad M. Mohiuddin, MD, of the Cardiothoracic Surgery Research Program at the NHLBI.

Dr. Mohiuddin's group and other investigators have developed techniques on two fronts to overcome some of the roadblocks that previously hindered successful xenotransplantation. The first advance was the ability to produce genetically engineered pigs as a source of donor organs by NHLBI's collaborator, Revivicor, Inc. The pigs had the genes that cause adverse immunologic reactions in humans "knocked out" and human genes that make the organ more compatible with human physiology were inserted. The second advance was the use of target-specific immunosuppression, which limits rejection of the transplanted organ rather than the usual generalized immunosuppression, which is more toxic.

Pigs were chosen because their anatomy is compatible with that of humans and they have a rapid breeding cycle, among other reasons. They are also widely available as a source of organs.

In this study, researchers compared the survival of hearts from genetically engineered piglets that were organized into different experimental groups based on the genetic modifications introduced. The gene that synthesizes the enzyme alpha 1-3 galactosidase transferase was "knocked out" in all piglets, thus eliminating one immunologic rejection target. 

The pig hearts also expressed one or two human transgenes to prevent blood from clotting. The transplanted hearts were attached to the circulatory systems of the host baboons, but placed in the baboons' abdomens. The baboons' own hearts, which were left in place, maintained circulatory function, and allowed the baboons to live despite the risk of organ rejection.

The researchers found that in one group (with a human gene), the average transplant survival was more than 200 days, dramatically surpassing the survival times of the other three groups (average survival 70 days, 21 days, and 80 days, respectively). Two of the five grafts in the long-surviving group stopped contracting on postoperative days 146 and 150, but the other three grafts were still contracting at more than 200 to 500 days at the time of the study's submission for publication.

Prolonged survival was attributed to several modifications. This longest-surviving group was the only one that had the human thrombomodulin gene added to the pigs' genome. Dr. Mohiuddin explains that thrombomodulin expression helps avoid some of the microvascular clotting problems that were previously associated with organ transplantation.

Another difference was the type, strength, and duration of antibody used for costimulation blockade to suppress T and B cell immune response in the hosts. In several groups, longer survival of transplants was observed with the use of anti-CD40 monoclonal antibodies but the longest-surviving group was treated specifically with a high dose of recombinant mouse-rhesus chimeric antibody (clone 2C10R4). In contrast, use of an anti-CD40 monoclonal antibody generated in a mouse (clone 3A8) did not extend survival. Anti-CD40 monoclonal antibodies also allow for faster recovery, says Dr. Mohiuddin.

No complications, including infections, were seen in the longest-survival group. The researchers used surveillance video and telemetric monitoring to identify any symptoms of complications in all groups, such as abdominal bleeding, gastrointestinal bleeding, aspiration pneumonia, seizures, or blood disorders.

The goal of the current study was to evaluate the viability of the transplants. The researchers' next step is to use hearts from the genetically-engineered pigs with the most effective immunosuppression in the current experiments to test whether the pig hearts can sustain full life support when replacing the original baboon hearts.

"Xenotransplantation could help to compensate for the shortage of human organs available for transplant. Our study has demonstrated that by using hearts from genetically engineered pigs in combination with target-specific immunosuppression of recipient baboons, organ survival can be significantly prolonged. 

Based on the data from long-term surviving grafts, we are hopeful that we will be able to repeat our results in the life-supporting model. This has potential for paving the way for the use of animal organs for transplantation into humans," concludes Dr. Mohiuddin.


Contacts and sources:
Nicole Baritot
American Association for Thoracic Surgery

Artificial Cells Act Like The Real Thing

Cell-like compartments produce proteins and communicate with one another, similar to natural biological systems

Imitation, they say, is the sincerest form of flattery, but mimicking the intricate networks and dynamic interactions that are inherent to living cells is difficult to achieve outside the cell.

Now, as published in Science, Weizmann Institute scientists have created an artificial, network-like cell system that is capable of reproducing the dynamic behavior of protein synthesis. This achievement is not only likely to help gain a deeper understanding of basic biological processes, but it may, in the future, pave the way toward controlling the synthesis of both naturally-occurring and synthetic proteins for a host of uses.

(L-R) Eyal Karzbrun, Alexandra Tayar and Prof. Roy Bar-Ziv

Credit: Weizmann Institute of Science

The system, designed by PhD students Eyal Karzbrun and Alexandra Tayar in the lab of Prof. Roy Bar-Ziv of the Weizmann Institute’s Materials and Interfaces Department, in collaboration with Prof. Vincent Noireaux of the University of Minnesota, comprises multiple compartments “etched’’ onto a biochip. 

These compartments – artificial cells, each a mere millionth of a meter in depth – are connected via thin capillary tubes, creating a network that allows the diffusion of biological substances throughout the system. Within each compartment, the researchers insert a cell genome – strands of DNA designed and controlled by the scientists themselves. 

In order to translate the genes into proteins, the scientists relinquished control to the bacterium E. coli: Filling the compartments withE. coli cell extract – a solution containing the entire bacterial protein-translating machinery, minus its DNA code – the scientists were able to sit back and observe the protein synthesis dynamics that emerged.

By coding two regulatory genes into the sequence, the scientists created a protein synthesis rate that was periodic, spontaneously switching from periods of being “on” to “off.” 

The amount of time each period lasted was determined by the geometry of the compartments. Such periodic behavior – a primitive version of cell cycle events – emerged in the system because the synthesized proteins could diffuse out of the compartment through the capillaries, mimicking natural protein turnover behavior in living cells. 

At the same time fresh nutrients were continuously replenished, diffusing into the compartment and enabling the protein synthesis reaction to continue indefinitely. “The artificial cell system, in which we can control the genetic content and protein dilution times, allows us to study the relation between gene network design and the emerging protein dynamics. This is quite difficult to do in a living system,” says Karzbrun. 

“The two-gene pattern we designed is a simple example of a cell network, but after proving the concept, we can now move forward to more complicated gene networks. One goal is to eventually design DNA content similar to a real genome that can be placed in the compartments.”

Fluorescent image of DNA (white squares) patterned in circular compartments connected by capillary tubes to the cell-free extract flowing in the channel at bottom. Compartments are 100 micrometers in diameter

Credit: Weizmann Institute of Science

The scientists then asked whether the artificial cells actually communicate and interact with one another like real cells. Indeed, they found that the synthesized proteins that diffused through the array of interconnected compartments were able to regulate genes and produce new proteins in compartments farther along the network. In fact, this system resembles the initial stages of morphogenesis – the biological process that governs the emergence of the body plan in embryonic development. 

“We observed that when we place a gene in a compartment at the edge of the array, it creates a diminishing protein concentration gradient; other compartments within the array can sense and respond to this gradient – similar to how morphogen concentration gradients diffuse through the cells and tissues of an embryo during early development. We are now working to expand the system and to introduce gene networks that will mimic pattern formation, such as the striped patterns that appear during fly embryogenesis,” explains Tayar.

With the artificial cell system, according to Bar-Ziv, one can, in principle, encode anything: “Genes are like Lego in which you can mix and match various components to produce different outcomes; you can take a regulatory element from E. coli that naturally controls gene X, and produce a known protein; or you can take the same regulatory element but connect it to gene Y instead to get different functions that do not naturally occur in nature.” 

This research may, in the future, help advance the synthesis of such things as fuel, pharmaceuticals, chemicals and the production of enzymes for industrial use, to name a few.


Contacts and sources:
Yivsam Azgad
Weizmann Institute of Science

Toothless 'Dragon' Pterosaurs Dominated The Late Cretaceous Skies

A new study provides an exciting insight into the Late Cretaceous and the diversity and distribution of the toothless 'dragon' pterosaurs from the Azhdarchidae family. The research was published in the open access journal ZooKeys.

The Azhdarchidan pterosaurs derive their name from the Persian word for dragon - Aždarha. Interestingly, this derived and rather successful group of pterosaurs included some of the largest known flying animals of all times, with a wingspan reaching between 10 and 12 m.

Azhdarchidan pterosaurs
Credit:  Wikipedia

'Dragon' pterosaurs had a worldwide distribution once and were the last of their kind to survive on the planet, until some 60 mya. They dominated the skies during the Late Cretaceus and unlike their predecessors, were characteristically toothless.

"This shift in dominance from toothed to toothless pterodactyloids apparently reflects some fundamental changes in Cretaceous ecosystems, which we still poorly understand," comments the author of the study Dr Alexander Averianov, Zoological Institute of the Russian Academy of Sciences.

Generally fossil record of pterosaurs is patchy and confined mostly to sedimentary deposits known as Konservat-Lagerstätten where exceptional depositional conditions facilitated preservation of fragile pterosaur bones. Unfortunately, such Lagerstätten are very rare for the Late Cretaceous when most of the evolutionary history of Azhdarchidae took place, which makes these exciting creatures exceptionally hard to study.

"Azhdarchidae currently represent a real nightmare for paleontologists: most taxa are known from few fragmentary bones, which often do not overlap between named taxa, the few articulated skeletons are poorly preserved, and some of the best available material has remained undescribed for forty years." explains Dr Averianov about the difficulties studying the group.

Despite these difficulties, the number of localities were azhdarchidan pterosaurs were found is impressive and undoubtedly reflect the important role they played in the Cretaceous ecosystems. These flying giants likely inhabited a large variety of environments, but seem to have been abundant near large lakes and rivers and most common in nearshore marine environments.

Contacts and sources:
Alexander Averianov
Pensoft Publishers

Citation: Averianov A (2014) Review of taxonomy, geographic distribution, and paleoenvironments of Azhdarchidae (Pterosauria) ZooKeys 432: 1-107. doi: 10.3897/zookeys.432.7913

Sun's Activity Influences Natural Climate Change

For the first time, a research team has been able to reconstruct the solar activity at the end of the last ice age, around 20,000-10,000 years ago, by analysing trace elements in ice cores in Greenland and cave formations from China.

Credit: UCAR

During the last glacial maximum, Sweden was covered in a thick ice sheet that stretched all the way down to northern Germany and sea levels were more than 100 metres lower than they are today, because the water was frozen in the extensive ice caps. 

The new study shows that the sun's variation influences the climate in a similar way regardless of whether the climate is extreme, as during the Ice Age, or as it is today.

"The study shows an unexpected link between solar activity and climate change. It shows both that changes in solar activity are nothing new and that solar activity influences the climate, especially on a regional level. Understanding these processes helps us to better forecast the climate in certain regions", said Raimund Muscheler, Lecturer in Quaternary Geology at Lund University and co-author of the study.

The sun's impact on the climate is a matter of current debate, especially as regards the less-than-expected global warming of the past 15 years. There is still a lot of uncertainty as to how the sun affects the climate, but the study suggests that direct solar energy is not the most important factor, but rather indirect effects on atmospheric circulation.

"Reduced solar activity could lead to colder winters in Northern Europe. This is because the sun's UV radiation affects the atmospheric circulation. Interestingly, the same processes lead to warmer winters in Greenland, with greater snowfall and more storms. The study also shows that the various solar processes need to be included in climate models in order to better predict future global and regional climate change", said Dr Muscheler.


Contacts and sources:
Raimund Muscheler
Lund University

Citation: Persistent link between solar activity and Greenland climate during the Last Glacial Maximum, Nature Geoscience

Zombie Ant Fungi Manipulate Hosts To Die On The 'Doorstep' Of The Colony

A parasitic fungus that must kill its ant hosts outside their nest to reproduce and transmit their infection, manipulates its victims to die in the vicinity of the colony, ensuring a constant supply of potential new hosts, according to researchers at Penn State and colleagues at Brazil's Federal University of Vicosa.

After killing its host, the so-called zombie ant fungus grows from the cadaver and produces spores, which rain down on the forest floor to infect new hosts.
Credit: College of Agricultural Sciences, Penn State

Previous research shows that Ophiocordyceps camponoti-rufipedis, known as the "zombie ant fungus," controls the behavior of carpenter ant workers -- Camponotus rufipes -- to die with precision attached to leaves in the understory of tropical forests, noted study lead author Raquel Loreto, doctoral candidate in entomology, Penn State's College of Agricultural Sciences.

"After climbing vegetation and biting the veins or margins on the underside of leaves, infected ants die, remaining attached to the leaf postmortem, where they serve as a platform for fungal growth," Loreto said.

The fungus grows a stalk, called the stroma, which protrudes from the ant cadaver. A large round structure, known as the ascoma, forms on the stroma. Infectious spores then develop in the ascoma and are discharged onto the forest floor below, where they can infect foraging ants from the colony.

This fungal reproductive activity must take place outside the ant colony, in part because of the ants' social immunity, which is collective action taken to limit disease spread, explained study co-author David Hughes, assistant professor of entomology and biology, Penn State.

"Previous laboratory studies have shown that social immunity is an important feature of insect societies, especially for ants," Hughes said. "For the first time, we found evidence of social immunity in ant societies under field conditions."

The researchers tested social immunity by placing 28 ants freshly killed by the fungus inside two nests -- 14 in a nest with live ants and 14 in one with no ants. They found that the fungus was not able to develop properly in any of the 28 cadavers. In the nest with live ants, nine of the 14 infected cadavers disappeared, presumably removed by the ants in an effort to thwart the disease organism.

"Ants are remarkably adept at cleaning the interior of the nest to prevent diseases," Hughes said. "But we also found that this fungal parasite can't grow to the stage suitable for transmission inside the nest whether ants are present or not. This may be because the physical space and microclimate inside the nest don't allow the fungus to complete its development."

Next the researchers set out to record the prevalence of the fungus among ant colonies within the study area, which was located at the Mata do Paraíso research station in southeast Brazil. After marking and searching 22 transects covering a total of 16,988 square miles, they discovered that all 17 nests found had ant cadavers attached to leaves beside the colony, suggesting a fungal prevalence of 100 percent at the ant population level.

In a more detailed, 20-month survey of four of those ant colonies, the scientists measured parasite pressure by mapping the precise locations of fungus-killed ants and foraging trails in close proximity to the nests.

"We limited our survey to the immediate area surrounding the nest because this is the zone the ants must walk through to leave and return to the colony," Loreto said. "To better understand the path workers ants took, we measured and mapped in 3-D the trails formed by the ants, and that allowed us to determine spatial location of potential new hosts, which would be on the foraging trails."

By measuring the position of manipulated ants and plotting these locations with respect to the nest, the researchers established that infected ants die on the "doorstep" of the colony.

"What the zombie fungi essentially do is create a sniper's alley through which their future hosts must pass," Hughes said. "The parasite doesn't need to evolve mechanisms to overcome the effective social immunity that occurs inside the nest. At the same time, it ensures a constant supply of susceptible hosts."

Despite the high prevalence of infected colonies and persistence of the fungus over time, the researchers did not observe colony collapse, suggesting that the parasite functions as a long-lasting but tolerable condition for the ants.

"We suggest that the parasite can be characterized as a 'chronic disease' that, as in humans, can be controlled but not cured," Loreto said.

The research, which was funded by CAPES-Brazil and Penn State, was published today (Aug. 18) in PLOS ONE.


Contacts and sources: 
A'ndrea Elyse Messer
Penn State

Sunday, August 17, 2014

8,000-Year-Old Mutation Key To Human Life At High Altitudes

In an environment where others struggle to survive, Tibetans thrive in the thin air on the Tibetan Plateau, with an average elevation of 14,800 feet.

A study led by University of Utah scientists is the first to find a genetic cause for the adaptation – a single DNA base pair change that dates back 8,000 years – and demonstrate how it contributes to the Tibetans' ability to live in low oxygen conditions. The study appears online in the journal Nature Genetics on Aug. 17, 2014.

This image depicts Tibetan locals living at 4,300 meters.
Credit: Tsewang Tashi, M.D.


"These findings help us understand the unique aspects of Tibetan adaptation to high altitudes, and to better understand human evolution," said Josef Prchal, M.D., senior author and University of Utah professor of internal medicine.

The story behind the discovery is equally about cultural diplomacy as it is scientific advancement. Prchal traveled several times to Asia to meet with Chinese officials, and representatives of exiled Tibetans in India, to obtain permissions to recruit subjects for the study. But he quickly learned that without the trust of Tibetans, his efforts were futile. Wary of foreigners, they refused to donate blood for his research.


Josef Prchal, M.D., (at computer) enrolls Tibetans into the study in this image.
Credit: Tsewang Tashi, M.D.


After returning to the U.S., Prchal couldn't believe his luck upon discovering that a native Tibetan, Tsewang Tashi, M.D., had just joined the Huntsman Cancer Institute at the University of Utah as a clinical fellow. When Prchal asked for his help, Tashi quickly agreed. 

"I realized the implications of his work not only for science as a whole but also for understanding what it means to be Tibetan," said Tashi. In another stroke of luck, Prchal received a long-awaited letter of support from the Dalai Lama. The two factors were instrumental in engaging the Tibetans' trust: more than 90, both from the U.S. and abroad, volunteered for the study.

First author Felipe Lorenzo, Ph.D., spent years combing through the Tibetans' DNA, and unlocking secrets from a "GC-rich" region that is notoriously difficult to penetrate. His hard work was worth it, for the Tibetans' DNA had a fascinating tale to tell. About 8,000 years ago, the gene EGLN1 changed by a single DNA base pair. 

Today, a relatively short time later on the scale of human history, 88% of Tibetans have the genetic variation, and it is virtually absent from closely related lowland Asians. The findings indicate the genetic variation endows its carriers with an advantage.

Prchal, collaborated with experts throughout the world, to determine what that advantage is. In those without the adaptation, low oxygen causes their blood to become thick with oxygen-carrying red blood cells - an attempt to feed starved tissues - which can cause long-term complications such as heart failure. The researchers found that the newly identified genetic variation protects Tibetans by decreasing the over-response to low oxygen.

This image depicts University of Utah scientists Felipe Lorenzo, Ph.D., Josef Prchal, M.D., Tsewang Tashi, M.D.
Credit: Tsewang Tashi, M.D.

These discoveries are but one chapter in a much larger story. The genetic adaptation likely causes other changes to the body that have yet to be understood. Plus, it is one of many as of yet unidentified genetic changes that collectively support life at high altitudes.

Prchal says the implications of the research extend beyond human evolution. Because oxygen plays a central role in human physiology and disease, a deep understanding of how high altitude adaptations work may lead to novel treatments for various diseases, including cancer. "There is much more that needs to be done, and this is just the beginning," he said.

At the beginning of the project, while in Asia, Prchal was amazed at how Tashi was able to establish a common ground with Tibetans. He helped them realize they had something unique to contribute. "When I tell my fellow Tibetans, 'Unlike other people, Tibetans can adapt better to living at high altitude,' they usually respond by a little initial surprise quickly followed by agreement," Tashi explained.

"Its as if I made them realize something new, which only then became obvious."



Contacts and sources:
Julie Kiefer
University of Utah Health Sciences