OpenX

Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

Tuesday, May 26, 2015

Cosmic Maid Service: Supernovas, Black Holes Team up to Clean Galaxies

Supernovas just might be the maid service of the universe.

It seems these explosions that mark the end of a star's life work hand-in-hand with supermassive black holes to sweep out gas and shut down galaxies' star-forming factories.

Jets erupting from a supermassive black hole, such as the one in Centaurus A (shown in this color composite image), might clear the way for supernovas to sweep out gas and stop star formation.

 Photo credit: WFI/ESO (optical); A. Weill et al/APEX/MPIFR and ESO (submillimeter); R. Kraft et al/ CXC/CFA and NASA (X-ray).

Recent research, led by Michigan State University astronomers, finds that the black holes located at the cores of galaxies launch fountains of charged particles, which can stir up gas throughout the galaxy and temporarily interrupt star formation.

But unless something intervenes, the gas will eventually cool and start forming stars again.

One mega-outburst from the black hole, though, could heat the gas surrounding the galaxy enough to let supernovas take over and mop up the mess. A celestial cleaning partnership might help astronomers understand why some massive galaxies stopped forming stars billions of years ago.

"Our previous research had shown that black-hole outbursts can limit star formation in massive galaxies, but they can't completely shut it off," said team leader Mark Voit, MSU professor of physics and astronomy in the College of Natural Science. "Something else needs to keep sweeping out the gas that dying stars continually dump into a galaxy, and supernova sweeping appears to work perfectly for that."

Other members of the research team are Megan Donahue, MSU professor of physics and astronomy; Brian O'Shea, MSU associate professor of physics and astronomy; Greg Bryan, Columbia University professor of astronomy; Ming Sun, University of Alabama in Huntsville assistant professor of physics; and Norbert Werner, Stanford University research associate.

This research was recently published in Science News and Astrophysical Journal Letters.

Contacts and sources:
Tom OswaldMichigan State University

Could Left-Handed Cosmic Magnetic Field Explain Missing Antimatter



The discovery of a 'left-handed' magnetic field that pervades the universe could help explain a long standing mystery – the absence of cosmic antimatter

A group of scientists, led by Prof. Tanmay Vachaspati from Arizona State University in the United States, with collaborators at Washington University and Nagoya University, announce their result in Monthly Notices of the Royal Astronomical Society.
An artist’s impression of the Fermi Gamma ray Space Telescope (FGST) in orbit. 

Credit: NASA.  

Planets, stars, gas and dust are almost entirely made up of 'normal' matter of the kind we are familiar with on Earth. But theory predicts that there should be a similar amount of antimatter, like normal matter, but with the opposite charge. For example, an antielectron (called a positron) has the same mass as its conventional counterpart, but a positive rather than negative charge.

In 2001 Prof. Vachaspati published theoretical models to try to solve this puzzle, which predict that the entire universe is filled with helical (screw-like) magnetic fields. He and his team were inspired to search for evidence of these fields in data from the NASA Fermi Gamma ray Space Telescope (FGST).

FGST, launched in 2008, observes gamma rays (electromagnetic radiation with a shorter wavelength than X-rays) from very distant sources, such as the supermassive black holes found in many large galaxies. The gamma rays are sensitive to effect of the magnetic field they travel through on their long journey to the Earth. If the field is helical, it will imprint a spiral pattern on the distribution of gamma rays.

Vachaspati and his team see exactly this effect in the FGST data, allowing them to not only detect the magnetic field but also to measure its properties. The data shows not only a helical field, but also that there is an excess of left-handedness - a fundamental discovery that for the first time suggests the precise mechanism that led to the absence of antimatter.

For example, mechanisms that occur nanoseconds after the Big Bang, when the Higgs field gave masses to all known particles, predict left-handed fields, while mechanisms based on interactions that occur even earlier predict right-handed fields.

 Illustration of the Fermi Gamma ray Space Telescope (FGST) map of the sky with the central band removed to block out gamma rays originating in the Milky Way. Gamma rays of different energies are represented by dots of various colors – red dots represent arrival locations of very energetic gamma rays, green dots represent lower energy, and blue dots represent lowest energy. 

Credit: Hiroyuki Tashiro.

The new analysis looks for spiral patterns in the distribution of gamma rays within patches on the sky, with the highest energy gamma ray at the center of the spiral and the lower energy gamma rays further along the spiral. A helical magnetic field in the universe gives an excess of spirals of one handedness - and FGST data shows an excess of left-handed spirals.

Prof. Vachaspati commented: "Both the planet we live on and the star we orbit are made up of 'normal' matter. Although it features in many science fiction stories, antimatter seems to be incredibly rare in nature. With this new result, we have one of the first hints that we might be able to solve this mystery."

This discovery has wide ramifications, as a cosmological magnetic field could play an important role in the formation of the first stars and could seed the stronger field seen in galaxies and clusters of galaxies in the present day.



Contacts and sources:
Dr Robert Massey
Royal Astronomical Society
 
Prof Tanmay Vachaspati
Director, Cosmology Initiative
Arizona State University

Citation: vThe new work appears in W. Chen et al., "Intergalactic magnetic field spectra from diffuse gamma rays", Monthly Notices of the Royal Astronomical Society, vol. 450, pp. 3371-3380, 2015, published by Oxford University Press.

Details of the earlier theoretical models appear in T. Vachaspati, "Estimate of the Primordial Magnetic Field Helicity", Physical Review Letters, vol. 87, p. 251302, 2001.

 

Monitoring Magnetospheres of Massive Stars

Queen's University PhD student Matt Shultz is researching magnetic, massive stars, and his research has uncovered questions concerning the behaviour of plasma within their magnetospheres.

A huge, billowing pair of gas and dust clouds are captured in this stunning NASA Hubble Space Telescope image of the supermassive star Eta Carinae
Credit: Nathan Smith (University of California, Berkeley), and NASA

Drawing upon the extensive dataset assembled by the international Magnetism in Massive Stars (MiMeS) collaboration, led by Mr. Shultz's supervisor, Queen's professor Gregg Wade, along with some of his own observations collected with both the Canada-France-Hawaii Telescope and the European Southern Observatory's Very Large Telescope, Mr. Shultz is conducting the first systematic population study of magnetosphere-host stars.

"All massive stars have winds: supersonic outflows of plasma driven by the stars' intense radiation. When you put this plasma inside a magnetic field you get a stellar magnetosphere," explains Mr. Shultz (Physics, Engineering Physics and Astronomy). "Since the 1980s, theoretical models have generally found that the plasma should escape the magnetosphere in sporadic, violent eruptions called centrifugal breakout events, triggered when the density of plasma grows beyond the ability of the magnetic field to contain.

"However, no evidence of this dramatic process has yet been observed, so the community has increasingly been calling that narrative into question."

Before now, obvious disagreements with theory had been noted primarily for a single, particularly well-studied star. Studying the full population of magnetic, massive stars with detectable magnetospheres, Mr. Shultz has determined that the plasma density within all such magnetospheres is far lower than the limiting value implied by the centrifugal breakout model. This suggests that plasma might be escaping gradually, maintaining magnetospheres in an essentially steady state.

"We don't know yet what is going on," says Mr. Shultz. "But, when centrifugal breakout was first identified as the most likely process for mass escape, only the simplest diffusive mechanisms were ruled out. Our understanding of space plasmas has developed quite a bit since then. We now need to go back and look more closely at the full range of diffusive mechanisms and plasma instabilities. There are plenty to choose from: the real challenge is developing the theoretical tools that will be necessary to test them."


Contacts and sources:
Anne Craig
Queen's University

New Tools: Intelligent Handheld Robots Help Inexperienced Uses Complete Jobs Requiring Skills


What if handheld tools know what needs to be done and were even able to guide and help inexperienced users to complete jobs that require skill? Researchers at the University of Bristol have developed and started studying a novel concept in robotics - intelligent handheld robots.

Novel intelligent handheld robot is shown.
Credit: University of Bristol

Historically, handheld tools have been blunt, unintelligent instruments that are unaware of the context they operate in, are fully directed by the user, and critically, lack any understanding about the task they are performing.

Dr Walterio Mayol-Cuevas and PhD student, Austin Gregg-Smith, from the University's Department of Computer Science, have been working in the design of robot prototypes as well as in understanding how best to interact with a tool that "knows and acts". In particular, they have been involved with comparing tools with increasing levels of autonomy.



Compared to other tools such as power tools that have a motor and perhaps some basic sensors, the handheld robots developed at Bristol are designed to have more degrees of motion to allow greater independence from the motions of the user, and importantly, are aware of the steps being carried out. This allows for a new level of co-operation between user and tool, such as the user providing tactical motions or directions and the tool performing the detailed task.

This image shows a novel intelligent handheld robot.

Credit: University of Bristol

Handheld robots, aim to share physical proximity with users but are neither fully independent as is a humanoid robot nor are part of the user's body, as are exoskeletons. The aim with handheld robots is to capitalise on exploiting the intuitiveness of using traditional handheld tools while adding embedded intelligence and action to allow for new capabilities.

Dr Mayol-Cuevas, Reader in Robotics Computer Vision and Mobile Systems, said: "There are three basic levels of autonomy we are considering: no autonomy, semi-autonomous when the robot advises the user but does not act, and fully autonomous when the robot advises and acts even by correcting or refusing to perform incorrect user actions."

The Bristol team has been studying user's task performance and user preferences on two generic tasks: pick and drop of different objects to form tile patterns, and aiming in 3D for simulated painting.

Austin Gregg-Smith, a PhD student who is sponsored by the James Dyson Foundation, added: "Our results indicate that users tend to prefer a tool that is fully autonomous and there is evidence of a significant impact on completion time and reduced perceived workload for autonomous handheld. However, users sometimes also expressed how different it is to work with this type of novel robot."

The researchers are currently investigating further topics on interaction, shared intelligence and new applications for field tasks, and due to the difficulties of starting in a new area of robotics, their robot designs are open source and available via http://www.handheldrobotics.org/



Contacts and sources: 
Joanne Fryer
University of Bristol


A paper about their recent work, which has been nominated for Best Cognitive Robotics Paper Award, Best Student Paper Award and Best Conference Paper Award, will be presented at this week's IEEE International Conference on Robotics and Automation (ICRA).

April 2015 Fourth Warmest on Record; Year-to-Date Warmest on Record

The globally averaged temperature over land and ocean surfaces for April 2015 was the fourth highest for the month since record keeping began in 1880. The year-to-date (January–April) globally averaged temperature was record high.

During January–April, the average temperature across global land and ocean surfaces was 1.44°F (0.80°C) above the 20th century average. This was the highest for January–April in the 1880–2015 record, surpassing the previous record of 2010 by 0.13°F (0.07°C).

Temperature anomalies for land and ocean are analyzed separately and then merged to form the global analysis.

                        Selected Climate Events & Anomalies for April 2015

 The most current data may be accessed via the Global Surface Temperature Anomalies page.

Temperatures

During January–April, the globally-averaged land surface temperature was 2.66°F (1.48°C) above the 20th century average. This was the highest for January–April in the 1880–2015 record, surpassing the previous record of 2007 by 0.05°F (0.03°C)

During January–April, the globally-averaged sea surface temperature was 0.99°F (0.55°C) above the 20th century average. This tied with 2010 as the second highest for January–April in the 1880–2015 record, trailing 1998 by 0.04°F (0.02°C).

In the atmosphere, 500-millibar height pressure anomalies correlate well with temperatures at the Earth's surface. The average position of the upper-level ridges of high pressure and troughs of low pressure—depicted by positive and negative 500-millibar height anomalies on the April 2015 and February–April 2015 maps—is generally reflected by areas of positive and negative temperature anomalies at the surface, respectively.
April


The average temperature across global land and ocean surface temperatures combined for April 2015 was 0.74°C (1.33°F) higher than the 20th century average. This was the fourth highest for April in the 136-year period of record, but also marks the lowest monthly departure from average since November 2014.

Examining the data beyond the traditional calendar year, the latest 12-month period (May 2014–April 2015) ties with the record set last month (April 2014–March 2015) as the warmest 12-month period among all months in the 136-year period of record, as shown in the table below. 

In fact, this record was set several times over the past year, and nine of the ten warmest 12-month periods have occurred within the past two years (September 1997–August 1998 ties as eighth warmest). Nine of these ten 12-month periods also comprise months in two overlapping years. Only the full calendar year of 2014 is among the ten warmest 12-month periods (ties for sixth warmest).

During April, the average temperature across global land and ocean surfaces was 1.33°F (0.74°C) above the 20th century average. This was the fourth highest for April in the 1880–2015 record.

The April globally-averaged land surface temperature was 2.00°F (1.11°C) above the 20th century average. This was the 10th highest for April in the 1880–2015 record.

The April globally-averaged sea surface temperature was 1.08°F (0.60°C) above the 20th century average. This was the highest for April in the 1880–2015 record, surpassing the previous record of 1998 by 0.05°F (0.03°C).

The average Arctic sea ice extent for April was 310,000 square miles (5.5 percent) below the 1981–2010 average. This was the second smallest April extent since records began in 1979, according to analysis by the National Snow and Ice Data Center based on data from NOAA and NASA. This extent was 30,000 square miles larger than the record small April extent that occurred in 2007.

Antarctic sea ice during April was 640,000 square miles (22.4 percent) above the 1981–2010 average. This was the largest April Antarctic sea ice extent on record, surpassing the previous record-large April extent of 2014 by 10,000 square miles.

According to data from NOAA analyzed by the Rutgers Global Snow Lab, the Northern Hemisphere snow cover extent during April was 50,000 square miles below the 1981–2010 average. This was the 22nd smallest April Northern Hemisphere snow cover extent in the 49-year period of record. Eurasia had a slightly larger-than-average April snow cover extent, while North America had its 15th smallest.


Source:  NOAA

Monday, May 25, 2015

Special Fats Proven Essential for Brain Growth

Research led by a Duke-NUS Graduate Medical School Singapore (Duke-NUS) scientist has proved that certain special fats found in blood are essential for human brain growth and function.

Duke-NUS Associate Professor David Silver co-led two Nature Genetics published studies which showed that mutations in the protein Mfsd2a causes impaired brain development in humans. Mfsd2a is the transporter in the brain for a special type of fat called lysophosphatidylcholines (LPCs) -- composed of essential fatty acids like omega-3. These studies show, for the first time, the crucial role of these fats in human brain growth and function.

The difference between a brain with a normal Mfsd2a gene and a brain with a mutated Mfsd2a gene.
Credit: Duke University

In the first study, two families in Libya and Egypt with Mfsd2a mutations were identified with severely reduced brain size, or microcephaly. Their mutations eliminated Mfsd2a's ability to transport LPCs, which meant not enough LPCs were absorbed by the brain. In these families, children affected by these mutations died between one and six years of age. The study not only establishes a link between the transport of LPCs by Mfsd2a and human brain growth and function, it is also the first time a genetic disease has been related to LPC transport in humans. The research was co-led by senior author Professor Joseph Gleeson from Rockefeller University.

In a second, separate study, a family in North Pakistan was found to have another type of mutation in the Mfsd2a gene which reduced its transport activity. The individuals with this mutation also had microcephaly, but in this case it was not lethal. However, they did have intellectual disabilities, impaired control of their limbs, and absent speech. Like the first study, findings are proof of the importance of LPCs in brain development and function. The research was co-led by senior author Professor Andrew H. Crosby from Exeter University.

In 2014, Dr. Silver published a landmark study in Nature which served as a basis for these two studies. He and his team discovered that Mfsd2a is the transporter for LPCs. Prior to this breakthrough, LPCs were known to be found at high concentrations in our blood but their function was a mystery. Dr. Silver's team showed that mice genetically engineered without Mfsd2a failed to transport LPCs into their brains - which resulted in microcephaly. 

Since DHA deficiency in animals does not result in microcephaly, this meant that LPCs are critical factors in brain growth and function. Also, while it was previously believed the brain made all the fat it needed, Dr. Silver's research showed that LPCs are transported there from the blood past the blood-brain barrier. His work with Rockefeller and Exeter prove this in humans.

"Our work confirms the essential role of LPCs in brain development and function in humans, and indicates that brain uptake of LPCs during foetal development and in adult life is important," said Dr. Silver, co-lead on both studies, based in the Cardiovascular and Metabolic Disorders Programme at Duke-NUS. 

"Now we are studying the functions of LPCs in the brain, and the implications for application are very exciting. We might be able to develop therapeutics in the future that could prevent and treat neurological disorders, and improve brain growth and function. We may even be able to target better brain nutrition for babies, mothers, and the aged."


Contacts and sources: 
Dharshini Subbiah
Duke University 

11 New Species Discovered in Madagascar

Madagascar is home to extraordinary biodiversity, but in the past few decades, the island's forests and associated biodiversity have been under greater attack than ever. Rapid deforestation is affecting the biotopes of hundreds of species, including the panther chameleon, a species with spectacular intra-specific colour variation. 

Shown here is a panther chameleon.
Credit: © Michel Milinkovitch

A new study by Michel Milinkovitch, professor of genetics, evolution, and biophysics at the University of Geneva (UNIGE), led in close collaboration with colleagues in Madagascar, reveals that this charismatic reptilian species, which is only found in Madagascar, is actually composed of eleven different species. The results of their research appear in the latest issue of the Molecular Ecologyjournal. They also discuss the urgent need to protect Madagascar's habitats.

In collaboration with professor Achille Raselimanana of the University of Antananarivo, researchers from the Department of Genetics and Evolution in the UNIGE Faculty of Sciences, led by Michel Milinkovitch, sought to find the genetic keys behind panther chameleon's incredible colour palette. Their analyses, performed on site in Madagascar, reveal the presence of 11 rather than a single species.

A Talkative Drop of Blood

It took two expeditions led from East to West for the scientists to collect a drop of blood from each of 324 individuals and document them through colour photographs. The DNA (mitochondrial and nuclear) of each of the specimens were sequenced and analysed in the laboratory according to the hypothesis that a chameleon's dominant colour might be related to the geographic zone where it is found. Most importantly, the genetic material indicated strong genetic structure among geographically-restricted lineages, revealing very low interbreeding among populations.

Shown here is a panther chameleon.
Credit: © Michel Milinkovitch

A Key for Turning Genetics into Color

The mathematical analyses of the 324 colour photographs demonstrated that subtle colour patterns could efficiently predict assignment of chameleon individuals to their corresponding genetic lineage, confirming that many of the geographical populations might need to be considered separated species. The scientists then simplified their analyses of the colour diversity into a classification key, which allows to link most chameleons to their corresponding species using only the naked eye. This case of hidden speciation confirms a major characteristic of Madagascar: it is amongst the most diverse places for life on Earth; a biodiversity hotspot.

Madagascar, Unique but Precarious Conservatory

Each of the new chameleon species requires individual management, given that they each constitute a different part of the biodiversity of the whole. The visual classification key devised by the researchers could assist local biologists and trade managers to avoid local population over-harvesting. The task of biodiversity management is daunting because of the widespread destruction of the forest habitat for agricultural practices as well as for firewood and charcoal production by populations with very low living standards. These human activities threaten the survival of 400 species of reptile, 300 species of amphibians, 300 species of birds, 15,000 species of plants and countless species of invertebrates. In addition, approximately 80 to 90% of all living species found in Madagascar are endemic, meaning they exist nowhere else on earth.

Given the charismatic nature of chameleons, Milinkovitch hopes that, beside a better understanding of the genetic basis of colour variation in chameleons, his collaborative study with his Malagasy colleagues will help his colleague, Professor Raselimanana, to continue his difficult enterprise: raising awareness for the staggering but fragile biodiversity hosted by Madagascar.


Contacts and sources:
Michel Milinkovitch
University of Geneva (UNIGE)

From Chicken To Dinosaur: Scientists Experimentally Reverse The Evolution Of The Perching Toe

Aunique adaptation in the foot of birds is the presence of a thumb-like opposable toe, which allows them to grasp and perch. However, in their dinosaur ancestors, this toe was small and non- opposable, and did not even touch the ground, resembling the dewclaws of dogs and cats. 
Remarkably, the embryonic development of birds provides a parallel of this evolutionary history: The toe starts out like their dinosaur ancestors, but then its base (the metatarsal) becomes twisted, making it opposable. Brazilian researcher Joâo Botelho, working at the lab of Alexander Vargas at the University of Chile, decided to study the underlying mechanisms. Botelho observed that the twisting occurred shortly after the embryonic musculature of this toe was in place.
Credit: University of Chile
“This is one of the clearest examples of how indirect the morphological consequences of genetic change are mediated”, Gunter Wagner, evolutionary geneticist and professor at Yale.
Bird embryos move a lot inside the egg during development, and the onset of movement at this toe coincided with the twisting of its base. Botelho also demonstrated that in this toe, genes of cartilage maturation were expressed at a much later stage than other digits: It retains many rapidly dividing stem cells for a much longer period. Such immature cartilage is highly plastic and easily transformed by muscular activity.

These observations suggested the toe is twisted as a result of mechanical forces imposed on it by the embryonic musculature. Definitive proof, however, would come from experiments. When Botelho applied Decamethonium bromide, a pharmacological agent capable of paralyzing embryonic musculature, the result was a non-opposable toe with a straight, non-twisted base identical to that of their dinosaur ancestors. 
Only a few experiments are known to recover dinosaur traits in birds (such as a dinosaur-like shank and tooth-like structures). The undoing of the perching digit is thus an important new addition, and the results have now been published in Scientific Reports, an open-access journal of the Nature Publishing Group.

The significance of this experiment, however, goes beyond the fact that a dinosaur-like toe is being retrieved. Evolutionary research often centers on mutations, but the development and evolution of the perching toe cannot be understood without the forces of embryonic muscular activity. 
The study is described as “true developmental mechanics” by Gunter Wagner, an evolutionary geneticist and professor at Yale. “This is one of the clearest examples of how indirect the morphological consequences of genetic change are mediated. The experiments prove that interactions about organ systems channel the directions of organismal evolution.” 


Contacts and sources:
University of Chile

Citation: João Francisco Botelho,Daniel Smith-Paredes, Sergio Soto-Acuña,Jorge Mpodozis, Verónica Palma & Alexander O. Vargas, “Skeletal plasticity in response to embryonic muscular activity underlies the development and evolution of the perching digit of birds”, Scientific Reports, 5, Article number: 9840, published 14 may 2015. doi:10.1038/srep09840. Link to article, images and supplementary info: http://www.nature.com/srep/2015/150514/srep09840/full/srep09840.html

The Viking's Grave and the Sunken Ship

Mapping archaeological digs takes plenty of time and a lot of measuring, photographing, drawing and note taking. Now, most of this work can be done with a technique called photogrammetry.

Photogrammetry is a method that uses two-dimensional images of an archaeological find to construct a 3D model.

Detailed image of a shield boss found in what is likely a Viking’s grave in Skaun. 
Photo: NTNU University Museum

You don’t need and special glasses or advanced equipment to use make use of this new technique. Together with precise measurements of the excavation, photogrammetry can create a complete detailed map of an archaeological excavation site.

“This is still a very new technique,” say archaeologists Raymond Sauvage and Fredrik Skoglund of the Norwegian University of Science and Technology's University Museum.

Photogrammetry is in many ways much more precise than older, more time-consuming methods.

Viking graves
This method is already being put to use by archaeologists. When a possible Viking grave was found in Skaun in Sør-Trøndelag in 2014, the excavation site was mapped using photogrammetry.

The manner in which artefacts are found, how deeply the are buried and where they are placed in relation to each other can provide a lot of information to archaeologists studying a site.

A Viking archaeological site

Credit: NTNU

Photogrammetry also makes it easier for archaeologists to share their findings with others. The 3D models that are produced can be saved as normal PDF files, which can be sent to colleagues for input.

Saving time
The two archaeologists are very enthusiastic. A Russian company has developed the program that they’re using at the museum. The program is easy to use and gives good results. The development and use of the technique has exploded in recent years.

“There’s a lot more interest in photogrammetry now. The new program is readily available and inexpensive,” says Sauvage.

He explains that it provides the kind of quality and detail that you could only dream of a few years ago. Even though the method requires some work, it still saves a lot of time.
“In one day, you can get three million measurement points. Before, we were satisfied with 3000,” he says.

And those 3000 points could take a long time to find. This method can save archaeologists weeks of work with tape measures, sketching paper and cameras. The practical work in the field goes much quicker.

“This frees up a lot more time for things like research,” Skoglund says.

Old finds
Similar results have been achieved in the past using laser equipment and early versions of a photogrammetry program. But this has been very expensive, and takes a lot of time and resources.

The new program only costs a few hundred euros, meaning that it is much more widely available.

With a photogrammetry program, three or four pictures from different angles are enough to make a simple 3D model, although more images will provide a higher quality model. You can use any normal camera.

“The more images, the better the quality,” Sauvage says.

It is also possible to use images of old finds to build a 3D model based on them. For example, you could make a model using photos from previous excavations of Viking graves, and use this to explore how an excavation site changes over time.

Shipwreck
Marine archaeologist Skoglund has tried this with the Dutch ship “De Grawe Adler” (the Grey Eagle), which sank in 1696 by Strømsholmen in Hustadvika, on the coast of central Norway and was discovered in 1982 when dredging for sand destroyed parts of the ship.

“I swam along the whole length of the wreck a few years ago and took pictures,” Skoglund says.

He did so with out ever considering the possibility of making a 3D model of the wreck. The fact that the photos were taken underwater makes it slightly harder to put them together, but it is by no means impossible.

If the results are precise enough, they can be used to monitor the decomposition of the ship. Finds under water tend to be particularly fragile, but decomposition can be difficult to see. You can’t just dive down every few years to make sure that everything is OK. With this new method, the decomposition can be measured much more precisely, and appropriate protection measures can be put in place.

The future
The next step is likely to be able to put on a pair of 3D-glasses and virtually walk into an excavation site, although that may be a few years in coming.

There is one challenge, however — storing measurements digitally in a manner that will be useful for generations to come. Archaeologists working today are behind measurements and notes on excavations that may be used hundreds of years in the future. A paper photo taken 100 years ago is just as good now as it was then, as long as you have it on hand. But nobody knows if a PDF file will be of use in year 2115. But this is a challenge facing all information that is stored digitally. And it’s something that we can’t overcome.


Contacts and sources:

Go Fish! Ancient Birds Evolved Specialist Diving Adaptations

A new study of some primitive birds from the Cretaceous shows how several separate lineages evolved adaptations for diving.

Living at the same time as the dinosaurs, Hesperornithiform bird fossils have been found in North America, Europe and Asia in rocks 65–95 million years old. Dr Alyssa Bell and Professor Luis Chiappe of the Dinosaur Institute, Natural History Museum of Los Angeles County, publishing in the Journal of Systematic Palaeontology, have undertaken a detailed analysis of their evolution, showing that separate lineages became progressively more adept at diving into water to catch fishes, like modern day loons and grebes.

Evolution of diving specializations within the Hesperornithiformes.
Credit:   Taylor & Francis

The Hesperornithiformes are a highly derived but very understudied group of primitive birds from the Cretaceous period. This study is the first comprehensive phylogenetic analysis, or evaluation of evolutionary relationships, to ever be undertaken on the entire group.

The results of this study confirm that the Hesperornithiformes do form a single group (or clade), but that within this group the inter-relationships of the different taxa are more complex than previously thought. Additionally, this study finds that anatomical changes were accompanied by enlargement in overall body size, which increased lung capacity and allowed deeper diving.

Overall, this study provides evidence for understanding the evolution of diving adaptations among the earliest known aquatic birds.


Contacts and sources:
Taylor & Francis

Researchers Find The 'Key' To Quantum Network Solution

Scientists at the University of York’s Centre for Quantum Technology have made an important step in establishing scalable and secure high rate quantum networks.

Working with colleagues at the Technical University of Denmark (DTU), Massachusetts Institute of Technology (MIT), and the University of Toronto, they have developed a protocol to achieve key-rates at metropolitan distances at three orders-of-magnitude higher than previously.

Credit: Quantum Theory - Full Documentary HD

Standard protocols of Quantum Key Distribution (QKD) exploit random sequences of quantum bits (qubits) to distribute secret keys in a completely secure fashion. Once these keys are shared by two remote parties, they can communicate confidentially by encrypting and decrypting binary messages. The security of the scheme relies on one of the most fundamental laws of quantum physics, the uncertainty principle.

Today's classical communications by email or phone are vulnerable to eavesdroppers but quantum communications based on single particle levels (photons) can easily detect eavesdroppers because they invariably disrupt or perturb a quantum signal. By making quantum measurements, two remote parties can estimate how much information an eavesdropper is stealing from the channel and can apply suitable protocols of privacy amplification to negate the effects of the information loss.

However, the problem with QKD protocols based on simple quantum systems, such as single-photon qubits, is their low key-rate, despite their effectiveness in working over long distances. This makes them unsuitable for adaptation for use in metropolitan networks.

The team, led by Dr Stefano Pirandola, of the Department of Computer Science at York, overcame this problem, both theoretically and experimentally, using continuous-variable quantum systems. These allow the parallel transmission of many qubits of information while retaining the quantum capability of detecting and defeating eavesdroppers. The research is published in Nature Photonics.

Dr Pirandola said: “You want a high rate and a fast connection particularly for systems that serve a metropolitan area. You have to transmit a lot of information in the fastest possible way; essentially you need a quantum equivalent of broadband.

“Continuous-variable systems can use many more photons but are still quantum based. Our system reaches extremely high speeds by three orders of magnitude higher than ever before over a distance of 25 kilometres. Its effectiveness above that distance decreases rapidly however.

“Nevertheless, our protocol could be used to build high-rate quantum networks where devices securely connect to nearby access points or proxy servers.”

Dr Pirandola was funded by the Engineering and Physical Sciences Research Council.

The University of York leads a unique collaboration to exploit fundamental laws of quantum physics for the development of secure communication technologies and services for consumer, commercial and government markets.

The Quantum Communications Hub is one of four in the EPSRC’s new £155m National Network of Quantum Technology Hubs.

Contacts and sources:
University of York


Citation:  The paper ‘High-rate measurement-device independent quantum cryptography’ by Stefano Pirandola, Carlo Ottaviani, Gaetana Spedalieri, Christian Weedbrook, Samuel L Braunstein, Seth Lloyd, Tobias Gehring, Christian S Jacobsen and Ulrik L Andersen is published in Nature Photonics.

Friday, May 22, 2015

First Drug Approved For Radiation Exposure

As a result of research performed by scientists at the University of Maryland School of Medicine (UM SOM), the U.S. Food and Drug Administration has approved the use of a drug to treat the deleterious effects of radiation exposure following a nuclear incident. The drug, Neupogen®, is the first ever approved for the treatment of acute radiation injury.


Credit: Oregon State University

The research was done by Thomas J. MacVittie, PhD, professor, and Ann M. Farese, MA, MS, assistant professor, both in the University of Maryland School of Medicine (UM SOM) Department of Radiation Oncology's Division of Translational Radiation Sciences. The investigators did their research in a non-human clinical model of high-dose radiation.

"Our research shows that this drug works to increase survival by protecting blood cells," said Dr. MacVittie, who is considered one of the nation's leading experts on radiation research. "That is a significant advancement, because the drug can now be used as a safe and effective treatment for the blood cell effects of severe radiation poisoning."

Radiation damages the bone marrow, and as a result decreases production of infection-fighting white blood cells. Neupogen® counteracts these effects. The drug, which is made by Amgen, Inc., was first approved in 1991 to treat cancer patients receiving chemotherapy. Although doctors may use it "off label" for other indications, the research and the resulting approval would speed up access to and use of the drug in the event of a nuclear incident.

This planning is already under way. In 2013, the Biomedical Advanced Research and Development Authority (BARDA), an arm of the Department of Health and Human Services, bought $157 million worth of Neupogen® for stockpiles around the country in case of nuclear accident or attack.

Neupogen® is one of several "dual-use" drugs that are being examined for their potential use as countermeasures in nuclear incidents. These drugs have everyday medical uses, but also may be helpful in treating radiation-related illness in nuclear events. Dr. MacVittie and Ms. Farese are continuing their research on other dual-use countermeasures to radiation. They are now focusing on remedies for other aspects of radiation injury, including problems with the gastrointestinal tract and the lungs.

The research builds on 40 years of work that Dr. MacVittie and his team have conducted in the field of radiation research, during which they have helped to define the field. The Neupogen study is also part of a broad portfolio of research being conducted by faculty in the Department of Radiation Oncology. Among these are Minesh Mehta, MD, the medical director of the Maryland Proton Treatment Center, who is focusing on research into thoracic oncology, neuro-oncology, integrating imaging advances with radiation therapy, and innovative applications of new radiation therapy technologies to test biological concepts. 

Another researcher in the department is Zeljko Vujaskovic, MD, PhD, director of the Division of Translational Radiation Sciences; he is doing research on identifying potential biomarkers predicting individual patient risk for injury, and to develop novel therapeutic interventions/strategies to prevent, mitigate, or treat radiation injury.

"In terms of both research and treatment, our department is leading the way in developing the most effective discovery-based clinical applications to help protect and heal patients," says William F. Regine, MD, professor and Isadore & Fannie Schneider Foxman Endowed Chair in Radiation Oncology at the UM SOM.

He added that research has served as the foundation for the Department of Radiation Oncology's recent development of four clinical modalities for the treatment of cancer through radiation:

Proton Treatment, a precise approach to cancer, which targets tumors while minimizing harm to surrounding tissues. Proton treatment uses protons traveling at about two-thirds the speed of light to precisely deliver beams of radiation to the tumor. This treatment will be available in the new 110,000 sq ft Maryland Proton Treatment Center before the end of the year;

Selective Internal Radiation Therapy, a precision modality for treating patients with particularly difficult to remove tumors involving the liver such as those from colorectal cancers;

Gammapod, a new, high-precision, noninvasive method of treating early-stage breast cancer;

Thermal Therapies, the use of "heat" in treating a broad spectrum of malignancies.

"The Department of Radiation Oncology's work is just one example of how the School of Medicine is discovering innovative ways to repurpose existing drugs that are able fight a broader array of critical diseases," said Dean E. Albert Reece, MD, PhD, MBA, who is also the vice president for Medical Affairs, University of Maryland, and the John Z. and Akiko K. Bowers Distinguished Professor and Dean of the School of Medicine. "We are particularly proud of the Neupogen research as it is not only important scientifically; it is crucial for our country's public health and its national security."


Contacts and sources: 
David Kohn

Machine learning can pinpoint rodent species that harbor diseases and geographic hotspots vulnerable to new parasites and pathogens. So reports a new study in the Proceedings of the National Academy of Sciences led by Barbara A. Han, a disease ecologist at the Cary Institute of Ecosystem Studies.

Most emerging infectious diseases are transmitted from animals to humans, with more than a billion people suffering annually. Safeguarding public health requires effective surveillance tools.

A majority of new reservoir and hyper-reservoir rodent species are predicted to occur in the upper latitudes.

Credit: Han et al.


Han comments: "Historically, emerging infectious diseases have been dealt with reactively, with efforts focused on containing outbreaks after they've spread. We were interested in how machine learning could inform early warning surveillance by revealing the distribution of rodent species that are effective disease reservoirs."

With University of Georgia Odum School of Ecology colleagues John Paul Schmidt, Sarah E. Bowden, and John M. Drake, Han employed machine learning, a form of artificial intelligence, to reveal patterns in an extensive set of data on more than 2,000 rodent species, with variables describing species' life history, ecology, behavior, physiology, and geographic distribution.

The team developed a model that was able to predict known rodent reservoir species with 90% accuracy, and identified particular traits that distinguish reservoirs from non-reservoirs. They revealed over 150 new potential rodent reservoir species and more than 50 new hyper-reservoirs - animals that may carry multiple pathogens infectious to humans.

"This study shows the value of bringing new analysis techniques together with big data," commented study co-author John Drake. "By combining ecological and biomedical data into a common database, Barbara was able to use machine learning to find patterns that can inform an early warning system for rodent-borne disease outbreaks."

Flying squirrels are a known reservoir for anaplasmosis and one the 50+ hyper-reservoir species predicted to carry additional pathogens infectious to humans.
Credit: Holly B. Vuong, Cary Institute of Ecosystem Studies/Rutgers University

With Han explaining, "Results equip us with a watch list of high-risk rodent species whose intrinsic traits make them effective at carrying infections transmissible to people. Such a list is increasingly important given accelerating rates of environmental change."

Among the take home messages: rodents are not created equal in their ability to transmit disease. The riskiest reservoir species are those that mature quickly, reproduce early and often, and live northern temperate areas with low levels of biodiversity. The paper adds to the growing body of knowledge that 'pace of life' affects infection tolerance in animals.

"Biologically-speaking, species that bear as many offspring as possible in a shorter period of time may tend to invest fewer resources in immune response compared to slower-living animals. This could make certain rodent species more effective disease reservoirs," notes Han.

Geographic areas found to have a high diversity of rodent reservoirs included North America, the Atlantic coast of South America, Europe, Russia, and parts of Central and East Asia. Predicted future hotspots of rodent reservoir diversity spanned arctic, temperate, tropical, and desert biomes, including China, Kazakhstan, and the Midwestern United States. A majority of new reservoir and hyper-reservoir species are predicted to occur in the upper latitudes.

"It was surprising to find more emerging rodent-borne diseases predicted for temperate zones than the tropics--given assumptions that the tropics are where new diseases originate," Drake commented. "This result shows how data-driven discovery can correct such stereotypes."

Findings provide a basis for targeted surveillance efforts, which are vital given the cost of monitoring for emerging infectious diseases. Han notes, "Turning our predictions into preventative measures will require collaboration with experts on the ground. It's where the real work begins. A start would be to look at the newly predicted rodent reservoirs and assess which have increasing contact with people through activities like urbanization, agricultural and hunting practices, and displacement from political or climate instability."



Contacts and sources:
Lori Quillen

Human and Dog Bond Goes Back Beyond 27,000 years

Dogs' special relationship to humans may go back 27,000 to 40,000 years, according to genomic analysis of an ancient Taimyr wolf bone reported in the Cell Press journal Current Biology on May 21. Earlier genome-based estimates have suggested that the ancestors of modern-day dogs diverged from wolves no more than 16,000 years ago, after the last Ice Age.

This is an ancient Taimyr Wolf bone from the lower jaw. The animal lived approximately 27,000 to 40,000 years ago.
Credit: Love Dalén

The genome from this ancient specimen, which has been radiocarbon dated to 35,000 years ago, reveals that the Taimyr wolf represents the most recent common ancestor of modern wolves and dogs.

"Dogs may have been domesticated much earlier than is generally believed," says Love Dalén of the Swedish Museum of Natural History. "The only other explanation is that there was a major divergence between two wolf populations at that time, and one of these populations subsequently gave rise to all modern wolves." Dalén considers this second explanation less likely, since it would require that the second wolf population subsequently became extinct in the wild.

This image compares an ancient Taimyr Wolf bone from the lower jaw to a modern pipette.
Credit: Love Dalén

"It is [still] possible that a population of wolves remained relatively untamed but tracked human groups to a large degree, for a long time," adds first author of the study Pontus Skoglund of Harvard Medical School and the Broad Institute.

The researchers made these discoveries based on a small piece of bone picked up during an expedition to the Taimyr Peninsula in Siberia. Initially, they didn't realize the bone fragment came from a wolf at all; this was only determined using a genetic test back in the laboratory. But wolves are common on the Taimyr Peninsula, and the bone could have easily belonged to a modern-day wolf. On a hunch, the researchers decided to radiocarbon date the bone anyway. It was only then that they realized what they had: a 35,000-year-old bone from an ancient Taimyr wolf.

The DNA evidence also shows that modern-day Siberian Huskies and Greenland sled dogs share an unusually large number of genes with the ancient Taimyr wolf.

DNA from this small piece of a rib bone from an ancient Taimyr wolf suggests that dogs may have been domesticated 27,000 years ago.
Credit: Love Dalén

"The power of DNA can provide direct evidence that a Siberian Husky you see walking down the street shares ancestry with a wolf that roamed Northern Siberia 35,000 years ago," Skoglund says. To put that in perspective, "this wolf lived just a few thousand years after Neanderthals disappeared from Europe and modern humans started populating Europe and Asia."


Contacts and sources: 
Joseph Caputo
Cell Press

Citation: Current Biology, Skoglund et al.: "Ancient wolf genome reveals an early divergence of domestic dog ancestors and admixture into high-latitude breeds" http://dx.doi.org/10.1016/j.cub.2015.04.019

Sight Without Eyes: Octopus Sees With Skin

The octopus has a unique ability. It can change the color, pattern and even texture of its skin not only for purposes of camouflage but also as a means of communication. The most intelligent, most mobile and largest of all mollusks, these cephalopods use their almost humanlike eyes to send signals to pigmented organs in their skin called chromatophores, which expand and contract to alter their appearance.

This is a a California two-spot octopus hatchling.
Credit: UCSB

A new study by UCSB scientists has found that the skin of the California two-spot octopus (Octopus bimaculoides) can sense light even without input from the central nervous system. The animal does so by using the same family of light-sensitive proteins called opsins found in its eyes -- a process not previously described for cephalopods. The researchers' findings appear in the Journal of Experimental Biology.

UCSB Researchers Study Octopus Camouflage from UC Santa Barbara on Vimeo.


"Octopus skin doesn't sense light in the same amount of detail as the animal does when it uses its eyes and brain," said lead author Desmond Ramirez, a doctoral student in the Department of Ecology, Evolution and Marine Biology (EEMB). "But it can sense an increase or change in light. Its skin is not detecting contrast and edge but rather brightness."

These are chromatophores in their contracted state (left) and at maximum expansion (right).

Credit: UCSB

As part of the experiment, Ramirez shone white light on the tissue, which caused the chromatophores to expand and change color. When the light was turned off, the chromatophores relaxed and the skin returned to its original hue. This process, Ramirez noted, suggests that light sensors are connected to the chromatophores and that this enables a response without input from the brain or eyes. He and his co-author, Todd Oakley, an EEMB professor, dubbed the process Light-Activated Chromatophore Expansion (LACE).

In order to record the skin's sensitivity across the spectrum, Ramirez exposed octopus skin to different wavelengths of light from violet to orange and found that chromatophore response time was quickest under blue light. Molecular experiments to determine which proteins were expressed in the skin followed. Ramirez found rhodopsin -- usually produced in the eye -- in the sensory neurons on the tissue's surface.

According to Oakley, this new research suggests an evolutionary adaptation. "We've discovered new components of this really complex behavior of octopus camouflage," said Oakley, who calls cephalopods the rock stars of the invertebrate world.

"It looks like the existing cellular mechanism for light detection in octopus eyes, which has been around for quite some time, has been co-opted for light sensing in the animal's skin and used for LACE," he explained. "So instead of completely inventing new things, LACE puts parts together in new ways and combinations."

Octopuses are not the only marine mollusks whose skin can sense light, but scientists don't know yet whether the skin of those other animals contains the light-sensitive opsins. If they do, Ramirez wants to understand how these two groups are related."Do they all come from the same ancestral source or did they evolve multiple times?" he asked. "What kind of behaviors do the different groups share and what kind of behaviors does the skin sensing light underlie?"

Ramirez and Oakley are conducting new experiments that will seek to answer those questions and more.


Contacts and sources:
Julie Cohen
University of California - Santa Barbara  

Thursday, May 21, 2015

Weird One-of-a-Kind Star Called "Nasty 1"

Astronomers using NASA's Hubble Space Telescope have uncovered surprising new clues about a hefty, rapidly aging star whose behavior has never been seen before in our Milky Way galaxy. In fact, the star is so weird that astronomers have nicknamed it "Nasty 1," a play on its catalog name of NaSt1. The star may represent a brief transitory stage in the evolution of extremely massive stars.

Astronomers using NASA's Hubble Space Telescope have uncovered surprising new clues about a hefty, rapidly aging star whose behavior has never been seen before in our Milky Way galaxy. Astronomers have nicknamed it 'Nasty 1,' a play on its catalog name of NaSt1.

Credit: NASA/Space Telescope

First discovered several decades ago, Nasty 1 was identified as a Wolf-Rayet star, a rapidly evolving star that is much more massive than our sun. The star loses its hydrogen-filled outer layers quickly, exposing its super-hot and extremely bright helium-burning core.

But Nasty 1 doesn't look like a typical Wolf-Rayet star. The astronomers using Hubble had expected to see twin lobes of gas flowing from opposite sides of the star, perhaps similar to those emanating from the massive star Eta Carinae, which is a Wolf-Rayet candidate. Instead, Hubble revealed a pancake-shaped disk of gas encircling the star. The vast disk is nearly 2 trillion miles wide, and may have formed from an unseen companion star that snacked on the outer envelope of the newly formed Wolf-Rayet. Based on current estimates, the nebula surrounding the stars is just a few thousand years old, and as close as 3,000 light-years from Earth.

"We were excited to see this disk-like structure because it may be evidence for a Wolf-Rayet star forming from a binary interaction," said study leader Jon Mauerhan of the University of California, Berkeley. "There are very few examples in the galaxy of this process in action because this phase is short-lived, perhaps lasting only a hundred thousand years, while the timescale over which a resulting disk is visible could be only ten thousand years or less."

In the team's proposed scenario, a massive star evolves very quickly, and as it begins to run out of hydrogen, it swells up. Its outer hydrogen envelope becomes more loosely bound and vulnerable to gravitational stripping, or a type of stellar cannibalism, by a nearby companion star. In that process, the more compact companion star winds up gaining mass, and the original massive star loses its hydrogen envelope, exposing its helium core to become a Wolf-Rayet star.

Another way Wolf-Rayet stars are said to form is when a massive star ejects its own hydrogen envelope in a strong stellar wind streaming with charged particles. The binary interaction model where a companion star is present is gaining traction because astronomers realize that at least 70 percent of massive stars are members of double-star systems. Direct mass loss alone also cannot account for the number of Wolf-Rayet stars relative to other less-evolved massive stars in the galaxy.

"We're finding that it is hard to form all the Wolf-Rayet stars we observe by the traditional wind mechanism, because mass loss isn't as strong as we used to think," said Nathan Smith of the University of Arizona in Tucson, who is a co-author on the new NaSt1 paper. "Mass exchange in binary systems seems to be vital to account for Wolf-Rayet stars and the supernovae they make, and catching binary stars in this short-lived phase will help us understand this process."

But the mass transfer process in mammoth binary systems isn't always efficient. Some of the stripped matter can spill out during the gravitational tussle between the stars, creating a disk around the binary.

"That's what we think is happening in Nasty 1," Mauerhan said. "We think there is a Wolf-Rayet star buried inside the nebula, and we think the nebula is being created by this mass-transfer process. So this type of sloppy stellar cannibalism actually makes Nasty 1 a rather fitting nickname."

The star's catalogue name, NaSt1, is derived from the first two letters of each of the two astronomers who discovered it in 1963, Jason Nassau and Charles Stephenson.

Viewing the Nasty 1 system hasn't been easy. The system is so heavily cloaked in gas and dust, it blocks even Hubble's view of the stars. Mauerhan's team cannot measure the mass of each star, the distance between them, or the amount of material spilling onto the companion star.

Previous observations of Nasty 1 have provided some information on the gas in the disk. The material, for example, is travelling about 22,000 miles per hour in the outer nebula, slower than similar stars. The comparatively slow speed indicates that the star expelled its material through a less violent event than Eta Carinae's explosive outbursts, where the gas is travelling hundreds of thousands of miles per hour.

Nasty 1 may also be shedding the material sporadically. Past studies in infrared light have shown evidence for a compact pocket of hot dust very close to the central stars. Recent observations by Mauerhan and colleagues at the University of Arizona, using the Magellan telescope at Las Campanas Observatory in Chile, have resolved a larger pocket of cooler dust that may be indirectly scattering the light from the central stars. The presence of warm dust implies that it formed very recently, perhaps in spurts, as chemically enriched material from the two stellar winds collides at different points, mixes, flows away, and cools. Sporadic changes in the wind strength or the rate the companion star strips the main star's hydrogen envelope might also explain the clumpy structure and gaps seen farther out in the disk.

To measure the hypersonic winds from each star, the astronomers turned to NASA's Chandra X-ray Observatory. The observations revealed scorching hot plasma, indicating that the winds from both stars are indeed colliding, creating high-energy shocks that glow in X-rays. These results are consistent with what astronomers have observed from other Wolf-Rayet systems.

The chaotic mass-transfer activity will end when the Wolf-Rayet star runs out of material. Eventually, the gas in the disk will dissipate, providing a clear view of the binary system.

"What evolutionary path the star will take is uncertain, but it will definitely not be boring," said Mauerhan. "Nasty 1 could evolve into another Eta Carinae-type system. To make that transformation, the mass-gaining companion star could experience a giant eruption because of some instability related to the acquiring of matter from the newly formed Wolf-Rayet. Or, the Wolf-Rayet could explode as a supernova. A stellar merger is another potential outcome, depending on the orbital evolution of the system. The future could be full of all kinds of exotic possibilities depending on whether it blows up or how long the mass transfer occurs, and how long it lives after the mass transfer ceases."


Contacts and sources:
Ray Villard
NASA/Goddard Space Flight Center 

Bronze Age Egtved Girl Found in Denmark Came From Germany's Black Forest


The Bronze Age Egtved Girl came from far away, as revealed by strontium isotope analyses of the girl's teeth. The analyses show that she was born and raised outside Denmark's current borders, and strontium isotope analyses of the girl's hair and a thumb nail also show that she travelled great distances the last two years of her life.

The wool from the Egtved Girl's clothing, the blanket she was covered with, and the oxhide she was laid to rest on in the oak coffin all originate from a location outside present-day Denmark.

This is the Egtved Girl's grave, from 1370 BC.
Credit:  The National Museum of Denmark

The combination of the different provenance analyses indicates that the Egtved Girl, her clothing, and the oxhide come from Schwarzwald ("the Black Forest") in South West Germany - as do the cremated remains of a six-year-old child who was buried with the Egtved Girl. The girl's coffin dates the burial to a summer day in the year 1370 BC.

It is senior researcher Karin Margarita Frei, from the National Museum of Denmark and Centre for Textile Research at the University of Copenhagen, who has analysed the Egtved Girl's strontium isotope signatures. The analyses have been carried out in collaboration with Kristian Kristiansen from the University of Gothenburg and the Department of Geosciences and Natural Resource Management and the Centre for GeoGenetics, both University of Copenhagen.

The research has been possible through the support of The Danish National Research Foundation, European Research Council, the Carlsberg Foundation and L'Oréal Denmark-UNESCO For Women in Science Award.

The results have just been published in Scientific Reports.

The girl's movements mapped month by month

Strontium is an element which exists in the earth's crust, but its prevalence is subject to geological variation. Humans, animals, and plants absorb strontium through water and food. By measuring the strontium isotopic signatures in archaeological remains, researchers can determine where humans and animals lived, and where plants grew because of their strontium isotope signatures. In that sense, strontium serves as a kind of GPS for scientists.

"I have analysed the strontium isotopic signatures of the enamel from one of the Egtved Girl's first molars, which was fully formed/crystallized when she was three or four years old, and the analysis tells us that she was born and lived her first years in a region that is geologically older than and different from the peninsula of Jutland in Denmark," Karin Margarita Frei says.

Karin Margarita Frei has also traced the last two years of the Egtved Girl's life by examining the strontium isotopic signatures in the girl's 23-centimetre-long hair. The analysis shows that she had been on a long journey shortly before she died, and this is the first time that researchers have been able to so accurately track a prehistoric person's movements.

"If we consider the last two years of the girl's life, we can see that, 13 to 15 months before her death, she stayed in a place with a strontium isotope signature very similar to the one that characterizes the area where she was born. Then she moved to an area that may well have been Jutland. After a period of c. 9 to 10 months there, she went back to the region she originally came from and stayed there for four to six months before she travelled to her final resting place, Egtved. Neither her hair nor her thumb nail contains a strontium isotopic signatures which indicates that she returned to Scandinavia until very shortly before she died. As an area's strontium isotopic signature is only detectable in human hair and nails after a month, she must have come to "Denmark" and "Egtved" about a month before she passed away," Karin Margarita Frei explains.

The Black Forest Girl

If the Egtved Girl was not born in Jutland, then where did she come from? Karin Margarita Frei suggests that she came from South West Germany, more specifically the Black Forest, which is located 500 miles south of Egtved.

Considered in isolation, the Egtved Girl's strontium isotope signature could indicate that she came from Sweden, Norway or Western or Southern Europe. She could also come from the island Bornholm in the Baltic Sea. But when Karin Margarita Frei combines the girl's strontium isotopic signatures with that of her clothing, she can pinpoint the girl's place of origin relatively accurately.

"The wool that her clothing was made from did not come from Denmark and the strontium isotope values vary greatly from wool thread to wool thread. This proves that the wool was made from sheep that either grazed in different geographical areas or that they grazed in one vast area with very complex geology, and Black Forest's bedrock is characterized by a similarly heterogeneous strontium isotopic range," Karin Margarita Frei says.

That the Egtved Girl in all probability came from the Black Forest region in Germany comes as no surprise to professor Kristian Kristiansen from the University of Gothenburg; the archaeological finds confirm that there were close relations between Denmark and Southern Germany in the Bronze Age.

"In Bronze Age Western Europe, Southern Germany and Denmark were the two dominant centres of power, very similar to kingdoms. We find many direct connections between the two in the archaeological evidence, and my guess is that the Egtved Girl was a Southern German girl who was given in marriage to a man in Jutland so as to forge an alliance between two powerful families," Kristian Kristiansen says.

According to him, Denmark was rich in amber and traded amber for bronze. In Mycenaean Greece and in the Middle East, Baltic amber was as coveted as gold, and, through middlemen in Southern Germany, large quantities of amber were transported to the Mediterranean, and large quantities of bronze came to Denmark as payment. In the Bronze Age, bronze was as valuable a raw material as oil is today so Denmark became one of the richest areas of Northern Europe.

"Amber was the engine of Bronze Age economy, and in order to keep the trade routes going, powerful families would forge alliances by giving their daughters in marriage to each other and letting their sons be raised by each other as a kind of security," Kristian Kristiansen says.

A great number of Danish Bronze Age graves contain human remains that are as well-preserved as those found the Egtved Girl's grave. Karin Margarita Frei and Kristian Kristiansen plan to examine these remains with a view to analysing their strontium isotope signatures.


Contacts and sources:

Senior researcher Karin Margarita Frei
National Museum of Denmark and University of Copenhagen

Professor Kristian Kristiansen
University of Gothenburg

Exploring the Mysteries of Cosmic Explosions

An automated software system developed at Los Alamos National Laboratory played a key role in the discovery of supernova iPTF 14atg and could provide insight, a virtual Rosetta stone, into future supernovae and their underlying physics.

A Los Alamos National Laboratory simulation of an exploding white dwarf, in which the supernova drives an expanding shock wave that collides with a torus of material accreted from a companion star. 
Credit: Los Alamos National Laboratory

"Over the past decade, rapid advances in imaging and computing technology have completely transformed time-domain astronomy," said Przemek Wozniak, the principal investigator of a Laboratory Directed Research and Development (LDRD) project that funds the Laboratory's contributions to the research. "The Intermediate Palomar Transient Factory (iPTF) is a leader among the new breed of data-intensive sky monitoring surveys that seek to discover and understand transient events of astrophysical origin."

The Laboratory is partnering with an international consortium, led by the California Institute of Technology, to conduct the iPTF project.

Type Ia supernovae, such as supernova iPTF 14atg, occur in binary systems, when two stars orbit one another and one of the stars is a dense white dwarf. This supernova demonstrated a rarely observed phenomenon that allowed scientists to understand the underlying physics of type Ia supernovae.

"The challenge in this work is to select transients from the torrent of images and quickly identify the ones that deserve further attention," Wozniak said. "Too many transients compete for scarce resources such as observing time on large telescopes. We are developing new machine learning technology that will allow us to tackle these big data challenges."

Researchers at Los Alamos developed an automated software system based on machine learning algorithms to reliably separate real astronomical transients from false detections. Wozniak said without machine learning technology, it is impossible to find events such as iPTF 14atg before it is too late for detailed follow-up observations that tell scientists about the broad spectral energy distribution of radiation emitted by supernovae.

An important piece of the puzzle in the case of iPTF 14atg came from NASA's Swift satellite, which detected the supernova in time to catch rapidly fading ultraviolet radiation from a young supernova.

"This excess UV emission is strong evidence that the supernova is interacting with its surrounding medium, such as an exploding white dwarf colliding with its companion star in the so-called single degenerate scenario," said Chris Fryer, a computational scientist at Los Alamos who leads the supernova simulation and modeling group at the Laboratory.

In this model, the ejecta from an exploding degenerate object called a white dwarf collide with a normal companion star, producing a UV transient lasting at most a few days. The competing double-degenerate model, which uses a pair of merging white dwarfs, predicts no UV excess.

Wozniak said Los Alamos is at the forefront of this fast-evolving field and well equipped to make important contributions in the future. He said the main idea is to automate and optimize the entire process of selecting, vetting and prioritizing transients in order to collect the most effective follow-up observations for events that matter.




Contacts and sources:
Nancy AmbrosianoLos Alamos National Laboratory

Supernova Ignition Surprises Astronomers


Scientists have captured the early death throes of supernovae for the first time and found that the universe's benchmark explosions are much more varied than expected.

The scientists used the Kepler space telescope to photograph three type 1a supernovae in the earliest stages of ignition. They then tracked the explosions in detail to full brightness around three weeks later, and the subsequent decline over the next few months.

Supernova SN2012fr, just below the center of the host galaxy, outshone the rest of the galaxy for several weeks.
       
Credit: Brad Tucker and Emma Kirby

They found the initial stages of a supernova explosion did not fit with the existing theories.

"The stars all blow up uniquely. It doesn't make sense," said Dr Brad Tucker from The Australian National University (ANU).

"It's particularly weird for these supernovae because even though their initial shockwaves are very different, they end up doing the same thing."

This is a timelapse compilation of the brightness of SN 2012fr over several weeks.  
Credit: ANU

Before this study, the earliest type 1a supernovae had been glimpsed was more than 2.5 hours after ignition, after which the explosions all followed an identical pattern.

This led astronomers to theorise that supernovae, the brilliant explosions of dying stars, all occurred through an identical process.

Astronomers had thought supernovae all happened when a dense star steadily sucked in material from a large nearby neighbour until it became so dense that carbon in the star's core ignited.

"Somewhat to our surprise the results suggest an alternative hypothesis, that a violent collision between two smallish white dwarf stars sets off the explosion," said lead researcher Dr Robert Olling, from the University of Maryland in the United States.

At the peak of their brightness, supernovae are brighter than the billions of stars in their galaxy. Because of their brightness, astronomers have been able to use them to calculate distances to distant galaxies.

Dr. Brad Tucker discusses the first observations of supernova ignition which are challenging theories of how they form.  
Credit: ANU

Measurements of distant supernovae led to the discovery that some unknown force, now called dark energy, is causing the accelerated expansion of the universe. Brian Schmidt from the ANU, Saul Perlmutter (Berkeley) and Adam Reiss (Johns Hopkins) were awarded the Nobel prize in 2011 for this discovery.

Dr Tucker said the new results did not undermine the discovery of dark energy.

"The accelerating universe will not now go away - they will not have to give back their Nobel prizes," he said.

"The new results will actually help us to better understand the physics of supernovae, and figure out what is this dark energy that is dominating the universe."

The findings are published in Nature.

Contacts and sources:
Dr. Brad Tucker
Australian National University

Infections Can Lower I.Q.

New research shows that infections can impair your cognitive ability measured on an IQ scale. The study is the largest of its kind to date, and it shows a clear correlation between infection levels and impaired cognition.

Anyone can suffer from an infection, for example in their stomach, urinary tract or skin. However, a new Danish study shows that a patient's distress does not necessarily end once the infection has been treated. In fact, ensuing infections can affect your cognitive ability measured by an IQ test:

"Our research shows a correlation between hospitalisation due to infection and impaired cognition corresponding to an IQ score of 1.76 lower than the average. People with five or more hospital contacts with infections had an IQ score of 9.44 lower than the average. The study thus shows a clear dose-response relationship between the number of infections, and the effect on cognitive ability increased with the temporal proximity of the last infection and with the severity of the infection. 

An example of one kind of IQ test item, modeled after items in the Raven's Progressive Matrices test
Credit: Life of Riley

Infections in the brain affected the cognitive ability the most, but many other types of infections severe enough to require hospitalisation can also impair a patient's cognitive ability. Moreover, it seems that the immune system itself can affect the brain to such an extent that the person's cognitive ability measured by an IQ test will also be impaired many years after the infection has been cured," explains MD and PhD Michael Eriksen Benrós, who is affiliated with the National Centre for Register-Based Research at Aarhus BSS and the Mental Health Centre Copenhagen, University of Copenhagen.

He has conducted the research in collaboration with researchers from the University of Copenhagen and Aarhus University.

190,000 Danes participated in the study

The study is a nationwide register study tracking 190,000 Danes born between 1974 and 1994, who have had their IQ assessed between 2006 and 2012. 35% of these individuals had a hospital contact with infections before the IQ testing was conducted.

According to Senior Researcher Michael Eriksen Benrós, part of the explanation of the increased risk of impaired cognition following an infection may be as follows:

"Infections can affect the brain directly, but also through peripheral inflammation, which affects the brain and our mental capacity. Infections have previously been associated with both depression and schizophrenia, and it has also been proven to affect the cognitive ability of patients suffering from dementia. This is the first major study to suggest that infections can also affect the brain and the cognitive ability in healthy individuals."

"We can see that the brain is affected by all types of infections. Therefore, it is important that more research is conducted into the mechanisms which lie behind the connection between a person's immune system and mental health," says Michael Eriksen Benrós.

He hopes that learning more about this connection will help to prevent the impairment of people's mental health and improve future treatment.

Experiments on animals have previously shown that the immune system can affect cognitive capabilities, and more recent minor studies in humans have also pointed in that direction. Normally, the brain is protected from the immune system, but with infections and inflammation the brain may be affected. 

Michael Eriksen Benrós' research suggests that it may be the immune system that causes the cognitive impairment, not just the infection, because many different types of infections were associated with a decrease in cognitive abilities. This is the first study to examine these correlations in this manner. 

The results suggest that the immune system's response to infections can possibly affect the brain and thereby also the person's cognitive ability. This is in line with previous studies, some of which have also been conducted by Dr. Michael Eriksen Benrós, which show that infections are associated with an increased risk of developing mental disorders such as depression or schizophrenia.

The researchers behind the study hope that their results may spur on further research on the possible involvement of the immune system in the development of psychiatric disorders and whether the discovered correlations contribute to the development of mental disorders or whether they may be caused by e.g. genetic liability toward acquiring infections in patients with reduced cognitive ability. 

The study has been adjusted for social conditions and parental educational levels; however, it cannot be ruled out that heritable and environmental factors associated with infections might also influence the associations.


Contacts and sources:
Michael Eriksen Benrós, MD, PhD, Senior Researcher
Mental Health Centre Copenhagen, University of Copenhagen
National Centre for Register-Based Research, Aarhus University