Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Monday, March 31, 2014

Invisibility Cloaks, Stealth Technology Breakthrough

Controlling and bending light around an object so it appears invisible to the naked eye is the theory behind fictional invisibility cloaks.

It may seem easy in Hollywood movies, but is hard to create in real life because no material in nature has the properties necessary to bend light in such a way. Scientists have managed to create artificial nanostructures that can do the job, called metamaterials. But the challenge has been making enough of the material to turn science fiction into a practical reality.

The work of Debashis Chanda at the University of Central Florida, however, may have just cracked that barrier. The cover story in the March edition of the journal Advanced Optical Materials, explains how Chanda and fellow optical and nanotech experts were able to develop a larger swath of multilayer 3-D metamaterial operating in the visible spectral range. They accomplished this feat by using nanotransfer printing, which can potentially be engineered to modify surrounding refractive index needed for controlling propagation of light.

Assistant Professor Chanda works with students in his lab at the UCF NanoScience Technology Center.
Credit: UCF

"Such large-area fabrication of metamaterials following a simple printing technique will enable realization of novel devices based on engineered optical responses at the nanoscale," said Chanda, an assistant professor at UCF.

The nanotransfer printing technique creates metal/dielectric composite films, which are stacked together in a 3-D architecture with nanoscale patterns for operation in the visible spectral range. Control of electromagnetic resonances over the 3-D space by structural manipulation allows precise control over propagation of light. Following this technique, larger pieces of this special material can be created, which were previously limited to micron-scale size.
 
By improving the technique, the team hopes to be able to create larger pieces of the material with engineered optical properties, which would make it practical to produce for real-life device applications. For example, the team could develop large-area metamaterial absorbers, which would enable fighter jets to remain invisible from detection systems.


Contacts and sources:
Zenaida Kotala
University of Central Florida

'Cosmic Barometer' Could Reveal Violent Events In Universe's Past

Scientists have developed a way of reading the universe’s ‘cosmic barometer’ to learn more about ancient violent events in space.

Exploding stars, random impacts involving comets and meteorites, and even near misses between two bodies can create regions of great heat and high pressure.

Credit: Imperial College London

Researchers from Imperial College London have now developed a method for analysing the pressure experienced by tiny samples of organic material that may have been ejected from dying stars before making a long journey through the cosmos. The researchers have investigated a type of aromatic hydrocarbon called dimethylnaphthalene, which should enable them to identify violent events in the history of the universe.

Samples of dimethylnaphthalene are found in meteorites. Previously, scientists have only had the ability to investigate how they have been affected by heat. The Imperial researchers say their method for detecting periods when dimethylnaphthalenes have experienced high pressure will now allow for a much more comprehensive analysis of organic materials.

Dr Wren Montgomery, co-author from the Department of Earth Science and Engineering at Imperial College London, says: “The ability to detect high pressure environments in space has tremendous implications for our ability to learn more about the formation of our solar system and the universe. Dimethylnaphthalenes are like microscopic barometers and thermometers recording changes in pressure and heat as they travel through space. Understanding these changes lets us probe their history, and with that, the history of the galaxy.”

In the study, the researchers placed a sample of dimethylnaphthalene, the width of a human hair, between the vice like grip of two anvils made out of gem-quality diamonds in a laboratory at the Swiss Light Source. They then applied pressure, recreating the type of high pressure environment that dimethylnaphthalene could experience in space. Using an infrared light from the synchrotron at the facility, Dr Montgomery and her colleagues were able to clearly determine the alterations that happen to the molecular structure of dimethylnaphthalene when experiencing high pressure.

By applying different pressures, the team were able to vary the change in the molecular structure of dimethylnaphthalene, giving an insight into how different types of pressures in space would alter the molecular structure of the organic material.

The researchers also recreated the experiments at the Paul Scherrer Institut in Switzerland and SOLEIL Synchrotron in France to verify their research.

The next step will see the team carrying out more lab work where they will be subjecting other types of aromatic hydrocarbons to a range of pressures experienced in space. Dimethylnaphthalene may not always be present in rock samples, so the researchers say it is important to build up a comprehensive catalogue of all aromatic hydrocarbons to understand more about high pressure zones.

This catalogue would be used by scientists in the field to detect molecular markers in their samples that indicate a particular pressure range. Combined with data about the mineralogy and chemistry of the space rock that the aromatic hydrocarbons are encased in, scientists could then deduce the types of violent events that the sample may have been exposed to many millions or billions of years ago on its way to Earth.

The team also believe that their new technique could be applied on Mars, potentially using the existing technology on-board roving laboratories such as the one on the Mars Science Laboratory Mission to glean information about sources of organic matter on the red planet. Recognising the pressures recorded in the aromatic hydrocarbons can help to reveal whether it came from processes generated from ancient living organisms.

Professor Mark Sephton, co-author from the Department of Earth Science and Engineering at Imperial, says: “We now have another instrument to add to our celestial toolbox, which will help us to learn more about high pressure environments in space. Massive heat and pressure waves arcing out through space from cataclysmic events leave an indelible record in these cosmic barometers. It is really exciting to know that we now have a technique at our disposal that will help to reveal pivotal moments in the universe’s history.”

The research is published tomorrow in The Astrophysical Journal.


Contacts and sources:
By Colin Smith
Imperial College London



Citation: “An organic cosmo-barometer: distinct pressure and temperature effects for methyl substituted polycyclic aromatic hydrocarbons”, The Astrophysical Journal, published in hard copy on Tuesday 1 April 2014.
Wren Montgomery [1], Jonathan S. Watson [1] , Mark Sephton [1]
Imperial College London, South Kensington Campus

Friday, March 28, 2014

Video: First Time Explosive Mechanism Of Solar Flare Creation Witnessed

Scientists have for the first time witnessed the mechanism behind explosive energy releases in the Sun's atmosphere, confirming new theories about how solar flares are created.

Video of magnetic field lines 'slipping reconnection' bring scientists a step closer to predicting when and where large flares will occur

  

New footage put together by an international team led by University of Cambridge researchers shows how entangled magnetic field lines looping from the Sun's surface slip around each other and lead to an eruption 35 times the size of the Earth and an explosive release of magnetic energy into space.

This is an image of a solar flare. Scientists have for the first time witnessed the mechanism behind explosive energy releases in the Sun's atmosphere.

Credit: NASA/SDO and AIA

The discoveries of a gigantic energy build-up bring us a step closer to predicting when and where large flares will occur, which is crucial in protecting the Earth from potentially devastating space weather. The study is published in The Astrophysical Journal.

While solar flares have long been a spectacular reminder of our star's power, they are also associated with Coronal Mass Ejections (CMEs) – eruptions of solar material with a twisted magnetic structure flying out of the Sun and into interplanetary space.

Space weather such as CMEs has been identified as a significant risk to the country's infrastructure by the UK's National Risk Register. Late last year The UK's MET Office announced it would set up a daily space weather forecast to work with the USA's Space Weather Prediction Center (SWPC).

The paper's lead author, Dr Jaroslav Dudik, Royal Society Newton International Fellow at the University of Cambridge's Centre for Mathematical Sciences, said: "We care about this as during flares we can have CMEs and sometimes they are sent in our direction. Human civilisation is nowadays maintained by technology and that technology is vulnerable to space weather. Indeed, CMEs can damage satellites and therefore have an enormous financial cost."

"They can also threaten airlines by disturbing the Earth's magnetic field. Very large flares can even create currents within electricity grids and knock out energy supplies."

One such event hit the Earth before technology was as integrated into human civilization as it is now, but still had a marked effect. In 1859 the Carrington storm made night skies so bright that newspapers could be read as easily as in daylight and telegraph systems caught fire.

Knowing the standard scientific models are right is therefore very important. The standard 3D model of solar flares has shown that they occur in places where the magnetic field is highly distorted.

In these places, the magnetic field lines can continuously reconnect while slipping and flipping around each other. In doing so, new magnetic structures are created.

Long before the flare the magnetic field lines are un-entangled and they appear in a smooth arc between two points on the photosphere (the Sun's visible surface) – areas called field line footpoints.

In a smooth, none-entangled arc the magnetic energy levels are low but entanglement will occur naturally as the footpoints move about each other. Their movement is caused as they are jostled from below by powerful convection currents rising and falling beneath the photosphere.

As the movement continues the entanglement of field lines causes magnetic energy to build up.

Like a group of straight cords which has been twisted, the lines will hold the energy until it becomes too great and then will release it, "straightening" back to the lower energy state.

Co-author Dr Helen Mason, Head of the Atomic Astro-Physics Group at the University of Cambridge, said: "You build the stress slowly until a point where they are no longer sustainable. The field lines say they have had enough and 'ping', they go back to something simple."

That "ping" creates the solar flare and CME. The word "ping" belies its power of course. Temperatures in the hotspots of the ejection can reach almost 20 million Degrees Celsius.

The theory remained unconfirmed until Dudik was reviewing footage of the Sun for an unrelated project last year.

It is no surprise it has taken so long to make the discovery. The technology that created the video is part of the Solar Dynamics Observatory (SDO) satellite mission which was only launched in 2010 by NASA.

It watches the Sun in the ultra-violet with the Atmospheric Imaging Assembly (AIA) capturing ultra-high-definition images every 12 seconds.

The final piece of the theoretical jigsaw was put in place in 2012 by French scientists – a paper published just six days before the flare occurred. Dudik admits that the serendipity the discovery is hard to ignore. But in science, fortune favours the prepared: "Suddenly I knew what I was looking at," he said.

What Dudik witnessed was the ultra-violet dance caused by the magnetic field lines slipping around each other, continuously "unzipping" and reconnecting as the footpoints of the flare loops move around on the surface. But during the flare, the footpoint slipping motion is highly ordered and much faster than the random motions entangling the field before the flare.

Dudik's observations were helped by the sheer size of the flare he was looking at – it could encompass 35 Earths. Not only that, the flare was of the most energetic kind, known as an X Class flare, and it took around an hour to reach its maximum.

If it had happened in a smaller flare, the slipping motion might not have been visible, even with NASA's technology to help. Although only seen in an X Class flare to date, the mechanism might well be something which happens in all flares, said Dudik: "But we are not yet certain."

The importance of seeing the evidence of theory cannot be underestimated said Dr Mason: "In recent years there have been a lot of developments theoretically but unless you actually tie that down with observations you can speculate widely and move further away from the truth, not closer, without knowing it."



Contacts and sources:
Paul Holland
University of Cambridge

Autism Spiraling Out Of Control Says CDC: 10 Things You Need To Know

New data from CDC's Autism and Developmental Disabilities Monitoring (ADDM) Network show that the estimated number of children identified with autism spectrum disorder (ASD) continues to rise, and the picture of ASD in communities has changed. These new data can be used to promote early identification, plan for training and service needs, guide research, and inform policy so that children with ASD and their families get the help they need. 

CDC will continue tracking the changing number and characteristics of children with ASD, researching what puts children at risk for ASD, and promoting early identification, the most powerful tool we have now for making a difference in the lives of children. Learn the 10 things you need to know about CDC's latest ADDM Network report. You can also read the full report here.

10 Things You Need To Know About CDC's Latest Report from the Autism and Developmental Disabilities Monitoring Network

The following estimates are based on information collected from the health and special education (if available*) records of children who were 8 years old and lived in areas of Alabama, Arizona, Arkansas, Colorado, Georgia, Maryland, Missouri, New Jersey, North Carolina, Utah, and Wisconsin in 2010:

  1. About 1 in 68 children (or 14.7 per 1,000 8 year olds) were identified with ASD. It is important to remember that this estimate is based on 8-year-old children living in 11 communities. It does not represent the entire population of children in the United States.
  2. This new estimate is roughly 30% higher than the estimate for 2008 (1 in 88), roughly 60% higher than the estimate for 2006 (1 in 110), and roughly 120% higher than the estimates for 2002 and 2000 (1 in 150). We don't know what is causing this increase. Some of it may be due to the way children are identified, diagnosed, and served in their local communities, but exactly how much is unknown.
  3. The number of children identified with ASD varied widely by community, from 1 in 175 children in areas of Alabama to 1 in 45 children in areas of New Jersey.
  4. Almost half (46%) of children identified with ASD had average or above average intellectual ability (IQ greater than 85).
  5. Boys were almost 5 times more likely to be identified with ASD than girls. About 1 in 42 boys and 1 in 189 girls were identified with ASD.
  6. White children were more likely to be identified with ASD than black or Hispanic children. About 1 in 63 white children, 1 in 81 black children, and 1 in 93 Hispanic children were identified with ASD.
  7. Less than half (44%) of children identified with ASD were evaluated for developmental concerns by the time they were 3 years old.
  8. Most children identified with ASD were not diagnosed until after age 4, even though children can be diagnosed as early as age 2.
  9. Black and Hispanic children identified with ASD were more likely than white children to have intellectual disability. A previous study has shown that children identified with ASD and intellectual disability have a greater number of ASD symptoms and a younger age at first diagnosis. Despite the greater burden of co-occurring intellectual disability among black and Hispanic children with ASD, these new data show that there was no difference among racial and ethnic groups in the age at which children were first diagnosed.
  10. About 80% of children identified with ASD either received special education services for autism at school or had an ASD diagnosis from a clinician. This means that the remaining 20% of children identified with ASD had symptoms of ASD documented in their records, but had not yet been classified as having ASD by a community professional in a school or clinic.
Why is this information important and how can it be used?

CDC has been at the forefront of documenting changes in the number of children identified with ASD over the past decade. CDC data have motivated research to understand who is likely to develop ASD, why ASD develops, and how to best support individuals, families, and communities affected by ASD. More is understood about ASD than ever before, including which children are more likely to be identified, at what age they are likely to be diagnosed, and what factors may be putting children at risk for ASD. However, there remains an urgent need to continue the search for answers and provide help to people living with ASD.

The ADDM Network's latest information directs the focus on what we know now and what else we need to know to further characterize and address the needs of children with ASD and their families. Service providers (such as healthcare organizations and school systems), researchers, and policymakers can use ADDM Network data to support service planning, guide research into what factors put a child at risk for ASD and what interventions can help, and inform policies that promote improved outcomes in health care and education.
As a professional who works with children, what should I do if I think a child might have ASD?

You are a valuable resource to parents. They look to you for information on their child, and they trust you. You can follow a child's development, and encourage parents to do the same, by looking for developmental milestones—that is, how he or she plays, learns, speaks, acts, and moves. Visit CDC's "Learn the Signs. Act Early." website for free milestone checklists and other resources to help you track children's development.

The American Academy of Pediatrics recommends that children be screened for general development using standardized, validated tools at 9, 18, and 24 or 30 months and for ASD at 18 and 24 months or whenever a parent or provider has a concern. Learn more atwww.cdc.gov/ncbddd/childdevelopment/screening.html.

Additional Resources
To learn more about autism spectrum disorder, visit www.cdc.gov/autism
To learn more about CDC's "Learn the Signs. Act Early" program, visit www.cdc.gov/ActEarly
To learn more about CDC's Study to Explore Early Development, visit www.cdc.gov/SEED

*Education records were either not available or available for only some children in 5 of the 11 sites.

Thursday, March 27, 2014

Satellite Time-Lapse Movie Shows U.S. East Coast Snowy Winter

A new time-lapse animation of data from NOAA's GOES-East satellite provides a good picture of why the U.S. East Coast experienced a snowier than normal winter. The new animation shows the movement of storms from January 1 to March 24.

This new animation of NOAA's GOES-East satellite imagery shows the movement of winter storms from January 1 to March 24 making for a snowier-than-normal winter along the U.S. East coast and Midwest.

Image Credit: NASA/NOAA GOES Project

NOAA's Geostationary Operational Environmental Satellites or GOES-East imagery from January 1 to March 24 was compiled into three videos made by NASA/NOAA's GOES Project at NASA's Goddard Space Flight Center in Greenbelt, Md. The time-lapse videos run at different speeds: 0:41 seconds, 1:22 minutes and 2:44 minutes.

The movie of mid-day views from NOAA's GOES-East satellite ends three days after the vernal equinox. The vernal, or spring, equinox in the Northern Hemisphere occurred on March 20 at 12:57 p.m. EDT and marked the meteorological arrival of spring.

"The once-per-day imagery creates a stroboscopic slide show of persistent brutal winter weather," said Dennis Chesters of the NASA/NOAA GOES Project at NASA's Goddard Space Flight Center in Greenbelt, Md. who created the animation.

To create the video and imagery, NASA/NOAA's GOES Project takes the cloud data from NOAA's GOES-East satellite and overlays it on a true-color image of land and ocean created by data from the Moderate Resolution Imaging Spectroradiometer, or MODIS, instrument that flies aboard NASA's Aqua and Terra satellites. Together, those data created the entire picture of the storm and show its movement. After the storm system passes, the snow on the ground becomes visible.

Credit: NASA

According to NOAA's National Weather Service (NWS), as of the first day of spring Washington, D.C. had received 30.3 inches of snow for the 2013-2014 winter season. Washington's average winter snowfall is 15.3 inches, so the snowfall for the Nation's Capital was almost double that, exceeding it by 15.0 inches. An early spring snow on March 25 is expected to add to that total.

Further north in Boston, Mass. snowfall totals were even higher. The NWS reported that since July 1, 2013, 58.6 inches of snow had fallen in Boston. The average snowfall is 40.8 inches, so Boston was 17.8 inches over normal snowfall.

The big snow story this winter has been across the Great Lakes region which has also seen record amounts of snowfall. According to NWS in Buffalo, the city has received 121.7 inches, or more than 10 feet of snow, as of March 24. Chicago has received 80 inches of snow which is more than double their annual snowfall amount of 34.4 inches.

GOES satellites provide the kind of continuous monitoring necessary for intensive data analysis. Geostationary describes an orbit in which a satellite is always in the same position with respect to the rotating Earth. This allows GOES to hover continuously over one position on Earth's surface, appearing stationary. As a result, GOES provide a constant vigil for the atmospheric "triggers" for severe weather conditions such as tornadoes, flash floods, hail storms and hurricanes.


Contacts and sources:
Rob Gutro
NASA/Goddard Space Flight Center

2014 Major League Baseball projections By NJIT Mathematician And Baseball Guru

As Opening Day rapidly approaches for most Major League Baseball teams, NJIT Associate Professor of Mathematical Sciences Bruce Bukiet has prepared his annual MLB projections for the upcoming season. And, to the chagrin of loyal Mets fan Bukiet, New York's National League club looks to be in store for a disappointing year. Bukiet, who developed a mathematical model for calculating expected MLB win totals that was published in Operations Research, forecasts a mere 68 wins and a last-place finish for the Metropolitans.

Dodger Stadium
File:Dodger Stadium.jpg
Credit: Wikipedia

Bukiet's model can be used to project the number of games a team should be expected to win, the optimal batting order for a set of 9 batters, and how trading players will likely influence a team's number of wins. "This all began when I, because I am not very big or powerful, set out to prove that a singles hitter who gets on base frequently would contribute more to winning than a slugger who strikes out a lot," Bukiet recalls. "What I found was the opposite—the slugger will generate more wins."

For the 2014 season, Bukiet's model pegs Boston, Detroit, and Oakland as American League Division winners, with Anaheim and Seattle narrowly edging Tampa Bay and the New York Yankees in the AL Wildcard chase. In the National League, the numbers say St. Louis, Washington, and Los Angeles will take the top spots in their respective divisions. San Francisco and Atlanta are predicted to fill the Wildcard slots.

National League
Washington94
St. Louis95
Los Angeles95
Atlanta90
Milwaukee86
San Francisco88
Philadelphia79
Cincinnati82
Arizona83
Miami69
Pittsburgh76
San Diego82
New York Mets68
Chicago Cubs57
Colorado67


American League
Boston96
Detroit99
Oakland93
Tampa Bay86
Kansas City82
Anaheim87
New York Yankees86
Cleveland80
Seattle87
Toronto82
Chicago White Sox65
Texas85
Baltimore73
Minnesota63
Houston55

"There are some unknowns that the model can't incorporate in projecting team win totals before the season, such as rookie performance and trades that have not yet occurred, but, sadly for my Mets, the forcasts have been very accurate," Bukiet noted. In fact, Bukiet's preseason expectations for the Mets have been within 3 games of the win total attained by the team in 9 of the last 10 seasons.

Of his annual projections, Bukiet said, "I publish these to promote the power and relevance of math. Applying mathematical models to things that people care about or enjoy, like baseball, shows that math can be fun as well as very useful."


Contacts and sources:

Bukiet, who serves as an associate professor of mathematical sciences and as associate dean of the College of Science and Liberal Arts at the New Jersey Institute of Technology, has done extensive research into the mathematical modeling of physical phenomena, including the healing of wounds, detonation waves, and the dynamics of human balance. He also has applied modeling to sports and gambling. http://www.egrandslam.com.

 




 
 

Record Quantum Entanglement Of Multiple Dimensions

An international team of researchers, directed by researchers from the Institute for Quantum Optics and Quantum Information of the Austrian Academy of Sciences, and with participation from the Universitat Autònoma de Barcelona, has managed to create an entanglement of 103 dimensions with only two photons. The record had been established at 11 dimensions. The discovery could represent a great advance toward the construction of quantum computers with much higher processing speeds than current ones, and toward a better encryption of information.

An example of a two-dimensional subspace is shown. The intensities and phases for two different modes in the z basis are demonstrated, and their superposition leads to a mode in the x basis. The y basis can be constructed similarly. 
Credit: PNAS 

The states in which elementary particles, such as photons, can be found have properties which are beyond common sense. Superpositions are produced, such as the possibility of being in two places at once, which defies intuition. In addition, when two particles are entangled a connection is generated: measuring the state of one (whether they are in one place or another, or spinning one way or another, for example) affects the state of the other particle instantly, no matter how far away from each other they are.

Scientists have spent years combining both properties to construct networks of entangled particles in a state of superposition. This in turn allows constructing quantum computers capable of operating at unimaginable speeds, encrypting information with total security and conducting experiments in quantum mechanics which would be impossible to carry out otherwise.

Until now, in order to increase the "computing" capacity of these particle systems, scientists have mainly turned to increasing the number of entangled particles, each of them in a two-dimensional state of superposition: a qubit (the quantum equivalent to an information bit, but with values which can be 1, 0 or an overlap of both values). Using this method, scientists managed to entangle up to 14 particles, an authentic multitude given its experimental difficulty.

The research team was directed by Anton Zeilinger and Mario Krenn from the Institute for Quantum Optics and Quantum Information of the Austrian Academy of Sciences. It included the participation of Marcus Huber, researcher from the Group of Quantum Information and Quantum Phenomena from the UAB Department of Physics, as well as visiting researcher at the Institute of Photonic Sciences (ICFO). The team has advanced one more step towards improving entangled quantum systems.

In an article published this week in the journal Proceedings (PNAS), scientists described how they managed to achieve a quantum entanglement with a minimum of 103 dimensions with only two particles. "We have two Schrödinger cats which could be alive, dead, or in 101 other states simultaneously", Huber jokes, “plus, they are entangled in such a way that what happens to one immediately affects the other”. The results implies a record in quantum entanglements of multiple dimensions with two particles, established until now at 11 dimensions.

Instead of entangling many particles with a qubit of information each, scientists generated one single pair of entangled photons in which each could be in more than one hundred states, or in any of the superpositions of theses states; something much easier than entangling many particles. These highly complex states correspond to different modes in which photons may find themselves in, with a distribution of their characteristic phase, angular momentum and intensity for each mode.

"This high dimension quantum entanglement offers great potential for quantum information applications. In cryptography, for example, our method would allow us to maintain the security of the information in realistic situations, with noise and interference. In addition, the discovery could facilitate the experimental development of quantum computers, since this would be an easier way of obtaining high dimensions of entanglement with few particles", explains UAB researcher Marcus Huber.

Now that the results demonstrate that obtaining high dimension entanglements is accessible, scientists conclude in the article that the next step will be to search how they can experimentally control these hundreds of spatial modes of the photons in order to conduct quantum computer operations.


Contacts and sources:
Universitat Autònoma de Barcelona

Inspiration Linked To Bipolar Disorder Risk

Inspiration has been linked with people at risk of developing bipolar disorder for the first time in a study led by Lancaster University.

For generations, artists, musicians, poets and writers have described personal experiences of mania and depression, highlighting the unique association between creativity and bipolar disorder – experiences which are backed up by recent research. But, until now, the specific links between inspiration - the generation of ideas that form the basis of creative work - and bipolar disorder has received little attention.

Bipolar disorder, also known as manic-depressive illness, is a brain disorder that causes unusual shifts in mood, energy, activity levels, and the ability to carry out day-to-day tasks. Symptoms of bipolar disorder are severe. They are different from the normal ups and downs that everyone goes through from time to time. Bipolar disorder symptoms can result in damaged relationships, poor job or school performance, and even suicide. But bipolar disorder can be treated, and people with this illness can lead full and productive lives.

Credit: NIMH

Scientists are studying the possible causes of bipolar disorder. Most scientists agree that there is no single cause. Rather, many factors likely act together to produce the illness or increase risk. Bipolar disorder often develops in a person's late teens or early adult years. At least half of all cases start before age 25. Some people have their first symptoms during childhood, while others may develop symptoms late in life.

New research by Professor by Steven Jones and Dr Alyson Dodd, of Lancaster University, and Dr June Gruber at Yale University, has shown people at higher risk for developing bipolar disorder consistently report stronger experiences of inspiration than those at lower risk.

The paper ‘Development and Validation of a New Multidimensional Measure of Inspiration: Associations with Risk for Bipolar Disorder’, published in PLOS One this week, found a specific link between those people who found their source of inspiration within themselves and risk for bipolar disorder.

Professor Jones, co-director of Lancaster University’s Spectrum Centre, said: “It appears that the types of inspiration most related to bipolar vulnerability are those which are self-generated and linked with strong drive for success.

“Understanding more about inspiration is important because it is a key aspect of creativity which is highly associated with mental health problems, in particular bipolar disorder. People with bipolar disorder highly value creativity as a positive aspect of their condition. This is relevant to clinicians, as people with bipolar disorder may be unwilling to engage with treatments and therapies which compromise their creativity.”

As part of the study, 835 undergraduate students were recruited to complete online questionnaires from both Yale University in the U.S. and Lancaster University in the U.K.

They were asked to complete a questionaire which measured their bipolar risk using a widely-used and well-validated 48-item measure which captures episodic shifts in emotion, behaviour, and energy called The Hypomanic Personality Scale (HPS).

They also completed a new questionnaire developed by the team which was designed to explore beliefs about inspiration, in particular the sources of inspiration – whether individuals thought it came from within themselves, from others or the wider environment. This measure was called the the EISI (External and Internal Sources of Inspiration) measure.

The students who scored highly for a risk of bipolar also consistently scored more highly than the others for levels of inspiration and for inspiration which they judged to have come from themselves.

Researchers say, although this pattern was consistent, the effect sizes were relatively modest so, although inspiration and bipolar risk are linked, it is important to explore other variables to get a fuller picture and to conduct further research with individuals with a clinical diagnosis of bipolar disorder.

The research team is currently inviting UK-based individuals with a diagnosis of bipolar disorder to take part in an online survey exploring associations between inspiration, mood and recovery. Go to:


Contacts and sources:
Lancaster University
NIMH

Mars-Bound Comet Sprouts Multiple Jets Seen By Hubble Space Telescope


Comet Siding Spring is plunging toward the Sun along a roughly 1-million-year orbit. The comet, discovered in 2013, was within the radius of Jupiter's orbit when the Hubble Space Telescope photographed it on March 11, 2014. Hubble resolves two jets of dust coming from the solid icy nucleus. 

Credit: NASAESA, and J.-Y. Li (Planetary Science Institute)

These persistent jets were first seen in Hubble pictures taken on Oct. 29, 2013. The feature should allow astronomers to measure the direction of the nucleus's pole, and hence, rotation axis. The comet will make its closest approach to our Sun on Oct. 25, 2014, at a distance of 130 million miles, well outside Earth's orbit. 

On its inbound leg, Comet Siding Spring will pass within 84,000 miles of Mars on Oct. 19, 2014, which is less than half the Moon's distance from Earth. The comet is not expected to become bright enough to be seen by the naked eye.

[Left]  This is a Hubble Space Telescope picture of comet C/2013 A1 Siding Spring as observed on March 11, 2014. At that time the comet was 353 million miles from Earth. The solid icy nucleus is too small to be resolved by Hubble, but it lies at the center of a dust cloud, called a coma, that is roughly 12,000 miles across in this image.

Credit: NASA, ESA, and J.-Y. Li (Planetary Science Institute)

[Right]  When the glow of the coma is subtracted through image processing, which incorporates a smooth model of the coma's light distribution, Hubble resolves what appear to be two jets of dust coming off the nucleus in opposite directions. This means that only portions of the surface of the nucleus are presently active as they are warmed by sunlight, say researchers. These jets were first seen in Hubble pictures taken on Oct. 29, 2013. The feature should allow astronomers to measure the direction of the nucleus's pole, and hence, rotation axis.

Discovered in January 2013 by Robert H. McNaught at Siding Spring Observatory in New South Wales, Australia, the comet is falling toward the Sun along a roughly 1-million-year orbit and is now within the radius of Jupiter's orbit. The comet will make its closest approach to our Sun on Oct. 25, at a distance of 130 million miles — well outside Earth's orbit. On its inbound leg, Comet Siding Spring will pass within 84,000 miles of Mars on Oct. 19, 2014 (less than half the Moon's distance from Earth). The comet is not expected to become bright enough to be seen by the naked eye.

An earlier Hubble observation made on Jan. 21, 2014, caught the comet as Earth was crossing the comet's orbital plane. This special geometry allows astronomers to better determine the speed of the dust coming off the nucleus. "This is critical information that we need to determine how likely and how much the dust grains in the coma will impact Mars and Mars spacecraft," said Jian-Yang Li of the Planetary Science Institute in Tucson, Ariz.

This visible-light image was taken with Hubble's Wide Field Camera 3.

Compass and Scale Image for Comet C/2013 A1 Siding Spring (3 Epochs)
Credit:  NASAESA, and J.-Y. Li (Planetary Science Institute)
 
This is a series of Hubble Space Telescope pictures of comet C/2013 A1 Siding Spring as observed on Oct. 29, 2013; Jan. 21, 2014; and March 11, 2014. The distances from Earth were, respectively, 376 million miles, 343 million miles, and 353 million miles. The solid icy nucleus is too small to be resolved by Hubble, but it lies at the center of a dusty coma that is roughly 12,000 miles across in these images.

Credit: NASAESA, and J.-Y. Li (Planetary Science Institute)

When the glow of the coma is subtracted through image processing, which incorporates a smooth model of the coma's light distribution, Hubble resolves what appear to be two jets of dust coming off the nucleus in opposite directions. The jets have persisted through the three Hubble visits, with their directions in the sky nearly unchanged. These visible-light images were taken with Hubble's Wide Field Camera 3.
 

Engineered Bacteria Makes Rocket Fuel

Researchers at the Georgia Institute of Technology and the Joint BioEnergy Institute have engineered a bacterium to synthesize pinene, a hydrocarbon produced by trees that could potentially replace high-energy fuels, such as JP-10, in missiles and other aerospace applications. With improvements in process efficiency, the biofuel could supplement limited supplies of petroleum-based JP-10, and might also facilitate development of a new generation of more powerful engines.
 
By inserting enzymes from trees into the bacterium, first author and Georgia Tech graduate student Stephen Sarria, working under the guidance of assistant professor Pamela Peralta-Yahya, boosted pinene production six-fold over earlier bioengineering efforts. Though a more dramatic improvement will be needed before pinene dimers can compete with petroleum-based JP-10, the scientists believe they have identified the major obstacles that must be overcome to reach that goal.

Georgia Tech researchers examine the production of the hydrocarbon pinene in a series of laboratory test tubes. Shown are (l-r) Pamela Peralta-Yahya, an assistant professor in the School of Chemistry and Biochemistry and the School of Chemical and Biomolecular Engineering, and Stephen Sarria, a graduate student in the School of Chemistry and Biochemistry.
Credit: Georgia Tech Photo: Rob Felt

Funded by Georgia Tech startup funds awarded to Peralta-Yahya's lab and by the U.S. Department of Energy's Office of Science, the research was reported February 27, 2014, in the journalACS Synthetic Biology.

"We have made a sustainable precursor to a tactical fuel with a high energy density," said Peralta-Yahya, an assistant professor in the School of Chemistry and Biochemistry and the School of Chemical and Biomolecular Engineering at Georgia Tech. "We are concentrating on making a 'drop-in' fuel that looks just like what is being produced from petroleum and can fit into existing distribution systems."

Fuels with high energy densities are important in applications where minimizing fuel weight is important. The gasoline used to power automobiles and the diesel used mainly in trucks both contain less energy per liter than the JP-10. The molecular arrangement of JP-10, which includes multiple strained rings of carbon atoms, accounts for its higher energy density.


By placing colonies of E. coli engineered to produce pinene into test tubes containing glucose, researchers were able to determine which enzyme combinations produced the hydrocarbon most efficiently.
Credit: Georgia Tech Photo: Rob Felt

The amount of JP-10 that can be extracted from each barrel of oil is limited, and sources of potentially comparable compounds such as trees can't provide much help. The limited supply drives the price of JP-10 to around $25 per gallon. That price point gives researchers working on a biofuel alternative a real advantage over scientists working on replacing gasoline and diesel.

"If you are trying to make an alternative to gasoline, you are competing against $3 per gallon," Peralta-Yahya noted. "That requires a long optimization process. Our process will be competitive with $25 per gallon in a much shorter time."

While much research has gone into producing ethanol and bio-diesel fuels, comparatively little work has been done on replacements for the high-energy JP-10.

Peralta-Yahya and collaborators set out to improve on previous efforts by studying alternative enzymes that could be inserted into the E. coli bacterium. They settled on two classes of enzymes – three pinene synthases (PS) and three geranyl diphosphate synthases (GPPS) – and experimented to see which combinations produced the best results.

Their results were much better than earlier efforts, but the researchers were puzzled because for a different hydrocarbon, similar enzymes produced more fuel per liter. So they tried an additional step to improve their efficiency. They placed the two enzymes adjacent to one another in the E. coli cells, ensuring that molecules produced by one enzyme would immediately contact the other. That boosted their production to 32 milligrams per liter – much better than earlier efforts, but still not competitive with petroleum-based JP-10.

Peralta-Yahya believes the problem now lies with built-in process inhibitions that will be more challenging to address.


Pamela Peralta-Yahya, an assistant professor in the Georgia Tech School of Chemistry and Biochemistry and the School of Chemical and Biomolecular Engineering, shows samples used to study the production of pinene by colonies of bioengineered E. coli.
Credit: Georgia Tech Photo: Rob Felt


"We found that the enzyme was being inhibited by the substrate, and that the inhibition was concentration-dependent," she said. "Now we need either an enzyme that is not inhibited at high substrate concentrations, or we need a pathway that is able to maintain low substrate concentrations throughout the run. Both of these are difficult, but not insurmountable, problems."

To be competitive, the researchers will have to boost their production of pinene 26-fold. Peralta-Yahya says that's within the range of possibilities for bioengineering the E. coli.

"Even though we are still in the milligrams per liter level, because the product we are trying to make is so much more expensive than diesel or gasoline means that we are relatively closer," she said.

Theoretically, it may be possible to produce pinene at a cost lower than that of petroleum-based sources. If that can be done – and if the resulting bio-fuel operates well in these applications – that could open the door for lighter and more powerful engines fueled by increased supplies of high-energy fuels. Pinene dimers, which result from the dimerization of pinene, have already been shown to have an energy density similar to that of JP-10.


###



Co-authors from the Joint BioEnergy Institute included Betty Wong, Hector Garcia Martin and Professor Jay D. Keasling, co-corresponding author of the paper.

CITATION: Stephen Sarria, et al., "Microbial Synthesis of Pinene," (ACS Synthetic Biology, 2014). (http://dx.doi.org/10.1021/sb4001382).

This work was started at the DOE Joint BioEnergy Institute (JBEI) and finished at the Georgia Institute of Technology. The work at JBEI was funded by the U.S. Department of Energy, Office of Science, Office of Biological and Environmental Research through contract DE-AC02-05CH11231 between Lawrence Berkeley National Laboratory and the U.S. Department of Energy. The work at the Georgia Institute of Technology was funded by startup funds awarded to the Peralta-Yahya laboratory. Any opinions expressed are those of the authors and do not necessarily represent the official views of the DOE.



Contacts and sources:
John Toon
Georgia Institute of Technology

Is It Safe To Pee In The Pool? An Answer To That Perennial Question

Sanitary-minded pool-goers who preach “no peeing in the pool,” despite ordinary and Olympic swimmers admitting to the practice, now have scientific evidence to back up their concern. 

Researchers are reporting that when mixed, urine and chlorine can form substances that can cause potential health problems. Their study appears in ACS’ journal Environmental Science & Technology.

Though competitive swimmers have admitted to peeing in the pool, the common practice should be a no-no, a new study finds.

Credit: Purestock/Thinkstock 

Jing Li, Ernest Blatchley, III, and colleagues note that adding chlorine to pool water is the most common way to kill disease-causing microbes and prevent swimmers from getting sick. But as people swim, splash, play — and pee — in the pool, chlorine mixes with sweat and urine and makes other substances. Two of these compounds, including trichloramine (NCl3) and cyanogen chloride (CNCl), are ubiquitous in swimming pools. The first one is associated with lung problems, and the second one can also affect the lungs, as well as the heart and central nervous system. But scientists have not yet identified all of the specific ingredients in sweat and urine that could cause these potentially harmful compounds to form. So Li’s team looked at how chlorine interacts with uric acid, a component of sweat and urine.

They mixed uric acid and chlorine, and within an hour, both NCl3 and CNCl formed. Though some uric acid comes from sweat, the scientists calculated that more than 90 percent of the compound in pools comes from urine. They conclude that swimmers can improve pool conditions by simply urinating where they’re supposed to — in the bathrooms.

The authors acknowledge funding from the Chinese Universities Scientific Fund, the National Natural Science Foundation of China and the National Swimming Pool Foundation.


Contacts and sources:
Michael Bernstein
American Chemical Society
 

Phantom Fields Hide Dark Energy

Quintessence and phantom fields, two hypotheses formulated using data from satellites, such as Planck and WMAP, are among the many theories that try to explain the nature of dark energy. Now researchers from Barcelona and Athens suggest that both possibilities are only a mirage in the observations and it is the quantum vacuum which could be behind this energy that moves our universe.

Observations of Planck and other satellites help to solve the equation of the state of dark energy.

Credit: ESA et al.

Cosmologists believe that some three quarters of the universe are made up of a mysterious dark energy which would explain its accelerated expansion. The truth is that they do not know what it could be, therefore they put forward possible solutions.

One is the existence of quintessence, an invisible gravitating agent that instead of attracting, repels and accelerates the expansion of the cosmos. From the Classical World until the Middle Ages, this term has referred to the ether or fifth element of nature, together with earth, fire, water and air.

Another possibility is the presence of an energy or phantom field whose density increases with time, causing an exponential cosmic acceleration. This would reach such speed that it could break the nuclear forces in the atoms and end the universe in some 20,000 million years, in what is called the Big Rip.

The experimental data that underlie these two hypotheses comes from satellites such as Planck of the European Space Agency (ESA) and Wilkinson Microwave Anisotropy Probe (WMAP) of NASA. Observations from the two probes are essential for solving the so-called equation of the state of dark energy, a characterising mathematical formula, the same as that possessed by solid, liquid and gaseous states.

Now researchers from the University of Barcelona (Spain) and the Academy of Athens (Greece) have used the same satellite data to demonstrate that the behaviour of dark energy does not need to resort to either quintessence or phantom energy in order to be explained. The details have been published in the Monthly Notices of the Royal Astronomical Society journal.

"Our theoretical study demonstrates that the equation of the state of dark energy can simulate a quintessence field, or even a phantom field, without being one in reality, thus when we see these effects in the observations from WMAP, Planck and other instruments, what we are seeing is an mirage," told SINC Joan Solà, one of the authors from University of Barcelona.

Nothing fuller than the quantum vacuum

"What we think is happening is a dynamic effect of the quantum vacuum, a parameter that we can calculate," explained the researcher. The concept of the quantum vacuum has nothing to do with the classic notion of absolute nothingness. "Nothing is more 'full' than the quantum vacuum since it is full of fluctuations that contribute fundamentally to the values that we observe and measure," Solà pointed out.

These scientists propose that dark energy is a type of dynamical quantum vacuum energy that acts in the accelerated expansion of our universe. This is in contrast to the traditional static vacuum energy or cosmological constant.

The drawback with this strange vacuum is that it is the source of problems such as the cosmological constant, a discrepancy between the theoretical data and the predictions of the quantum theory that drives physicists mad.

"However, quintessence and phantom fields are still more problematic, therefore the explanation based on the dynamic quantum vacuum could be the more simple and natural one," concluded Solà.


Contacts and sources:
SINC
FECYT - Spanish Foundation for Science and Technology



Citation:  Spyros Basilakos, Joan Sola. "Effective equation of state for running vacuum: "mirage" quintessence and phantom dark energy". Monthly Notices of the Royal Astronomical Society 437(4), February 2014. DOI:10.1093/mnras/stt2135.

Wednesday, March 26, 2014

New Dwarf Planet Found At Solar System's Edge, Nicknamed Biden

The solar system has a new most-distant family member.

Scientists using ground based observatories have discovered an object that is believed to have the most distant orbit found beyond the known edge of our solar system. Named 2012 VP113, the observations of the object -- possibly a dwarf planet -- were obtained and analyzed with a grant from NASA. A dwarf planet is an object in orbit around the sun that is large enough to have its own gravity pull itself into a spherical, or nearly round, shape.

These images show the discovery of 2012 VP113 taken about 2 hours apart on Nov. 5, 2012. The motion of 2012 VP113 stands out compared to the steady state background of stars and galaxies.
Image Credit: 
These images show the discovery of 2012 VP113 taken about 2 hours apart on November 5, 2012. The motion of 2012 VP113 stands out compared to the steady state background of stars and galaxies.
Scott Sheppard/Carnegie Institution for Science

The detailed findings are published in the March 27 edition of Nature.

“This discovery adds the most distant address thus far to our solar system’s dynamic neighborhood map,” said Kelly Fast, discipline scientist for NASA's Planetary Astronomy Program, Science Mission Directorate (SMD) at NASA Headquarters, Washington. “While the very existence of the inner Oort Cloud is only a working hypothesis, this finding could help answer how it may have formed.”

Orbit diagram for the outer solar system. The Sun and Terrestrial planets are at the center. The orbits of the four giant planet Jupiter, Saturn, Uranus and Neptune are shown by purple solid circles. The Kuiper Belt (including Pluto) is shown by the dotted light blue region just beyond the giant planets. Sedna's orbit is shown in orange while 2012 VP113's orbit is shown in red. Both objects are currently near their closest approach to the Sun (perihelion). They would be too faint to detect when in the outer parts of their orbits. Notice that both orbits have similar perihelion locations on the sky and both are far away from the giant planet and Kuiper Belt regions. b) Plot of all the known bodies in the outer solar system with their closest approach to the Sun (Perihelion) and eccentricity.

Credit: Scott S. Sheppard Carnegie Institution for Science  

The observations and analysis were led and coordinated by Chadwick Trujillo of the Gemini Observatory in Hawaii and Scott Sheppard of the Carnegie Institution in Washington. They used the National Optical Astronomy Observatory’s 13-foot (4-meter) telescope in Chile to discover 2012 VP113. The telescope is operated by the Foundation of Universities for Research in Astronomy, under contract with the National Science Foundation. The Magellan 21-foot (6.5-meter) telescope at Carnegie’s Las Campanas Observatory in Chile was used to determine the orbit of 2012 VP113 and obtain detailed information about its surface properties. 

The discovery images of 2012 VP113 (affectionately nicknamed "Biden" because of the VP in the provisional name). It has the most distant orbit known in our Solar System. Three images of the night sky, each taken about 2 hours apart, were combined into one. The first image was artificially colored red, second green and third blue. 2012 VP113 moved between each image as seen by the red, green and blue dots. The background stars and galaxies did not move and thus their red, green and blue images combine to show up as white sources.
Credit: Scott S. Sheppard/Carnegie Institution for Science

“The discovery of 2012 VP113 shows us that the outer reaches of our solar system are not an empty wasteland as once was thought,” said Trujillo, lead author and astronomer. “Instead, this is just the tip of the iceberg telling us that there are many inner Oort Cloud bodies awaiting discovery. It also illustrates how little we know about the most distant parts of our solar system and how much there is left to explore.”

Our known solar system consists of the rocky planets like Earth, which are close to the sun; the gas giant planets, which are further out; and the frozen objects of the Kuiper belt, which lie just beyond Neptune's orbit. Beyond this, there appears to be an edge to the solar system where only one object somewhat smaller than Pluto, Sedna, was previously known to inhabit for its entire orbit. But the newly found 2012 VP113 has an orbit that stays even beyond Sedna, making it the furthest known in the solar system.

Sedna was discovered beyond the Kuiper Belt edge in 2003, and it was not known if Sedna was unique, as Pluto once was thought to be before the Kuiper Belt was discovered in 1992. With the discovery of 2012 VP113, Sedna is not unique, and 2012 VP113 is likely the second known member of the hypothesized inner Oort cloud. The outer Oort cloud is the likely origin of some comets.

“The search for these distant inner Oort cloud objects beyond Sedna and 2012 VP113 should continue, as they could tell us a lot about how our solar system formed and evolved," says Sheppard.

Sheppard and Trujillo determine that about 900 objects with orbits like Sedna and 2012 VP113 with sizes larger than 621 miles (1000 km) may exist. 2012 VP113 is likely one of hundreds of thousands of distant objects that inhabit the region in our solar system scientists refer to as the inner Oort cloud. The total population of the inner Oort cloud is likely bigger than that of the Kuiper Belt and main asteroid belt.

“Some of these inner Oort cloud objects could rival the size of Mars or even Earth,” said Sheppard. This is because many of the inner Oort cloud objects are so distant that even very large ones would be too faint to detect with current technology.”

2012 VP113’s closest orbit point to the sun brings it to about 80 times the distance of the Earth from the sun, a measurement referred to as an astronomical unit or AU. The rocky planets and asteroids exist at distances ranging between .39 and 4.2 AU. Gas giants are found between 5 and 30 AU, and the Kuiper belt (composed of hundreds of thousands of icy objects, including Pluto) ranges from 30 to 50 AU. In our solar system there is a distinct edge at 50 AU. Until 2012 VP113 was discovered, only Sedna, with a closest approach to the Sun of 76 AU, was known to stay significantly beyond this outer boundary for its entire orbit. 

Credit: Scott S. Sheppard/Carnegie Institution for Science

Both Sedna and 2012 VP113 were found near their closest approach to the sun, but they both have orbits that go out to hundreds of AU, at which point they would be too faint to discover. The similarity in the orbits found for Sedna, 2012 VP113 and a few other objects near the edge of the Kuiper Belt suggests the new object’s orbit might be influenced by the potential presence of a yet unseen planet perhaps up to 10 times the size of Earth. Further studies of this deep space arena will continue.

Contacts and sources:
NASA
Scott S. Sheppard Carnegie Institution for Science
Chad Trujillo Gemini Observatory

Intercepting Asteroids To Avoid Armageddon

It sounds like the script for a Hollywood film: a giant meteorite from outer space heading straight for the Earth and threatening the destruction of mankind. And yet such a scenario does represent a real threat to our planet, as researchers reckon that we can expect an asteroid to collide with Earth every few hundred years.

 If an asteroid hits, the consequences are clear. The Barringer Crater in Arizona is 1200m wide and was made by an asteroid 50 meters in size.
Credit: © Stefan Seip/DLR
  
In real life, though, nobody wants to rely on a rescue plan hastily improvised at the last minute. That is why the European-funded research project NEOShield was set up, with research scientists from Fraunhofer Institute for High-Speed Dynamics, Ernst-Mach-Institut, EMI in Freiburg among those contributing to the work on the asteroid impact avoidance system. The teams of researchers are working to develop concepts designed to help avert these impacts and to alter asteroids’ orbits as they race toward Earth.

In the case of an asteroid on a collision course for Earth, scientists refer to the point in time and space when the asteroid will impact with the Earth. To prevent the impact, the asteroid has to be speeded up or slowed down so that the Earth has either passed by or is yet to arrive when the asteroid reaches the hypothetical point of collision.

"One solution would be to launch a relatively solid space probe designed to hit the asteroid at high speed," says Professor Alan Harris from the German Aerospace Center’s Institute of Planetary Research as he explains the basic concept. Professor Harris leads the EU-funded NEOShield project. 


 
Meanwhile, scientists from Fraunhofer EMI are helping to research the foundations of this technique. "Asteroids are typically made of porous materials, so the first step is to build up a basic understanding of what happens when materials like that are hit by a foreign object," says Dr. Frank Schäfer, head of the spacecraft technology group at Fraunhofer EMI.

An asteroid approaches Earth.
Credit: © NASA

To do this, he and his team use a light gas gun – one of the fastest accelerator facilities in the world. Within the gun’s approximately one-and-a-half-meter barrel, millimeter-sized pellets are accelerated to speeds of almost 10km per second. That equates to a speed of around 36,000 kilometers per hour.

The Fraunhofer scientists use what is known as a target chamber to bombard stone blocks used to approximate asteroids with a high-velocity mini projectile. The aim is to analyze with as much precision as possible how the material reacts. High-speed cameras document the experiment by taking up to 30,000 pictures per second. As in the crash testing of vehicles, the Fraunhofer researchers are interested in quantifying the force of the collision. Data are adjusted to account for actual scale and are imported continuously into computer simulations.

In the long term, NEOShield project leader Professor Harris would like to see the defense techniques that are the subject of this research tested in international space missions: »This kind of test mission is bound to throw up a few surprises, and will teach us a great deal.«

Incidentally, Harris reckons that averting an oncoming asteroid from its collision course by means of a huge explosion – just like in a Hollywood film – could in fact be an option in an emergency. Time would have to be pressing though, or the object concerned at least a kilometer in diameter.


Contacts and sources: 
Dr. Frank Schäfer
Fraunhofer Institute for High-Speed Dynamics, Ernst-Mach-Institut, EMI

More information:
The NEOShield Project 

Sugar Produced From Wood In Lignocellulose Biorefinery, New Raw Material Source For Biofuels and Chemicals

No more oil – renewable raw materials are the future. This motto not only applies to biodiesel, but also to isobutene, a basic product used in the chemical industry. In a pilot plant researchers now want to obtain this substance from sugar instead of oil for the first time. And in order not to threaten food supplies, in the long term the sugar should come from wood or straw and not from sugar beet. 

 In the pilot plant at the Fraunhofer Center for Chemical-Biotechnological Processes CBP researchers are producing oil substitutes from renewable raw materials. 

Credit;  © Gunter Binsack / Fraunhofer CBP

Plastic, gasoline, rubber – very many items we use every day are based on oil. But this raw material is becoming increasingly scarcer. Step by step researchers are therefore investigating possibilities for using renewable raw materials to replace oil. 

One well-known example of this is biodiesel, which comes not from oil sources, but from fields of yellow-flowering rape. In future it is planned to produce another substance from plants, namely isobutene, a basic chemical used in the chemical industry to produce fuels, solvents, elastomers or even antiknock agents in fuel. Sugar, not oil, will be used to produce this isobutene. Researchers at the Fraunhofer Center for Chemical-Biotechnological Processes CBP in Leuna are planning to set up a pilot plant.

Valuable product of “digestion“

The basis of this was provided by staff at the company Global Bioenergies: They introduced the unique metabolic conversion of sugar to isobutene into a microorganism: If sugar is added to this microorganism, it "digests" it – and out comes gaseous isobutene. To develop and construct this pilot plant, Global Energies will receive 5.7 million euros from the German Federal Ministry of Education and Research BMBF. 

The company got the CBP on board as a partner. "We have the expertise for both the biotechnological and the chemical processes, and we meet all the requirements for successfully getting the project off the ground" said a very pleased Gerd Unkelbach, Director of CBP. "For example, we are able to handle isobutene: when mixed with air explosive mixtures are formed".

Construction of the 600 m2 pilot plant will start in the CBP technical center as of the fall of 2014 and it is planned that it will come into operation a year later. The large-scale processes will take place like in the laboratory: Sugar and the microorganism go into a fermenter which converts the sugar to gaseous isobutene. The isobutene is separated, purified, liquefied and filled into containers. Once the process has been transferred from the laboratory to the pilot scale, the plant will produce up to 100 tons of isobutene per year.

Sugar from wood

Sugar as a raw material has a big advantage over oil: It grows back. However, as a result of this, isobutene production is in competition with the food industry, as the sugar that ends up in the pilot plant is lost as a foodstuff. For this reason the researchers want to change tack in future, away from sugar from sugar beet to sugar from renewable raw materials that are not suitable as foods. Wood for example.

"The sugar we use is thus totally independent of food production" explained Unkelbach. The technological basis of this is already available at the CPB in the lignocellulose biorefinery. Here the researchers break wood down into its individual components: Cellulose, i.e. sugar, hemicelluloses and lignin.

Researchers will present these new activities at the Industrial GreenTec as part of the Hannover Trade Fair from 7 to 11 April in Hannover (Hall 6, Booth J18).


Contacts and sources: 
Gerd Unkelbach
Fraunhofer Center for Chemical-Biotechnological Processes CBP

Light As Bright As A Million Suns Illuminates Fossilized Plant

Scientists have used one of the brightest lights in the Universe to expose the biochemical structure of a 50 million-year-old fossil plant to stunning visual effect.

The team of palaeontologists, geochemists and physicists investigated the chemistry of exceptionally preserved fossil leaves from the Eocene-aged ‘Green River Formation’ of the western United States by bombarding the fossils with X-rays brighter than a million suns produced by synchrotron particle accelerators.

Credit: University of Manchester

Researchers from Britain’s  and Diamond Light Source and the Stanford Synchrotron Radiation Lightsource in the US have published their findings, along with amazing images, in Metallomics; one of the images is featured on the cover of the latest edition of the Royal Society of Chemistry journal.

Lead author Dr Nicholas Edwards, a postdoctoral researcher at The University of Manchester, said: “The synchrotron has already shown its potential in teasing new information from fossils, in particular our group’s previous work on pigmentation in fossil animals. With this study, we wanted to use the same techniques to see whether we could extract a similar level of biochemical information from a completely different part of the tree of life.

“To do this we needed to test the chemistry of the fossil plants to see if the fossil material was derived directly from the living organisms or degraded and replaced by the fossilisation process.

“We know that plant chemistry can be preserved over hundreds of millions of years – this preserved chemistry powers our society today in the form of fossil fuels. However, this is just the ‘combustible’ part; until now no one has completed this type of study of the other biochemical components of fossil plants, such as metals.”

By combining the unique capabilities of two synchrotron facilities, the team were able to produce detailed images of where the various elements of the periodic table were located within both living and fossil leaves, as well as being able to show how these elements were combined with other elements.

The work shows that the distribution of copper, zinc and nickel in the fossil leaves was almost identical to that in modern leaves. Each element was concentrated in distinct biological structures, such as the veins and the edges of the leaves, and the way these trace elements and sulphur were attached to other elements was very similar to that seen in modern leaves and plant matter in soils.

Co-author Professor Roy Wogelius, from Manchester’s School of Earth, Atmospheric and Environmental Sciences, said: “This type of chemical mapping and the ability to determine the atomic arrangement of biologically important elements, such as copper and sulfur, can only be accomplished by using a synchrotron particle accelerator.

“In one beautiful specimen, the leaf has been partially eaten by prehistoric caterpillars – just as modern caterpillars feed – and their feeding tubes are preserved on the leaf. The chemistry of these fossil tubes remarkably still matches that of the leaf on which the caterpillars fed.”

The data from a suite of other techniques has led the team to conclude that the chemistry of the fossil leaves is not wholly sourced from the surrounding environment, as has previously been suggested, but represents that of the living leaves. Another modern-day connection suggests a way in which these specimens are so beautifully preserved over millions of years.

Manchester palaeontologist and co-author Dr Phil Manning said: “We think that copper may have aided preservation by acting as a ‘natural’ biocide, slowing down the usual microbial breakdown that would destroy delicate leaf tissues. This property of copper is used today in the same wood preservatives that you paint on your garden fence before winter approaches.”


Contacts and sources:
Aeron Haworth
University of Manchester

Look For Alien Life Around F-Type Stars Says Physicists

Scientists searching for habitable planets beyond Earth shouldn’t overlook F-type stars in favor of their more abundant, smaller and cooler cousins, according to new research from University of Texas at Arlington physicists.

The Morgan–Keenan spectral classification
File:Morgan-Keenan spectral classification.png


Stars fall into seven lettered categories according to their surface temperature, but they also differ in other factors such as mass, luminosity and abundance in the universe. Scientists looking for habitable planets typically have focused on the less massive end of the spectrum, where our own G-type Sun as well as the even less massive K and M-type stars reside.

F-types are the in the middle of the scale, more massive and hotter than our Sun. Their increased ultraviolet radiation has been thought to be a limiting factor for sustaining life. In addition, there just aren’t as many of them.

Manfred Cuntz
Credit:   UT Arlington

But, UT Arlington Physics Professor Manfred Cuntz, contends: “F-type stars are not hopeless.”

Cuntz said: “There is a gap in attention from the scientific community when it comes to knowledge about F-type stars, and that is what our research is working to fill. It appears they may indeed be a good place to look for habitable planets.”

Cuntz and UT Arlington Ph.D. student Satoko Sato teamed with researchers from the University of Guanajuato in Mexico for a new work published this week by the International Journal of Astrobiology. They argue that since F-type stars have a wider habitability zone – the area where conditions are right for general Earth-type planets to develop and sustain life – they warrant additional consideration.

Satoko Sato
Credit:   UT Arlington

The researchers also explored the potential limitations caused by UV radiation by estimating the potential damage that carbon-based macromolecules necessary for life would sustain in F-type stars’ habitable zones. To do that, they used DNA as an example and compared estimates of DNA damage on planets in F-type star systems to the damage that would be done on Earth by the Sun.

The research included calculations for several different types of F-type stars, at different points in their evolution. It found encouraging results. In a few cases, the damage estimates were similar to the damage on Earth, if Earth did not have an atmosphere. The damage estimate was even less if an atmosphere on the planet in the F-type system was assumed.

“Our study is a further contribution toward the exploration of the exobiological suitability of stars hotter and, by implication, more massive than the Sun … at least in the outer portions of F-star habitable zones, UV radiation should not be viewed as an insurmountable hindrance to the existence and evolution of life,” the study said.

The study is called “Habitability around F-type Stars” and is available online here:http://journals.cambridge.org/action/displayIssue?jid=IJA&tab=firstview. Co-authors from the University of Guanajuato were Cecilia Maria Guerra Olvera, Dennis Jack and Klaus-Peter Schröder.

Pamela Jansma, dean of the UT Arlington College of Science, said the collaboration between Sato and Cuntz is representative of the research advances that can result from a strong faculty-graduate student mentorship.

“Astrophysics as it relates to habitable planets is an increasingly popular topic, and Dr. Cuntz and his student have enriched that conversation by weaving elements of theoretical biology and planetary science into their outstanding work,” Jansma said.

The new paper suggests that further research be done that would include detailed chemical models of planetary atmospheres, examples of specific star-planet systems with observational data and cases of F-type stars that are members of binary or higher-order systems.


Contacts and sources:
Traci Peterson

An Ominous Fate Awaits Earth Says Astronomer

New theory advanced for how collapsed stars become polluted -- pointing to ominous fate that awaits planet Earth

 
A decades old space mystery has been solved by an international team of astronomers led by Professor Martin Barstow of the University of Leicester and President-elect of the Royal Astronomical Society.

Scientists from the University of Leicester and University of Arizona investigated hot, young, white dwarfs — the super-dense remains of Sun-like stars that ran out of fuel and collapsed to about the size of the Earth. Their research is featured in MNRAS- the Monthly Notices of the Royal Astronomical Society, published by Oxford University Press.

This is an artist's impression of debris around a white dwarf star.

Credit: Image credit: NASA, ESA, STScI, and G. Bacon (STScI)

It has been known that many hot white dwarfs atmospheres, essentially of pure hydrogen or pure helium, are contaminated by other elements – like carbon, silicon and iron. What was not known, however, was the origins of these elements, known in astronomical terms as metals.

"The precise origin of the metals has remained a mystery and extreme differences in their abundance between stars could not be explained," said Professor Barstow, a Pro-Vice-Chancellor at the University of Leicester whose research was assisted by his daughter Jo, a co-author of the paper, during a summer work placement in Leicester. She has now gone on to be an astronomer working in Oxford - on extra-solar planets.

"It was believed that this material was "levitated" by the intense radiation from deeper layers in the star," said Professor Barstow.

Now the researchers have discovered that many of the stars show signs of contamination by rocky material, the left overs from a planetary system.


This is an artist's impression of a massive asteroid belt in orbit around a star. The new work with SDSS data shows that similar rubble around many white dwarfs contaminates these stars with rocky material and water.

Credit: Image credit: NASA-JPL / Caltech / T. Pyle (SSC)

The researchers surveyed 89 white dwarfs, using the Far Ultraviolet Spectroscopic Explorer to obtain their spectra (dispersing the light by colour) in which the "fingerprints" of carbon, silicon, phosphorous and sulphur can be seen, when these elements are present in the atmosphere.

"We found that in stars with polluted atmospheres the ratio of silicon to carbon matched that seen in rocky material, much higher than found in stars or interstellar gas.

"The new work indicates that at around a one-third of all hot white dwarfs are contaminated in this way, with the debris most likely in the form of rocky minor planet analogues. This implies that a similar proportion of stars like our Sun, as well as stars that are a little more massive like Vega and Fomalhaut, build systems containing terrestrial planets. This work is a form of celestial archaeology where we are studying the 'ruins' of rocky planets and/or their building blocks, following the demise of the main star.

"The mystery of the composition of these stars is a problem we have been trying to solve for more than 20 years. It is exciting to realise that they are swallowing up the left overs from planetary systems, perhaps like our own, with the prospect that more detailed follow-up work will be able to tell us about the composition of rocky planets orbiting other stars", said Professor Barstow.

The study also points to the ultimate fate of the Earth billions of years from now- ending up as a contamination within the white dwarf Sun.




Contacts and sources:
Martin Barstow
University of Leicester

Black Market For Hackers Bigger And Badder Than Ever, May Surpass Illegal Drugs In Value

Black and gray markets for computer hacking tools, services and byproducts such as stolen credit card numbers continue to expand, creating an increasing threat to businesses, governments and individuals, according to a new RAND Corporation study.

According to the study: 
  • The cyber black market has evolved from a varied landscape of discrete, ad hoc individuals into a network of highly organized groups, often connected with traditional crime groups (e.g., drug cartels, mafias, terrorist cells) and nation-states.
  • The cyber black market does not differ much from a traditional market or other typical criminal enterprises; participants communicate through various channels, place their orders, and get products.
  • Its evolution mirrors the normal evolution of markets with both innovation and growth.
  • For many, the cyber black market can be more profitable than the illegal drug trade.
One dramatic example is the December 2013 breach of retail giant Target, in which data from approximately 40 million credit cards and 70 million user accounts was hijacked. Within days, that data appeared — available for purchase — on black market websites.


“Hacking used to be an activity that was mainly carried out by individuals working alone, but over the last 15 years the world of hacking has become more organized and reliable,” said Lillian Ablon, lead author of the study and an information systems analyst at RAND, a nonprofit research organization. “In certain respects, cybercrime can be more lucrative and easier to carry out than the illegal drug trade.”

The growth in cybercrime has been assisted by sophisticated and specialized markets that freely deal in the tools and the spoils of cybercrime. These include items such as exploit kits (software tools that can help create, distribute, and manage attacks on systems), botnets (a group of compromised computers remotely controlled by a central authority that can be used to send spam or flood websites), as-a-service models (hacking for hire) and the fruits of cybercrime, including stolen credit card numbers and compromised hosts.

In the wake of several highly-publicized arrests and an increase in the ability of law enforcement to take down some markets, access to many of these black markets has become more restricted, with cybercriminals vetting potential partners before offering access to the upper levels. That said, once in, there is very low barrier to entry to participate and profit, according to the report.

RAND researchers conducted more than two dozen interviews with cybersecurity and related experts, including academics, security researchers, news reporters, security vendors and law enforcement officials. The study outlines the characteristics of the cybercrime black markets, with additional consideration given to botnets and their role in the black market, and “zero-day” vulnerabilities (software bugs that are unknown to vendors and without a software patch). Researchers also examine various projections and predictions for how the black market may evolve.

What makes these black markets notable is their resilience and sophistication, Ablon said. Even as consumers and businesses have fortified their activities in reaction to security threats, cybercriminals have adapted. An increase in law enforcement arrests has resulted in hackers going after bigger targets. More and more crimes have a digital component.

The RAND study says there will be more activity in “darknets,” more checking and vetting of participants, more use of crypto-currencies such as Bitcoin, greater anonymity capabilities in malware, and more attention to encrypting and protecting communications and transactions. Helped by such markets, the ability to attack will likely outpace the ability to defend.

Internet Map
Credit: Wikipedia

Hyper-connectivity will create more points of presence for attack and exploitation so that crime increasingly will have a networked or cyber component, creating a wider range of opportunities for black markets. Exploitations of social networks and mobile devices will continue to grow. There will be more hacking-for-hire, as-a-service offerings and cybercrime brokers.

However, experts disagree on who will be the most affected by the growth of the black market, what products will be on the rise and which types of attacks will be more prevalent, Ablon said.

These Cyber Black Markets Respond to Outside Forces:
  • As suspicion and "paranoia" spike because of an increase in recent takedowns, more transactions move to darknets; stronger vetting takes place; and greater encryption, obfuscation, and anonymization techniques are employed, restricting access to the most sophisticated parts of the black market.
  • The proliferation of as-a-service and point-and-click interfaces lowers the cost to enter the market.
  • Law enforcement efforts are improving as more individuals are technologically savvy; suspects are going after bigger targets, and thus are attracting more attention; and more crimes involve a digital component, giving law enforcement more opportunities to encounter crime in cyberspace.
  • Still, the cyber black market remains resilient and is growing at an accelerated pace, continually getting more creative and innovative as defenses get stronger, law enforcement gets more sophisticated, and new exploitable technologies and connections appear in the world.
  • Products can be highly customized, and players tend to be extremely specialized.
The study, “Markets for Cybercrime Tools and Stolen Data: Hackers' Bazaar,” can be found at www.rand.org. Other authors of the study are Martin Libicki and Andrea A. Golay.

Support for the study was provided by Juniper Networks as part of a multiphase study on the future cybersecurity environment.

The study was conducted within the Acquisition and Technology Policy Center of the RAND National Security Research Division. The division conducts research and analysis on defense and national security topics for the U.S. and allied defense, foreign policy, homeland security and intelligence communities and foundations and other nongovernmental organizations that support defense and national security analysis.



Contacts and sources:
Lisa M. Sodders
RAND Corporation