Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Tuesday, April 30, 2013

Synthetic Biology Research Community Grows Significantly

The number of private and public entities conducting research in synthetic biology worldwide grew significantly between 2009 and 2013, according to the latest version of an interactive map produced by the Synthetic Biology Project at the Woodrow Wilson International Center for Scholars.

As research into synthetic biology increases, this map identifies companies, universities, research institutions, laboratories and other centers across the globe that are active in this emerging field. In 2009, the Synthetic Biology Project began mapping the increased research in the field of synthetic biology, finding pockets of work being done in California, Massachusetts, the United Kingdom and Germany. These areas see continued growth in 2013, while research also ramps up in China and Japan.
The map is available online at http://www.synbioproject.org/map.

Synthetic biology, an area of research focused on the design and construction of new biological parts and devices, or the re-design of existing biological systems, is an emerging field and the focus of labs and companies around the world. The map, which builds on work the project started in 2009, is populated with more than 500 companies, universities, research institutions and other entities working on synthetic biology, showing clusters of activity in California, Massachusetts, Western Europe and East Asia.

"Part of this new activity has been driven by continuing government investments in the science," said David Rejeski, who directs the Synthetic Biology Project. "Another important factor has been the rapidly declining costs of gene sequencing, which has supported more effective approaches to engineering biological systems."

The Synthetic Biology Project found that the number of companies conducting synthetic biology research increased three-fold since 2009. A plurality of the companies involved in synthetic biology is focusing on developing bio-based specialty chemicals, fuels and/or medicines.

  Entities Conducting Research in Synthetic Biology, Worldwide
Since 2009, the industry has also experienced moderate levels of consolidation and failure. Of the 61 companies included on the initial 2009 inventory, six were acquired by other companies, closed their doors or can longer be identified. An additional 11 companies that were tracked between the release of the 2009 inventory and the 2013 update were also acquired, closed or cannot be identified.

Application Focus of Biological Systems Designers/Manufacturers Conducting Synthetic Biology Research, 

In addition to the expanded listings, the updated map features improved functionality, more detailed information and additional categories and subcategories. The updated map can also be accessed on Android and Apple mobile devices.

The map can be found here: http://www.synbioproject.org/map

The 2013 analysis can be found here: http://www.synbioproject.org/process/assets/files/6302/_draft/findings_2013.pdf

The map will be updated periodically. Users can submit additional entries to the map using this online form: http://www.synbioproject.org/sbmap/add-item/

Contacts and sources:
Aaron Lovell
Woodrow Wilson International Center for Scholars/Science and Technology Innovation Program

House Finance Chair Hensarling Goes on Ski Vacation with Wall Street

by Justin Elliott, ProPublica 

In January, , ascended to the powerful chairmanship of the House Financial Services Committee. Six weeks later, camRep. Jeb Hensarling, R-Texaspaign finance filings and interviews show, Hensarling was joined by representatives of the banking industry for a ski vacation fundraiser at a posh Park City, Utah, resort.

The congressman’s political action committee held the fundraiser at the St. Regis Deer Valley, the “Ritz-Carlton of ski resortsknown for its “white-glove service” and for its restaurant by superstar chef Jean-Georges Vongerichten.

Rep. Jeb Hensarling, R-Texas

There’s no evidence the fundraiser broke any campaign finance rules. But a ski getaway with Hensarling, whose committee oversees both Wall Street and its regulators, is an invaluable opportunity for industry lobbyists.

Among those attending the weekend getaway was an official from the American Securitization Forum, a Wall Street industry group, a spokesman confirmed. It gave $2,500 in February to Hensarling’s political action committee, the Jobs, Economy, and Budget (JEB) Fund.

Len Wolfson, a lobbyist for the Mortgage Bankers Association, which gave the JEB Fund $5,000 that month, posted a picture on Instagram from the weekend of the fundraiser of the funicular at the St. Regis. (It was labeled, “Putting the #fun in #funicular. #stregis #deervalley #utah.”) Wolfson did not respond to requests for comment. (UPDATE 1 p.m.Wolfson has now set his account to private.)

This photo was posted to Instagram by Mortgage Bankers Association lobbyist Len Wolfson on Feb. 24.

Visa, which gave the JEB Fund $5,000, also sent an official. A Visa spokesman told ProPublica that in attendance were not just finance companies, but also big retailers and others.

Hensarling, a protégé of former Texas senator and famed deregulator Phil Gramm, has a mixed record regarding Wall Street. While he has been critical of “too big to fail” banks and voted against the 2008 bailout, Hensarling recently said he opposed downsizing big banks, according to Bloomberg. That stance matters now more than ever as a bipartisan duo in the Senate, David Vitter, R-La., and Sherrod Brown, D-Ohio, introduced a bill last week seeking to constrain the too-big-to-fail institutions. While the bill is considered a longshot, it has provoked intense opposition from the industry.

Meanwhile, Hensarling recently barred the head of the new Consumer Financial Protection Bureau from appearing before the House Financial Services Committee, citing a legal cloud over recess appointments made by President Obama.

Whatever his stance on the industry, Hensarling has been more than happy to court Wall Street’s money.

Donors working in various financial industries are Hensarling’s biggest supporters, giving him over $1 million dollars in the last election cycle, according to the Center for Responsive Politics. The congressman’s office did not respond to requests for comment.

Others donating to Hensarling’s JEB Fund around the time of the Utah ski weekend: Capital One; Credit Suisse; PricewaterhouseCoopers; MasterCard; UBS; US Bank; the National Association of Federal Credit Unions; Koch Industries, which is involved in sundry financial trading; the National Pawnbrokers Association; and payday lenders Cash America International and CheckSmart Financial. All either declined to comment or did not respond to requests.

A spokeswoman for one large bank that donated $5,000, Alabama-based Regions Financial, told ProPublica the company doesn’t discuss events employees attend for “a number of reasons, including security.”

Also donating $5,000 to Hensarling’s political committee around the time of the ski weekend was Steve Clark, a lobbyist for JP Morgan and the industry group the Financial Services Roundtable. (In 2011, a memo written by Clark and his partners for the American Bankers Association proposed an $850,000 public-relations strategy to undermine Occupy Wall Street. It leaked to MSNBC; the plan had apparently never been executed.)

Clark didn’t respond to requests for comment.

The ski weekend was a large, apparently family-friendly affair. A Utah entertainment booker told ProPublica she had hired two caricature artists for a Feb. 23 event at the St. Regis for a group of 100, including 20 children. Hensarling’s JEB Fund, paid the bill. The fund also reported spending about $1,000 on “gifts and mementos” at Deer Valley as well as charges at the upscale restaurant Talisker on Main.

Campaigns and political action committees of a few other GOP congressmen also show charges totaling more than $50,000 at the St. Regis around that time: House Rules Committee Chairman Pete Sessions of Texas; House Ways and Means Committee Chairman Dave Camp of Michigan; and National Republican Congressional Committee Chairman Greg Walden of Oregon. None responded to requests for comment.

This is at least the second consecutive year that Hensarling has attended a fundraiser at Deer Valley. During the same February congressional recess last year, the National Republican Congressional Committee hosted a “Park City Ski Weekend” for Hensarling along with Sessions and Walden. Hensarling’s JEB Fund also reported about $60,000 paid to the St. Regis Deer Valley in the last election cycle. (The NRCC said it did not sponsor this year’s event.)

The Texan congressman has long had a taste for mixing skiing and politics. On the same February weekend in 2009, for example, Hensarling’s political action committeeinvited donors “to the second annual ‘JEB Fund Takes Jackson’” ski weekend for a minimum contribution of $2,500. The setting was the Snake River Lodge and Spa in Jackson, Wyoming, which boasted “wintertime activities fun for the entire family” including dog sledding tours and sleigh rides, according to the invitation.

Reporting contributed by Al Shaw.
Contacts and sources:

One Step Closer To A Quantum Computer

Professor Weimin Chen and his colleagues at Linköping University, in cooperation with German and American researchers, have succeeded in both initializing and reading nuclear spins, relevant to qubits for quantum computers, at room temperature. The results have just been published in the renowned journal Nature Communications.

A quantum computer is controlled by the laws of quantum physics; it promises to perform complicated calculations, or search large amounts of data, at a speed that exceeds by far those that today’s fastest supercomputers are capable of.

Weimin Chen, professor
Credit: Linköping University

“You could say that a quantum computer can think several thoughts simultaneously, while a traditional computer thinks one thought at a time,” says Weimin Chen, professor in the Division of Functional Electronic Materials at the Department of Physics, Chemistry and Biology at LiU, and one of the main authors of the article in Nature Communications.

A traditional computer stores, processes and sends all information in the form of bits, which can have a value of 1 or 0. But in the world of quantum physics, at the nano- and atomic level, other rules prevail and a bit in a quantum computer – a qubit – can have any value between 1 and 0. A spin-based qubit makes use of the fact that electrons and atomic nuclei rotate around their own axes – they have a spin. They can rotate both clockwise and counterclockwise (equivalent to 1 and 0), and in both directions simultaneously (a mix of 1 and 0) – something that is completely unthinkable in the traditional, “classical” world.

An atomic nucleus consists of both protons and neutrons, and the advantage of using the nuclear spin as a qubit is that the nucleus is well protected, and nearly impervious to unwanted electromagnetic disturbance, which is a condition for keeping the sensitive information in the qubit intact.

The first step in building a quantum computer is to assign each qubit a well-defined value, either 1 or 0. Starting, or initiating, the spin-based qubits then requires all the atomic nuclei to spin in the same direction, either ‘up’ or ‘down’ (clockwise or counterclockwise). The most common method for polarising nuclear spin is called dynamic nuclear polarisation; this means that the electrons’ spin simply influences the nucleus to spin in the same direction. The method requires strongly spin polarised electrons and functions superbly at lower temperatures. Dynamic nuclear polarisation via conduction electrons has, however, not yet been demonstrated at room temperature – which is crucial for the method to be useful in practice for the development of quantum computers. The main problem is that the spin orientation in the electrons can easily be lost at room temperature, since it is sensitive to disruptions from its surroundings.

Linköping University researchers Yuttapoom Puttisong, Xingjun Wang, Irina Buyanova and Weimin Chen, together with their German and American colleagues, have now discovered a way of getting around this problem.
Credit: Linköping University

Back in 2009, Chen and his research group presented a spin filter that works at room temperature; the filter lets through electrons that have the desired spin direction and screens out the others.

With the help of the spin filter, they have now succeeded in producing a flow of free electrons with a given spin in a material – in this case GaNAs (gallium nitrogen arsenide). The spin polarisation is so strong that it creates a strong polarisation of the nuclear spin in extra Ga atoms that are added as defects in the material – and this takes place at room temperature. This is the first time that strong nuclear spin polarisation of a defect atom in a solid is demonstrated at room temperature by spin-polarised conduction electrons.

“We prove experimentally that the measurable magnetic field from the nuclei, as well as the strong polarisation of the nuclear spins in the material at room temperature, comes from the dynamic polarisation of the nuclear spin in the extra added Ga atoms,” says Chen.

The researchers have also shown that the polarisation of the nuclear spin happens very quickly – potentially in less than a nanosecond (one-billionth of a second).

The method proposed also has the advantage of making use of free electrons. This makes it possible to control the polarisation of the spin in the nucleus electrically; in this way the information lying in the spin can both be initiated and read.

Article: Efficient room-temperature nuclear spin hyperpolarization of a defect atom in a semiconductor by Y. Puttisong, X. J. Wang, I.A. Buyanova, L. Geelhaar, H. Riechert, A.J. Ptak, C.W. Tu, and W.M. Chen. Nature Communications. 4: 1751 doi:10.1038/ncomms2776 (2013).

Contacts and sources:
Weimin Chen
Linköping University

Superlattice Unleashes Oxygen To Improve Fuel Cell Performance

‘Superlattice’ structure could give a huge boost to oxygen reaction in fuel cells, increasing their power potential.

New research at MIT could dramatically improve the efficiency of fuel cells, which are considered a promising alternative to batteries for powering everything from electronic devices to cars and homes.

Fuel cells make electricity by combining hydrogen, or hydrocarbon fuels, with oxygen. But the most efficient types, called solid oxide fuel cells (SOFC), have drawbacks that have limited their usefulness — including operating temperatures above 700 degrees Celsius (roughly 1300 degrees Fahrenheit). Now, MIT researchers have unraveled the properties of a promising alternative material structure for a key component of these devices.

The MIT team used a scanning tunneling microscope (STM) to study the electrical activity of a superlattice material composed of two different compounds of the elements strontium, lanthanum and cobalt. At bottom, a diagram of how they "sliced" the material on an angle to expose wider bands of the thin layers of material. The center two images show the resulting measurements of the surface topography of the material, and the activity of electrons moving through it. At top, a diagram of the molecular structures of the two compounds. 
Unleashing oxygen
Graphic Courtesy of Chen Et Al

The new structure, a “superlattice” of two compounds interleaved at a tiny scale, could serve as one of the two electrodes in the fuel cell. The complex material, discovered about six years ago and known as LSC113/214, is composed of two oxides of the elements lanthanum, strontium and cobalt. While one of the oxides was already known as an especially good material for such electrodes, the combination of the two is far more potent in promoting oxygen reduction than either oxide alone.

The interfaces between these two oxides were thought to be the key. But until now, no one had been able to observe the LSC113/214 interface properties in operation, at sufficiently high resolution, to figure out why it worked so well.

Oxygen reduction is one of two main reactions in a fuel cell, and the one that has limited their overall performance — so finding improved materials for that reaction could be a key advance for fuel cells, the researchers say. The new findings are published in the journal Advanced Energy Materials in a paper co-authored by graduate student Yan Chen, professors Harry Tuller and Bilge Yildiz, and three other researchers at MIT.

Yildiz, an associate professor of nuclear science and engineering, says LSC113/214 has been “a singular example” of a material with extremely high reactivity to oxygen reduction; the new results explaining why it works so well could lead to further optimization or the discovery of other materials that might perform even better.

The best of both

The key to the material’s performance, she explains, is the marriage of complementary qualities from its two constituents. One of the oxides allows superior conduction and transfer of electrons, while the other excels at holding onto oxygen atoms; to perform well as a fuel cell’s cathode — one of its two electrodes — a material needs to have both qualities.

The close proximity of the two materials in this superlattice causes them to “borrow” one another’s attributes, the MIT team found. The result is a material whose reactivity exceeds that of the best materials currently used in fuel cells, Yildiz says: “It’s the best of the two worlds.”

Now that the MIT team has analyzed LSC113/214, it may be possible to discover even better materials by conducting systematic searches, Yildiz says; the team is now working on that. “If we can crack this problem, then we can make great strides in improving the performance,” adds Tuller, a professor of ceramics and electronic materials in MIT’s Department of Materials Science and Engineering.

Unique tool enables observations

The finding was made possible by instrumentation developed in Yildiz’s laboratory at MIT for observation of electron-transfer properties on surfaces: The instrument, a modified scanning tunneling microscope (STM), can observe materials at high temperatures and in an oxygen-rich environment — “representative of the operating conditions of a fuel-cell cathode,” Yildiz says. This high-temperature phenomenon would not have been detectable with conventional methods.

Tuller describes the superlattice as a “layer cake” of the two different oxides. But these layers are vanishingly thin. To overcome this, the team “sliced” the layers on an extreme angle, exposing much wider surfaces of each. “That magnifies the layers by a hundredfold,” Tuller says.

That slicing is done using a focused ion beam, Chen explains, to expose the interface in a way that the STM can observe more easily at high temperature.

The researchers hope that with this new knowledge, it will be possible to make rapid progress in the search for better electrode materials, helping make fuel cells practical for a wide range of energy applications, from powering homes to powering mobile devices.

John Kilner, a professor of energy materials at Imperial College, London, who was not involved in this project, calls this “a very elegant set of experiments that contributes a great deal toward our understanding of the very complex problem of oxygen surface exchange.”

Kilner adds, “It waits to be seen if we can capitalize on this knowledge to aid in the construction of practical devices, but it opens up the possibility of engineering new structures with enhanced performance at low temperatures.”

The work was supported by the U.S. Department of Energy’s Basic Energy Sciences Program.

Contacts and sources:
David L. Chandler, MIT News Office

We Still Face Grave Nuclear Dangers

William J. Perry, a senior fellow at Stanford's Freeman Spogli Institute for International Studies, spoke about his efforts to reduce the arsenal of nuclear weapons around the globe. He talked with fellows at the Center for Advanced Study in the Behavioral Sciences.

Nuclear explosion
File:Operation Upshot-Knothole - Badger 001.jpg
Credit: Wikipedia

Former Secretary of Defense William J. Perry says it is possible to dramatically reduce nuclear weapons and the dangers they pose, but the effort has stalled and even reversed, leaving the world at greater risk.

Perry, speaking recently to a group of fellows at the Center for Advanced Study in the Behavioral Sciences at Stanford University (CASBS), said that after years of progress toward nonproliferation, there are new indications that some nations, including the United States, are working toward building up their stockpiles.

"North Korea and Iran are both moving today to building nuclear arsenals. Russia and China have each started new nuclear programs, and the United States is considering following suit," said Perry, a senior fellow at Stanford's Freeman Spogli Institute for International Studies and the Hoover Institution. "It is imperative that we reverse this trend."

Perry has worked extensively on nonproliferation with Stanford colleagues Sidney Drell, a physicist, and former statesman George Shultz.

During the recent lunchtime discussion with CASBS fellows, Perry warned that a regional nuclear war or nuclear terrorism poses serious threats that can best be neutralized by ridding the world of the weapons and the materials to build them.

He expressed doubt, however, that Americans realize the risk.

"They believe nuclear dangers ended with the ending of the Cold War," he said. "Their children, thankfully, are no longer doing duck-and-cover drills at school, thus the danger must have passed."

But, he said, "While we no longer face an all-out nuclear attack from the Soviet Union or China, which most Americans understand, we still face grave nuclear dangers which most Americans do not understand."

He warned that without that understanding, Congress has no willingness to lead.

"In order for the world to make real progress, the United States must lead and the United States will not lead unless Americans understand the importance of doing so," he said.

Perry, who was the defense secretary under Bill Clinton, said that after years of diplomacy on the brink of nuclear war, his task now is to "try to influence other people's thinking to change."

"Fundamentally, what I'd say they need to understand is that nuclear weapons no longer provide for a national security as they did in the Cold War, but that today nuclear weapons are in fact endangering our security."

Perry plans to launch the "William J. Perry Project," which includes a memoir documenting his work both to procure nuclear weapons and to get rid of them.

Those experiences will help inform another component of the project, which is a series of educational programs about the topic, mostly directed at young people.

"I have sort of given up on my generation," he quipped.

Perry said he realizes his efforts may be "Mission Impossible," but "I do this because I believe that time is not on our side and because having helped to build our nuclear arsenal I know better than most how to dismantle it. And I believe I have a special responsibility to do so."

Contacts and sources:
Brooke Donald

Graphene's High-Speed Seesaw, Transistor With Trillions Of Switches Per Second

Writing in Nature Communications, the researchers report the first graphene-based transistor with bistable characteristics, which means that the device can spontaneously switch between two electronic states. Such devices are in great demand as emitters of electromagnetic waves in the high-frequency range between radar and infra-red, relevant for applications such as security systems and medical imaging.

Bistability is a common phenomenon – a seesaw-like system has two equivalent states and small perturbations can trigger spontaneous switching between them. The way in which charge-carrying electrons in graphene transistors move makes this switching incredibly fast – trillions of switches per second.

Graphene transistors could be key in medical imaging and security devices
Credit: University of Machester

Wonder material graphene is the world's thinnest, strongest and most conductive material, and has the potential to revolutionise a huge number of diverse applications; from smartphones and ultrafast broadband to drug delivery and computer chips. It was first isolated at The University of Manchester in 2004.

The device consists of two layers of graphene separated by an insulating layer of boron nitride just a few atomic layers thick. The electron clouds in each graphene layer can be tuned by applying a small voltage. This can induce the electrons into a state where they move spontaneously at high speed between the layers.

Because the insulating layer separating the two graphene sheets is ultra-thin, electrons are able to move through this barrier by 'quantum tunnelling'. This process induces a rapid motion of electrical charge which can lead to the emission of high-frequency electromagnetic waves.

These new transistors exhibit the essential signature of a quantum seesaw, called negative differential conductance, whereby the same electrical current flows at two different applied voltages. The next step for researchers is to learn how to optimise the transistor as a detector and emitter.

One of the researchers, Professor Laurence Eaves, said: "In addition to its potential in medical imaging and security screening, the graphene devices could also be integrated on a chip with conventional, or other graphene-based, electronic components to provide new architectures and functionality.

"For more than 40 years, technology has led to ever-smaller transistors; a tour de force of engineering that has provided us with today's state-of-the-art silicon chips which contain billions of transistors. Scientists are searching for an alternative to silicon-based technology, which is likely to hit the buffers in a few years' time, and graphene may be an answer."

"Graphene research is relatively mature but multi-layered devices made of different atomically-thin materials such as graphene were first reported only a year ago. This architecture can bring many more surprises", adds Dr Liam Britnell, University of Manchester, the first author of the paper.

Contacts and sources:

Is Antimatter Anti-Gravity?

First direct measurement of antimatter's weight compared to that of normal matter. 
Antimatter is strange stuff. It has the opposite electrical charge to normal matter and, when it meets its matter counterpart, the two annihilate in a flash of light.

UC Berkeley/LBNL physicists asked the question, does normal hydrogen (left, with a negatively charged electron orbiting a positively charged proton) weigh the same as antihydrogen (a positively charged positron orbiting a negatively charged antiproton)?
Credit: Image by Chukman So, UC Berkeley.

Four University of California, Berkeley, physicists are now asking whether matter and antimatter are also affected differently by gravity. Could antimatter fall upward – that is, exhibit anti-gravity – or fall downward at a different rate than normal matter?

Almost everyone, including the physicists, thinks that antimatter will likely fall at the same rate as normal matter, but no one has ever dropped antimatter to see if this is true, said Joel Fajans, UC Berkeley professor of physics. And while there are many indirect indications that matter and antimatter weigh the same, they all rely on assumptions that might not be correct. A few theorists have argued that some cosmological conundrums, such as why there is more matter than antimatter in the universe, could be explained if antimatter did fall upward.

In a new paper published online on April 30 in Nature Communications, the UC Berkeley physicists and their colleagues with the ALPHA experiment at CERN, the European Organization for Nuclear Research in Geneva, Switzerland, report the first direct measurement of gravity's effect on antimatter, specifically antihydrogen in free fall. Though far from definitive – the uncertainty is about 100 times the expected measurement – the UC Berkeley experiment points the way toward a definitive answer to the fundamental question of whether matter falls up or down.

"This is the first word, not the last," Fajans. "We've taken the first steps toward a direct experimental test of questions physicists and nonphysicists have been wondering about for more than 50 years. We certainly expect antimatter to fall down, but just maybe we will be surprised."

Fajans and fellow physics professor Jonathan Wurtele employed data from the Antihydrogen Laser Physics Apparatus (ALPHA) at CERN. The experiment captures antiprotons and combines them with antielectons (positrons) to make antihydrogen atoms, which are stored and studied for a few seconds in a magnetic trap. Afterward, however, the trap is turned off and the atoms fall out. The two researchers realized that by analyzing how antihydrogen fell out of the trap, they could determine if gravity pulled on antihydrogen differently than on hydrogen.

Antihydrogen did not behave weirdly, so they calculated that it cannot be more than 110 times heavier than hydrogen. If antimatter is anti-gravity – and they cannot rule it out – it doesn't accelerate upward with more than 65 Gs.

"We need to do better, and we hope to do so in the next few years," Wurtele said. ALPHA is being upgraded and should provide more precise data once the experiment reopens in 2014.

The paper was coauthored by other members of the ALPHA team, including UC Berkeley postdoctoral fellow Andre Zhmoginov and lecturer Andrew Charman.

Contacts and sources: 
Robert Sanders
University of California - Berkeley

Monday, April 29, 2013

'Super-Resolution' Microscope Possible For Nanostructures

Researchers have found a way to see synthetic nanostructures and molecules using a new type of super-resolution optical microscopy that does not require fluorescent dyes, representing a practical tool for biomedical and nanotechnology research.

"Super-resolution optical microscopy has opened a new window into the nanoscopic world," said Ji-Xin Cheng, an associate professor of biomedical engineering and chemistry at Purdue University.

A new type of super-resolution optical microscopy takes a high-resolution image (at right) of graphite "nanoplatelets" about 100 nanometers wide. The imaging system, called saturated transient absorption microscopy,or STAM, uses a trio of laser beams and represents a practical tool for biomedical and nanotechnology research.
Credit: Weldon School of Biomedical Engineering, Purdue University

Conventional optical microscopes can resolve objects no smaller than about 300 nanometers, or billionths of a meter, a restriction known as the "diffraction limit," which is defined as half the width of the wavelength of light being used to view the specimen. However, researchers want to view molecules such as proteins and lipids, as well as synthetic nanostructures like nanotubes, which are a few nanometers in diameter.

Such a capability could bring advances in a diverse range of disciplines, from medicine to nanoelectronics, Cheng said.

"The diffraction limit represents the fundamental limit of optical imaging resolution," Cheng said. "Stefan Hell at the Max Planck Institute and others have developed super-resolution imaging methods that require fluorescent labels. Here, we demonstrate a new scheme for breaking the diffraction limit in optical imaging of non-fluorescent species. Because it is label-free, the signal is directly from the object so that we can learn more about the nanostructure."

Findings are detailed in a research paper that appeared online Sunday (April 28) in the journalNature Photonics.

The imaging system, called saturated transient absorption microscopy, or STAM, uses a trio of laser beams, including a doughnut-shaped laser beam that selectively illuminates some molecules but not others. Electrons in the atoms of illuminated molecules are kicked temporarily into a higher energy level and are said to be excited, while the others remain in their "ground state." Images are generated using a laser called a probe to compare the contrast between the excited and ground-state molecules.

The researchers demonstrated the technique, taking images of graphite "nanoplatelets" about 100 nanometers wide.

"It's a proof of concept and has great potential for the study of nanomaterials, both natural and synthetic," Cheng said.

The doughnut-shaped laser excitation technique, invented by researcher Stefan Hell, makes it possible to focus on yet smaller objects. Researchers hope to improve the imaging system to see objects about 10 nanometers in diameter, or about 30 times smaller than possible using conventional optical microscopes.

"We are not there yet, but a few schemes can be applied to further increase the resolution of our system," Cheng said.

The paper was co-authored by biomedical engineering doctoral student Pu Wang; research scientist Mikhail N. Slipchenko; mechanical engineering doctoral student James Mitchell; Chen Yang, an assistant professor of physical chemistry at Purdue; Eric O. Potma, an associate professor of chemistry at the University of California, Irvine; Xianfan Xu, Purdue's James J. and Carol L. Shuttleworth Professor of Mechanical Engineering; and Cheng.

Future research may include work to use lasers with shorter wavelengths of light. Because the wavelengths are shorter, the doughnut hole is smaller, possibly allowing researchers to focus on smaller objects.

The work will be discussed during the third annual Spectroscopic Imaging: A New Window into the Unseen World workshop on May 23 and 24 at Purdue. The workshop is hosted by the university's Weldon School of Biomedical Engineering. More workshop information is available at http://www.conf.purdue.edu/cheng

The research is funded by the National Institutes of Health, National Science Foundation and the Defense Advanced Research Projects Agency.

Contacts and sources:
Emil Venere
Purdue University

What Happened to Dinosaurs' Predecessors After Earth's Largest Extinction 252 Million Years Ago?

Predecessors to dinosaurs missed the race to fill habitats emptied when nine out of 10 species disappeared during Earth's largest mass extinction 252 million years ago.

Or did they?

That thinking was based on fossil records from sites in South Africa and southwest Russia.

After the ancient extinction, some animals, like Asilisaurus, had more restricted ranges.
Credit: Marlene Donnelly/Field Museum of Natural History

It turns out, however, that scientists may have been looking in the wrong places.

Newly discovered fossils from 10 million years after the mass extinction reveal a lineage of animals thought to have led to dinosaurs in Tanzania and Zambia.

That's still millions of years before dinosaur relatives were seen in the fossil record elsewhere on Earth.

"The fossil record from the Karoo of South Africa, for example, is a good representation of four-legged land animals across southern Pangea before the extinction," says Christian Sidor, a paleontologist at the University of Washington.

The extinction took out species like Dicynodon; other herbivores then moved in.
Credit: Marlene Donnelly/Field Museum of Natural History

Pangea was a landmass in which all the world's continents were once joined together. Southern Pangea was made up of what is today Africa, South America, Antarctica, Australia and India.

"After the extinction," says Sidor, "animals weren't as uniformly and widely distributed as before. We had to go looking in some fairly unorthodox places."

Sidor is the lead author of a paper reporting the findings; it appears in this week's issue of the journal Proceedings of the National Academy of Sciences.

The insights come from seven fossil-hunting expeditions in Tanzania, Zambia and Antarctica funded by the National Science Foundation (NSF). Additional work involved combing through existing fossil collections.

"These scientists have identified an outcome of mass extinctions--that species ecologically marginalized before the extinction may be 'freed up' to experience evolutionary bursts then dominate after the extinction," says H. Richard Lane, program director in NSF's Division of Earth Sciences.

Fossils from South Africa, Zambia, Malawi, Tanzania, Antarctica were part of the research.
Credit: U of Texas at Austin/UW

The researchers created two "snapshots" of four-legged animals about five million years before, and again about 10 million years after, the extinction 252 million years ago.

Prior to the extinction, for example, the pig-sized Dicynodon--said to resemble a fat lizard with a short tail and turtle's head--was a dominant plant-eating species across southern Pangea.

After the mass extinction, Dicynodon disappeared. Related species were so greatly decreased in number that newly emerging herbivores could then compete with them.

"Groups that did well before the extinction didn't necessarily do well afterward," Sidor says.

The snapshot of life 10 million years after the extinction reveals that, among other things, archosaurs roamed in Tanzanian and Zambian basins, but weren't distributed across southern Pangea as had been the pattern for four-legged animals before the extinction.

Archosaurs, whose living relatives are birds and crocodilians, are of interest to scientists because it's thought that they led to animals like Asilisaurus, a dinosaur-like animal, andNyasasaurus parringtoni, a dog-sized creature with a five-foot-long tail that could be the earliest dinosaur.

"Early archosaurs being found mainly in Tanzania is an example of how fragmented animal communities became after the extinction," Sidor says.

A new framework for analyzing biogeographic patterns from species distributions, developed by paper co-author Daril Vilhena of University of Washington, provided a way to discern the complex recovery.

Fossilized remains being sorted of a new specimen of Asilisaurus collected in Tanzania.
Credit: Roger H. M. Smith

It revealed that before the extinction, 35 percent of four-legged species were found in two or more of the five areas studied.

Some species' ranges stretched 1,600 miles (2,600 kilometers), encompassing the Tanzanian and South African basins.

Ten million years after the extinction, there was clear geographic clustering. Just seven percent of species were found in two or more regions.

The technique--a new way to statistically consider how connected or isolated species are from each other--could be useful to other paleontologists and to modern-day biogeographers, Sidor says.

Beginning in the early 2000s, he and his co-authors conducted expeditions to collect fossils from sites in Tanzania that hadn't been visited since the 1960s, and in Zambia where there had been little work since the 1980s.

Two expeditions to Antarctica provided additional finds, as did efforts to look at museum fossils that had not been fully documented or named.

The fossils turned out to hold a treasure trove of information, the scientists say, on life some 250 million years ago.

Other co-authors of the paper are Adam Huttenlocker, Brandon Peecook, Sterling Nesbitt and Linda Tsuji from University of Washington; Kenneth Angielczyk of the Field Museum of Natural History in Chicago; Roger Smith of the Iziko South African Museum in Cape Town; and Sébastien Steyer from the National Museum of Natural History in Paris.

The project was also funded by the National Geographic Society, Evolving Earth Foundation, the Grainger Foundation, the Field Museum/IDP Inc. African Partners Program, and the National Research Council of South Africa.

Contacts and sources:
Cheryl Dybas
National Science Foundation 
Sandra Hines, UW

Revolutionary Shape-Changing Phone Curls Upon A Call, Nearly As Thin As Paper

Queen’s University’s human media lab to unveil Morephone at Paris conference

Researchers at Queen’s University’s human media lab have developed a new smartphone – called Morephone – which can morph its shape to give users a silent yet visual cue of an incoming phone call, text message or email.

“This is another step in the direction of radically new interaction techniques afforded by smartphones based on thin film, flexible display technologies” says Roel Vertegaal (School of Computing), director of the human media lab at queen’s university who developed the flexible paperphone and papertab.

“Users are familiar with hearing their phone ring or feeling it vibrate in silent mode. one of the problems with current silent forms of notification is that users often miss notifications when not holding their phone. With morephone, they can leave their smartphone on the table and observe visual shape changes when someone is trying to contact them.”

Morephone is not a traditional smartphone. it is made of a thin, flexible electrophoretic display manufactured by plastic logic – a british company and a world leader in plastic electronics. sandwiched beneath the display are a number of shape memory alloy wires that contract when the phone notifies the user. this allows the phone to either curl either its entire body, or up to three individual corners. Each corner can be tailored to convey a particular message. for example, users can set the top right corner of the morephone to bend when receiving a text message, and the bottom right corner when receiving an email. corners can also repeatedly bend up and down to convey messages of greater urgency.

Dr. Vertegaal thinks bendable, flexible cell phones are the future and morephones could be in the hands of consumers within five to 10 years. Queen’s researchers will unveil the prototype at the ACM CHI 2013 (computer-human interaction) in Paris on April 29th. the annual conference is the world’s premier conference on all aspects of human-computer interaction.

Morephone was developed by Dr. Vertegaal and his school of computing students Antonio Gomes and Andrea Nesbitt.

Contacts and sources;
Michael Onesi
Queen's University

Video: Spectacular Close-Up Views Of Giant Hurricane On Saturn

- NASA's Cassini spacecraft has provided scientists the first close-up, visible-light views of a behemoth hurricane swirling around Saturn's north pole.

In high-resolution pictures and video, scientists see the hurricane's eye is about 1,250 miles (2,000 kilometers) wide, 20 times larger than the average hurricane eye on Earth. Thin, bright clouds at the outer edge of the hurricane are traveling 330 miles per hour (150 meters per second). The hurricane swirls inside a large, mysterious, six-sided weather pattern known as the hexagon.

"We did a double take when we saw this vortex because it looks so much like a hurricane on Earth," said Andrew Ingersoll, a Cassini imaging team member at the California Institute of Technology in Pasadena. "But there it is at Saturn, on a much larger scale, and it is somehow getting by on the small amounts of water vapor in Saturn's hydrogen atmosphere." 

The spinning vortex of Saturn's north polar storm resembles a deep red rose of giant proportions surrounded by green foliage in this false-color image from NASA's Cassini spacecraft. 
The spinning vortex of Saturn's north polar storm resembles a deep red rose of giant proportions surrounded by green foliage in this false-color image from NASA's Cassini spacecraft
Image credit: NASA/JPL-Caltech/SSI

Scientists will be studying the hurricane to gain insight into hurricanes on Earth, which feed off warm ocean water. Although there is no body of water close to these clouds high in Saturn's atmosphere, learning how these Saturnian storms use water vapor could tell scientists more about how terrestrial hurricanes are generated and sustained.

Both a terrestrial hurricane and Saturn's north polar vortex have a central eye with no clouds or very low clouds. Other similar features include high clouds forming an eye wall, other high clouds spiraling around the eye, and a counter-clockwise spin in the northern hemisphere.

A major difference between the hurricanes is that the one on Saturn is much bigger than its counterparts on Earth and spins surprisingly fast. At Saturn, the wind in the eye wall blows more than four times faster than hurricane force winds on Earth. Unlike terrestrial hurricanes, which tend to move, the Saturnian hurricane is locked onto the planet's north pole. On Earth, hurricanes tend to drift northward because of the forces acting on the fast swirls of wind as the planet rotates. The one on Saturn does not drift and is already as far north as it can be. 

The north pole of Saturn, in the fresh light of spring, is revealed in this color image from NASA's Cassini spacecraft. 
The north pole of Saturn, in the fresh light of spring, is revealed in this color image from NASA's Cassini spacecraft
Image credit: NASA/JPL-Caltech/SSI

"The polar hurricane has nowhere else to go, and that's likely why it's stuck at the pole," said Kunio Sayanagi, a Cassini imaging team associate at Hampton University in Hampton, Va.

Scientists believe the massive storm has been churning for years. When Cassini arrived in the Saturn system in 2004, Saturn's north pole was dark because the planet was in the middle of its north polar winter. During that time, Cassini's composite infrared spectrometer and visual and infrared mapping spectrometer detected a great vortex, but a visible-light view had to wait for the passing of the equinox in August 2009. Only then did sunlight begin flooding Saturn's northern hemisphere. The view required a change in the angle of Cassini's orbits around Saturn so the spacecraft could see the poles.

"Such a stunning and mesmerizing view of the hurricane-like storm at the north pole is only possible because Cassini is on a sportier course, with orbits tilted to loop the spacecraft above and below Saturn's equatorial plane," said Scott Edgington, Cassini deputy project scientist at NASA's Jet Propulsion Laboratory (JPL) in Pasadena, Calif. "You cannot see the polar regions very well from an equatorial orbit. Observing the planet from different vantage points reveals more about the cloud layers that cover the entirety of the planet."

This spectacular, vertigo inducing, false-color image from NASA's Cassini mission highlights the storms at Saturn's north pole. 
This spectacular, vertigo inducing, false-color image from NASA's Cassini mission highlights the storms at Saturn's north pole
Image credit: NASA/JPL-Caltech/SSI

Cassini changes its orbital inclination for such an observing campaign only once every few years. Because the spacecraft uses flybys of Saturn's moon Titan to change the angle of its orbit, the inclined trajectories require attentive oversight from navigators. The path requires careful planning years in advance and sticking very precisely to the planned itinerary to ensure enough propellant is available for the spacecraft to reach future planned orbits and encounters.

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. JPL manages the Cassini-Huygens mission for NASA's Science Mission Directorate in Washington. The Cassini orbiter and its two onboard cameras were designed, developed and assembled at JPL. The imaging team consists of scientists from the United States, the United Kingdom, France and Germany. The imaging operations center is based at the Space Science Institute in Boulder, Colo.

 Contacts and sources:
Jia-Rui Cook
Jet Propulsion Laboratory

Volcanic Eruption Forecasting Improved

Forecasting volcanic eruptions with success is heavily dependent on recognizing well-established patterns of pre-eruption unrest in the monitoring data. But in order to develop better monitoring procedures, it is also crucial to understand volcanic eruptions that deviate from these patterns.

Cleveland Volcano in the Aleutian Islands of Alaska photographed from the International Space Station,
File:MtCleveland ISS013-E-24184.jpg
Credit: Wikipedia

New research from a team led by Carnegie's Diana Roman retrospectively documented and analyzed the period immediately preceding the 2009 eruption of the Redoubt volcano in Alaska, which was characterized by an abnormally long period of pre-eruption seismic activity that's normally associated with short-term warnings of eruption. Their work is published today by Earth and Planetary Science Letters.

Well-established pre-eruption patterns can include a gradual increase in the rate of seismic activity, a progressive alteration in the type of seismic activity, or a change in ratios of gas released.

"But there are numerous cases of volcanic activity that in some way violated these common patterns of precursory unrest," Roman said. "That's why examining the unusual precursor behavior of the Redoubt eruption is so enlightening."

About six to seven months before the March 2009 eruption, Redoubt began to experience long-period seismic events, as well as shallow volcanic tremors, which intensified into a sustained tremor over the next several months. Immediately following this last development, shallow, short-period earthquakes were observed at an increased rate below the summit. In the 48 hours prior to eruption both deep and shallow earthquakes were recorded.

This behavior was unusual because precursor observations usually involve a transition from short-period to long-period seismic activity, not the other way around. What's more, seismic tremor is usually seen as a short-term warning, not something that happens months in advance. However, these same precursors were also observed during the 1989-90 Redoubt eruption, thus indicating that the unusual seismic pattern reflects some unique aspect of the volcano's magma system.

Advanced analysis of the seismic activity taking place under the volcano allowed Roman and her team to understand the changes taking place before, during, and after eruption. Their results show that the eruption was likely preceded by a protracted period of slow magma ascent, followed by a short period of rapidly increasing pressure beneath Redoubt.

Mount Redoubt.
File:Mt. Redoubt2009.jpg
Credit: Wikipedia

Elucidating the magma processes causing these unusual precursor events could help scientists to hone their seismic forecasting, rather than just relying on the same forecasting tools they're currently using, ones that are not able to detect anomalies.

For example, using current techniques, the forecasts prior to Redoubt's 2009 eruption wavered over a period of five months, back and forth between eruption being likely within a few weeks to within a few days. If the analytical techniques used by Roman and her team had been taken into consideration, the early risk escalations might not have been issued.

"Our work shows the importance of clarifying the underlying processes driving anomalous volcanic activity. This will allow us to respond to subtle signals and increase confidence in making our forecasts." Roman said.

Mount Rinjani eruption in 1994, in Lombok, Indonesia
File:Rinjani 1994.jpg
This research was funded by U.S. Geological Survey Volcano Hazards ARRA Award.

The Carnegie Institution for Science is a private, nonprofit organization headquartered in Washington, D.C., with six research departments throughout the U.S. Since its founding in 1902, the Carnegie Institution has been a pioneering force in basic scientific research. Carnegie scientists are leaders in plant biology, developmental biology, astronomy, materials science, global ecology, and Earth and planetary science.

Contacts and sources: 
Diana Roman
Carnegie Institution

From Which Ancestors Have Turtles Evolved? How Did They Get Their Shell?

From which ancestors have turtles evolved? How did they get their shell? New data provided by the Joint International Turtle Genome Consortium, led by researchers from RIKEN in Japan, BGI in China, and the Wellcome Trust Sanger Institute in the UK provides evidence that turtles are not primitive reptiles but belong to a sister group of birds and crocodiles. The work also sheds light on the evolution of the turtle’s intriguing morphology and reveals that the turtle’s shell evolved by recruiting genetic information encoding for the limbs.

Green sea turtle
Credit: RIKEN

Turtles are often described as evolutionary monsters, with a unique body plan and a shell that is considered to be one of the most intriguing structures in the animal kingdom.

“Turtles are interesting because they offer an exceptional case to understand the big evolutionary changes that occurred in vertebrate history,” explains Dr. Naoki Irie, from the RIKEN Center for Developmental Biology, who led the study.

Turtle evolutionary tree
Credit: RIKEN

Using next-generation DNA sequencers, the researchers from 9 international institutions have decoded the genome of the green sea turtle and Chinese soft-shell turtle and studied the expression of genetic information in the developing turtle.

Their results published in Nature Genetics show that turtles are not primitive reptiles as previously thought, but are related to the group comprising birds and crocodilians, which also includes extinct dinosaurs.
Based on genomic information, the researchers predict that turtles must have split from this group around 250 million years ago, during one of the largest extinction events ever to take place on this planet.

Turtle and chicken body plan during development

Credit: RIKEN

“We expect that this research will motivate further work to elucidate the possible causal connection between these events,” says Dr. Irie.

The study also reveals that despite their unique anatomy, turtles follow the basic embryonic pattern during development. Rather than developing directly into a turtle-specific body shape with a shell, they first establish the vertebrates’ basic body plan and then enter a turtle-specific development phase. During this late specialization phase, the group found traces of limb-related gene expression in the embryonic shell, which indicates that the turtle shell evolved by recruiting part of the genetic program used for the limbs. 

soft shell turtle
Credit: RIKEN

“The work not only provides insight into how turtles evolved, but also gives hints as to how the vertebrate developmental programs can be changed to produce major evolutionary novelties.” explains Dr. Irie.

Another unexpected finding of the study was that turtles possess a large number of olfactory receptors and must therefore have the ability to smell a wide variety of substances. The researchers identified more than 1000 olfactory receptors in the soft-shell turtle, which is one of the largest numbers ever to be found in a non-mammalian vertebrate.

Contacts and sources:

Citation:“The draft genomes of soft-shell turtle and green sea turtle yield insights into the development and evolution of the turtle–specific body plan.”Zhuo Wang et al. Nature Genetics, 2013 DOI: 10.1038/ng.2615

Demographic Transition Does Not Stop Human Evolution

In many places around the world, people are living longer and are having fewer children. But that’s not all. In a study of people living in rural Gambia, it appears that this modern-day “demographic transition” may lead women to be taller and slimmer, too. Researchers from the Leibniz Institute of Zoo and Wildlife Research (IZW) as well as British, American and Gambian institutes and universities just published their discovery in the Cell Press journal Current Biology.

The relationships between fitness and body measurement traits are likely to change, owing to the modification of the social, cultural, medical and economic environment. 
Photo: Felicia Webb

“This is a reminder that improvements in health do not necessarily mean that evolution stops, but that it changes,” says Alexandre Courtiol of the Leibniz Institute for Zoo and Wildlife Research in Germany. Researchers explored for the first time the two main evolutionary consequences of demographic transitions. On the one hand, by influencing mortality and fertility demographic transition can influence the intensity and scope for Darwinian selection. On the other hand, the relationships between fitness and body measurement traits are likely to change too, owing to the modification of the social, cultural, medical and economic environment that occurred in parallel to these demographic changes.

For their studies, Courtiol, Rickard, and their colleagues used data collected over a 55-year period (1956-2010) by the UK Medical Research Council on thousands of women from two rural villages in the West Kiang district of Gambia. Over this time period, these communities experienced significant demographic shifts – from high mortality and fertility rates, which are characteristic for preindustrial societies, to rapidly declining ones. The researchers also had thorough data on the height and weight of the women.

They could show that the changes are likely related to improvements in medical care since a clinic providing free medical care opened there in 1974, which has changed the way that natural selection acts on body size. Demographic transition influenced directional selection on women’s height and body mass index (BMI). Selection initially favoured short women with high BMI values but shifted over time to favour tall women with low BMI values. “That selection has shifted from shorter and stouter women to taller and thinner ones is partly because selection began acting less on mortality and more on fertility over time, but other environmental changes influenced which women were more or less likely to reproduce, too” Courtiol says.

The findings in Gambia may have relevance around the globe. “Our results are important because the majority of human populations have either recently undergone, or are currently undergoing, a demographic transition from high to low fertility and mortality rates,” the researchers write. “Thus the temporal dynamics of the evolutionary processes revealed here may reflect the shifts in evolutionary pressures being experienced by human societies generally.” A change in opportunity for Darwinian selection across the transition has been documented for most populations studied, including the United States, Italy, Finland, Sweden, and India. How we humans respond to these pressures might tell us something about how we’ll continue to evolve in this ever-changing world we live in.

Contacts and sources:
Forschungsverbund Berlin e.V. (FVB)

Citation: Courtiol A, Rickard IJ, Lummaa V, Prentice AM, Fulford AJC, Stearns SC (2013): The demographic transition influences variance in fitness and selection on height and BMI in rural Gambia. CURR BIOL 23, 1–6. http://dx.doi.org/10.1016/j.cub.2013.04.006

Study Reveals: 30 % Of Survey Participants Check Work-Related E-Mails Before Going To Bed And Upon Waking In The Morning

A research project at the Department of Media and Communications Science is investigating the “mediatisation of work”. One of the aspects receiving focused attention is the question of how new media and technologies are transforming the working environment.

From the spread of the personal computer in the 80s and 90s, to smart phones and tablet PCs – increasingly, our working life is affected by modern information and communication technologies (ICT). This development goes hand in hand with the societal change of work. “As far as working individuals are concerned, this introduces new challenges as well as opportunities”, Caroline Roth-Ebner points out. She is currently conducting a study on “New Media and Work” at the Alpen-Adria-Universität.

Caroline Roth-Ebner 
 Foto: Maurer

First interim results already clearly illustrate that the workforce requires new skills, in order to both seize the opportunities and meet the challenges of a mediatised working environment. In the course of her study, Caroline Roth-Ebner carried out an online survey and conducted 20 interviews with so-called “digicom workers”.These are individuals whose professional occupations mainly involve performing activities in the realm of communication and information, primarily through the use of digital technology. To a certain extent their tasks are organized virtually, which means that they can work independently of space and time.

“We have discovered that media and technology skills represent an essential prerequisite for many professions today”, Roth-Ebner comments. However, two additional skills are also gaining relevance: “Flexible working conditions without defined boundaries require competencies in the management of time and space, as well as the ability to manage borders.” The mobile office in particular, which may take the form of e-mail access via smart phone, laptop or tablet PC, poses a significant challenge, and requires a greater balancing effort in relation to the border between work and non-work. “Work follows you everywhere”, one digicom worker describes the availability trap during her interview.

Indeed, the online survey revealed that 30 per cent of the 445 participants check their e-mails before they go to bed and again upon waking. Only five per cent declared that they are unavailable to their companies or their customers during their leisure time. Companies deliberately provide their employees with prestigious smart phones, thus increasing the pressure to remain available, even outside of working hours.

Contacts and sources:

Sunday, April 28, 2013

Titan's Methane: Going, Going, Soon to Be Gone?

By tracking a part of the surface of Saturn's moon Titan over several years, NASA's Cassini mission has found a remarkable longevity to the hydrocarbon lakes on the moon's surface.

A team led by Christophe Sotin of NASA's Jet Propulsion Laboratory in Pasadena, Calif., fed these results into a model that suggests the supply of the hydrocarbon methane at Titan could be coming to an end soon (on geological timescales). The study of the lakes also led scientists to spot a few new ones in images from Cassini's visual and infrared mapping spectrometer data in June 2010. 

These images from NASA's Cassini spacecraft show one of the large seas and a bounty of smaller lakes on Saturn's moon Titan. Scientists saw these small lakes in data obtained by both Cassini's visual and infrared mapping spectrometer (left) and radar instrument (right). 
These images from NASA's Cassini spacecraft show one of the large seas and a bounty of smaller lakes on Saturn's moon Titan.
Image credit: NASA/JPL-Caltech/University of Arizona

Titan is the only other place in the solar system besides Earth that has stable liquid on its surface. Scientists think methane is at the heart of a cycle at Titan that is somewhat similar to the role of water in Earth's hydrological cycle - causing rain, carving channels and evaporating from lakes. However, the fact that the lakes seem remarkably consistent in size and shape over several years of data from Cassini's visual and infrared mapping spectrometer suggests that the lakes evaporate very slowly. Methane tends to evaporate quickly, so scientists think the lakes must be dominated by methane's sister hydrocarbon ethane, which evaporates more slowly.

The lakes are also not getting filled quickly, and scientists haven't seen more than the occasional outburst of hydrocarbon rain at the moon over the mission's eight-plus years in the Saturn system. This indicates that on Titan, the methane that is constantly being lost by breaking down to form ethane and other heavier molecules is not being replaced by fresh methane from the interior. The team suggests that the current load of methane at Titan may have come from some kind of gigantic outburst from the interior eons ago possibly after a huge impact. They think Titan's methane could run out in tens of millions of years.

A dense network of small rivers or swampy areas appears to connect some of the seas on Titan, as seen in this comparison of data of the same area from two Cassini instruments. Images from the radar instrument are on the left and images from the visual and infrared mapping spectrometer (VIMS) are on the right.
Small rivers and swampy areas on Titan
Image credit: NASA/JPL-Caltech/University of Arizona

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. The Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, manages the mission for NASA's Science Mission Directorate, Washington, D.C. The visual and infrared mapping spectrometer team is based at the University of Arizona, Tucson.

Contacts and sources:
Jia-Rui Cook
Jet Propulsion Laboratory

For more information on this finding and the lakes, visit http://saturn.jpl.nasa.gov/news/cassiniscienceleague/science20130412/ . 

Where Are The Best Windows Into Europa's Interior?

The surface of Jupiter's moon Europa exposes material churned up from inside the moon and also material resulting from matter and energy coming from above. If you want to learn about the deep saltwater ocean beneath this unusual world's icy shell -- as many people do who are interested in possible extraterrestrial life -- you might target your investigation of the surface somewhere that has more of the up-from-below stuff and less of the down-from-above stuff.

This graphic of Jupiter's moon Europa maps a relationship between the amount of energy deposited onto the moon from charged-particle bombardment and the chemical contents of ice deposits on the surface in five areas of the moon (labeled A through E).
Energy From Above Affecting Surface of Europa
Credit: NASA/JPL-Caltech/Univ. of Ariz./JHUAPL/Univ. of Colo.

New analysis of observations made more than a decade ago by NASA's Galileo mission to Jupiter helps identify those places.

"We have found the regions where charged electrons and ions striking the surface would have done the most, and the least, chemical processing of materials emplaced at the surface from the interior ocean," said J. Brad Dalton of NASA's Jet Propulsion Laboratory, Pasadena, Calif., lead author of the report published recently in the journal Planetary and Space Science. "That tells us where to look for materials representing the most pristine ocean composition, which would be the best places to target with a lander or study with an orbiter."

Europa is about the size of Earth's moon and, like our moon, keeps the same side toward the planet it orbits. Picture a car driving in circles around a mountain with its left-side windows always facing the mountain.

Europa's orbit around Jupiter is filled with charged, energetic particles tied to Jupiter's powerful magnetic field. Besides electrons, these particles include ions of sulfur and oxygen originating from volcanic eruptions on Io, a neighboring moon.

The magnetic field carrying these energetic particles sweeps around Jupiter faster than Europa orbits Jupiter, in the same direction: about 10 hours per circuit for the magnetic field versus about 3.6 days for Europa's orbit. So, instead of our mountain-circling car getting bugs on the front windshield, the bugs are plastered on the back of the car by a "wind" from behind going nearly nine times faster than the car. Europa has a "leading hemisphere" in front and a "trailing hemisphere" in back.

Earlier studies had found more sulfuric acid being produced toward the center of the trailing hemisphere than elsewhere on Europa's surface, interpreted as resulting from chemistry driven by sulfur ions bombarding the icy surface.

Dalton and his co-authors at JPL and at Johns Hopkins University Applied Physics Laboratory, Laurel, Md., examined data from observations by Galileo's near infrared mapping spectrometer of five widely distributed areas of Europa's surface. The spectra of reflected light from frozen material on the surface enabled them to distinguish between relatively pristine water and sulfate hydrates. These included magnesium and sodium sulfate salt hydrates, and hydrated sulfuric acid. They compared the distributions of these substances with models of how the influxes of energetic electrons and of sulfur and oxygen ions are distributed around the surface of Europa.

The concentration of frozen sulfuric acid on the surface varies greatly, they found. It ranges from undetectable levels near the center of the leading hemisphere, to more than half of the surface materials near the center of the heavily bombarded trailing hemisphere. The concentration was closely related to the amount of electrons and sulfur ions striking the surface.

"The close correlation of electron and ion fluxes with the sulfuric acid hydrate concentrations indicates that the surface chemistry is affected by these charged particles," says Dalton. "If you are interested in the composition and habitability of the interior ocean, the best places to study would be the parts of the leading hemisphere we have identified as receiving the fewest electrons and having the lowest sulfuric acid concentrations."

Surface deposits in these areas are most likely to preserve the original chemical compounds that erupted from the interior. Dalton suggests that any future spacecraft missions to Europa should target these deposits for study from orbit, or even attempt to land there.

Dalton said, "The darkest material, on the trailing hemisphere, is probably the result of externally-driven chemical processing, with little of the original oceanic material intact. While investigating the products of surface chemistry driven by charged particles is still interesting from a scientific standpoint, there is a strong push within the community to characterize the contents of the ocean and determine whether it could support life. These kinds of places just might be the windows that allow us to do that."

The study was funded by NASA's Outer Planets Research Program. NASA's Galileo mission, launched in 1989, orbited Jupiter, investigating the planet and its diverse moons from 1995 to 2003. JPL, a division of the California Institute of Technology in Pasadena, managed Galileo for NASA's Science Mission Directorate, Washington.

Contacts and sources:
Jet Propulsion Laboratory 

New Device Could Make Diagnosing Disease As Simple As Breathing

A range of diseases and conditions, from asthma to liver disease, could be diagnosed and monitored quickly and painlessly just by breathing, using gas sensing technology developed by a Cambridge spin-out.

Microscope/Micrograph of MEMS Micro-heating element with integrated CMOS electronic driver and temperature sensing circuits

Credit: Cambridge CMOS Sensors

The highly sensitive, low-power, low-cost infrared emitter developed by Cambridge CMOS Sensors (CCMOSS) is capable of identifying more than 35 biomarkers present in exhaled breath in concentrations as low as one part per million, and is being developed for use as a non-invasive medical testing device and other applications.

In addition to nitrogen, oxygen and carbon dioxide, we exhale thousands of chemical compounds with every breath: elevated acetone levels in the breath can indicate poorly-controlled diabetes, asthmatics will exhale higher than normal levels of nitric oxide, and glucose is a sign of kidney failure.

“Non-invasive breath analysis is an area of great potential for diagnosing and monitoring a wide range of medical conditions,” said Professor Florin Udrea of the Department of Engineering and CCMOSS’ CEO and co-founder. “Testing is easy and painless, and can be repeated as often as needed.”

A number of breath analysis tests are currently in the research and development phase, most of which use mass spectrometry or lasers to analyse the breath for specific compounds. These tests can only detect a small range of compounds however, meaning that different devices are needed to detect different conditions.

The technology developed by CCMOSS is different in that it uses broadband infrared radiation to make the detection of a wide range of biomarkers possible in a single device. The company’s miniature heaters, or microhotplates, can be heated from room temperature to 700°C in a fraction of a second, a temperature high enough to emit infrared radiation and allow the sensing material to react with gas molecules.

Many gas molecules absorb infrared. The amount of radiation absorbed allows the gas to be identified and its concentration calculated - this is the basic principle behind the roadside breathalyser test. CCMOSS’ technology however, is far more sensitive. Using broadband infrared, the company’s gas sensing technology can detect wavelengths between two and 14 microns, corresponding to a wide range of biomarkers. In order to detect different wavelengths, a filter is applied on top of the detector, meaning that only infrared radiation of a particular wavelength can get through.

CCMOSS’ devices are based on complementary metal-oxide semiconductor (CMOS) technology, a low-power type of semiconductor which is widely used in microprocessors and battery-operated devices. Using CMOS processes results in miniaturised ultra-low power devices that can be produced at higher volume and lower cost than current state of the art gas sensing devices.

Because the CMOS process is highly reproducible, all the parameters can be very tightly controlled. The manufacturing process is highly scalable and cost effective, with yields above 99 per cent.
In addition to medical applications, the company is developing their technology for use in consumer electronics, industrial security and automotive applications. It currently has a range of products on the market and is actively involved in leading edge R&D projects for the next generation of micro and nanosensors.

The company, which spun-out from the Department of Engineering in 2009, was founded by Professors Florin Udrea and Bill Milne of Cambridge, along with Professor Julian Gardner of Warwick University. CCMOSS has been supported by seed funding from Cambridge Enterprise, the University’s commercialisation arm, and was recently named Cleantech Business of the Year at the 2013 Business Weekly awards.

Contacts and sources:

Movement Of Pyrrole Molecules Defy 'Classical' Physics

New research shows that movement of the ring-like molecule pyrrole over a metal surface runs counter to the centuries-old laws of 'classical' physics that govern our everyday world.

Representation of a pyrrole molecule
Credit: Marco Sacchi/ University of Cambridge

Using uniquely sensitive experimental techniques, scientists have found that laws of quantum physics - believed primarily to influence at only sub-atomic levels – can actually impact on a molecular level.

Researchers at Cambridge's Chemistry Department and Cavendish Laboratory say they have evidence that, in the case of pyrrole, quantum laws affecting the internal motions of the molecule change the "very nature of the energy landscape" – making this 'quantum motion' essential to understanding the distribution of the whole molecule.

The study, a collaboration between scientists from Cambridge and Rutgers universities, appeared in the German chemistry journalAngewandte Chemie earlier this month.

A pyrrole molecule's centre consists of a "flat pentagram" of five atoms, four carbon and one nitrogen. Each of these atoms has an additional hydrogen atom attached, sticking out like spokes.

Following experiments performed by Barbara Lechner at the Cavendish Laboratory to determine the energy required for movement of pyrrole across a copper surface, the team discovered a discrepancy that led them down a 'quantum' road to an unusual discovery.

In previous work on simpler molecules, the scientists were able to accurately calculate the 'activation barrier' – the energy required to loosen a molecule's bond to a surface, allowing movement – using 'density functional theory', a method that treats the electrons which bind the atoms according to quantum mechanics but, crucially, deals with atomic nuclei using a 'classical' physics approach.

Surprisingly, with pyrrole the predicted 'activation barriers' were way out, with calculations "less than a third of the measured value". After much head scratching, puzzled scientists turned to a purely quantum phenomenon called 'zero-point energy'.

In classical physics, an object losing energy can continue to do so until it can be thought of as sitting perfectly still. In the quantum world, this is never the case: everything always retains some form of residual – even undetectable – energy, known as 'zero-point energy'.

While 'zero-point energy' is well known to be associated with motion of the atoms contained in molecules, it was previously believed that such tiny amounts of energy simply don't affect the molecule as a whole to any measurable extent, unless the molecule broke apart.

But now, the researchers have discovered that the "quantum nature" of the molecule's internal motion actually does affect the molecule as a whole as it moves across the surface, defying the 'classical' laws that it's simply too big to feel quantum effects.

'Zero-point energy' moving within a pyrrole molecule is unexpectedly sensitive to the exact site occupied by the molecule on the surface. In moving from one site to another, the 'activation energy' must include a sizeable contribution due to the change in the quantum 'zero-point energy'.

Scientists believe the effect is particularly noticeable in the case of pyrrole because the 'activation energy' needed for diffusion is particularly small, but that many other similar molecules ought to show the same kind of behavior.

"Understanding the nature of molecular diffusion on metal surfaces is of great current interest, due to efforts to manufacture two-dimensional networks of ring-like molecules for use in optical, electronic or spintronic devices," said Dr Stephen Jenkins, who heads up the Surface Science Group in Cambridge's Department of Chemistry.

"The balance between the activation energy and the energy barrier that sticks the molecules to the surface is critical in determining which networks are able to form under different conditions."

Learning Disabilities Affect Up To 10 Percent Of Children And Co-Occur At Higher Than Expected Rates

Up to 10 per cent of the population is affected by specific learning disabilities (SLDs), such as dyslexia, dyscalculia and autism, translating to two or three pupils in every classroom, a new study has found.

Credit: University of Melbourne

Led by Professor Brian Butterworth, a Professorial Fellow at the University of Melbourne’s School of Psychological Sciences and Emeritus Professor of cognitive neuropsychology at University College London, the study gives insight into the underlying causes of specific learning disabilities and how to tailor individual teaching and learning for individuals and education professionals.

The study found children are frequently affected by more than one learning disability and that specific learning disabilities co-occur more often than expected. For example, in children with attention-deficit or hyperactivity disorder, 33 to 45 per cent also suffer from dyslexia and 11 per cent from dyscalculia, a learning disability in mathematics.

Professor Butterworth said the results showed there were many neurological development disorders that result in learning disabilities, even in children of normal or even high intelligence.

Specific learning disabilities arise from atypical brain development with complicated genetic and environmental factors, causing such conditions as dyslexia, dyscalculia, attention-deficit/hyperactivity disorder, autism spectrum disorder and specific language impairment.

As part of the study, Professor Butterworth and colleague Yulia Kovas have summarised what is known about SLD’s neural and genetic basis to help clarify how these disabilities develop, helping improve teaching for individual learners, and also training for school psychologists, clinicians and teachers.

The study suggests causes of SLDs are due to difficulties processing speech, language and numbers at a cognitive level. From a neurological basis, evidence suggests each SLD is associated with an abnormality in a distinct neural network. A single neurophysiological cause may affect distinct regions in the brain, affecting an individual’s learning ability.

“We are also finally beginning to find effective ways to help learners with one or more SLDs, and although the majority of learners can usually adapt to the one-size-fits-all approach of whole class teaching, those with SLDs will need specialised support tailored to their unique combination of disabilities,” he said.

Contacts and sources:

Computer Scientists Suggest New Spin On Origins Of Evolvability

Scientists have long observed that species seem to have become increasingly capable of evolving in response to changes in the environment. But computer science researchers now say that the popular explanation of competition to survive in nature may not actually be necessary for evolvability to increase.

In a paper published this week in PLOS ONE, the researchers report that evolvability can increase over generations regardless of whether species are competing for food, habitat or other factors.

Using a simulated model they designed to mimic how organisms evolve, the researchers saw increasing evolvability even without competitive pressure.

"The explanation is that evolvable organisms separate themselves naturally from less evolvable organisms over time simply by becoming increasingly diverse," said Kenneth O. Stanley, an associate professor at the College of Engineering and Computer Science at the University of Central Florida. He co-wrote the paper about the study along with lead author Joel Lehman, a post-doctoral researcher at the University of Texas at Austin.

Kenneth Stanley's work has been cited more than 4,000 times.
 Credit: UCF

The finding could have implications for the origins of evolvability in many species.

"When new species appear in the future, they are most likely descendants of those that were evolvable in the past," Lehman said. "The result is that evolvable species accumulate over time even without selective pressure."

During the simulations, the team's simulated organisms became more evolvable without any pressure from other organisms out-competing them. The simulations were based on a conceptual algorithm.

"The algorithms used for the simulations are abstractly based on how organisms are evolved, but not on any particular real-life organism," explained Lehman.

Joel Lehman is a researcher at University of Texas at Austin. He earned his Ph.D. in computer science at UCF.

Credit: Joel Lehman

The team's hypothesis is unique and is in contrast to most popular theories for why evolvability increases.

"An important implication of this result is that traditional selective and adaptive explanations for phenomena such as increasing evolvability deserve more scrutiny and may turn out unnecessary in some cases," Stanley said.

Stanley is an associate professor at UCF. He has a bachelor's of science in engineering from the University of Pennsylvania and a doctorate in computer science from the University of Texas at Austin. He serves on the editorial boards of several journals. He has over 70 publications in competitive venues and has secured grants worth more than $1 million. His works in artificial intelligence and evolutionary computation have been cited more than 4,000 times.

Lehman has a bachelor's degree in computer science from Ohio State University and a Ph.D. in computer science from UCF. He continues his research at the University of Texas at Austin and is teaching an undergraduate course in artificial intelligence.


50 Years of Achievement: The University of Central Florida, the nation's second-largest university with nearly 60,000 students, is celebrating its 50th anniversary in 2013. UCF has grown in size, quality, diversity and reputation, and today the university offers more than 200 degree programs at its main campus in Orlando and more than a dozen other locations. Known as America's leading partnership university, UCF is an economic engine attracting and supporting industries vital to the region's success now and into the future. For more information, visithttp://today.ucf.edu.

Contacts and sources:Zenaida Gonzalez Kotala
University of Central Florida