Unseen Is Free

Unseen Is Free
Try It Now

Google Translate

No censorship at seen.life

Thursday, September 30, 2010

NIH Scientists Describe How Salmonella Bacteria Spread In Humans

New findings by National Institutes of Health scientists could explain how Salmonella bacteria, a common cause of food poisoning, efficiently spread in people. In a study published this week in the Proceedings of the National Academy of Sciences, researchers describe finding a reservoir of rapidly replicating Salmonella inside epithelial cells.

 These bacteria are primed to infect other cells and are pushed from the epithelial layer by a new mechanism that frees the Salmonella to infect other cells or be shed into the intestine.

VIDEO: See a time-lapse series showing hyper-replication of Salmonella bacteria (red) in epithelial cells from two to seven hours after infection.
 
Credit: NIAID VIDEO

The Centers for Disease Control and Prevention estimate that Salmonella infections sicken 40,000 people each year in the United States, though the actual number of infections is likely much higher because many cases are mild and not diagnosed or reported. Currently, Salmonella is the focus of an ongoing U.S. public health investigation into contaminated chicken eggs.

"Unfortunately, far too many people have experienced the debilitating effects of Salmonella, which cause disease via largely unexplained processes, including overactive inflammatory responses," says Anthony S. Fauci, M.D., director of NIH's National Institute of Allergy and Infectious Diseases (NIAID). "This elegant study provides new insight into the origins of that inflammatory disease process."

While much is known about the human infectious cycle of Salmonella, scientists have yet to understand how the bacteria escape the gut to spread infection. Epithelial cells line the outer and inner surfaces of the body, such as the skin and gut, and form a continuous protective tissue against infection. But Salmonella have learned how to live inside epithelial cells and use them for their benefit. Salmonella protect themselves within special membrane-bound compartments, called vacuoles, inside gut epithelial cells.

Using special high-resolution microscopes to view laboratory-grown human intestinal epithelial cells and laboratory mice infected with Salmonella, an NIAID research group led by Olivia Steele-Mortimer, Ph.D., in collaboration with Bruce Vallance, Ph.D., of the University of British Columbia in Vancouver, discovered a secondary population of Salmonella not confined within a vacuole, but instead moving freely inside the epithelial cells.

This reservoir of Salmonella is distinct from vacuolar Salmonella. The bacteria multiply much faster; they have long tail-like projections, called flagella, used to move; and they exhibit a needle complex they use to pierce cells and inject their proteins. With these attributes, this population of Salmonella is genetically programmed to invade new cells.

The scientists observed that epithelial cells containing the hyper-replicating, invasive Salmonella are eventually pushed out of the intestinal tissue into the gut cavity, setting the Salmonella free. The mechanism used to push these Salmonella-infected cells into the body cavity resembles the natural mechanism humans use to shed dying or dead epithelial cells from their gut. The scientists believe that Salmonella have hijacked this mechanism to facilitate their own escape.

The human immune system, however, also senses that these are not normal, dying cells in the gut and triggers a response that includes release of interleukin-18, a small protein that sets off an inflammation cascade. Interleukin-18 also is prominent in chronic intestinal inflammation associated with autoimmune disorders, such as inflammatory bowel disease. The effects of interleukin-18 release provide an explanation for the acute intestinal inflammation associated with Salmonella infections.

The scientists hope their research leads to a treatment that prevents the spread of infection. They are focusing on how this specialized population of Salmonella escapes from its membrane-bound compartment to multiply and swim freely in the cell.

NIAID conducts and supports research—at NIH, throughout the United States, and worldwide—to study the causes of infectious and immune-mediated diseases, and to develop better means of preventing, diagnosing and treating these illnesses. News releases, fact sheets and other NIAID-related materials are available on the NIAID Web site at http://www.niaid.nih.gov.

The National Institutes of Health (NIH)—The Nation's Medical Research Agency—includes 27 Institutes and Centers and is a component of the U. S. Department of Health and Human Services. It is the primary federal agency for conducting and supporting basic, clinical and translational medical research, and it investigates the causes, treatments and cures for both common and rare diseases. For more information about NIH and its programs, visit http://www.nih.gov.

Reference: L Knodler et al. Dissemination of invasive Salmonella via bacterial-induced extrusion of mucosal epithelia. Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.1006098107 (2010).

Contacts and sources:

Turning Waste Heat Into Power By Taking Advantage Of Quantum Physics



Taking advantage of quantum physics, a new way of harvesting waste heat and turning it into electrical power holds great promise for making cars, power plants, factories and solar panels more efficient

 A "forest " of molecules holds the promise of turning waste heat into electricity. UA physicists discovered that because of quantum effects, electron waves traveling along the backbone of each molecule...
                                               
Credit: University of Arizona

Unlike existing heat-conversion devices such as refrigerators and steam turbines, the devices of Bergfield and Stafford require no mechanics and no ozone-depleting chemicals. Instead, a rubber-like polymer sandwiched between two metals acting as electrodes can do the trick.

Car or factory exhaust pipes could be coated with the material, less than 1 millionth of an inch thick, to harvest energy otherwise lost as heat and generate electricity.

The physicists take advantage of the laws of quantum physics, a realm not typically tapped into when engineering power-generating technology. To the uninitiated, the laws of quantum physics appear to fly in the face of how things are "supposed" to behave.

Charles Stafford (left) and Justin Bergfield discuss the flow of electron waves around a benzene ring -- the key to the quantum effects allowing for the conversion of heat into electricity.
 
Credit: University of Arizona

The key to the technology lies in a quantum law physicists call wave-particle duality: Tiny objects such as electrons can behave either as a wave or as a particle.

"In a sense, an electron is like a red sports car," Bergfield said. "The sports car is both a car and it's red, just as the electron is both a particle and a wave. The two are properties of the same thing. Electrons are just less obvious to us than sports cars."

Bergfield and Stafford discovered the potential for converting heat into electricity when they studied polyphenyl ethers, molecules that spontaneously aggregate into polymers, long chains of repeating units. The backbone of each polyphenyl ether molecule consists of a chain of benzene rings, which in turn are built from carbon atoms. The chain link structure of each molecule acts as a "molecular wire" through which electrons can travel.

"We had both worked with these molecules before and thought about using them for a thermoelectric device," Bergfield said, "but we hadn't really found anything special about them until Michelle Solis, an undergrad who worked on independent study in the lab, discovered that, low and behold, these things had a special feature."

Using computer simulations, Bergfield then "grew" a forest of molecules sandwiched between two electrodes and exposed the array to a simulated heat source.

"As you increase the number of benzene rings in each molecule, you increase the power generated," Bergfield said.

The secret to the molecules' capability to turn heat into power lies in their structure: Like water reaching a fork in a river, the flow of electrons along the molecule is split in two once it encounters a benzene ring, with one flow of electrons following along each arm of the ring.

Bergfield designed the benzene ring circuit in such a way that in one path the electron is forced to travel a longer distance around the ring than the other. This causes the two electron waves to be out of phase once they reunite upon reaching the far side of the benzene ring. When the waves meet, they cancel each other out in a process known as quantum interference. When a temperature difference is placed across the circuit, this interruption in the flow of electric charge leads to the buildup of an electric potential – voltage – between the two electrodes.

Wave interference is a concept exploited by noise-cancelling headphones: Incoming sound waves are met with counter waves generated by the device, wiping out the offending noise.

"We are the first to harness the wave nature of the electron and develop a concept to turn it into usable energy," Stafford said.

Analogous to solid state versus spinning hard drive type computer memory, the UA-designed thermoelectric devices require no moving parts. By design, they are self-contained, easier to manufacture and easier to maintain compared to currently available technology.

"You could just take a pair of metal electrodes and paint them with a single layer of these molecules," Bergfield said. "That would give you a little sandwich that would act as your thermoelectric device. With a solid-state device you don't need cooling agents, you don't need liquid nitrogen shipments, and you don't need to do a lot of maintenance."

"You could say, instead of Freon gas, we use electron gas," Stafford added.

"The effects we see are not unique to the molecules we used in our simulation," Bergfield said. "Any quantum-scale device where you have a cancellation of electric charge will do the trick, as long as there is a temperature difference. The greater the temperature difference, the more power you can generate."

Molecular thermoelectric devices could help solve an issue currently plaguing photovoltaic cells harvesting energy from sunlight.

"Solar panels get very hot and their efficiency goes down," Stafford said. "You could harvest some of that heat and use it to generate additional electricity while simultaneously cooling the panel and making its own photovoltaic process more efficient."

"With a very efficient thermoelectric device based on our design, you could power about 200 100-Watt light bulbs using the waste heat of an automobile," he said. "Put another way, one could increase the car's efficiency by well over 25 percent, which would be ideal for a hybrid since it already uses an electrical motor."

So, next time you watch a red sports car zip by, think of the hidden power of the electron and how much more efficient that sports car could be with a thermoelectric device wrapped around its exhaust pipe.

Contacts and sources:

Reference: Giant Thermoelectric Effect from Transmission Supernodes. Justin Bergfield, Michelle Solis, and Charles Stafford. ACS Nano Sept. 2010.

Funding for this research was provided by the University of Arizona physics department.

Different Tea Leaves Identified Using Neural Networks; Mathematical Algorithms Mimic Human Nervous System



A team of chemists from the University of Seville (US) has managed to distinguish between different kinds of tea leaves on the basis of their mineral content and by using artificial neural networks. This technique makes it possible to differentiate between the five main varieties of tea – white, green black, Oolong and red tea.

"This method makes it possible to clearly differentiate between the five types of tea – something that is often not easy to do by eye alone – by using analysis of the leaves' mineral content and then mathematically processing these data", José Marcos Jurado, co-author of the study and a researcher at the US, tells SINC.

Tea leaves are identified using neural networks.
 
Credit: J. Marcos Jurado et al.

The technique makes it possible to distinguish between the five main tea varieties (white, green, black, Oolong and red) using chemometrics, a branch of chemistry that uses mathematics to extract useful information from data obtained in the laboratory.

Firstly, the concentrations of the chemical elements in the leaves were determined using 'inductively-coupled plasma atomic emission spectroscopy', which showed the most abundant elements to be calcium, magnesium, potassium, aluminium, phosphorus and sulphur.

Other essential elements were also identified in the tea, such as copper, manganese, iron and zinc, according to this study, which has been published online in the journal Food Chemistry.

Once the mineral content of the leaves was established, probabilistic neural networks were used to find out which type of tea a sample belonged to. These networks are "mathematical algorithms that mimic the behaviour of the neurons in the human nervous system in order to process the information", the expert explains.

This generates a model that receives an input signal (chemical data) and produces an output one, making it possible to predict the type of tea in the sample with a probability of 97%.
The second most commonly drunk beverage in the world

Tea is the second most commonly drunk beverage in the world after water, and this has been the case since 2700BCE. This infusion is prepared from the plant Camellia sinensis. The five tea varieties result from the different kinds of preparation process that the leaves are subjected to after being harvested.

White tea is a non-fermented tea made up of new buds and leaves that are protected from sunlight as they grow in order to limit chlorophyll production. Green tea is another unfermented tea, but it is made by using older green leaves.

The Oolong and black tea varieties are made by fermenting the leaves, although in the first case these are completely fermented, while black tea undergoes an intermediate controlled fermentation process of between 10% and 70%.

Red, or Pu-erh, tea is a fermented product obtained from another variety of the plant, Camellia sinensis var assamica, which is cultivated in the Chinese region of Yunnan.

The health benefits of the leaves of this plant are well known. Aside from acting as an antioxidant, diuretic and relieving hypertension, it is also an important source of essential elements such as aluminium, copper, zinc, calcium and potassium.

Contacts and sources:
Citation: James S. McKenzie, José Marcos Jurado y Fernando de Pablos. "Characterisation of tea leaves according to their total mineral content by means of probabilistic neural networks". Food Chemistry 123 (3): 859, 2010. Doi: 10.1016/j.foodchem.2010.05.007.

Rice Researchers Find Metallacarboranes May Meet DOE Hydrogen Storage Goals

New research by Rice University scientists suggests that a class of material known as metallacarborane could store hydrogen at or better than benchmarks set by the United States Department of Energy (DOE) Hydrogen Program for 2015.

The work could receive wide attention as hydrogen comes into play as a fuel of the future for cars, in fuel cells and by industry.

Metallacarborane 
 

The new study by Rice theoretical physicist Boris Yakobson and his colleagues, which appears in the online Journal of the American Chemical Society, taps the power of transition metals scandium and titanium to hold a load of hydrogen molecules -- but not so tightly that they can't be extracted.

A matrix made of metallacarboranes would theoretically hold up to 8.8 percent of its weight in hydrogen atoms, which would at least meet and perhaps surpass DOE milestones issued a year ago for cars that would run on hydrogen fuel.

Yakobson, a professor in mechanical engineering and materials science and of chemistry at Rice, said inspiration for the new study came from the development of metallacarboranes, now well-known molecules that combine boron, carbon and metal atoms in a cage-like structure.

"A single metal atom can bind multiple hydrogen molecules," Yakobson said, "but metals also tend to aggregate. Without something to hold them, they clump into a blob and are useless."

Abhishek Singh, lead author of the study, a former postdoctoral researcher for Yakobson and now an assistant professor at the Indian Institute of Science in Bangalore, India, calculated that boron clusters would grip the titanium and scandium, which would in turn bind hydrogen. "The metals fit like a gem in a setting, so they don't aggregate," Yakobson said. Carbon would link the clusters to form a matrix called a metal organic framework (MOF), which would act like a sponge for hydrogen.

Investigation of various transition metals showed scandium and titanium to have the highest rate of adsorption (the adhesion of transient molecules -- like hydrogen -- to a surface). Both demonstrate an affinity for "Kubas" interaction, a trading of electrons that can bind atoms to one another in certain circumstances. "Kubas is a special interaction that you often see mentioned in hydrogen research, because it gives exactly the right binding strength," Yakobson said.

"If you remember basic chemistry, you know that covalent bonds are very strong. You can bind hydrogen, but you cannot take it out," he said. "And on the other extreme is weak physisorption. The molecules don't form chemical bonds. They're just exhibiting a weak attraction through the van der Waals force.

"Kubas interaction is in the middle and gives the right kind of binding so hydrogen can be stored and, if you change conditions -- heat it up a little or reduce pressure -- it can be taken out. You want the framework to be like a fuel tank."

Kubas allows for reversible storage of hydrogen in ambient conditions -- ranging from well above to well below room temperature -- and that would make metallacarborane materials highly attractive for everyday use, Yakobson said. Physisorption of hydrogen by the carbon matrix, already demonstrated, would also occur at a much lower percentage, which would be a bit of a bonus, he said.

Other studies have demonstrated how to make carborane-based MOFs. "That means they can already make three-dimensional frameworks of material that are still accessible to gas. This is very encouraging to us," Yakobson said. "There are many papers where people analyze a cluster and say, 'Oh, this will also absorb a hydrogen,' but that's not useful. One cluster is nothing.

"But if chemists can synthesize this particular framework with metallacarborane as an element, this may become a reality."

Arta Sadrzadeh, a graduate student in Yakobson's lab, is a co-author.

Contacts and sources:
Read the abstract here: http://pubs.acs.org/doi/abs/10.1021/ja104544s

Newly Discovered Planet Gliese 581g May Have Water On Its Surface

 A team of astronomers that includes the University of Hawaiʻi' at Manoa's Nader Haghighipour has announced the discovery of a planet that could have liquid water on its surface.

The planet, which is probably 30 percent larger than Earth, was discovered using one of the telescopes of the W. M. Keck Observatory on Mauna Kea. It orbits a relatively small star, Gliese 581, that is 20 light-years from Earth in the constellation Libra.

"By determining the orbit of this planet, we can deduce that its surface temperature is similar to that of Earth," said Haghighipour. This means that at least some of any water on the surface of the planet and in its atmosphere will be in liquid form rather than ice or vapor. The discovery of liquid water in space is an important step in the search for extraterrestrial life.

The team estimates that the new planet, called Gliese 581g, has a mass three to four times that of Earth, and orbits its star in just under 37 Earth days. Its mass indicates that it is probably a rocky planet with enough gravity to hold on to its atmosphere. It is one of six known planets orbiting the star.

To discover the planet, the team looked for the tiny changes in the star's velocity that arise from the gravitational tugs of its planets. They used 238 separate observations of Gliese 581 taken over a period of 11 years.

Haghighipour said that the team is keeping tabs on many nearby stars using the Keck Observatory. "As we collect more and more data about how these stars are moving, we expect to find many more planets with potentially Earth-like conditions," he said. He noted that to learn more about the conditions on these planets would take even bigger telescopes, such at the Thirty Meter Telescope planned for Mauna Kea.

###
The team that made the discovery is led by Steven Vogt of the University of California, Santa Cruz (UCSC) and Paul Butler of the Carnegie Institution of Washington. Other team members include UCSC associate research scientist Eugenio Rivera, and Gregory Henry and Michael Williamson of Tennessee State University.

This research was supported by grants from the National Science Foundation, NASA, and the NASA Astrobiology Institute.

Contacts and sources:

For a related press release, see http://news.ucsc.edu/2010/09/planet.html.

Underwater Robot Swims Free Thanks To York U-Designed Wireless Controller

A waterproof controller designed and built by York University researchers is allowing an underwater robot to go “wireless” in a unique way.

AQUA, an amphibious, otter-like robot, is small and nimble, with flippers rather than propellers, designed for intricate data collection from shipwrecks and reefs.
 
Credit: York University

The robot, a joint project of York, McGill and Dalhousie universities, can now be controlled wirelessly using a waterproof tablet built at York. While underwater, divers can program the tablet to display tags onscreen, similar to barcodes read by smartphones. The robot’s on-board camera then scans these two-dimensional tags to receive and carry out commands. For a video of AQUA in action, click here.

Cutting the cord on underwater robots has been a longstanding challenge for scientists; water interferes with radio signals, hindering traditional wireless communication via modem. Tethered communication is cumbersome and can create safety issues for divers.

“Having a robot tethered to a vehicle above water creates a scenario where communication between the diver, robot, and surface operator becomes quite complicated,” says Michael Jenkin, professor in York’s Faculty of Science & Engineering and co-author of the forthcoming paper, Swimming with Robots: Human Robot Communication at Depth.

“Investigating a shipwreck, for example, is a very delicate operation and the diver and robot need to be able to react quickly to changes in the environment. An error or a lag in communication could be dangerous,” Jenkin says.

Realizing there was no device on the market that fit the bill, Jenkin and his team at York’s Centre for Vision Research, including the paper’s lead author, MSc student Bart Verzijlenberg, set to work constructing a prototype. The resulting device, fittingly dubbed AQUATablet, is watertight to a depth of 60 feet. Aluminum housing with a clear acrylic cover protects the tablet computer, which can be controlled by a diver using toggle-switches and on-screen prompts.

“A diver at 60 feet can actually teleoperate AQUA 30-40 feet deeper. Needless to say this is much easier on the diver, physically, and much safer,” Jenkin says.

The tablet also allows divers to command the robot much as if they were using a video game joystick; turn the tablet right and AQUA turns right, too. In this mode, the robot is connected to the tablet by a slim length of optical cable, circumventing many of the issues of a robot-to-surface tether. The optical cable also allows AQUA to provide video feedback from its camera to the operator. In a totally wireless mode, the robot acknowledges prompts by flashing its on-board light. Its cameras can be used to build 3-D models of the environment which can then be used to guide the robot to particular tasks.

“This is a huge improvement on [a robot] having to travel to the surface to communicate with its operators,” Jenkin says.

In past, divers have used laminated flashcards to visually communicate with robots while underwater. However, these limit the diver to a pre-set sequence of commands.

“It’s impossible to anticipate everything you’re going to want the robot to do once you get underwater. We wanted to develop a system where we could create commands on the fly, in response to the environment,” he says.

Jenkin and Verzijlenberg’s paper will be presented at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in Taiwan.

Jenkin and Verzijlenberg are two of the researchers based in York’s new state-of-the-art Sherman Health Science Research Centre, which officially opened on Sept. 14, 2010.  Jenkin leads the Canadian Centre for Field Robotics, which is based on the building’s main level. The centre is supported by a grant from the Canada Foundation for Innovation (CFI). The AQUA project is funded in part by the Natural Sciences and Engineering Research Council of Canada (NSERC). York’s Centre for Vision Research is part of the Faculty of Health.

A Step Closer to Big Bang Conditions at Large Hadron Collider

Since December, the Large Hadron Collider (LHC) has been smashing particles together at record-setting energy levels. Physicists hope that those high-energy collisions could replicate the conditions seen immediately after the Big Bang, shedding light on how our universe came to be. Now, data from collisions that took place in July suggests that the LHC may have have taken a step toward that goal.

Proton-proton collisions at the Large Hadron Collider produce hundreds of particles. Some of those particles form pairs that display an unexpected correlation.
Image credit: CERN

The finding, which has been submitted to the Journal of High Energy Physics, comes from proton-proton collisions that occurred in the LHC in July, each of which produced 100 or more charged particles. One of the two large, general-purpose detectors at LHC, the Compact Muon Solenoid (CMS) experiment, measured the path that each of these particles took after the collision.

The CMS physicists observed a surprising new phenomenon in some pairs of those particles: They appeared to be associated together at the point of collision. That is, when some pairs of particles fly away from each other after the collision, their respective directions appear to be correlated. Such correlations between particles that move away from each other at near the speed of light had not been seen before in collisions of protons.

“As soon as the measurement came out — first within the experiment and then presented publicly — there was a lot of debate about the possible explanation,” says Gunther Roland, a physicist in the MIT heavy-ion group who was one of the leaders of the analysis of the data along with MIT postdoctoral associate Wei Li.

Some of the proposed explanations are based on subtle effects in the scattering of the quarks that make up the colliding protons, which may not be described by current models of these interactions. Others assume that the effect is the result of the high density of particles in the early stages of the collision, says Roland, an associate professor of physics at MIT.

At the Relativistic Heavy Ion Collider at Brookhaven National Laboratory, physicists have observed similar phenomena following collisions of heavier particles such as copper and gold ions. One explanation for the observation at Brookhaven is that the quarks and gluons were forced together at such high densities that they were freed, becoming quark-gluon plasma — the hot soup of elementary particles that existed for a few millionths of a second after the Big Bang and that subsequently cooled and formed protons and neutrons, the building blocks of matter.

In the upcoming months, physicists plan to increase the intensity of LHC proton beams, providing at least 100 times more data that can be used to further study this phenomenon. They also plan to run beams of heavier ions, such as lead. Based on these studies, it will be possible to eliminate many of the proposed explanations and to study if the effects in proton-proton and heavy ion collisions are related.

Sources and contacts:  Anne Trafton, MIT News Office

Knot In The Ribbon At The Edge Of The Solar System 'Unties'

The unusual "knot" in the bright, narrow ribbon of neutral atoms emanating in from the boundary between our solar system and interstellar space appears to have "untied," according to a paper published online in the Journal of Geophysical Research.

Researchers believe the ribbon, first revealed in maps produced by NASA's Interstellar Boundary Explorer (IBEX) spacecraft, forms in response to interactions between interstellar space and the heliosphere, the protective bubble in which the Earth and other planets reside. Sensitive neutral atom detectors aboard IBEX produce global maps of this region every six months.

One of the clear features visible in the IBEX maps is an apparent knot in the ribbon. Scientists were anxious to see how this structure would change with time. The second map showed that the knot in the ribbon somehow spread out. It is as if the knot in the ribbon was literally untangled over only six months. This visualization shows a close-up of the ribbon (green and red) superimposed on the stars and constellations in the nighttime sky. The animation begins by looking toward the nose of the heliosphere and then pans up and left to reveal the knot. The twisted structure superimposed on the map is an artist's conception of the tangled up ribbon. The animation then shows this structure untangling as we fade into the second map of the heliosphere. 
Download Movie
Credit: IBEX Science Team/Goddard Scientific Visualization Studio/ESA

Analyses of the first map, released last fall, suggest the ribbon is somehow ordered by the direction of the local interstellar magnetic field outside the heliosphere, influencing the structure of the heliosphere more than researchers had previously believed. The knot feature seen in the northern portion of the ribbon in the first map stood apart from the rest of the ribbon as the brightest feature at higher energies.

While the second map, released publicly with the just-published paper, shows the large-scale structure of the ribbon to be generally stable within the six-month period, changes are also apparent. The polar regions of the ribbon display lower emissions and the knot diminishes by as much as a third and appears to "untie" as it spreads out to both lower and higher latitudes.

"What we're seeing is the knot pull apart as it spreads across a region of the ribbon," says Dr. David J. McComas, IBEX principal investigator and an assistant vice president at Southwest Research Institute in San Antonio. "To this day the science team can't agree on exactly what causes the knot or the ribbon, but by comparing different sky maps we find the surprising result that the region is changing over relatively short time periods. Now we have to figure out why."

The IBEX science team compares the first and second maps to reveal whether there are time variations in the ribbon or the more distributed emissions around the ribbon. This animation fades between the first and second IBEX maps. We see that the first and second maps are relatively similar; however, there are significant time variations as well. These time variations are forcing scientists to try to understand how the heliosphere can be changing so rapidly.
Download Movie
Credit: IBEX Science Team/Goddard Space Flight Center

As the IBEX spacecraft gathers a wealth of new information about the dynamic interactions at the edge of the solar system — the region of space that shields our solar system from the majority of galactic cosmic ray radiation — the IBEX team continues to study numerous theories about the source of the ribbon and its unusual features.

The paper, "The evolving heliosphere: Large-scale stability and time variations observed by the Interstellar Boundary Explorer," by D.J. McComas, M. Bzowski, P. Frisch, G.B. Crew, M.A. Dayeh, R. DeMajistre, H.O. Funsten, S.A. Fuselier, M. Gruntman, P. Janzen, M.A. Kubiak, G. Livadiotis, E. Mobius, D.B. Reisenfeld, and N.A. Schwadron, was published online Sept. 29 in the American Geophysical Union's Journal of Geophysical Research.

IBEX is the latest in NASA's series of low-cost, rapidly developed Small Explorers space missions. Southwest Research Institute in San Antonio leads and developed the mission with a team of national and international partners. NASA's Goddard Space Flight Center in Greenbelt, Md., manages the Explorers Program for NASA's Science Mission Directorate in Washington.

Contacts and sources:

Searching for Dense Water Cascades in the Arctic Ocean

The Arctic is one of the most sensitive regions of the plant to the effects of global climate change. The RV Jan Mayen research cruise, which has been contributed to by a team of experts from the GRC-CM Marine Geosciences group at the University of Barcelona, is aimed at studying the phenomenon of dense water cascading and its relationship with climate change in an area to the west of the Svalbard Islands in the Arctic Circle.

The initiative is part of the HERMIONE project (Hotspot Ecosystem Research and Man’s Impact on European Seas), organized under the Seventh EU Framework Programme, which analyses deep-sea ecosystems and the impact of human activity on the ocean floor.

Dense water cascades, formed as surface waters cool or evaporate, quickly transfer matter and energy to the ocean floor in a recurring process that carries oxygen and nutrients to deep-sea areas.

The Arctic Ocean is a strategic location for the study of cascading.
Credit: University of Barcelona

However, if surface waters do not cool sufficiently, due to global warming or other factors, the process may stall, affecting the fragile equilibrium of deep-see ecosystems. "Thousands of metres below the surface, the cascading mechanism is yet more proof of the creeping influence of climate change", explains Miquel Canals, head of the UB's Marine Geosciences Research Group and leading author of an article that describes this phenomenon as observed in the north-western Mediterranean (Nature, 2006).

Mooring lines in the ocean depths

The Arctic Ocean is a strategic location for the study of cascading. Aboard the RV Jan Mayen, a specialist research ship operated by the University of Tromsø (Norway), the geologists Antoni Calafat, Anna Sànchez-Vidal and Ruth Duran from the Department of Stratigraphy, Paleontology and Marine Geosciences have installed a series of sophisticated instruments on the ocean floor to record data on dense water cascades and assess their overall impact on the marine ecosystem and deep-sea areas. "Our aim is to understand the dynamics of cascading in polar latitudes and to study environmental changes that the phenomenon could bring about on the ocean floor.

To obtain data, we have installed four mooring lines with current meters and sediment traps at depths of 1000, 1250, 1500 and 2000 metres," says oceanographer Anna Sànchez-Vidal. The equipment will record oceanographic and geochemical data at regular intervals and is scheduled to be retrieved at the end of the summer in 2011. Sànchez-Vidal explains that, "the data will give us a time series of measurements showing the properties of water masses at different times (speed and direction of current, temperature, salinity, turbudity, etc.) and the sediment transport profile". For the first time in an Arctic study, data will be complemented by specific examination of microorganisms, which are reliable indicators of environmental changes in deep-sea ecosystems.

The Mediterranean versus the Arctic

The process of cascading has been widely observed by scientists in the Mediterranean. However, the Arctic presents a different series of conditions. As Antoni Calafat explains, "The surface of the Arctic Ocean is divided between one part that remains frozen throughout the year and another, much larger part that freezes during the winter, which leads to a different pattern of cascading. Ice is a good thermal insulator. In addition, in the Arctic we also find polynyas, which are areas of open water surrounded by surface ice, where the wind cools surface water masses and accelerates the formation of dense water. However, this process is dependent on seasonal conditions and can vary from year to year. The relief of the ocean floor is also different in the Arctic to that of the Mediterranean Basin, and the cascading process could drag large quantities of organic matter to deeper areas".

The relief of the ocean floor also affects current dynamics during cascading. As oceanographer Ruth Duran explains, "The morphological parameters of the Svalbard Islands are very different to those of the Mediterranean Basin. We know that morphology, as in the case of Cap de Creus, determines the intensity and direction of currents in the Mediterranean. So during the expedition we produced detailed maps of the ocean floor in the study area – covering some 2600 km2 – that had not been fully charted until then, and this enabled us to determine the precise locations to install the mooring lines".

Cascading is a process that has a particular impact in polar areas and high latitudes. Experts studying the Mediterranean have identified a relationship between cascading and the biological productivity of the red shrimp (Aristeus antennatus). Could cascading have a similar biological effect in other areas of the planet? Research carried out by the Marine Geosciences group, which has worked extensively in the Antarctic, will reveal new information about dense water cascades, climate change and its impact on deep-sea ecosystems. The oceanography study carried out on board the RV Jan Mayen during the summer of 2010 was directed by the expert Jurgen Mienert (University of Tromsø) and contributed to by teams led by Roberto Danovaro (CoNISMa, Polytechnic University of Marche, Italy), Serge Heussner (CEFREM-University of Perpignan, France), Joan Grimalt (IDAEA, CSIC, Spain) and Leonardo Langone (ISMAR-Bologna, Italy), as well as other noted experts.

Source: Universidad de Barcelona news release

Yummy Basalt, Tasty Granite: How Microbes Eat Rocks and Make Soil

This microscopic image reveals an example of a mycorrhiza in a mesquite tree. Mycorrhiza is a mutually beneficial association between plant roots and a fungus, seen here as ball, and thread-like structures in which each partner makes nutrients available that the other can't produce on its own.
 Mycorrhiza
Photo credit: F. Solis-Dominguez

Biosphere 2 researchers received a $424,623 grant from the National Science Foundation to study how plants and microbes interact to chew away on minerals and make new soil.

The National Science Foundation has awarded a three-year, $424,623 grant to UA researchers to investigate how plants and microbes influence mineral weathering and leaching of mineral-forming chemical elements to make new soil.

"We will investigate how Ponderosa Pine and Buffalo Grass – two plant species common to the western U.S. – and microorganisms work together to mine nutrients from rock and make new soil," said project leaderKaterina Dontsova, an assistant research professor at Biosphere 2 Earthscience.

Biosphere 2 Director Travis Huxman, a professor in the UA's department of ecology and evolutionary biology, said: "Biosphere 2 is ideally suited to serve as a center for this work because of its focus on interdisciplinary research that addresses Earth system processes." 

The researchers will be able to manipulate parameters such as rock type and presence or absence of microorganisms and/or plants independently and study the effects on weathering processes, nutrient uptake by plants and leaching of minerals into the soil.

In addition to Dontsova and Huxman, the team includes Raina Maier, associate director of the UA's Superfund Research ProgramJon Chorover, a professor in the UA's department of soil, water and environmental science, and Julia Perdrial, a researcher in the same department. 

The team will be able to apply some of the techniques developed during the Superfund Research Program.

"We look at the effects of plant growth on geochemistry in mine tailings as an example of how biological weathering transforms the Earth's surface," Maier said.

The new project is expected to maintain a two-way information and methodological transfer that would benefit both the NSF-funded project and existing mine-tailing research. 

"Plants and microbes employ similar mechanisms in both natural and disturbed environments, such as mine-tailings, to make soils a better environment for life," Dontsova said.

Funded through NSF's Emerging topics in Biogeochemistry program, which focuses on research that transcends Earth and biological science, the project will closely integrate with related interdisciplinary projects at the UA, including the Jemez River Basin – Santa Catalina Mountains Critical Zone Observatory (led by Chorover), the UA Superfund Research Program, the Biosphere 2 Landscape Evolution Observatory, and a grass mortality study, soon to be featured by National Geographic TV. 

Sources and contacts:
Hassan Hijazi, Biosphere 2

Fuel Cell Research Aims to Lighten Load Carried by Soldiers

A UC Riverside Bourns College of Engineering professor and a team of researchers nationwide were recently awarded a five-year, $6.25 million grant to develop a greener, lighter-weight and longer-lasting power source for armed service members increasingly reliant on electronic devices.
Yushan Yan, professor and chair of the UCR department of environmental and chemical engineering, and researchers from the Colorado School of Mines, University of Massachusetts, Amherst and University of Chicago, received the grant to study the possibility of replacing batteries with fuel cells.

The research is funded by the Department of Defense under the 
Multidisciplinary University Initiative program. Yan’s portion of the grant is $875,000.

Currently, armed service members carry up to 30 pounds of batteries for a mission of 72 hours to power everything from night-vision goggles to GPS devices.

The research by Yan and the other scientists could lead to the development of fuel cells that would be up to 80 percent lighter than batteries. The fuel cells could also increase the life of devices in the field by up to five times, Yan said.

Small, portable methanol fuel cells exist today, but they require the use expensive metal catalysts, such as platinum. The researchers aim to develop a new class of ion conducting polymer membranes that would eliminate the need for expensive metal catalysts.

While the research is being funded by the military, it could provide the groundwork for fuel cell advances in other industries, particularly transportation, said Yan, who has been studying fuel cells since 1999, a year after he was hired by UCR.

In addition to fuel cells, Yan also studies zeolite, a microporous mineral widely used for water purification, as a catalyst in petroleum refining and in the production of laundry detergents.

Yan was recently honored for that research when he was presented with the 2010. 
Donald W. Breck Award by the International Zeolite Association at the 16th International Zeolite Conference in July in Sorrento, Italy.

The Breck award, which is given at the association’s meeting every three years, honors an individual or group who has made the most significant contribution to molecular sieve science and technology. Yan shared the award with Ryong Ryoo, a professor of chemistry at the Korea Advanced Institute of Science and Technology in Daejeon, South Korea.

Yan’s zeolite research focuses on zeolite thin films as insulators for computer chips, corrosion-resistant coatings for aircraft aluminum alloys, and hydrophilic and antimicrobial coatings for water separation in a space station.



Source: University of California Riverside news release

Global Tropical Forests Threatened by 2100

By 2100 only 18% to 45% of the plants and animals making up ecosystems in global, humid tropical forests may remain as we know them today, according to a new study led by Greg Asner at the Carnegie Institution’s Department of Global Ecology. The research combined new deforestation and selective logging data with climate-change projections. It is the first study to consider these combined effects for all humid tropical forest ecosystems and can help conservationists pinpoint where their efforts will be most effective. The study was published in the August 5, 2010, issue of Conservation Letters.  Video Press Release
Credit: Carnegie Institution’s Department of Global Ecology
“This is the first global compilation of projected ecosystem impacts for humid tropical forests affected by these combined forces,” remarked Asner. “For those areas of the globe projected to suffer most from climate change, land managers could focus their efforts on reducing the pressure from deforestation, thereby helping species adjust to climate change, or enhancing their ability to move in time to keep pace with it. On the flip side, regions of the world where deforestation is projected to have fewer effects from climate change could be targeted for restoration.”
Tropical forests hold more then half of all the plants and animal species on Earth. But the combined effect of climate change, forest clear cutting, and logging may force them to adapt, move, or die. The scientists looked at land use and climate change by integrating global deforestation and logging maps from satellite imagery and high-resolution data with projected future vegetation changes from 16 different global climate models. They then ran scenarios on how different types of species could be geographically reshuffled by 2100.They used the reorganization of plant classes, such as tropical broadleaf evergreen trees, tropical drought deciduous trees, plus different kinds of grasses as surrogates for biodiversity changes.
For Central and South America, climate change could alter about two-thirds of the humid tropical forests biodiversity—the variety and abundance of plants and animals in an ecosystem. Combining that scenario with current patterns of land-use change, and the Amazon Basin alone could see changes in biodiversity over 80% of the region.
Most of the changes in the Congo area likely to come from selective logging and climate change, which could negatively affect between 35% and 74% of that region. At the continental scale, about 70% of Africa’s tropical forest biodiversity would likely be affected if current practices are not curtailed.
In Asia and the central and southern Pacific islands, deforestation and logging are the primary drivers of ecosystem changes. Model projections suggest that climate change might play a lesser role there than in Latin America or Africa. That said, the research showed that between 60% and 77% of the area is susceptible to biodiversity losses via massive ongoing land-use changes in the region.
“This study is the strongest evidence yet that the world’s natural ecosystems will undergo profound changes—including severe alterations in their species composition—through the combined influence of climate change and land use,” remarked Daniel Nepstad, senior scientist at the Woods Hole Research Center. “Conservation of the world’s biota, as we know it, will depend upon rapid, steep declines in greenhouse gas emissions.”

NASA and NSF-Funded Research Finds First Potentially Habitable Exoplanet

A team of planet hunters from the University of California (UC) Santa Cruz, and the Carnegie Institution of Washington has announced the discovery of a planet with three times the mass of Earth orbiting a nearby star at a distance that places it squarely in the middle of the star's "habitable zone."



Credit:   University of California Santa Cruz

This discovery was the result of more than a decade of observations using the W. M. Keck Observatory in Hawaii, one of the world's largest optical telescopes. The research, sponsored by NASA and the National Science Foundation, placed the planet in an area where liquid water could exist on the planet's surface. If confirmed, this would be the most Earth-like exoplanet yet discovered and the first strong case for a potentially habitable one.

To astronomers, a "potentially habitable" planet is one that could sustain life, not necessarily one where humans would thrive. Habitability depends on many factors, but having liquid water and an atmosphere are among the most important.

The new findings are based on 11 years of observations of the nearby red dwarf star Gliese 581using the HIRES spectrometer on the Keck I Telescope. The spectrometer allows precise measurements of a star's radial velocity (its motion along the line of sight from Earth), which can reveal the presence of planets. The gravitational tug of an orbiting planet causes periodic changes in the radial velocity of the host star. Multiple planets induce complex wobbles in the star's motion, and astronomers use sophisticated analyses to detect planets and determine their orbits and masses.

"Keck's long-term observations of the wobble of nearby stars enabled the detection of this multi-planetary system," said Mario R. Perez, Keck program scientist at NASA Headquarters in Washington. "Keck is once again proving itself an amazing tool for scientific research."


Steven Vogt, professor of astronomy and astrophysics at UC Santa Cruz, and Paul Butler of the Carnegie Institution lead the Lick-Carnegie Exoplanet Survey. The team's new findings are reported in a paper published in the Astrophysical Journal and posted online at: http://arxiv.org


"Our findings offer a very compelling case for a potentially habitable planet," said Vogt. "The fact that we were able to detect this planet so quickly and so nearby tells us that planets like this must be really common."

The paper reports the discovery of two new planets around Gliese 581. This brings the total number of known planets around this star to six, the most yet discovered in a planetary system outside of our own. Like our solar system, the planets around Gliese 581 have nearly-circular orbits.

The new planet designated Gliese 581g has a mass three to four times that of Earth and orbits its star in just under 37 days. Its mass indicates that it is probably a rocky planet with a definite surface and enough gravity to hold on to an atmosphere.


Gliese 581, located 20 light years away from Earth in the constellation Libra, has two previously detected planets that lie at the edges of the habitable zone, one on the hot side (planet c) and one on the cold side (planet d). While some astronomers still think planet d may be habitable if it has a thick atmosphere with a strong greenhouse effect to warm it up, others are skeptical. The newly-discovered planet g, however, lies right in the middle of the habitable zone.

The planet is tidally locked to the star, meaning that one side is always facing the star and basking in perpetual daylight, while the side facing away from the star is in perpetual darkness. One effect of this is to stabilize the planet's surface climates, according to Vogt. The most habitable zone on the planet's surface would be the line between shadow and light (known as the "terminator").


Species Accumulate On Earth At Slower Rates Than In The Past, Penn Biologists Say

Computational biologists at the University of Pennsylvania say that species are still accumulating on Earth but at a slower rate than in the past.

In the study, published in the journal PLoS Biology, Penn researchers developed a novel computational approach to infer the dynamics of species diversification using the family trees of present-day species.  Using nine patterns of diversification as alternative models, they examined 289 phylogenies, or evolutionary trees, representing amphibians, arthropods, birds, mammals, mollusks and flowering plants.

Flowering plants are still a diversifying group of species, says computational biologist Josh Plotkin.
Credit: Steve Minicola.

The study demonstrated that diversity is generally not at equilibrium. Nonetheless, speciation rates have typically decayed over time, suggesting that the diversification of species is somehow constrained, and that equilibrium may eventually be reached.

There are many competing theories for how species diversify and become extinct.  Some suggest that species continually accumulate in time, always finding new ecological niches.  Other theories suggest that the number of coexisting species is limited and that we will eventually have equilibrium.  In other words, a species will be born only when another goes extinct.

The question that intrigued the Penn researchers was whether species diversity on Earth is in equilibrium or is still expanding. They also wondered whether the world has an invisible stop sign on species diversity that would eventually limit the diversity on the planet.

“What we see is diversification rates that are declining but not yet to zero,” said Joshua Plotkin, assistant professor in the Department of Biology in the School of Arts and Sciences at Penn.

 “We are not yet in equilibrium.  Either there is a limit to the total species number and we haven’t reached it yet, or there is no such limit.  But the rates of diversification are typically falling; when we will hit zero is not yet obvious.”

While it is clear that Earth has recently lost species due to human impact, this study dealt with much longer, geologic time scales. Understanding these long-term dynamics is central to our understanding of what controls present-day biodiversity across groups and regions.

Even though the study did not deal with the current anthropogenic loss of biodiversity, researchers were surprised at how little extinction they actually saw in the evolutionary trees of species.  The fossil record shows that many species have gone extinct over geologic time.  For example, the diversity of whales has decreased during the last ~12 million years.  But extinction was rarely apparent in this analysis of evolutionary trees.

The study also shows how analyzing molecular phylogenies can shed light on patterns of speciation and extinction; future work may reconcile this approach with the fossil record. “By taking advantage of existing data from the flood of genomic research, we hope to combine efforts with paleontologists gathering fossil data,” Plotkin said.

The study was conducted by Hélène Morlon and Plotkin of the Department of Biology in Penn’s School of Arts and Sciences and Matthew D. Potts of the University of California, Berkeley.  

It was funded by the Burroughs Wellcome Fund, David and Lucile Packard Foundation, James S. McDonnell Foundation and Alfred P. Sloan Foundation.

Contacts and sources:
PLoS Biology
Burroughs Wellcome Fund
David and Lucile Packard Foundation
James S. McDonnell Foundation
Alfred P. Sloan Foundation

Universities Report $55 Billion in Science and Engineering R&D Spending for FY 2009; Redesigned Survey to Launch in 2010

University spending on research and development in science and engineering (S&E) increased 5.8% between FY 2008 and FY 2009 to $54.9 billion, according to FY 2009 data from the National Science Foundation (NSF) Survey of Research and Development Expenditures at Universities and Colleges (table 1).[2] When adjusted for inflation, academic R&D rose by 4.2% in FY 2009.
TABLE 1. S&E R&D expenditures at universities and colleges: FY 2004–09.
FY 2009 federally funded academic R&D expenditures rose 4.2% in current dollars to $32.6 billion, a 2.6% increase in inflation-adjusted dollars (figure 1). The federal government is the largest source of academic S&E R&D funding. Its share of universities' R&D funding total has dropped by 5 percentage points in recent years, from 64% in FY 2005 to 59% in FY 2009.
FIGURE 1. S&E R&D expenditures at universities and colleges, by source of funds: FY 2000–09.
Unless otherwise indicated, references to dollar amounts or percentages for the remainder of this InfoBrief are in current dollars.
The second largest source of funding, institutions' own funds (internal funds), increased by 7.6% in FY 2009 to $11.2 billion (table 1). This amount includes separately budgeted organized research funded solely by the institutions ($6.3 billion) and almost $5 billion in unrecovered indirect costs related to sponsored research and direct cost sharing.[3] Academic R&D expenditures financed by state and local government funding grew by 5.7% in FY 2009, to $3.6 billion.[4] Industry-funded academic R&D had the largest percentage increase from FY 2008 to FY 2009, growing 11.6% to $3.2 billion. Funding from all other sources combined (nonprofit organizations and other nongovernmental entities) increased 9.6% to $4.3 billion.
Academic institutions characterized 74.6% of their FY 2009 total R&D expenditures as basic research rather than applied research or development. This proportion has been fairly constant over the last decade.

S&E R&D Expenditures by Type of Institution

Almost 60% of the R&D-performing academic institutions are public universities (414 in FY 2009), and together they accounted for 68% ($37.5 billion) of the total FY 2009 academic R&D expenditures (table 2). Private institutions (297 in FY 2009) accounted for the remaining $17.4 billion. The relative contributions of the major sources of R&D funding differ substantially for public and private institutions. In FY 2009 the federal government provided 54% of the R&D funds spent by public institutions, compared with 71% for private institutions (figure 2). Internal funds accounted for a larger share of R&D funding at public institutions (24% in FY 2009) than at private institutions (12%).
TABLE 2. S&E R&D expenditures at universities and colleges, by type of control: FY 2009.
FIGURE 2. Sources of S&E R&D funding for public and private institutions: FY 2009.
S&E R&D Expenditures by Field
The majority of academic R&D historically has been concentrated in the life sciences ($32.8 billion in FY 2009) (table 3). Within the life sciences, the subfields of medical and biological sciences continue to account for over half of all R&D expenditures with $18.2 billion and $10.2 billion, respectively. Engineering was the next highest field with $8.7 billion in FY 2009 R&D expenditures. R&D spending in two named subfields showed double-digit percentage increases between FY 2008 and FY 2009, physics at 16.4% and aeronautical/astronautical engineering at 14.2%. Mathematics showed a double-digit percentage decline (-10.9%) between FY 2008 and FY 2009.
TABLE 3. R&D expenditures at universities and colleges, by S&E field: FY 2008–09.

S&E R&D Spending by Federal Agency Sources

Corresponding to the dominance of life sciences R&D within academic institutions, the Department of Health and Human Services (HHS), including the National Institutes of Health, historically has been the largest provider of federal R&D funding to universities and colleges (55% or $18.1 billion in FY 2009) (table 4). NSF contributed the next largest amount in FY 2009 ($3.9 billion) and was the largest single-agency funder of five different fields. The Department of Defense (DOD) provided $3.4 billion, almost half in support of engineering R&D.
TABLE 4. Federally funded R&D expenditures at universities and colleges, by S&E field and agency: FY 2009.

S&E R&D Spending for Top 20 Performers

Of the 711 institutions surveyed, the top 20 in terms of total S&E R&D expenditures accounted for 30% of total academic R&D spending (table 5). The University of Colorado all campuses and the University of North Carolina Chapel Hill both reported one-year R&D spending increases of over $100 million, primarily within the life sciences, and moved into the top 20 in FY 2009, displacing the University of Pittsburgh all campuses and the University of Florida. The University of Michigan all campuses reported a one-year increase of over $130 million, making it the second institution in the survey's history to top $1 billion in annual R&D expenditures. The institutions constituting the top 5 have remained the same since FY 2004.
TABLE 5. Twenty instituions reporting the largest FY 2009 R&D expenditures in S&E fields, ranked by FY 2009 amount: FY 2007–09.

Non-S&E R&D Spending for Top 20 Performers

Academic institutions spent a total of $2.4 billion on R&D in non-S&E fields in FY 2009 (table 6).[5] This amount is in addition to the $54.9 billion expended on S&E R&D. The largest amounts reported for individual non-S&E fields were in education ($921 million), business and management ($341 million), and humanities ($253 million). The top 20 performers of non-S&E R&D accounted for 36% of the total non-S&E R&D expenditures in FY 2009. Purdue University all campuses, ranked 34th in S&E R&D expenditures, holds the number one spot in non-S&E R&D for the second year in a row with $70 million. The University of Michigan all campuses holds the number two spot in non-S&E R&D as well as S&E R&D spending, with $63 million in non-S&E R&D reported in FY 2009. Three institutions were newcomers to the top 20 in FY 2009: Arizona State University, the University of Southern Maine, and Ohio State University all campuses, displacing the University of California, Santa Cruz; Indiana University all campuses; and Brown University.
TABLE 6. Twenty instituions reporting the largest FY 2009 non-S&E R&D expenditures, ranked by FY 2009 amount: FY 2007–09.

Changes for FY 2010

Over the past 3 years NSF has been engaged in a large scale redesign of the Survey of Research and Development Expenditures at Universities and Colleges. The goals of the redesign were to (1) update the survey instrument to reflect current accounting principles, to obtain more valid and reliable measurements of the amount of academic R&D spending in the United States, and (2) expand the current survey items to collect the additional detail most often requested by data users. As part of the redesign effort, NSF held a data user workshop and assembled an expert panel for consultation, worked with accounting and survey methodology experts, visited more than 40 institutions, and conducted phone interviews with an additional 25 institutions to receive input on changes to the survey. A pilot test of the redesigned survey was administered to 40 institutions during the fall of 2009. Final changes to the survey were made based on the feedback of the pilot participant debriefings.
The substantially revised and expanded survey, now the Higher Education R&D (HERD) Survey, will be fielded for the first time in the fall of 2010. Data are expected to be available to the public by late fall of 2011. The HERD Survey will continue to capture comparable information on R&D expenditures by sources of funding and field, which will allow for continued trend analysis. In addition, it will include the following changes:
  • Total R&D will be expanded to include R&D expenditures in both S&E and non-S&E fields
  • The definition of R&D will explicitly include research training grants and clinical trials
  • Each institution campus headed by its own administration (i.e., a campus level president or chancellor) will be asked to report separately to enable a more consistent unit of analysis
  • R&D expenditures funded by nonprofit institutions will be specifically tracked (previously included under "Other sources")
The HERD Survey will also request information never before collected:
  • R&D expenditures funded by foreign sources
  • R&D expenditures by type of funding mechanism (contracts or grants)
  • R&D expenditures within an institution's medical school
  • Clinical trial expenditures
  • R&D expenditures by character of work (basic research, applied research, and development)
  • Detail by field (both S&E and non-S&E) for R&D expenditures from each source of funding (federal, state/local, institution, business, nonprofit, and other)
  • R&D expenditures funded by the American Recovery and Reinvestment Act (ARRA)
  • Total R&D expenditures by direct cost categories (salaries, software, equipment)
  • Headcounts of principal investigators and other personnel paid with R&D funds
  • Headcount of postdoctoral researchers paid with R&D funds

Data Sources, Limitations, and Availability

The academic R&D expenditures data presented in this InfoBrief were obtained from 711 universities and colleges that grant degrees in the sciences or engineering and expended at least $150,000 in S&E R&D in the survey period. The amounts reported include all funds expended for S&E activities specifically organized to produce research outcomes and sponsored by an outside organization or separately budgeted using institution funds. R&D expenditures at university-administered federally funded research and development centers (FFRDCs) are collected in a separate survey. Data from the Survey of R&D Expenditures at FFRDCs are available at http://www.nsf.gov/statistics/ffrdc/.
Institutions meeting the S&E R&D threshold for inclusion are also asked to provide non-S&E R&D spending. Non-S&E R&D expenditures are reported separately in the survey and are not included in the overall R&D expenditure totals. For a complete listing of the fields included under the S&E and non-S&E categories, refer to the FY 2009 survey questionnaire, available athttp://www.nsf.gov/statistics/question.cfm#12. Data reported on non-S&E R&D expenditures are lower-bound estimates for the national totals because NSF did not attempt to estimate for nonresponse on this item. Also, the activities of institutions that do not perform S&E R&D (but may conduct substantial amounts of non-S&E R&D) are not reflected here.
NSF makes available institutional profiles for institutions of higher education with S&E departments that grant master's degrees or higher (http://www.nsf.gov/statistics/profiles/). The profiles contain data from this survey as well as from three other NSF surveys: the Survey of Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions; the Survey of Graduate Students and Postdoctorates in Science and Engineering; and the Survey of Earned Doctorates. Data from the four surveys are available on the Web at http://www.nsf.gov/statistics/ and through the NSF WebCASPAR database system, a Web tool for retrieval and analysis of institutional data on academic S&E resources (http://webcaspar.nsf.gov/).
The full set of detailed tables from this survey will be available in the report Academic Research and Development Expenditures: Fiscal Year 2009 at http://www.nsf.gov/statistics/rdexpenditures/. Individual detailed tables from the 2009 survey may be available in advance of publication of the full report. For further information, please contact the author.

Notes

[1]  Ronda Britt, Research and Development Statistics Program, Division of Science Resources Statistics, National Science Foundation, 4201 Wilson Boulevard, Suite 965, Arlington, VA 22230 (rbritt@nsf.gov; 703-292-7765).
[2]  The fiscal year referred to throughout this report is the academic fiscal year; for most institutions FY 2009 represents the period 1 July 2008 through 30 June 2009. Most of the increase in R&D expenditures due to American Recovery and Reinvestment Act funding will not be seen until the FY 2010 data.
[3]  Separately budgeted organized research refers to funds designated solely for specific research projects. Unrecovered indirect costs are the portion of indirect costs incurred as a result of conducting sponsored research that are not reimbursed by the project sponsor. Direct cost sharing refers to the portion of direct project costs paid for by the institution on an externally funded project; this amount is negotiated and agreed upon with the sponsor at the time of the project award.
[4]  Figures reported for state and local government support of academic R&D exclude general-purpose funds that schools receive from these sources and devote to R&D activities. These funds are included in figures reported as institutional funds.
[5]  Only institutions reporting S&E R&D expenditures are surveyed for non-S&E R&D spending. See "Data Sources, Limitations, and Availability."