Wednesday, June 30, 2010

MagForce Nanotechnologies AG Receives EU Regulatory Approval for Nano-Cancer® Therapy

Following two decades of intensive research and development efforts, MagForce Nanotechnologies AG, a Berlin-based medical technology company founded in 1997, has received European regulatory approval for its Nano-Cancer® therapy. The official notice of regulatory approval signifies that the authorized testing centers in Germany responsible for conformity evaluation of medical devices have completed their examination of the application submitted for market approval of Nano-Cancer® therapy and that the approved medical devices fulfill all requirements with regard to quality, safety and medical efficacy.

"This is a historic milestone for MagForce," exclaimed Dr. Peter Heinrich, CEO of MagForce Nanotechnologies AG, on news of the regulatory approval. "Over the past twenty years, the company has worked in close partnership with a number of partners in academia, particularly the Charité University Medical Center in Berlin, to develop this highly innovative therapeutic approach for treating solid tumors. Along this long road, Dr. Andreas Jordan, founder of the company and my colleague on the executive board, always remained true to his vision of an entirely new concept in cancer treatment based on recent advances in nanotechnology. In recognition of his many years of dedication, I am thus particularly delighted that, with regulatory approval received, a revolutionary new product in nanomedicine may now be brought to market."

The regulatory approval covers the treatment of brain tumors throughout the European Union. With this approval in hand, the company is now accelerating and strengthening its sales and marketing activities to introduce its new therapeutic procedure into the major European markets, starting with Germany, as well as moving its business model into the next stage. A further focus of these activities in preparation for market launch is discussion with medical insurers, such as the German Krankenkassen, regarding coverage for this therapy. In addition, MagForce will in the coming months be defining a development, partnership and commercialization strategy for the North American and Asian markets.

Initial revenues from the commercialization of Nano-Cancer® therapy through the company’s own sales structures should be reflected in its 2011 financial results. With European regulatory approval now received, the company anticipates that its share listing on the Frankfurt Stock Exchange will be changed over the medium term from the Entry Standard segment to Prime Standard.


MagForce Nanotechnologies AG is a world-leading company in the area of nanotechnology-based cancer treatment. The proprietary procedure which it has developed, Nano-Cancer® therapy, enables the targeted treatment of solid tumors through the intratumoral release of heat from magnetic nanoparticles. Its products used in the therapy, NanoTherm® and NanoActivator®, have received EU-wide regulatory approval as medical devices for the treatment of brain tumors.

NOAA Sends Two Ships to Study Loop Current and Coastal Florida Waters

A NOAA research ship and a university-owned vessel left Miami this week to begin two complementary studies gathering data on the Loop Current and area ecosystems in response to the Deepwater Horizon / BP oil spill in the Gulf of Mexico.

NOAA Ship Nancy Foster begins today (June 30) a two-week survey in the eastern Gulf of Mexico and the Florida Straits. Nancy Foster is one of six NOAA-owned ships supporting the oil spill response effort. Scientists from NOAA’s Atlantic Oceanographic and Meteorological Laboratory in Miami and the NOAA Southeast Fisheries Science Center will lead the expedition to track where the oil has been and to determine where it may go. So far, oil from the Deepwater Horizon/BP oil spill has not entered the Loop Current.

NOAA Ship Nancy Foster.
 NOAA Ship Nancy Foster.
Credit: NOAA

Scientists will examine the presence of oil, dispersants and tar balls in the water column and collect zooplankton samples in areas affected by the spill. Scientists will also identify and count types of fish larvae found at different depths of the upper ocean.

“Our historical data and newer information will help evaluate any impact in the future, particularly as the bimonthly sampling continues,” said Michelle Wood, director of the Ocean Chemistry Division of NOAA’s AOML.

In addition, the Nancy Foster scientific team plans to monitor connectivity between the Loop Current and the Loop Current “Eddy Franklin” during the first week, and to study surface and subsurface waters in the east and north parts of the eddy during the second week. The Loop Current is a stream of warm Caribbean water that enters the Yucatan Straits, meanders northward, sometimes extending to the Gulf Coast, and exits into the Florida Straits after a sharp turn around the Florida Keys where it becomes the Florida Current. The “Eddy Franklin” is a warm water current that appears to have detached from the Loop Current sometime last week.

“Floating material – plankton or tar balls or oil – all get collected into the eddy and travel together until the ring ultimately breaks down or reattaches to the Loop Current,” Wood said.

Three drifting buoys, like the one pictured above, equipped with satellite communication, will be deployed  during the cruise of the R/V Savannah to help track currents.
 Three drifting buoys, like the one pictured above, equipped with satellite communication, will be deployed during the cruise of the R/V Savannah to help track currents.
Credit: NOAA

One of the goals of this mission is to provide an early warning to the Florida Keys National Marine Sanctuary and other resource managers and scientists should the oil spill arrive at sensitive ecosystems in the region.

The R/V Savannah, operated by the Skidaway Institute of Oceanography in Savannah, Ga., is sailing through the Florida Keys and western Florida shelf as part of a long-term bimonthly sampling effort for NOAA’s South Florida Ecosystem Restoration Program that has been modified to collect samples to check for the presence of oil in the region.

The Savannah scientific team, also led by AOML, will sample along the west Florida shelf, where early impacts from oil would be expected. During the cruise scientists will collect samples to determine if oil has reached the area as well as investigate a high sea surface temperature event around the Florida Keys. Three drifting buoys with satellite communication will be deployed to track currents to compliment the research from the vessel.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources.

Mining A Mountain of Electronic Waste: 1 Million Cell Phones Contain: 250 Kg Silver, 24 Kg Gold, 9 Kg Palladium, And 9 Tons Of Copper"

Mobile phones, computers, TVs - we like them but where do they go when we are finished with them? In the worst case they can be dismantled by hand for scrap by children in developing countries. This can expose them to potentially fatal chemicals. New legislation aims to toughen existing rules on collection and treatment so that within six years 85% of all waste will be recovered and treated. Environment Committee MEPs backed the proposals on 22 June.

This type of waste is one of the fastest growing waste streams in the EU (over 8 million tonnes and growing) and poses a series of challenges such as health problems if the waste is not properly treated and a loss of raw materials if there is no recycling.

Rapporteur Karl-Heinz Florenz of the centre right European People's Party told us, "we lose a lot of raw material because a lot of electronic waste is illegally shipped out of Europe. For example, 1 million mobiles contain: 250 kg silver, 24 kg gold, 9 kg palladium, and 9 tons of copper".

According to national reports, only 33% of the waste is currently collected and properly treated.

Setting ambitious and fair targets

The current collection target is 4 kg per year per person, but it doesn't reflect the different circumstances of each country. Some states already exceeded this amount, others fell short of it.

Mr. Florenz said, "we suggested collecting 85% of the waste that arises in the Member State. It is a challenging but realistic and important target."

This target is going to be effective in 2016. In the meanwhile, an interim target (4 kg or the amount collected in 2010, whichever is greater) will be set, to facilitate gradual improvement towards the final target.

"Another change will be the establishment of European-wide standards for collection, treatment and recycling of waste. The current situation shows a quite different quality of these operations in Europe," said Mr Florenz.

Illegal shipment

Karl-Heinz Florenz was clear about the present situation: "At the moment a very big amount of waste is illegally shipped out of Europe. Every Member State, specifically the customs officers have to prove that the exported product is not functioning and therefore not allowed to be shipped".

He added: "we will shift the burden of proof: now it is on the exporter. Furthermore, we established clear criteria to distinguish between waste and used but functioning products. This will help the custom services to control the exporters".

Consumer responsibility

Consumers can already turn their electronic waste in to dedicated facilities, but things will be easier now: "consumers will now be able to deposit very small appliances like mobile phones, shavers etc at any retail shop, without the requirement to buy a new product. These small products often end up in the waste bin, because consumers are not willing to go to a collection point just for an MP3 Player," Mr Florenz said.

Related Sites:

Human Induced Global Warming Started With Cavemen Says New Study

Even before the dawn of agriculture, people may have caused the planet to warm up, a new study suggests.

Mammoths used to roam modern-day Russia and North America, but are now extinct--and there's evidence that around 15,000 years ago, early hunters had a hand in wiping them out. A new study, accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union (AGU), argues that this die-off had the side effect of heating up the planet.

"A lot of people still think that people are unable to affect the climate even now, even when there are more than 6 billion people," says the lead author of the study, Chris Doughty of the Carnegie Institution for Science in Stanford, California. The new results, however, "show that even when we had populations orders of magnitude smaller than we do now, we still had a big impact."

In the new study, Doughty, Adam Wolf, and Chris Field--all at Carnegie Institution for Science--propose a scenario to explain how hunters could have triggered global warming.

First, mammoth populations began to drop-both because of natural climate change as the planet emerged from the last ice age, and because of human hunting. Normally, mammoths would have grazed down any birch that grew, so the area stayed a grassland. But if the mammoths vanished, the birch could spread. In the cold of the far north, these trees would be dwarfs, only about 2 meters (6 feet) tall. Nonetheless, they would dominate the grasses.

The trees would change the color of the landscape, making it much darker so it would absorb more of the Sun's heat, in turn heating up the air. This process would have added to natural climate change, making it harder for mammoths to cope, and helping the birch spread further.

To test how big of an effect this would have on climate, Field's team looked at ancient records
of pollen, preserved in lake sediments from Alaska, Siberia, and the Yukon Territory, built up over thousands of years. They looked at pollen from birch trees (the genus Betula), since this is "a pioneer species that can rapidly colonize open ground following disturbance," the study says. The researchers found that around 15,000 years ago--the same time that mammoth populations dropped, and that hunters arrived in the area--the amount of birch pollen started to rise quickly.

To estimate how much additional area the birch might have covered, they started with the way modern-day elephants affect their environment by eating plants and uprooting trees. If mammoths had effects on vegetation similar to those of modern elephants, then the fall of mammoths would have allowed birch trees to spread over several centuries, expanding from very few trees to covering about one-quarter of Siberia and Beringia--the land bridge between Asia and Alaska. In those places where there was dense vegetation to start with and where mammoths had lived, the main reason for the spread of birch trees was the demise of mammoths, the model suggests.

Another study, published last year, shows that "the mammoths went extinct, and that was followed by a drastic change in the vegetation," rather than the other way around, Doughty says. "With the extinction of this keystone species, it would have some impact on the ecology and vegetation--and vegetation has a large impact on climate."

Doughty and colleagues then used a climate simulation to estimate that this spread of birch trees would have warmed the whole planet more than 0.1 degrees Celsius (0.18 degrees Fahrenheit) over the course of several centuries. (In comparison, the planet has warmed about six times more during the past 150 years, largely because of people's greenhouse gas emissions.)

Only some portion--about one-quarter--of the spread of the birch trees would have been due to the mammoth extinctions, the researchers estimate. Natural climate change would have been responsible for the rest of the expansion of birch trees. Nonetheless, this suggests that when hunters helped finish off the mammoth, they could have caused some global warming.

In Siberia, Doughty says, "about 0.2 degrees C (0.36 degrees F) of regional warming is the part that is likely due to humans."

Earlier research indicated that prehistoric farmers changed the climate by slashing and burning forests starting about 8,000 years ago, and when they introduced rice paddy farming about 5,000 years ago. This would suggest that the start of the so-called "Anthropocene"--a term used by ome scientists to refer to the geological age when mankind began shaping the entire planet--should be dated to several thousand years ago.

However, Field and colleagues argue, the evidence of an even earlier man-made global climate impact suggests the Anthropocene could have started much earlier. Their results, they write, "suggest the human influence on climate began even earlier than previously believed, and that the onset of the Anthropocene should be extended back many thousands of years."

This work was funded by the Carnegie Institution for Science and NASA.

Sources and contacts:
Christopher E. Doughty, Carnegie Institution for Science "Biophysical feedbacks between the Pleistocene megafauna extinction and climate: The first human-induced global warming?" Authors:  Christopher E. Doughty, Adam Wolf, and Christopher B. Field, Department of Global Ecology,  Carnegie Institution for Science, Stanford, California, USA 

GE, U of Alberta and Alberta Innovate Initiate $4 Million CO2 Capture Project Using Nanotechnology

 In the quest to develop more cost-effective ways to reduce carbon emissions from fossil fuels, GE (Iskayuna, NY)  is partnering with the University of Alberta (UA) and Alberta Innovates Technology Futures (AITF) on a $4 million CO2 capture project supported by the Climate Change and Emissions Management (CCEMC) Corporation.

This team is leveraging cutting-edge research in nanotechnology to tackle two of the most pressing environmental challenges facing the Oil Sands -- reduction of CO2 emissions associated with the extraction and upgrading process, and treatment of produced water generated during the oil recovery.

With support from Climate Change and Emissions Management Corporation (CCEMC), GE is working with the University of Alberta and Alberta Innovates Technology Futures to develop CO2 capture technology that could be used to reduce carbon emissions in the Oil Sands. In the future, this technology could support CO2 capture in power plants (e.g. the Integrated Gasification Combined Cycle (IGCC) plant pictured at right) and also be used in water treatment processes. Pictured left is the natural zeolite materials that GE and its project partners are basing their membrane technology on.

Photo: Business Wire

The technology is based on naturally occurring zeolites identified by UA. These materials are rocks with molecularly sized pores, which allow small molecules to enter while excluding larger molecules. Zeolites are widely used in the chemical industry as catalysts, and this project seeks to form these materials into membranes that can be used for high temperature gas separation. The materials also have the potential to be used as filters for contaminated water. The CCEMC is providing $2 million in support of this project, with an equal cost share from GE and its project partners.

Anthony Ku, a chemical engineer and project leader for GE Global Research on the CO2 capture project, said, "This project is a great example of how partnership between academic research organizations and industry can lead to meaningful innovation. We're excited to be working with the CCEMC and some of Alberta's best and brightest research minds to take an interesting material identified in a university lab and figure out how to build a prototype that will be tested in the field."

Ku noted that the successful commercialization and widespread adoption of this technology could reduce CO2 emissions from the production of synthetic crude oil from the Oil Sands by up to 25%.

With fossil fuels like coal, oil and natural gas projected to be a large portion of our energy mix for decades to come, GE is committed to developing new, cost-effective technologies for the management of greenhouse gas emissions. This technology collaboration is supported in part through GE's ecomagination initiative. Ecomagination represents GE's commitment to deliver new clean products and technologies to market for its customers and society. Recently, the company pledged to double its investment in clean R&D over the next five years from $5 billion to $10 billion.

About the CCEMC

The CCEMC is a not-for-profit organization whose mandate is to establish or participate in funding for initiatives that reduce greenhouse gas emissions and support adaptation. The CCEMC invests in discovery, development, and operational deployment of clean technologies.

About GE Global Research

GE Global Research is the hub of technology development for all of GE's businesses. GE scientists and engineers redefine what's possible, drive growth for their businesses and find answers to some of the world's toughest problems.

Curiosity Rover to Test Chemistry and Mineralogy with X-Ray Diffraction of Mars, 2011 Launch Date

NASA's Curiosity rover, coming together for a late 2011 launch to Mars, has a newly installed component: a key onboard X-ray instrument for helping the mission achieve its goals.

Researchers will use Curiosity in an intriguing area of Mars to search for modern or ancient habitable environments, including any that may have also been favorable for preserving clues about life and environment.

The team assembling and testing Curiosity at NASA's Jet Propulsion Laboratory, Pasadena, Calif., fastened the Chemistry and Mineralogy (CheMin) instrument inside the rover body on June 15. CheMin will identify the minerals in samples of powdered rock or soil that the rover's robotic arm will deliver to an input funnel. 

CheMin Principal Investigator David Blake, of the NASA Ames Research Center, Moffett Field, Calif., is seen here collecting data from a CheMin cousin called Terra. The scene is from a NASA field test of technology for producing water and oxygen from soil, using the Hawaiian site as an analog for the moon. In such an application, Terra could analyze the starting soils as well as products from the extraction process.
Image Credit: NASA

"Minerals give us a record of what the environment was like at the time they were formed," said the principal investigator for CheMin, David Blake of NASA's Ames Research Center, Moffett Field, Calif. Temperature, pressure, and the chemical ingredients present -- including water -- determine what minerals form and how they are altered.

The instrument uses X-ray diffraction, a first for a mission to Mars and a more definitive method for identifying minerals than any instrument on previous missions. It supplements the diffraction measurements with X-ray fluorescence capability to garner further details of composition.

X-ray diffraction works by directing an X-ray beam at a sample and recording how the X-rays are scattered by the sample's atoms. All minerals are crystalline, and in crystalline materials, atoms are arranged in an orderly, periodic structure, causing the X-rays to be scattered at predictable angles. From those angles, researchers can deduce the spacing between planes of atoms in the crystal.

"You get a series of spacings and intensities for each mineral," Blake said. "It's more than a fingerprint because it not only provides definitive identification, but we know the reason for each pattern, right down to the atomic level."

NASA's Mars Science Laboratory mission will send Curiosity to a place on Mars where water-related minerals have been detected by Mars orbiters. The rover's 10 science instruments [link to ] will examine the site's modern environment and geological clues to its past environments. NASA's multi-step strategy might include potential future missions for bringing Mars samples to Earth for detailed analysis. One key goal for the Mars Science Laboratory mission is to identify a good hunting ground for rocks that could hold biosignatures -- evidence of life -- though this mission itself will not seek evidence of life.

On Earth, life has thrived for more than 3 billion years, but preserving evidence of life from the geologically distant past requires specific, unusual conditions.

Fossil insects encased in amber or mastodon skeletons immersed in tar pits are examples of how specific environments can store a record of ancient life by isolating it from normal decomposition. But Mars won't have insects or mastodons; if Mars has had any life forms at all, they were likely microbes. Understanding what types of environments may have preserved evidence of microbial life from billions of years ago, even on Earth, is still an emerging field of study. Some factors good for life are bad for preserving biosignatures. For example, life needs water, but organic compounds, the carbon-chemical ingredients of life, generally oxidize to carbon dioxide gas if not protected from water.

Some minerals detectable by CheMin, such as phosphates, carbonates, sulfates and silica, can help preserve biosignatures. Clay minerals trap and preserve organic compounds under some conditions. Some minerals that form when salty water evaporates can encase and protect organics, too. Other minerals that CheMin could detect might also have implications about past conditions favorable to life and to preservation of biosignatures.

"We'll finally have the ability to conduct a wide-ranging inventory of the minerals for one part of Mars," said John Grotzinger of the California Institute of Technology in Pasadena, chief scientist for the Mars Science Laboratory. "This will be a big step forward. Whatever we learn about conditions for life, we'll also get a great benefit in learning about the early evolution of a planet."

Curiosity's 10 science instruments, with about 15 times more mass than the five-instrument science payload on either of the Mars rovers Spirit or Opportunity, provide complementary capabilities for meeting the mission's goals. Some will provide quicker evaluations of rocks when the rover drives to a new location, helping the science team choose which rocks to examine more thoroughly with CheMin and the Sample Analysis at Mars (SAM) experiment. SAM can identify organic compounds. Imaging information about the context and textures of rocks will augment information about the rocks' composition.

"CheMin will tell us the major minerals there without a lot of debate," said Jack Farmer of Arizona State University, Tempe, a member of the instrument's science team. "It won’t necessarily reveal anything definitive about biosignatures, but it will help us select the rocks to check for organics. X-ray diffraction is the gold standard for mineralogy. Anyone who wants to determine the minerals in a rock on Earth takes it to an X-ray diffraction lab."

Blake began working 21 years ago on a compact X-ray diffraction instrument for use in planetary missions. His work with colleagues has resulted in commercial portable instruments for use in geological field work on Earth, as well as the CheMin instrument. The spinoff instruments have found innovative applications in screening for counterfeit pharmaceuticals in developing nations and analyzing archaeological finds.

CheMin is roughly a cube 25 centimeters (10 inches) on each side, weighing about 10 kilograms (22 pounds). It generates X-rays by aiming high-energy electrons at a target of cobalt, then directing the X-rays into a narrow beam. The detector is a charge-coupled device like the ones in electronic cameras, but sensitive to X-ray wavelengths and cooled to minus 60 degrees Celsius (minus 76 degrees Fahrenheit).

A sample wheel mounted between the X-ray source and detector holds 32 disc-shaped sample cells, each about the diameter of a shirt button and thickness of a business card, with transparent plastic walls. Rotating the wheel can position any cell into the X-ray beam. Five cells hold reference samples from Earth to help calibrate the instrument. The other 27 are reusable holders for Martian samples. Samples of gritty powder delivered by the robotic arm to CheMin's inlet funnel will each contain about as much material as in a baby aspirin.

Each CheMin analysis of a sample requires up to 10 hours of accumulating data while X-rays are hitting the sample. The time may be split into two or more nights of operation.

Besides X-ray diffraction, CheMin records X-ray fluorescence data from the analyzed material. X-ray fluorescence works by recording the secondary X-rays generated when the atoms in the sample are excited by the primary X-ray source. Different elements, when excited, emit fluorescent X-rays at different and characteristic energies, so this information indicates which elements are present. This compositional information will supplement similar data collected by the Alpha Particle X-ray Spectrometer on Curiosity's arm.

CheMin's team of scientists combines expertise in mineralogy, petrology, materials science, astrobiology and soil science, with experience studying terrestrial, lunar and Martian rocks.

The launch period for the Mars Science Laboratory will begin on Nov. 25, 2011, for a landing on Mars in August 2012. Blake's wish for results from the Martian rock data he's already been anticipating for more than two decades: "I hope we find something unexpected, something surprising."

 Sources and contacts:
Guy Webster,Jet Propulsion Laboratory, Pasadena, Calif.
Rachel Hoover, Ames Research Center, Moffett Field, Calif. 

More Oil and Gas May Pool In Deep Waters and Sediments in Gulf Of Mexico and Alter Microbial Life Says Georgia Scientist

To examine the impacts of the Deepwater Horizon oil spill on microbes in the waters and sediments near the spill site, the National Science Foundation (NSF) awarded a rapid response grant to marine scientist Samantha Joye of the University of Georgia (UGA) and colleagues.

Oil in Gulf of Mexico
Photo of oil spill in Gulf waters.
 Image credit:  Samantha Joye

The team traveled aboard the research vessel F.G. Walton Smith in the Gulf of Mexico on an oceanographic research cruise in late May and early June. On June 9, 2010, Joye presented testimony about her research at a congressional hearing of the House Energy and Commerce Committee.

The release of oil from the Deepwater Horizon incident on April 20, 2010, is of greater magnitude and scope than any previous spill and is also unique because it has introduced both oil and methane gas into the deep, cold waters of the Gulf of Mexico. "This combination of oil and gas could stimulate a broader microbial population," said Joye, "as well as potentially alter the distribution of the leaking material, possibly leading to more oil and gas pooling in deep waters and sediments."

Joye and other researchers are collecting samples of sediments, deepwaters and surface waters at 20 sites in the spill area. The team is studying the factors regulating the activity of microbes in the water column, including nutrient availability, methane concentration, trace metals and vitamins, and the impact of oil on key microbial processes, including the oxidation of methane.

"This research is essential to assessing how massive amounts of oil will affect the health of the Gulf of Mexico in both the short- and long-term," said David Garrison, director of NSF's biological oceanography program. Read more about this RAPID Response project here, or see the research team's blog for further information on the scientists' observations.

To date, NSF has awarded 39 RAPID Response grants, totaling nearly $4 million, for scientific study of the gulf oil spill. For a regularly updated list of RAPID oil spill awards, see here.

EPA Releases Independent Toxicity Tests on Eight Oil Dispersants Including Corexit

The US Environmental Protection Agency released peer reviewed results from the first round of its own independent toxicity testing on eight oil dispersants. EPA conducted testing to ensure that decisions about ongoing dispersant use in the Gulf of Mexico continue to be grounded in the best available science.

EPA’s results indicated that none of the eight dispersants tested, including the product in use in the Gulf, displayed biologically significant endocrine disrupting activity. While the dispersant products alone – not mixed with oil - have roughly the same impact on aquatic life, JD-2000 and Corexit 9500 were generally less toxic to small fish and JD-2000 and SAF-RON GOLD were least toxic to mysid shrimp.  While this is important information to have, additional testing is needed to further inform the use of dispersants.

"EPA is performing independent tests to determine the potential impacts of various dispersants. We will continue to conduct additional research before providing a final recommendation, " said EPA Administrator Lisa P. Jackson.  "We want to ensure that every tool is available to mitigate the impact of the BP spill and protect our fragile wetlands.  But we continue to direct BP to use dispersants responsibly and in as limited an amount as possible." 

EPA continues to carefully monitor BP’s use of dispersant in the Gulf. Dispersants are generally less toxic than oil and can prevent some oil from impacting sensitive areas along the Gulf Coast. EPA believes BP should use as little dispersant as necessary and, on May 23, Administrator Jackson and then-Federal On-Scene Coordinator Rear Admiral Mary Landry directed BP to reduce dispersant usage by 75 percent from peak usage. EPA and the Coast Guard formalized that order in a directive to BP on May 26. Over the next month BP reduced dispersant use 68 percent from that peak.

Before directing BP to ramp down dispersant use, EPA directed BP to analyze potential alternative dispersants for toxicity and effectiveness. BP reported to EPA that they were unable to find a dispersant that is less toxic than Corexit 9500, the product currently in use. Following that, EPA began its own scientific testing of eight dispersant products on the National Contingency Plan Product Schedule (NCP-PS). Those dispersant products are: Dispersit SPC 1000, Nokomis 3-F4, Nokomis 3-AA, ZI-400, SAF-RON Gold, Sea Brat #4, Corexit 9500 A and JD 2000. Today’s results represent the first stage of that effort.

EPA tested these eight products for endocrine disrupting activity and potential impacts on small fish and mysid shrimp. The testing found:

·         None of the eight dispersants tested displayed biologically significant endocrine disrupting activity.
·         While all eight dispersants alone – not mixed with oil – showed roughly the same effects, JD-2000 and Corexit 9500 proved to be the least toxic to small fish, and JD-2000 and SAF-RON GOLD were the least toxic to the mysid shrimp.

The next phase of EPA’s testing will assess the acute toxicity of multiple concentrations of Louisiana Sweet Crude Oil alone and combinations of Louisiana Sweet Crude Oil with each of the eight dispersants for two test species.

To view the first round of test results please visit:

$26-million Study Aims To Understand How Carbon Is Exchanged Between Soil and Atmosphere To Predict Climate Change

A new $26-million NASA project led by a University of Michigan researcher aims to help clarify how ecosystems exchange carbon with the atmosphere, an important piece of missing knowledge in the quest to understand, predict, and adapt to climate change. 

The project's goal is to help determine whether the North American continent is a net source or sink of carbon. Researchers from U-M, NASA's Jet Propulsion Laboratory, Harvard University, MIT, Oregon State University, NASA's Goddard Space Flight Center, the U.S. Department of Agriculture, and Purdue University are taking part.

The AirMOSS radar will be packaged in a small pod (bottom left) carried by a Glufstream aircraft (top left). On the right, a possible pod layout is shown with the electronics bay and location of the electronics subsystems.
Credit: NASA Jet Propulsion Laboratory. Click image above for higher resolution.

Over the next five years, a radar instrument called the Airborne Microwave Observatory of Subcanopy and Subsurface (AirMOSS) will collect data in nine North American regions from aboard a Gulfstream-III aircraft. The radar data will be converted to measurements of soil moisture by using sophisticated computer simulations.

The radar, to be built during the first year-and-a-half of the project, generates signals that can penetrate up to four feet beneath the ground surface. This state-of-the-art low-frequency radar will be the most compact and versatile radar of its kind built to-date, says principal investigator Mahta Moghaddam, a professor in the Department of Electrical Engineering and Computer Science.

Root-zone soil moisture levels directly affect how well a plant is functioning.

"Even your houseplant has its own net exchange of carbon," Moghaddam said. "It takes carbon dioxide in during the day through photosynthesis, provided there is sunlight and it's warm enough. And breathes out some carbon dioxide at night. How much net carbon it sequesters, and therefore how much the plant grows, has to do with how much water is available to its roots: No water, no growth."

Scientists don't understand exactly when and where this net carbon exchange process is most efficient, or how much the net exchange differs across ecosystems. They might know it for a few selected locations across North America where they have manually sampled, but not on the large scale that AirMOSS will enable. Lack of current knowledge about root zone soil moisture is believed to contribute 60-80 percent of the uncertainty about how much the ecosystems exchange carbon with the atmosphere.

Collaborating researchers will incorporate Moghaddam's root zone soil moisture measurements into hydrology and ecosystem models to produce a continental estimate of the net ecosystem exchange. The results, which will show whether the continent takes in or releases more carbon and by how much, are expected by May 2015.

Moghaddam will oversee the design and fabrication of the AirMOSS instrument, a table-top-sized, high-powered, low-frequency radar that NASA/JPL collaborators will build for the project. She has also developed computational techniques to analyze the signals it sends back. Moghaddam's research group is a leader in developing radar algorithms for subsurface characterization.

"This work will help us understand a piece of the carbon cycle puzzle," Moghaddam said. "We may know that different areas in north America act as sinks or sources of carbon, but we don't know how large the net carbon exchange is, how fast it's changing, or how big it's going to get. Today, we rely on model estimates and there is huge uncertainty."

Beyond this project, Moghaddam envisions other applications for this radar instrument, including surveillance and resource exploration.

Michigan Engineering:
The University of Michigan College of Engineering is ranked among the top engineering schools in the country. At $160 million annually, its engineering research budget is one of largest of any public university. Michigan Engineering is home to 11 academic departments and a National Science Foundation Engineering Research Center. The college plays a leading role in the Michigan Memorial Phoenix Energy Institute and hosts the world class Lurie Nanofabrication Facility. Find out more at


University At Buffalo Launches Clinical Trial Of New Multiple Sclerosis Treatment

Buffalo medical researchers led by a team from the University at Buffalo Department of Neurosurgery, will embark on a landmark prospective randomized double-blinded study to test the safety and efficacy of interventional endovascular therapy --dubbed "liberation treatment" -- on the symptoms and progression of Multiple Sclerosis (MS).

Recent research has strongly associated chronic cerebrospinal venous insufficiency (CCSVI) with MS.

In a series of original studies, Paolo Zamboni MD, of the University of Ferrara, Italy, found blockage of major venous outflow from the brain and spinal cord in patients with MS. Researchers from many institutions, including the University at Buffalo, have confirmed the association.

It is hypothesized that the narrowing in the large veins in the neck and chest might cause improper drainage of blood from the brain, resulting in eventual injury to brain tissue. It is thought that angioplasty -- treatment commonly used by cardiologists and other endovascular surgeons to treat atherosclerosis -- may remedy the blockages.

Zamboni has conducted preliminary studies that suggest the efficacy of venous angioplasty – "liberation procedure" -- in the amelioration of MS symptoms.

Now, researchers at the University at Buffalo will launch PREMiSe (Prospective Randomized Endovascular therapy in Multiple Sclerosis), a study to determine if endovascular intervention via balloon angioplasty to correct the blockages improves MS symptoms or progression.

PREMiSe is believed to be the first prospective randomized double-blinded study of balloon angioplasty for MS being performed with Institutional Review Board approval in a rigorous fashion in the US with significant safeguards in place to ensure careful determination of risks and benefits.

The study is led by principal investigator Adnan Siddiqui, MD, assistant professor of Neurosurgery, UB School of Medicine and Biomedical Sciences, with co-principal investigators Elad Levy, MD, associate professor, and L.N. Hopkins, MD, professor and chair of the UB Department of Neurosurgery .

Additional independent researchers from University at Buffalo will participate in the evaluation and follow-up of study patients. An independent Data Safety Monitoring Board (DSMB) will ensure the safety and effectiveness of the study on an ongoing basis.

In the first phase of the study, ten MS patients from the United States and Canada who exhibit venous insufficiency will undergo minimally invasive venous angioplasties to determine if the procedure can be performed safely. The procedures began June 29 and will continue today (June 30), performed by Siddiqui and Levy at Kaleida Health's Millard Fillmore Gates Hospital in Buffalo, New York.

In its second phase, the study will randomize 20 MS patients who will undergo either venous angioplasty or a "sham angioplasty" (i.e. a catheter will be inserted but there will be no inflation of the balloon). The treatment will be blinded in such a way that neither the patient undergoing the procedure nor the clinicians evaluating the patient will be aware which procedure was performed.

If results suggest an appropriate safety profile and preliminary effectiveness, researchers will approach the University at Buffalo IRB for an extension of the protocol to study a larger number of patients in order to convincingly prove or disprove a causal relationship between CCSVI and MS.

Multiple sclerosis is estimated to affect more than 400,000 people in the United States and more than two million people worldwide. It is typically a disease of young adults characterized by either a relapsing or progressive decline in neurologic function resulting in significant disability. It is an inflammatory neurological disease widely considered to be autoimmune in nature, though its exact origins remain elusive.

If angioplasty is proven effective at improving MS symptoms, the implications for the future of MS treatment could be monumental. The physicians conducting PREMiSe are cautious but optimistic that initial findings will be promising.

University at Buffalo Neurosurgery (UBNS) is an academic neurosurgical group and leading regional referral center for cerebrovascular disorders run by a distinguished team of neurosurgical specialists and subspecialists committed to superior patient care, resident education, and translational research. UBNS diagnoses and treats a wide range of conditions, including but not limited to aneurysms; stroke; back and neck pain; epilepsy; Parkinson's disease; hydrocephalus; and tumors of the brain, spine, and skull base. It is also the only neurosurgical group in Western New York with FDA approval to conduct device-related clinical trials for acute stroke.

The University at Buffalo is a premier research-intensive public university, a flagship institution in the State University of New York system and its largest and most comprehensive campus. UB's more than 28,000 students pursue their academic interests through more than 300 undergraduate, graduate and professional degree programs. Founded in 1846, the University at Buffalo is a member of the Association of American Universities.

Contacts and sources:

Linguistics Professor Examines Advertising in Manufacturers' Prescription Drug Websites

Researchers from Dartmouth College and the University of Minnesota have examined the corporate websites dedicated to the 100 best-selling prescription drugs. They found a startling lack of consistency in an industry where advertising standards are regulated by the Food and Drug Administration.

"Communicating via a website is common practice today," says Glinert, "and consumers are very savvy about doing their own research on the Internet. The FDA has rules about direct-to-consumer print and television drug advertising, so we think it makes sense to also regulate websites and other marketing tools when it comes to prescription medicine. Consumers need consistent and balanced information."

Glinert presented their study, "Manufacturers' prescription drug web sites: A gray area of discourse and ethics," at the Communication, Medicine and Ethics (COMET) 2010 Conference at Boston University School of Public Health on June 28. Glinert and Schommer have previously published on the topic of direct-to-consumer drug advertising and Glinert has also presented their research at an FDA hearing.

In this paper, Glinert and Schommer found that the websites:
  • have no obvious linear narrative or 'next page' or conclusion; users move in a maze of text and navigation choices, some leading far away
  • lack a popular genre name (like infomercial), meaning that users come to them without a clear idea of how to perceive them
  • have an unpredictable mix of information and promotion, content, verbal style, visuals, and layout
  • often present safety and risk information in small font, in cumbersome un-bulleted blocks of text, detached from promotional text and videos, and below a page's scrolling 'fold'

Glinert notes that the Internet search engine Google has also been working to help consumers with their research on prescription drugs. A Google search of a prescription or generic drug name, for example Lipitor, will now display a summary and description at the top of the search results.

The new feature, developed in partnership with the National Institutes of Health, links to NIH content and risk data. (Background information at: and )

"Our research provides justification for Google's move," says Glinert. "Only time will tell if this is a major change for the better."

Contacts and sources:
Communications, Medicine and Ethics 2010

Conserving Nature And Dollars: Delivering Cost-Effective Biodiversity Protection

A more flexible approach to the expansion of protected area systems could ultimately protect much more biodiversity for the same budget according to a new paper in the scientific journal, Nature. 

Lead author Dr Richard Fuller of the CSIRO Climate Adaptation Flagship and The University of Queensland said that without spending extra money "we could dramatically improve the performance of protected area systems by replacing a small number of poor performing areas with more cost-effective ones".

Protected areas are one of the most important tools in modern nature conservation, with over 100,000 sites covering about 12 per cent of the land and territorial waters of countries worldwide.

The paper examines how effectively different sites can conserve a range of vegetation types.

"Replacing the least cost-effective 1 per cent of Australia's 6990 strictly protected areas could more than double the number of vegetation types that have 15 per cent or more of their original extent protected," Dr Fuller said.

"We can do this if we reverse the protection status of the least cost-effective sites and use the resulting capital to establish and manage new protected areas."

The authors of the paper, including colleagues from CSIRO and The University of Queensland, acknowledge that community values would need to be incorporated when considering changes to the protected status of selected reserves. However, the benefits of reducing management costs in low performing areas are also worth exploring.

By being informed by this analysis method, future investments in protected areas could better protect biodiversity from threats such as climate change.

"As the rate of investment in new protected areas has slowed globally in recent years ensuring the best places are protected is more important than ever," Dr Fuller said.

The Climate Adaptation Flagship is enabling Australia to adapt more effectively to the impacts of climate change and variability. This includes developing adaptation options to protect Australia's marine and terrestrial species, ecosystems and the services they provide.

Contacts and sources:
 Commonwealth Scientific and Industrial Research Organization

ARS Releases New Line of Disease, Drought and Heat-Tolerant Beans

New bean germplasm lines containing heat, drought and disease tolerance are being released by Agricultural Research Service (ARS) scientists and cooperators.

ARS geneticist Tim Porch has recently released new more heat tolerant lines of kidney beans. 
 Photo: Kidney beans. Link to photo information
Photo courtesy of Grain Inspection, Packers and Stockyards Administration, USDA.

ARS geneticist Tim Porch, with the agency'sTropical Agricultural Research Station in Mayagüez, Puerto Rico, has recently released two new kidney bean germplasm lines, named TARS HT-1 and TARS HT-2, that are tolerant to high temperature conditions. These new releases are part of collaborative breeding efforts with Cornell University, the University of Tennessee and the University of Puerto Rico.

TARS HT-1 yields well under high day and high night temperature stress, and TARS HT-2 performs well under high day and moderate night temperature stress. These germplasm lines can improve yields under hot summer conditions for farmers in regions prone to high temperature stress. They can also be used to improve heat tolerance in other large-seeded beans through breeding and selection.

Porch and university colleagues are also developing new black bean germplasm lines with tolerance to heat and drought and resistance to root rot and common bacterial blight. Common bacterial blight disease is caused by the bacterium Xanthomonas axonopodis pv. phaseoli and thrives in hot, humid climates. It primarily attacks the leaves and pods of bean plants and causes significant losses in both yields and seed quality. Root rot is caused by a complex of fungal diseases and is present in most common bean production zones worldwide.

Porch crossed tropical black and red beans to produce the new black bean germplasm lines, which are adapted to temperate areas, helping to increase the diversity of U.S. bean germplasm. Field and greenhouse trials in Nebraska show the lines yield well in addition to possessing drought tolerance and disease resistance.

According to Porch, the beans he and his university collaborators are testing have broad adaptation. They do well in the short days common to Puerto Rico and the long days found in the continental United States.

Read more about this research in the May/June 2010 issue of Agricultural Research magazine.
ARS is the principal intramural scientific research agency of the U.S. Department of Agriculture (USDA). This research supports the USDA priorities of responding to climate change and promoting international food security.

Contacts and sources:

How The Dead Can Improve The Environment, EU Entrepreneurs Develop New Methods of Body Disposal

People who care about improving the environment in life may soon be able to do so after death. Entrepreneurs in Europe have developed two new and unusual methods of body disposal -- including a low-heat cremation method and a corpse compost method that turns bodies into soil — that could provide environmentally friendly alternatives to those now in use. That's the topic of an article in the current issue of Chemical & Engineering News, ACS' weekly news magazine. 

C&EN Associate Editor Sarah Everts notes that environmentally minded individuals have several concerns about cremation and burial practices. The high temperature of cremation burns up lots of fuel and releases carbon dioxide, the major greenhouse gas, into the atmosphere. Cremation also releases mercury from dental fillings into the air. Some worry that formaldehyde and other toxic substances that undertakers use to prepare bodies for burial can leach into the environment.

Entrepreneurs have developed two green alternatives that are soon launching in North America or Europe. They include a new cremation method that breaks down a corpse using a highly corrosive alkaline substance rather than extremely high heat. Because the temperatures used in the new process is also 80 percent cooler than standard cremation temperatures, the process uses less energy and produces lower carbon dioxide emissions.

A newly developed burial method allows corpses to be composted (decomposed) into soil instead of transforming to dust in a sealed casket. The unusual process involves freezing the body in liquid nitrogen, breaking it into smaller pieces, and freeze-drying the parts, which are then placed in a biodegradable coffin for burial. Over time, the body turns into soil instead of undergoing the standard decaying process. "No matter how you look at it, there's just no pretty way to go," said one of the entrepreneurs.

Contacts and sources:
 "Green for Eternity" Available for download

Psychological Research Conducted In WEIRD Nations May Not Apply To Global Populations

A new University of British Columbia study says that an overreliance on research subjects from the US and other Western nations can produce false claims about human psychology and behavior because their psychological tendencies are highly unusual compared to the global population. 

According to the study, the majority of psychological research is conducted on subjects from Western nations, primarily university students. Between 2003 and 2007, 96 per cent of psychological samples came from countries with only 12 per cent of the world's populations. The U.S. alone provided nearly 70 per cent of these subjects.

However, the study finds significant psychological and behavioral differences between what the researchers call Western, Educated, Industrialized, Rich and Democratic (WEIRD) societies and their non-WEIRD counterparts across a spectrum of key areas, including visual perception, fairness, spatial and moral reasoning, memory and conformity.

The findings, published in Nature tomorrow and Behavioral Sciences this week, raise questions about the practice of drawing universal claims about human psychology and behavior based on research samples from WEIRD societies.

"The foundations of human psychology and behavior have been built almost exclusively on research conducted on subjects from WEIRD societies," says UBC Psychology and Economics Prof. Joe Henrich, who led the study with UBC co-authors Prof. Steven Heine and Prof. Ara Norenzayan. "While students from Western nations are a convenient, low-cost data pool, our findings suggest that they are also among the least representative populations one could find for generalizing about humans."

The study, which reviews the comparative database of research from across the behavioural sciences, finds that subjects from WEIRD societies are more individualistic, analytic, concerned with fairness, existentially anxious and less conforming and attentive to context compared to those from non-WEIRD societies.

According to the study, significant psychological and behavioral differences also exist between population groups within WEIRD nations. For example, U.S. undergraduate students are typically more analytic and choosy and less conforming than U.S. adults without college educations.

"Researchers often implicitly assume that there is little variation across human populations or that these 'standard subjects' are as representative of the species as any other population," says Henrich. "Our study shows there is substantial variability in experimental results across populations. In fact, there is enough evidence that researchers cannot in good faith continue to make species-generalizing claims about Homo sapiens in the absence of comparative evidence."

The research team calls on universities, peer reviewers, funding agencies and journal editors to push researchers to explicitly support any generalizations to the species with evidence or potent inductive arguments. Additionally, they envision the creation of research partnerships with non-WEIRD institutions to further and expand and diversify the empirical base of the behavioral sciences.

Contacts and sources:
Prof. Joseph.Henrich 
UBC Psychology/Economics
*Currently in UK
Prof. Steven Heine 
UBC Dept. of Psychology
Prof. Ara Norenzayan 
UBC Dept. of Psychology
View the study, "The weirdest people in the world?," and comprehensive commentary by the authors and colleagues in the research community
An opinion piece by the authors, to appear in the journal Nature on July 1, is available 

Scientists Unpeel Atoms And Molecules From The Inside Out

The first published scientific results from the world's most powerful hard X-ray laser, located at SLAC National Accelerator Laboratory, show its unique ability to control the behaviors of individual electrons within simple atoms and molecules by stripping them away, one by one -- in some cases creating hollow atoms. 

These early results—one published today, the other last week—describe in great detail how the Linac Coherent Light Source's intense pulses of X-ray light change the very atoms and molecules they are designed to image. Controlling those changes will be critical to achieving the atomic-scale images of biological molecules and movies of chemical processes that the LCLS is designed to produce.

The world's first hard X-ray free-electron laser started operation with a bang. First experiments at SLAC National Accelerator Laboratory's Linac Coherent Light Source stripped electrons one by one from neon atoms (illustrated above) and nitrogen molecules, in some cases removing only the innermost electrons to create "hollow atoms." Understanding how the machine's ultra-bright X-ray pulses interact with matter will be critical for making clear, atomic-scale images of biological molecules and movies of chemical processes.
 (Artist's Conception)
Artwork by Gregory Stewart, SLAC.

In a report published in the July 1 issue of Nature, a team led by Argonne National Laboratory physicist Linda Young describes how they were able to tune LCLS pulses to selectively strip electrons, one by one, from atoms of neon gas. By varying the photon energies of the pulses, they could do it from the outside in or—a more difficult task—from the inside out, creating so-called "hollow atoms."

"Until very recently, few believed that a free-electron X-ray laser was even possible in principle, let alone capable of being used with this precision," said William Brinkman, director of DOE's Office of Science. "That's what makes these results so exciting."

Young, who led the first experiments in October with collaborators from SLAC and five other institutions, said, "No one has ever had access to X-rays of this intensity, so the way in which ultra-intense X-rays interact with matter was completely unknown. It was important to establish these basic interaction mechanisms."

SLAC's Joachim Stöhr, director of the LCLS, said, "When we thought of the first experiments with LCLS ten years ago, we envisioned that the LCLS beam may actually be powerful enough to create hollow atoms, but at that time it was only a dream. The dream has now become reality."

In another report, published June 22 in Physical Review Letters, a team led by physicist Nora Berrah of Western Michigan University—the third group to conduct experiments at the LCLS—describes the first experiments on molecules. Her group also created hollow atoms, in this case within molecules of nitrogen gas, and found surprising differences in the way short and long laser pulses of exactly the same energies stripped and damaged the nitrogen molecules.

"We just introduced molecules into the chamber and looked at what was coming out there, and we found surprising new science," said Matthias Hoener, a postdoctoral researcher in Berrah's group at WMU and visiting scientist at Lawrence Berkeley National Laboratory who was first author of the paper. "Now we know that by reducing the pulse length, the interaction with the molecule becomes less violent. "

While the first experiments were designed to see what the LCLS can do and how its ultra-fast, ultra-bright pulses interact with atoms and molecules, they also pave the way for more complex experiments to come. Its unique capabilities make the LCLS a powerful tool for research in a wide range of fields, including physics, chemistry, biology, materials and energy sciences.

The LCLS forms images by scattering X-ray light off an atom, molecule or larger sample of material. Yet when the LCLS X-rays are tightly focused by mirrors, each powerful laser pulse destroys any sample it hits. Since certain types of damage, like the melting of a solid, are not instantaneous and only develop with time, the trick is to minimize the damage during the pulse itself and record the X-ray snapshot with a camera before the sample disintegrates.

Both teams found that the shorter the laser pulse, the fewer electrons are stripped away from the atom or molecule and the less damage is done. And both delved into the detailed mechanisms behind that damage.

Atoms are a little like miniature solar systems, with their electrons orbiting at various distances from the nucleus in a sort of quantum fuzz. To make things simpler, scientists describe the electrons as orbiting in "shells" at specific distances from the nucleus. The innermost shell contains up to two electrons, the next one up to eight, the third one up to 18, and so on.

Since they're closest to the positively charged nucleus, the two innermost electrons are generally the hardest to wrest away. But they also most readily absorb photons of X-ray light, and so are the most vulnerable to getting stripped away by intense X-rays.

Although previous experiments with intense optical lasers had stripped neon atoms of most of their electrons, Young's was the first to discover how ultra-intense X-ray lasers do this. At low photon energies, the outer electrons are removed, leaving the inner electrons untouched. However, at higher photon energies, the inner electrons are the first to be ejected; then the outer electrons cascade into the empty inner core, only to be kicked out by later parts of the same X-ray pulse. Even within the span of a single pulse there may be times when both inner electrons are missing, creating a hollow atom that is transparent to X-rays, Young said.

"This transparency associated with hollow atoms could be a useful property for future imaging experiments, because it decreases the fraction of photons doing damage and allows a higher percentage of photons to scatter off the atom and create the image," Young said. She said application of this phenomenon will also allow researchers to control how deeply an intense X-ray pulse penetrates into a sample.

Berrah's team bombarded puffs of nitrogen gas with laser pulses that ranged in duration from about four femtoseconds, or quadrillionths of a second, to 280 femtoseconds. No matter how short or long it was, though, each pulse contained the same amount of energy in the form of X-ray light; so you might expect that they would have roughly the same effects on the nitrogen molecules.

But to the team's surprise, that was not the case, Hoener said. The long pulses stripped every single electron from the nitrogen molecules, starting with the ones closest to the nucleus; the short ones stripped off only some of them.

Their report attributes this to the "frustrated absorption effect": Since the molecule's electrons are preferentially stripped from the innermost shells, there is simply not enough time during a short pulse for the molecule's outermost electrons to refill the innermost shells and get kicked out in turn.

With all this activity going on inside the atom, scientists have a new way to explore atomic structure and dynamics. Further experiments have investigated nanoclusters of atoms, protein nanocrystals and even individual viruses, with results expected to be published in coming months.

Young's research was primarily supported by the DOE Office of Science, with additional support from the Alexander von Humboldt Foundation. Berrah's research was supported by the DOE Office of Science.

The LCLS is a DOE Office of Science-funded project led by SLAC National Accelerator Laboratory in partnership with Argonne National Laboratory and Lawrence Livermore National Laboratory. Pacific Northwest National Laboratory provided initial project management support. Lawrence Berkeley National Laboratory and Cornell University contributed key subsystems. University of California, Los Angeles provided theoretical physics support throughout the project; Brookhaven and Los Alamos national laboratories were active in the early stages of LCLS research and development.

SLAC National Accelerator Laboratory is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. SLAC is located in Menlo Park, California, and is operated by Stanford University for the U.S. Department of Energy Office of Science.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC, for the U.S. Department of Energy's Office of Science.

Western Michigan University is a dynamic, student-centered research university with an enrollment of 25,000. WMU is focused on delivering high-quality undergraduate instruction, advancing its growing graduate division and fostering significant research activities.

Contacts and sources:

Sweet Antibiotic: Scientists Identify A Secret Ingredient In Honey That Kills Bacteria

New research in the FASEB Journal shows that defensin-1, a protein added to honey by bees, possesses potent antibacterial properties and could be used again drug-resistant bacteria

Sweet news for those looking for new antibiotics: A new research published in the July 2010 print edition of the FASEB Journal ( explains for the first time how honey kills bacteria. Specifically, the research shows that bees make a protein that they add to the honey, called defensin-1, which could one day be used to treat burns and skin infections and to develop new drugs that could combat antibiotic-resistant infections.

"We have completely elucidated the molecular basis of the antibacterial activity of a single medical-grade honey, which contributes to the applicability of honey in medicine," said Sebastian A.J. Zaat, Ph.D., a researcher involved in the work from the Department of Medical Microbiology at the Academic Medical Center in Amsterdam. "Honey or isolated honey-derived components might be of great value for prevention and treatment of infections caused by antibiotic-resistant bacteria."

To make the discovery, Zaat and colleagues investigated the antibacterial activity of medical-grade honey in test tubes against a panel of antibiotic-resistant, disease-causing bacteria. They developed a method to selectively neutralize the known antibacterial factors in honey and determine their individual antibacterial contributions. Ultimately, researchers isolated the defensin-1 protein, which is part of the honey bee immune system and is added by bees to honey. After analysis, the scientists concluded that the vast majority of honey's antibacterial properties come from that protein. This information also sheds light on the inner workings of honey bee immune systems, which may one day help breeders create healthier and heartier honey bees.

"We've known for millennia that honey can be good for what ails us, but we haven't known how it works," said Gerald Weissmann, M.D., Editor-in-Chief of the FASEB Journal, "Now that we've extracted a potent antibacterial ingredient from honey, we can make it still more effective and take the sting out of bacterial infections."

Receive monthly highlights from the FASEB Journal by e-mail. Sign up here. The FASEB Journal  is published by the Federation of the American Societies for Experimental Biology (FASEB). The journal has been recognized by the Special Libraries Association as one of the top 100 most influential biomedical journals of the past century and is the most cited biology journal worldwide according to the Institute for Scientific Information.

FASEB comprises 23 societies with more than 100,000 members, making it the largest coalition of biomedical research associations in the United States. FASEB enhances the ability of scientists and engineers to improve—through their research—the health, well-being and productivity of all people. FASEB's mission is to advance health and welfare by promoting progress and education in biological and biomedical sciences through service to our member societies and collaborative advocacy.

Contacts and sources:
Details: Paulus H. S. Kwakman, Anje A. te Velde, Leonie de Boer, Dave Speijer, Christina M. J.
E. Vandenbroucke-Grauls, and Sebastian A. J. Zaat. How honey kills bacteria. FASEB J. 2010 24: 2576-2582. DOI: 10.1096/fj.09-150789 ;