NASA’s Webb Telescope MIRI Instrument Takes One Step Closer To Space


A major instrument due to fly aboard NASA's James Webb Space Telescope is getting its first taste of space in the test facilities at the Rutherford Appleton Laboratory (RAL) in the United Kingdom. The Mid-InfraRed Instrument (MIRI) has been designed to contribute to areas of investigation as diverse as the first light in the early Universe and the formation of planets around other stars.

"The start of space simulation testing of the MIRI is the last major engineering activity needed to enable its delivery to NASA. It represents the culmination of 8 years of work by the MIRI consortium, and is a major progress milestone for the Webb telescope project," said Matt Greenhouse, NASA Project Scientist for the Webb telescope Integrated Science Instrument Module, at NASA's Goddard Space Flight Center, Greenbelt, Md.

The James Webb Space Telescope represents the next generation of space telescope and, unlike its predecessor Hubble, it will have to journey far from home. Its ultimate destination is L2, a gravitational pivot point located 1.5 million kilometers (930,000 miles) away, on the opposite side of the Earth from the Sun. Here it is cool enough for the MIRI to obtain exquisite measurements that astronomers will use to help decipher the Universe. "At L2 we are at an environmentally stable point where we can be permanently shaded from light from the Sun and Earth. That allows us to reach the very low temperatures - as low as 7K (- 447.1 Fahrenheit) in the case of MIRI – that are necessary to measure in the mid-infrared," says Jose Lorenzo Alvarez, MIRI Instrument Manager for European Space Agency (ESA).

The MIRI provides imaging, coronagraphy and integral field spectroscopy over the 5-28 micron wavelength range. It is being developed as a partnership between Europe and the U.S. The MIRI is one of four instruments flying aboard the Webb telescope. The other instruments include: NIRSpec (a near-infrared spectrograph), NIRCam (a near-infrared camera), and TFI (a tunable filter imager).

One of the jewels in the MIRI's crown is the potential to observe star formation that has been triggered by an interaction between galaxies. This phenomena has been difficult to study with Hubble or ground-based telescopes since the optical and near-infrared light from these newly formed stars is hidden from view by clouds of dust that typically surround newly formed stars This will not be a problem for MIRI, as it is sensitive to longer wavelengths of light in the range 5 to 28 microns, which can penetrate the dust.

However, keeping the MIRI at a colder temperature than on Pluto, for a sustained period of time, was one of the biggest engineering challenges facing those charged with constructing the instrument. "A critical aspect, to achieving the right sensitivity, is to ensure stable operation at 7 Kelvin (- 447.1 Fahrenheit) that will last for the five years of the mission," explains Alvarez.

This past spring, the flight model of the MIRI began to take shape as the key sub-assemblies - the imager, the spectrometer optics, and the input-optics and calibration module - were delivered to RAL for integration. Each of the optical sub-assemblies of the MIRI had at that stage already, separately, undergone exhaustive mechanical and thermal testing to make sure they can not only survive the rigors of a journey to L2, but also remain operational for the life of the mission. At RAL, the sub-assemblies were integrated into the flight model and are now being tested again, as a complete instrument, using a specially designed chamber developed at RAL to reproduce the environment at L2.

For the purposes of these environmental and calibration tests the Webb telescope optics are simulated using the MIRI Telescope Simulator (MTS) that was built in Spain. Following completion of these tests, the MIRI will be shipped to NASA's Goddard Space Flight Center in Greenbelt, Md., U.S. next spring, when the instrument will be integrated with the Webb’s Integrated Science Instrument Module.

When the MIRI eventually reaches its sheltered position, located four times further away from the Earth than the Moon, scientists can begin probing the Universe's secrets, including its earliest days. "We'd like to try and identify very young galaxies, containing some of the first stars that formed in the Universe," says Gillian Wright, European Principal Investigator for MIRI based at the U.K. Astronomy Technology Centre, Edinburgh, U.K.

With the current generation of space telescopes, distinguishing between a galaxy mature enough to have a central black hole and a young galaxy at a high redshift is troublesome, as they appeared similar in the near-infrared. A key to the MIRI's potential success is its ability to see through cosmic dust. When stars form they burn through the elements, creating dust which ends up in the interstellar medium of the galaxy. The re-radiated emission from this dust creates a spectrum markedly different from that of a galaxy with no dust; the emission is expected to be 5-10 times stronger in the mature galaxy. "MIRI provides a diagnostic of whether there has been a previous generation of stars that had gone supernova and created dust. In the first generation of stars there would be no dust or black holes because there hadn't been time to make any," explains Wright.

The astronomers who will use the MIRI and the James Webb Space Telescope are also particularly keen to explore the formation of planets around distant stars, another area where the ability to peer through the dust becomes important. "MIRI is absolutely essential for understanding planet formation because we know that it occurs in regions which are deeply embedded in dust," said Wright. MIRI's beam width of 0.1 arc seconds allows the instrument to image 30-35 Astronomical Units (AU) of a proto-planetary disc.

With most such discs thought to be hundreds of AU across, the MIRI can build up highly resolved mosaics of these planetary nurseries in unprecedented detail. With its spectrometer, the MIRI could even reveal the existence of water and/or hydrocarbons within the debris, paving the way for investigations into the habitability of other planetary systems.

The James Webb Space Telescope is a joint project of NASA, ESA and the Canadian Space Agency.

For more information visit http://www.nasa.gov/topics/technology/features/miri-test.html


View this site car transport


How Warm Was This Summer?

An unparalleled heat wave in eastern Europe, coupled with intense droughts and fires around Moscow, put Earth’s temperatures in the headlines this summer. Likewise, a string of exceptionally warm days in July in the eastern United States strained power grids, forced nursing home evacuations, and slowed transit systems. Both high-profile events reinvigorated questions about humanity’s role in climate change.

But, from a global perspective, how warm was the summer exactly? How did the summer's temperatures compare with previous years? And was global warming the "cause" of the unusual heat waves? Scientists at NASA's Goddard Institute for Space Studies (GISS) in New York City, led by GISS's director, James Hansen, have analyzed summer temperatures and released an update on the GISS website that addresses all of these questions.

Globally, June through August, according to the GISS analysis, was the fourth-warmest summer period in GISS’s 131-year-temperature record. The same months during 2009, in contrast, were the second warmest on record. The slightly cooler 2010 summer temperatures were primarily the result of a moderate La Niña (cooler than normal temperatures in the equatorial Pacific Ocean) replacing a moderate El Niño (warmer than normal temperatures in the equatorial Pacific Ocean).


As part of their analysis, Hansen and colleagues released a series of graphs that help explain why perceptions of global temperatures vary -- often erroneously -- from season to season and year to year. For example, unusually warm summer temperatures in the United States and eastern Europe created the impression of global warming run amuck in those regions this summer, while last winter's unusually cool temperatures created the opposite impression. A more global view, as shown below for 2009 and 2010, makes clear that extrapolating global trends based on the experience of one or two regions can be misleading.

"Unfortunately, it is common for the public to take the most recent local seasonal temperature anomaly as indicative of long-term climate trends," Hansen notes. "[We hope] these global temperature anomaly maps may help people understand that the temperature anomaly in one place in one season has limited relevance to global trends."

Last winter, for example, unusually cool temperatures in much of the United States caused many Americans to wonder why temperatures seemed to be plummeting, and whether the Earth could actually be experiencing global warming in the face of such frigid temperatures. A more global view, seen in the lower left of the four graphs above, shows that global warming trends had hardly abated. In fact, despite the cool temperatures in the United States, last winter was the second-warmest on record.

Meanwhile, the global seasonal temperatures for the spring of 2010 -- March, April, and May -- was the warmest on GISS's record. Does that mean that 2010 will shape up to be the warmest on record? Since the warmest year on GISS’s record -- 2005 -- experienced especially high temperatures during the last four calendar months of the year, it’s not yet clear how 2010 will stack up.

"It is likely that the 2005 or 2010 calendar year means will turn out to be sufficiently close that it will be difficult to say which year was warmer, and results of our analysis may differ from those of other groups," Hansen notes. "What is clear, though, is that the warmest 12-month period in the GISS analysis was reached in mid-2010."

The Russian heat wave was highly unusual. Its intensity exceeded anything scientists have seen in the temperature record since widespread global temperature measurements became available in the 1880s. Indeed, a leading Russian meteorologist asserted that the country had not experienced such an intense heat wave in the last 1,000 years. And a prominent meteorologist with Weather Underground estimated such an event may occur as infrequently as once every 15,000 years.

In the face of such a rare event, there’s much debate and discussion about whether global warming can "cause" such extreme weather events. The answer -- both no and yes -- is not a simple one.

Weather in a given region occurs in such a complex and unstable environment, driven by such a multitude of factors, that no single weather event can be pinned solely on climate change. In that sense, it's correct to say that the Moscow heat wave was not caused by climate change.

However, if one frames the question slightly differently: "Would an event like the Moscow heat wave have occurred if carbon dioxide levels had remained at pre-industrial levels," the answer, Hansen asserts, is clear: "Almost certainly not."

The frequency of extreme warm anomalies increases disproportionately as global temperature rises. "Were global temperature not increasing, the chance of an extreme heat wave such as the one Moscow experienced, though not impossible, would be small," Hansen says.

For more information visit http://www.nasa.gov/topics/earth/features/summer-temps.html


View this site car transport


NASA and NSF-Funded Research Finds First Potentially Habitable Exoplanet

A team of planet hunters from the University of California (UC) Santa Cruz, and the Carnegie Institution of Washington has announced the discovery of a planet with three times the mass of Earth orbiting a nearby star at a distance that places it squarely in the middle of the star's "habitable zone."

This discovery was the result of more than a decade of observations using the W. M. Keck Observatory in Hawaii, one of the world's largest optical telescopes. The research, sponsored by NASA and the National Science Foundation, placed the planet in an area where liquid water could exist on the planet's surface. If confirmed, this would be the most Earth-like exoplanet yet discovered and the first strong case for a potentially habitable one.

To astronomers, a "potentially habitable" planet is one that could sustain life, not necessarily one where humans would thrive. Habitability depends on many factors, but having liquid water and an atmosphere are among the most important.

The new findings are based on 11 years of observations of the nearby red dwarf star Gliese 581using the HIRES spectrometer on the Keck I Telescope. The spectrometer allows precise measurements of a star's radial velocity (its motion along the line of sight from Earth), which can reveal the presence of planets. The gravitational tug of an orbiting planet causes periodic changes in the radial velocity of the host star. Multiple planets induce complex wobbles in the star's motion, and astronomers use sophisticated analyses to detect planets and determine their orbits and masses.

"Keck's long-term observations of the wobble of nearby stars enabled the detection of this multi-planetary system," said Mario R. Perez, Keck program scientist at NASA Headquarters in Washington. "Keck is once again proving itself an amazing tool for scientific research."

Steven Vogt, professor of astronomy and astrophysics at UC Santa Cruz, and Paul Butler of the Carnegie Institution lead the Lick-Carnegie Exoplanet Survey.

"Our findings offer a very compelling case for a potentially habitable planet," said Vogt. "The fact that we were able to detect this planet so quickly and so nearby tells us that planets like this must be really common."

The paper reports the discovery of two new planets around Gliese 581. This brings the total number of known planets around this star to six, the most yet discovered in a planetary system outside of our own. Like our solar system, the planets around Gliese 581 have nearly-circular orbits.

The new planet designated Gliese 581g has a mass three to four times that of Earth and orbits its star in just under 37 days. Its mass indicates that it is probably a rocky planet with a definite surface and enough gravity to hold on to an atmosphere.

Gliese 581, located 20 light years away from Earth in the constellation Libra, has two previously detected planets that lie at the edges of the habitable zone, one on the hot side (planet c) and one on the cold side (planet d). While some astronomers still think planet d may be habitable if it has a thick atmosphere with a strong greenhouse effect to warm it up, others are skeptical. The newly-discovered planet g, however, lies right in the middle of the habitable zone.

The planet is tidally locked to the star, meaning that one side is always facing the star and basking in perpetual daylight, while the side facing away from the star is in perpetual darkness. One effect of this is to stabilize the planet's surface climates, according to Vogt. The most habitable zone on the planet's surface would be the line between shadow and light (known as the "terminator").

For more information visit http://www.nasa.gov/topics/universe/features/gliese_581_feature.html


View this site car transport


Goddard Team Obtains the ‘Unobtainium’ for NASA’s Next Space Observatory

Imagine building a car chassis without a blueprint or even a list of recommended construction materials.

In a sense, that's precisely what a team of engineers at the NASA Goddard Space Flight Center in Greenbelt, Md., did when they designed a one-of-a-kind structure that is one of 9 key new technology systems of the Integrated Science Instrument Module (ISIM). Just as a chassis supports the engine and other components in a car, the ISIM will hold four highly sensitive instruments, electronics, and other shared instrument systems flying on the James Webb Space Telescope, NASA's next flagship observatory.

From scratch — without past experience to help guide them — the engineers designed the ISIM made of a never-before-manufactured composite material and proved through testing that it could withstand the super-cold temperatures it would encounter when the observatory reached its orbit 1.5-million kilometers (930,000 miles) from Earth. In fact, the ISIM structure survived temperatures that plunged as low as 27 Kelvin (-411 degrees Fahrenheit), colder than the surface of Pluto.

"It is the first large, bonded composite spacecraft structure to be exposed to such a severe environment," said Jim Pontius, ISIM lead mechanical engineer.

The 26-day test was specifically carried out to test whether the car-sized structure contracted and distorted as predicted when it cooled from room temperature to the frigid — very important since the science instruments must maintain a specific location on the structure to receive light gathered by the telescope's 6.5-meter (21.3-feet) primary mirror. If the structure shrunk or distorted in an unpredictable way due to the cold, the instruments no longer would be in position to gather data about everything from the first luminous glows following the big bang to the formation of star systems capable of supporting life.

"The tolerances are much looser on the Hubble Space Telescope," said Ray Ohl, a Goddard optical engineer who leads ISIM's optical integration and test. "The optical requirements for Webb are even more difficult to meet than those on Hubble."

Despite repeated cycles of testing, the truss-like assembly designed by Goddard engineers did not crack. The structure shrunk as predicted by only 170 microns — the width of a needle —when it reached 27 Kelvin (-411 degrees Fahrenheit), far exceeding the design requirement of about 500 microns. "We certainly wouldn’t have been able to realign the instruments on orbit if the structure moved too much," said ISIM Structure Project Manager Eric Johnson. "That's why we needed to make sure we had designed the right structure."

Obtaining the Unobtainium

Achieving the milestone was just one of many firsts for the Goddard team. Almost on every level, "we pushed the technology envelope, from the type of material we would use to build ISIM to how we would test it once it was assembled," Pontius added. "The technology challenges are what attracted the people to the program."

One of the first challenges the team tackled after NASA had named Goddard as the lead center to design and develop ISIM was identifying a structural material that would assure the instruments' precise cryogenic alignment and stability, yet survive the extreme gravitational forces experienced during launch.

An exhaustive search in the technical literature for a possible candidate material yielded nothing, leaving the team with only one alternative — developing its own as-yet-to-be manufactured material, which team members jokingly referred to as "unobtainium." Through mathematical modeling, the team discovered that by combining two composite materials, it could create a carbon fiber/cyanate-ester resin system that would be ideal for fabricating the structure's square tubes that measure 75-mm (3-inch) in diameter.

How then would engineers attach these tubes? Again through mathematical modeling, the team found it could bond the pieces together using a combination of nickel-alloy fittings, clips, and specially shaped composite plates joined with a novel adhesive process, smoothly distributing launch loads while holding the instruments in precise locations — a difficult engineering challenge because different materials react differently to changes in temperature.

"We engineered from the small pieces to the big pieces testing along the way to see if the failure theories were correct. We were looking to see where the design could go wrong," Pontius explained. "By incorporating the lessons learned into the final flight structure, we met the requirements and test validated our building-block approach."

Making Cold, Colder

The test inside Goddard's Space Environment Simulator — a three-story thermal-vacuum chamber that simulates the temperature and vacuum conditions found in space — presented its own set of technological hurdles. "We weren't sure we could get the simulator cold enough," said Paul Cleveland, a technical consultant at Goddard involved in the project. For most spacecraft, the simulator's ability to cool down to 100 Kelvin (-279.7 degrees Fahrenheit) is cold enough. Not so for the Webb telescope, which will endure a constant temperature of 39 Kelvin (-389.5 degrees Fahrenheit) when it reaches its deep-space orbit.

The group engineered a giant tuna fish can-like shroud, cooled by helium gas, and inserted it inside the 27-foot diameter chamber. "When you get down to these temperatures, the physics change," Cleveland said. Anything, including wires or small gaps in the chamber, can create an intractable heat source. "It's a totally different arena," he added. "One watt can raise the temperature by 20 degrees Kelvin. We had to meticulously close the gaps."

With the gaps closed and the ISIM safely lowered into the helium shroud, technicians began sucking air from the chamber to create a vacuum. They activated the simulator's nitrogen panels to cool the chamber to 100 Kelvin (-279.7 degrees Fahrenheit) and began injecting helium gas inside the shroud to chill the ISIM to the correct temperature.

To measure ISIM's reaction as it cooled to the sub-freezing temperatures, the team used a technique called photogrammetry, the science of making precise measurements by means of photography. However, using the technique wasn't so cut-and-dried when carried out in a frosty, airless environment, Ohl said. To protect two commercial-grade cameras from extreme frostbite, team members placed the equipment inside specially designed protective canisters and attached the camera assemblies to the ends of a motorized boom.

As the boom made nearly 360-degree sweeps inside the helium shroud, the cameras snapped photos through a gold-coated glass window of reflective, hockey puck-shaped targets bolted onto ISIM's composite tubes. From the photos, the team could precisely determine whether the targets moved, and if so, by how much.

"It passed with flying colors," Pontius said, referring to the negligible shrinkage. "This test was a huge success for us."

With the critical milestone test behind them, team members say their work likely will serve NASA in the future. Many future science missions will also operate in deep space, and therefore would have to be tested under extreme cryogenic conditions. In the meantime, though, the facility will be used to test other Webb telescope systems, including the backplane, the structure to which the Webb telescope’s 18 primary mirror segments are bolted when the observatory is assembled. "We need to characterize its bending at cryogenic temperatures," Ohl said.


For more information visit http://www.nasa.gov/topics/technology/features/jwst-unobtainium.html


View this site car transport


Wildfires: A Symptom of Climate Change

This summer, wildfires swept across some 22 regions of Russia, blanketing the country with dense smoke and in some cases destroying entire villages. In the foothills of Boulder, Colo., this month, wildfires exacted a similar toll on a smaller scale.

That's just the tip of the iceberg. Thousands of wildfires large and small are underway at any given time across the globe. Beyond the obvious immediate health effects, this "biomass" burning is part of the equation for global warming. In northern latitudes, wildfires actually are a symptom of the Earth's warming.

'We already see the initial signs of climate change, and fires are part of it," said Dr. Amber Soja, a biomass burning expert at the National Institute of Aerospace (NIA) in Hampton, Va.

And research suggests that a hotter Earth resulting from global warming will lead to more frequent and larger fires.

The fires release "particulates" -- tiny particles that become airborne -- and greenhouse gases that warm the planet.

Human ignition

A common perception is that most wildfires are caused by acts of nature, such as lightning. The inverse is true, said Dr. Joel Levine, a biomass burning expert at NASA Langley Research Center in Hampton, Va.

"What we found is that 90 percent of biomass burning is human instigated," said Levine, who was the principal investigator for a NASA biomass burning program that ran from 1985 to 1999.

Levine and others in the Langley-led Biomass Burning Program travelled to wildfires in Canada, California, Russia, South African, Mexico and the wetlands of NASA's Kennedy Space Center in Florida.

Biomass burning accounts for the annual production of some 30 percent of atmospheric carbon dioxide, a leading cause of global warming, Levine said.

Dr. Paul F. Crutzen, a pioneer of biomass burning, was the first to document the gases produced by wildfires in addition to carbon dioxide.

"Modern global estimates agree rather well with the initial values," said Crutzen, who shared the Nobel Prize in Chemistry 1995 with Mario J. Molina and F. Sherwood Rowland for their "work in atmospheric chemistry, particularly concerning the formation and decomposition of ozone."

Northern exposure

Whether biomass burning is on the rise globally is not clear. But it definitely is increasing in far northern latitudes, in "boreal" forests comprised largely of coniferous trees and peatlands.

The reason is that, unlike the tropics, northern latitudes are warming, and experiencing less precipitation, making them more susceptible to fire. Coniferous trees shed needles, which are stored in deep organic layers over time, providing abundant fuel for fires, said Soja, whose work at the NIA supports NASA.

"That's one of the reasons northern latitudes are so important," she said, "and the smoldering peat causes horrible air quality that can affect human health and result in death."

Fires in different ecosystems burn at different temperatures due to the nature and structure of the biomass and its moisture content. Burning biomass varies from very thin, dry grasses in savannahs to the very dense and massive, moister trees of the boreal, temperate and tropical forests.

Fire combustion products vary over a range depending on the degree of combustion, said Levine, who authored a chapter on biomass burning for a book titled "Methane and Climate Change," published in August by Earthscan.

Flaming combustion like the kind in thin, small, dry grasses in savannahs results in near-complete combustion and produces mostly carbon dioxide. Smoldering combustion in moist, larger fuels like those in forest and peatlands results in incomplete combustion and dirtier emission products such as carbon monoxide.

Boreal fires burn the hottest and contribute more pollutants per unit area burned.

'Eerie experience'

Being near large wildfires is a unique experience, said Levine. "The smoke is so thick it looks like twilight. It blocks out the sun. It looks like another planet. It's a very eerie experience."

In Russia, the wildfires are believed caused by a warming climate that made the current summer the hottest on record. The hotter weather increases the incidence of lightning, the major cause of naturally occurring biomass burning.

Soja said she hopes the wildfires in Russia prompt the country to support efforts to mitigate climate change. In fact, Russia's president, Dmitri A. Medvedev, last month acknowledged the need to do something about it.

"What's happening with the planet's climate right now needs to be a wake-up call to all of us, meaning all heads of state, all heads of social organizations, in order to take a more energetic approach to countering the global changes to the climate," said Medvedev, in contrast to Russia's long-standing position that human-induced climate change is not occurring.

For more information visit http://www.nasa.gov/topics/earth/features/wildfires.html

Hello, Saturn Summer Solstice: Cassini’s New Chapter

Turning a midsummer night's dream into reality, NASA's Cassini spacecraft begins its new mission extension -- the Cassini Solstice Mission -- today. The mission extension will take Cassini a few months past Saturn's northern summer solstice (or midsummer) through September 2017. It will enable scientists to study seasonal changes and other long-term weather changes on Saturn and its moons.

Cassini had arrived just after Saturn's northern winter solstice in 2004, and the extension continues a few months past the northern summer solstice in May 2017. A complete seasonal period on Saturn has never been studied at this level of detail.

Cassini has revealed a bounty of scientific discoveries since its launch in 1997, including previously unknown characteristics of the Earth-like world of Saturn's moon Titan, and the plume of water vapor and organic particles spewing from another moon, Enceladus.

The Cassini Solstice Mission will enable continued study of these intriguing worlds. It will also allow scientists to continue observations of Saturn's rings and the magnetic bubble around the planet, known as the magnetosphere. Near the end of the mission, the spacecraft will make repeated dives between Saturn and its rings to obtain in-depth knowledge of the gas giant. During these dives, the spacecraft will study the internal structure of Saturn, its magnetic fluctuations and ring mass.

Cassini entered orbit around Saturn in 2004. Mission managers had originally planned for a four-year tour of the Saturnian system. In 2008, Cassini received a mission extension through September 2010 to probe the planet and its moons through equinox, when the sun was directly over the equator. Equinox, which occurred in August 2009, marked the turn from southern fall to northern spring. The second mission extension, called the Cassini Solstice Mission, was announced earlier this year.

"After nearly seven years in transit and six years in Saturn orbit, this spacecraft still just hums along," said Bob Mitchell, Cassini program manager at NASA's Jet Propulsion Laboratory, Pasadena, Calif. "With seven more years to go, the science should be just as exciting as what we've seen so far."

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. JPL manages the project for NASA's Science Mission Directorate in Washington. The Cassini orbiter was designed, developed and assembled at JPL.

For more information visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-314


View this site car transport


Shining Starlight on the Dark Cocoons of Star Birth

This series of images from NASA's Spitzer Space Telescope shows a dark mass of gas and dust, called a core, where new stars and planets will likely spring up.
Astronomers have discovered a new, cosmic phenomenon, termed "coreshine," which is revealing new information about how stars and planets come to be.

The scientists used data from NASA's Spitzer Space Telescope to measure infrared light deflecting off cores -- cold, dark cocoons where young stars and planetary systems are blossoming. This coreshine effect, which occurs when starlight from nearby stars bounces off the cores, reveals information about their age and consistency. In a new paper, to be published Friday, Sept. 24, in the journal Science, the team reports finding coreshine across dozens of dark cores.

"Dark clouds in our Milky Way galaxy, far from Earth, are huge places where new stars are born. But they are shy and hide themselves in a shroud of dust so that we cannot see what happens inside," said Laurent Pagani of the Observatoire de Paris and the Centre National de la Recherche Scientifique, both in France. "We have found a new way to peer into them. They are like ghosts because we see them but we also see through them."

Pagani and his team first observed one case of the coreshine phenomenon in 2009. They were surprised to see that starlight was scattering off a dark core in the form of infrared light that Spitzer could see. They had thought the grains of dust making up the core were too small to deflect the starlight; instead, they expected the sunlight would travel straight through. Their finding told them that the dust grains were bigger than previously thought -- about 1 micron instead of 0.1 micron (a typical human hair is about 100 microns).

This particular core lies deep within a larger dark cloud called L183

That might not sound like a big difference, but it can significantly change astronomers' models of star and planet formation. For one thing, the larger grain size means that planets -- which form as dust circling young stars sticks together -- might take shape more quickly. In other words, the tiny seeds for planet formation may be forming very early on, when a star is still in its pre-embryonic phase.

But this particular object observed in 2009 could have been a fluke. The researchers did not know if what they found was true of other dark clouds -- until now. In the new study, they examine 110 dark cores, and find that about half of them exhibit coreshine.

The finding amounts to a new tool for not only studying the dust making up the dark cores, but also for assessing their age. The more developed star-forming cores will have larger dust grains, so, using this tool, astronomers can better map their ages across our Milky Way galaxy. Coreshine can also help in constructing three-dimensional models of the cores -- the deflected starlight is scattered in a way that is dependent on the cloud structures.

Said Pagani, "We're opening a new window on the realm of dark, star-forming cores."

Other authors are Aurore Bacmann of the Astrophysics Laboratory of Grenoble, France, and Jürgen Steinacker, Amelia Stutz and Thomas Henning of the Max-Planck Institute for Astronomy, Germany. Steinacker is also with the Observatoire de Paris, and Stutz is also with the University of Arizona, Tucson.

The Spitzer measurements are based on data from the mission's public archive, taken before the telescope ran out of its liquid coolant in May 2009 and began its current warm mission.

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology, also in Pasadena. Caltech manages JPL for NASA.

For more information visit http://www.nasa.gov/mission_pages/spitzer/news/spitzer20100923.html

New Map Offers a Global View of Health-Sapping Air Pollution

In many developing countries, the absence of surface-based air pollution sensors makes it difficult, and in some cases impossible, to get even a rough estimate of the abundance of a subcategory of airborne particles that epidemiologists suspect contributes to millions of premature deaths each year. The problematic particles, called fine particulate matter (PM2.5), are 2.5 micrometers or less in diameter, about a tenth the fraction of human hair. These small particles can get past the body’s normal defenses and penetrate deep into the lungs.

To fill in these gaps in surface-based PM2.5 measurements, experts look toward satellites to provide a global perspective. Yet, satellite instruments have generally struggled to achieve accurate measurements of the particles in near-surface air. The problem: Most satellite instruments can't distinguish particles close to the ground from those high in the atmosphere. In addition, clouds tend to obscure the view. And bright land surfaces, such as snow, desert sand, and those found in certain urban areas can mar measurements.

However, the view got a bit clearer this summer with the publication of the first long-term global map of PM2.5 in a recent issue of Environmental Health Perspectives. Canadian researchers Aaron van Donkelaar and Randall Martin at Dalhousie University, Halifax, Nova Scotia, Canada, created the map by blending total-column aerosol amount measurements from two NASA satellite instruments with information about the vertical distribution of aerosols from a computer model.

Their map, which shows the average PM2.5 results between 2001 and 2006, offers the most comprehensive view of the health-sapping particles to date. Though the new blending technique has not necessarily produced more accurate pollution measurements over developed regions that have well-established surface-based monitoring networks, it has provided the first PM2.5 satellite estimates in a number of developing countries that have had no estimates of air pollution levels until now.

The map shows very high levels of PM2.5 in a broad swath stretching from the Saharan Desert in Northern Africa to Eastern Asia. When compared with maps of population density, it suggests more than 80 percent of the world's population breathe polluted air that exceeds the World Health Organization's recommended level of 10 micrograms per cubic meter. Levels of PM2.5 are comparatively low in the United States, though noticeable pockets are clearly visible over urban areas in the Midwest and East.

"We still have plenty of work to do to refine this map, but it's a real step forward," said Martin, one of the atmospheric scientists who created the map."We hope this data will be useful in areas that don't have access to robust ground-based measurements."

Piecing Together the Health Impacts of PM2.5

Take a deep breath. Even if the air looks clear, it's nearly certain you've inhaled millions of PM2.5 particles. Though often invisible to humans, such particles are present everywhere in Earth's atmosphere, and they come from both natural and human sources. Researchers are still working to quantify the precise percentage of natural versus human-generated PM2.5, but it's clear that both types contribute to the hotspots that show up in the new map.

Wind, for example, lifts large amounts of mineral dust aloft in the Arabian and Saharan deserts. In many heavily urbanized areas, such as eastern China and northern India, power plants and factories that burn coal lack filters and produce a steady stream of sulfate and soot particles. Motor vehicle exhaust also creates significant amounts of nitrates and other particles. Both agricultural burning and diesel engines yield dark sooty particles scientists call black carbon.

Human-generated particles often predominate in urban air -- what most people actually breathe -- and these particles trouble medical experts the most, explained Arden Pope, an epidemiologist at Brigham Young University, Provo, Utah and one of the world's leading experts on the health impacts of air pollution. That's because the smaller PM2.5 particles evade the body defenses—small hair-like structures in the respiratory tract called cilia and hairs in our noses—that do a reasonably good job of clearing or filtering out the larger particles.

Small particles can make their way deep into human lungs and some ultrafine particles can even enter the bloodstream. Once there, they can spark a whole range of diseases including asthma, cardiovascular disease, and bronchitis. The American Heart Association estimates that in the United States alone, PM2.5 air pollution spark some 60,000 deaths a year.

Though PM2.5 as a class of particle clearly poses health problems, researchers have had less success assigning blame to specific types of particles. "There are still big debates about which type of particle is the most toxic," said Pope. "We're not sure whether it's the sulfates, or the nitrates, or even fine dust that's the most problematic."

One of the big sticking points: PM2.5 particles frequently mix and create hybrid particles, making it difficult for both satellite and ground-based instruments to parse out the individual effects of the particles.

The Promise of Satellites and PM2.5

The new map, and research that builds upon it, will help guide researchers who attempt to address this and a number of other unresolved questions about PM2.5. The most basic: how much of a public health toll does air pollution take around the globe? "We can see clearly that a tremendous number of people are exposed to high levels of particulates," said Martin. "But, so far, nobody has looked at what that means in terms of mortality and disease. Most of the epidemiology has focused on developed countries in North America and Europe."

Now, with this map and dataset in hand, epidemiologists can start to look more closely at how long term exposure to particulate matter in rarely studied parts of the world – such as Asia's fast-growing cities or areas in North Africa with quantities of dust in the air – affect human health. The new information could even be useful in parts of the United States or Western Europe where surface monitors, still the gold standard for measuring air quality, are sparse.

In addition to using satellite data from NASA's Multi-angle Imaging SpectroRadiometer (MISR) that flies on NASA's Terra satellite and the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument that flies on both NASA's Aqua and Terra satellites, the researchers used output from a chemical transport model called GEOS-Chem to create the new map.

However, the map does not represent the final word on the global distribution of PM2.5, the researchers who made it emphasize. Although the data blending technique van Donkelaar applied provides a clearer global view of fine particulates, the abundance of PM2.5 could still be off by 25 percent or more in some areas due to remaining uncertainties, explained Ralph Kahn, an expert in remote sensing from NASA's Goddard Space Flight Center in Greenbelt, Md. and one of the coauthors of the paper.

To improve understanding of airborne particles, NASA scientists have plans to participate in numerous upcoming field campaigns and satellite missions. NASA Goddard, for example, operates a global network of ground-based particle sensors called AERONET that site managers are currently working to enhance and expand. And, later next year, scientists from Goddard's Institute for Space Studies (GISS) in New York will begin to analyze the first data from Glory, a satellite that carries an innovative type of instrument—a polarimeter—that will measure particle properties in new ways and complement existing instruments capable of measuring aerosols from space.

"We still have some work to do in order to realize the full potential of satellite measurements of air pollution," said Raymond Hoff, the director of the Goddard Earth Science and Technology Center at the University of Maryland-Baltimore County and the author of a comprehensive review article on the topic published recently in the Journal of the Air & Waste Management Association. "But this is an important step forward."

For more information visit http://www.nasa.gov/topics/earth/features/health-sapping.html

NASA Study Shows Desert Dust Cuts Colorado River Flow

Dust-covered snow in the San Juan Mountains.
Snowmelt in the Colorado River basin is occurring earlier, reducing runoff and the amount of crucial water available downstream. A new study shows this is due to increased dust caused by human activities in the region during the past 150 years.

The study, led by a NASA scientist and funded by the agency and the National Science Foundation, showed peak spring runoff now comes three weeks earlier than before the region was settled and soils were disturbed. Annual runoff is lower by more than five percent on average compared to pre-settlement levels.

The findings have major implications for the 27 million people in the seven U.S. states and Mexico who rely on the Colorado River for drinking, agricultural and industrial water. The results were published in this week's Proceedings of the National Academy of Sciences.

The research team was led by Tom Painter, a snow hydrologist at both NASA's Jet Propulsion Laboratory in Pasadena, Calif., and UCLA. The team examined the impact of human-produced dust deposits on mountain snowpacks over the Upper Colorado River basin between 1915 and 2003. Studies of lake sediment cores showed the amount of dust falling in the Rocky Mountains increased by 500 to 600 percent since the mid-to-late 1800s, when grazing and agriculture began to disturb fragile but stable desert soils.

The team used an advanced hydrology model to simulate the balance of water flowing into and out of the river basin under current dusty conditions, and those that existed before soil was disturbed. Hydrologic data gathered from field studies funded by NASA and the National Science Foundation, and measurements of the absorption of sunlight by dust in snow, were combined with the modeling.

Snow pits like this one, dug in the mountains of the Upper Colorado River basin, were used to study the layering of dust in snowfields and its impact on the absorption of sunlight.

More than 80 percent of sunlight falling on fresh snow is typically reflected back into space. In the semi-arid regions of the Colorado Plateau and Great Basin, winds blow desert dust east, triggering dust-on-snow events. When dark dust particles fall on snow, they reduce its ability to reflect sunlight. The snow also absorbs more of the sun's energy. This darker snow cover melts earlier, with some water evaporating into the atmosphere.

Earlier melt seasons expose vegetation sooner, and plants lose water to the atmosphere through the exhalation of vapor. The study shows an annual average of approximately 35-billion cubic feet of water is lost from this exhalation and the overall evaporation that would otherwise feed the Colorado River. This is enough water to supply Los Angeles for 18 months.

"The compressed mountain runoff period makes water management more difficult than a slower runoff," Painter said. "With the more rapid runoff under dust-accelerated melt, costly errors are more likely to be made when water is released from and captured in Colorado River reservoirs."

Prior to the study, scientists and water managers had a poor understanding of dust-on-snow events. Scientists knew from theory and modeling studies that dust could be changing the way snowfields reflect and absorb sunlight, but no one had measured its full impact on snowmelt rates and runoff over the river basin. The team addressed these uncertainties by making systematic measurements of the sources, frequency and snowmelt impact of dust-on-snow events.

"These researchers brought together their collective expertise to provide a historical context for how the Colorado River and its runoff respond to dust deposition on snow," said Anjuli Bamzai, program director in the National Science Foundation's Division of Atmospheric and Geospace Sciences in Arlington, Va. "The work lays the foundation for future sound water resource management."

Painter believes steps can be taken to reduce the severity of dust-on-snow events in the Colorado River basin. He points to the impact of the Taylor Grazing Act of 1934 for potential guidance on how dust loads can be reduced. The act regulated grazing on public lands to improve rangeland conditions. Lake sediment studies show it decreased the amount of dust falling in the Rocky Mountains by about one quarter.

"Restoration of desert soils could increase the duration of snow cover, simplifying water management, increasing water supplies and reducing the need for additional reservoir storage of water. Peak runoff under cleaner conditions would then come later in summer, when agricultural and other water demands are greater," Painter said.

"It could also at least partially mitigate the expected regional impacts of climate change, which include reduced Colorado River flows, increased year-to-year variability in its flow rate, and more severe and longer droughts," he added. "Climate models project a seven to 20 percent reduction in Colorado River basin runoff in this century due to climate change."

Other institutions participating in the study include the National Snow and Ice Center in Boulder, Colo.; U.S. Geological Survey Southwest Biological Center in Moab, Utah; University of Washington in Seattle; Center for Snow and Avalanche Studies in Silverton, Colo.; and the University of Colorado-NOAA Western Water Assessment in Boulder.

For more information visit http://www.nasa.gov/topics/earth/features/colorado20100920.html

Spring on Titan brings sunshine and patchy clouds

The northern hemisphere of Saturn's moon Titan is set for mainly fine spring weather, with polar skies clearing since the equinox in August last year. The visual and infrared mapping spectrometer (VIMS) aboard NASA's Cassini spacecraft has been monitoring clouds on Titan regularly since the spacecraft entered orbit around Saturn in 2004. Now, a group led by Sébastien Rodriguez, a Cassini VIMS team collaborator based at Université Paris Diderot, France, has analyzed more than 2,000 VIMS images to create the first long-term study of Titan's weather using observational data that also includes the equinox. Equinox, when the sun shone directly over the equator, occurred in August 2009.

Rodriguez is presenting the results and new images at the European Planetary Science Congress in Rome on Sept. 22.

Though Titan's surface is far colder and lacks liquid water, this moon is a kind of "sister world" to Earth because it has a surface covered with organic material and an atmosphere whose chemical composition harkens back to an early Earth. Titan has a hydrological cycle similar to Earth's, though Titan's cycle depends on methane and ethane rather than water.

A season on Titan lasts about seven Earth years. Rodriguez and colleagues observed significant atmospheric changes between July 2004 (early summer in Titan's southern hemisphere) and April 2010 (the very start of northern spring). The images showed that cloud activity has recently decreased near both of Titan's poles. These regions had been heavily overcast during the late southern summer until 2008, a few months before the equinox.

Over the past six years, the scientists found that clouds clustered in three distinct latitude regions of Titan: large clouds at the north pole, patchy clouds at the south pole and a narrow belt around 40 degrees south. "However, we are now seeing evidence of a seasonal circulation turnover on Titan – the clouds at the south pole completely disappeared just before the equinox and the clouds in the north are thinning out," Rodriguez said. "This agrees with predictions from models and we are expecting to see cloud activity reverse from one hemisphere to another in the coming decade as southern winter approaches."

The Cassini-Huygens mission is a cooperative project of NASA, the European Space Agency and the Italian Space Agency. JPL manages the mission for NASA's Science Mission Directorate, Washington, D.C. The visual and infrared mapping spectrometer team is based at the University of Arizona, Tucson.

For more information visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-308

Shuttle Displays Convey Lasting Tribute

Throughout history, art always seems to inspire emotion. But in Shuttle Launch Director Mike Leinbach's office overlooking Firing Room 4 of the Launch Control Center, his emotions have inspired several beautiful works of art.

With the end of the Space Shuttle Program approaching, Leinbach came up with an idea in February to honor the final flight of Atlantis, Discovery and Endeavour.

"The original idea was to unveil something after the last launch of each vehicle in Firing Room 4," Leinbach said. "I personally wanted them in there because of the teams I've worked with as launch director throughout the last 10 years. It means a lot to me to have them there."

With the help of Management Support Assistant Amy Simpson, designing the first display began in April. The plan was to get Atlantis' tribute up before its final scheduled launch.

"The graphics people went from concept to design quickly," Leinbach said. "They put their heart and soul into it. It's so unique to each processing team."

Designs for the first was a 2-foot-by-3-foot piece of work, but the size eventually grew to 5 by 7 feet. The displays couldn't be too heavy, so they are made of foam board and held up with Velcro. Each shuttle is a separate piece, which makes it three-dimensional. Atlantis' display was completed May 7; the spacecraft launched May 14 and landed May 26.

"We did Atlantis separately really well," Leinbach said.

Leinbach and Simpson decided to put the final four displays up at once so workers could enjoy them as the final launches take place. The art for Columbia, Challenger, Discovery and Endeavour went up the last week of July. The displays are on the wall above the shuttle countdown clock in the order of their delivery date.

"They represent everybody who has contributed to the Space Shuttle Program, especially here at Kennedy Space Center, throughout the last 30 years," Leinbach said. "I have a very, very close, personal relationship with those orbiters. And that may sound funny, but the people who work on the vehicles know exactly what I'm saying.

"We end up loving the vehicles as much as we love the crews. It's hard to explain to the public, but everyone who reads this will understand what I'm saying."

Five replica posters are on Firing Room 4's north wall. Each gives a brief description of the artwork and contains information that couldn't be represented in the display.

Leinbach asked the three orbiter flow directors what they wanted to do with their final launch and coordinated their ideas for each display with two graphic artists, Amy Lombardo and Lynda Brammer of Abacus Technology Corp. Lombardo designed four of the five displays.

Brammer handled the Challenger piece.

"These were the first displays I have ever created," Lombardo said. "I wanted to meet the standards and expectations set by my coworkers, particularly those of Lynda Brammer. Her beautiful work graces walls across the center and I hoped that my work would measure up."

The tributes already have received accolades. As recently as Sept. 7, the Endeavour image was NASA's Image of the Day. The excitement the displays have drawn has come from around the world.

"We knew they were going to be popular, but we had no idea how well they were going to be received," Leinbach said. "I've gotten calls and e-mails from every center... friends I've made throughout my career. It's pretty neat.

"I hope they remain in Firing Room 4 forever. It will be a lasting tribute to the orbiters and the processing teams."

For more information visit http://www.nasa.gov/mission_pages/shuttle/flyout/art_tribute.html

Cosmic Ice Sculptures

Enjoying a frozen treat on a hot summer day can leave a sticky mess as it melts in the Sun and deforms. In the cold vacuum of space, there is no edible ice cream, but there is radiation from massive stars that is carving away at cold molecular clouds, creating bizarre, fantasy-like structures.

These one-light-year-tall pillars of cold hydrogen and dust, imaged by the Hubble Space Telescope, are located in the Carina Nebula. Violent stellar winds and powerful radiation from massive stars are sculpting the surrounding nebula. Inside the dense structures, new stars may be born.

This image of dust pillars in the Carina Nebula is a composite of 2005 observations taken of the region in hydrogen light (light emitted by hydrogen atoms) along with 2010 observations taken in oxygen light (light emitted by oxygen atoms), both times with Hubble's Advanced Camera for Surveys. The immense Carina Nebula is an estimated 7,500 light-years away in the southern constellation Carina.

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI) conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc. in Washington, D.C.

For more information visit http://www.nasa.gov/mission_pages/hubble/science/ice-sculptures.html

Antarctic Ozone Hole 2010

The yearly depletion of stratospheric ozone over Antarctica – more commonly referred to as the “ozone hole” – started in early August 2010 and is now expanding toward its annual maximum. The hole in the ozone layer typically reaches its maximum area in late September or early October, though atmospheric scientists must wait a few weeks after the maximum to pinpoint when the trend of ozone depletion has slowed down and reversed.

The hole isn’t literal; no part of the stratosphere — the second layer of the atmosphere, between 8 and 50 km (5 and 31 miles) — is empty of ozone. Scientists use "hole" as a metaphor for the area in which ozone concentrations drop below the historical threshold of 220 Dobson Units. Historical levels of ozone were much higher than 220 Dobson Units, according to NASA atmospheric scientist Paul Newman, so this value shows a very large ozone loss.

Earth's ozone layer protects life by absorbing ultraviolet light, which damages DNA in plants and animals (including humans) and leads to skin cancer.

The Ozone Monitoring Instrument (OMI) on NASA’s Aura satellite acquired data for this map of ozone concentrations over Antarctica on September 12, 2010. OMI is a spectrometer that measures the amount of sunlight scattered by Earth’s atmosphere and surface, allowing scientists to assess how much ozone is present at various altitudes — particularly the stratosphere — and near the ground.

So far in 2010, the size and depth of the ozone hole has been slightly below the average for 1979 to 2009, likely because of warmer temperatures in the stratosphere over the far southern hemisphere. However, even slight changes in the meteorology of the region this month could affect the rate of depletion of ozone and how large an area the ozone hole might span. You can follow the progress of the ozone hole by visiting NASA’s Ozone Hole Watch page.

September 16 is the International Day for the Preservation of the Ozone Layer, a commemoration of the day in 1987 when nations commenced the signing of the Montreal Protocol to limit and eventually ban ozone-depleting substances such as chlorofluorocarbons (CFCs) and other chlorine and bromine-containing compounds. The ozone scientific assessment panel for the United Nations Environment Program, which monitors the effectiveness of the Montreal Protocol, is expected to release its latest review of the state of the world’s ozone layer by the end of 2010. (The last assessment was released in 2006.) Newman is one of the four co-chairs of the assessment panel.

For more information visit http://www.nasa.gov/topics/earth/features/ozone0910.html

A Growing La Nina Chills Out the Pacific

The tropical Pacific Ocean has transitioned from last winter's El Niño conditions to a cool La Niña, as shown by new data about sea surface heights, collected by the U.S-French Ocean Surface Topography Mission (OSTM)/Jason-2 oceanography satellite.

This OSTM/Jason-2 image of the Pacific Ocean is based on the average of 10 days of data centered on Sept. 3, 2010. A new image depicts places where the Pacific sea surface height is higher (warmer) than normal as yellow and red, with places where the sea surface is lower (cooler) than normal as blue and purple. Green indicates near-normal conditions. Sea surface height is an indicator of how much of the sun's heat is stored in the upper ocean.

La Niña ocean conditions often follow an El Niño episode and are essentially the opposite of El Niño conditions. During a La Niña episode, trade winds are stronger than normal, and the cold water that normally exists along the coast of South America extends to the central equatorial Pacific. La Niña episodes change global weather patterns and are associated with less moisture in the air over cooler ocean waters, resulting in less rain along the coasts of North and South America and the equator, and more rain in the far Western Pacific.

"This La Niña has strengthened for the past four months, is strong now and is still building," said Climatologist Bill Patzert of NASA's Jet Propulsion Laboratory, Pasadena, Calif. "It will surely impact this coming winter's weather and climate.

"After more than a decade of mostly dry years on the Colorado River watershed and in the American Southwest, and only one normal rain year in the past five years in Southern California, water supplies are dangerously low," Patzert added. "This La Niña could deepen the drought in the already parched Southwest and could also worsen conditions that have fueled Southern California's recent deadly wildfires."
NASA will continue to track this change in Pacific climate.

The comings and goings of El Niño and La Niña are part of a long-term, evolving state of global climate, for which measurements of sea surface height are a key indicator. JPL manages the U.S. portion of the OSTM/Jason-2 mission for NASA's Science Mission Directorate, Washington, D.C.

For more information visit http://www.jpl.nasa.gov/news/news.cfm?release=2010-300

Chandra Finds Evidence for Stellar Cannibalism

Evidence that a star has recently engulfed a companion star or a giant planet has been found using NASA's Chandra X-ray Observatory. The likely existence of such a "cannibal" star provides new insight into how stars and the planets around them may interact as they age.

The star in question, known as BP Piscium (BP Psc), appears to be a more evolved version of our Sun, but with a dusty and gaseous disk surrounding it. A pair of jets several light years long blasting out of the system in opposite directions has also been seen in optical data. While the disk and jets are characteristics of a very young star, several clues -- including the new results from Chandra -- suggest that BP Psc is not what it originally appeared to be.

Instead, astronomers have suggested that BP Psc is an old star in its so-called red giant phase. And, rather than being hallmarks of its youth, the disk and jets are, in fact, remnants of a recent and catastrophic interaction whereby a nearby star or giant planet was consumed by BP Psc.

When stars like the Sun begin to run of nuclear fuel, they expand and shed their outer layers. Our Sun, for example, is expected to swell so that it nearly reaches or possibly engulfs Earth, as it becomes a red giant star.

"It appears that BP Psc represents a star-eat-star Universe, or maybe a star-eat-planet one," said Joel Kastner of the Rochester Institute of Technology, who led the Chandra study. "Either way, it just shows it's not always friendly out there."

Several pieces of information have led astronomers to rethink how old BP Psc might be. First, BP Psc is not located near any star-forming cloud, and there are no other known young stars in its immediate vicinity. Secondly, in common with most elderly stars, its atmosphere contains only a small amount of lithium. Thirdly, its surface gravity appears to be too weak for a young star and instead matches up with one of an old red giant.

Chandra adds to this story. Young, low-mass stars are brighter than most other stars in X-rays, and so X-ray observations can be used as a sign of how old a star may be. Chandra does detect X-rays from BP Psc, but at a rate that is too low to be from a young star. Instead, the X-ray emission rate measured for BP Psc is consistent with that of rapidly rotating giant stars.

The spectrum of the X-ray emission -- that is how the amount of X-rays changes with wavelength -- is consistent with flares occurring on the surface of the star, or with interactions between the star and the disk surrounding it. The magnetic activity of the star itself might be generated by a dynamo caused by its rapid rotation. This rapid rotation can be caused by the engulfment process.

"It seems that BP Psc has been energized by its meal," said co-author Rodolfo (Rudy) Montez Jr., also from the Rochester Institute of Technology.

The star's surface is obscured throughout the visible and near-infrared bands, so the Chandra observation represents the first detection at any wavelength of BP Psc itself.

"BP Psc shows us that stars like our Sun may live quietly for billions of years," said co-author David Rodriguez from UCLA, "but when they go, they just might take a star or planet or two with them."

Although any close-in planets were presumably devastated when BP Psc turned into a giant star, a second round of planet formation might be occurring in the surrounding disk, hundreds of millions of years after the first round. A new paper using observations with the Spitzer Space Telescope has reported possible evidence for a giant planet in the disk surrounding BP Psc. This might be a newly formed planet or one that was part of the original planetary system.

"Exactly how stars might engulf other stars or planets is a hot topic in astrophysics today," said Kastner. "We have many important details that we still need to work out, so objects like BP Psc are really exciting to find."

These results appeared in The Astrophysical Journal Letters. Other co-authors on the study were Nicolas Grosso of the University of Strasbourg, Ben Zuckerman from UCLA, Marshall Perrin from the Space Telescope Science Institute, Thierry Forveille of the Grenoble Astrophysics Laboratory in France and James Graham from University of California, Berkeley.

NASA's Marshall Space Flight Center in Huntsville, Ala., manages the Chandra program for NASA's Science Mission Directorate in Washington. The Smithsonian Astrophysical Observatory controls Chandra's science and flight operations from Cambridge, Mass.

For more information visit http://www.nasa.gov/mission_pages/chandra/news/10-118.html

A Snapshot of Sea Ice

The Arctic Ocean is covered by a dynamic layer of sea ice that grows each winter and shrinks each summer, reaching its yearly minimum size each fall. While the 2010 minimum remains to be seen, NASA's Aqua satellite captured this snapshot on Sept. 3.

How does the Aqua satellite "see" sea ice? Microwaves. Everything on Earth’s surface -- including people -- emits microwave radiation, the properties of which vary with the emitter, thereby allowing the AMSR-E microwave sensor on Aqua to map the planet.

Ice emits more microwave radiation than water, making regions of the ocean with floating ice appear much brighter than the open ocean to the AMSR-E sensor. This difference allows the satellite to capture a sea ice record year-round, through cloud cover and the months of polar night. Continuous records are important because sea ice is dynamic. Besides melting and freezing, the ice moves with wind and currents which can cause it to split or pile up.

"The data from AMSR-E and other NASA satellites are critical for understanding the coupling between sea ice and the ocean and atmosphere," said Tom Wagner, Cryosphere program manager at NASA Headquarters in Washington. "It’s important for us to understand these connections to improve our predictive models of how the planet will change."

The Arctic sea ice is a major factor in the global climate system. The ice cools the planet by reflecting sunlight back into space. It also helps drive ocean circulation by converting the warm Pacific water that flows into the Arctic into the cold, saltier water that empties into the Atlantic. The sea ice also fundamentally shapes the Arctic; defining the organisms that make up its ecosystem and keeping heat from the ocean from melting the frozen tundra.

In fall 2009, Arctic sea ice reached its minimum extent on about Sept. 12, and was the third lowest since satellite microwave measurements were first made in 1979. Researchers are interested in year-to-year changes, which can be highly variable, so that scientists need many years, even decades, of data to examine long-term trends. Notably, all of the major minimums have occurred in the last decade, consistent with other NASA research, which shows January 2000 to December 2009 was the warmest decade on record.

As the sea ice nears the 2010 minimum later this month, look for images and analysis from NASA and the National Snow and Ice Data Center, in Boulder, Colo.

For more information visit http://www.nasa.gov/topics/earth/features/ice-min-approach.html

Deadly Tides Mean Early Exit for Hot Jupiters

Bad news for planet hunters: most of the "hot Jupiters" that astronomers have been searching for in star clusters were likely destroyed long ago by their stars. In a paper accepted for publication by the Astrophysical Journal, John Debes and Brian Jackson of NASA's Goddard Space Flight Center in Greenbelt, Md., offer this new explanation for why no transiting planets (planets that pass in front of their stars and temporarily block some of the light) have been found yet in star clusters. The researchers also predict that the planet hunting being done by the Kepler mission is more likely to succeed in younger star clusters than older ones.

"Planets are elusive creatures," says Jackson, a NASA Postdoctoral Program fellow at Goddard, "and we found another reason that they're elusive."

When astronomers began to search for planets in star-packed globular clusters about 10 years ago, they hoped to find many new worlds. One survey of the cluster called 47 Tucanae (47 Tuc), for example, was expected to find at least a dozen planets among the roughly 34,000 candidate stars. "They looked at so many stars, people thought for sure they would find some planets," says Debes, a NASA Postdoctoral Program fellow at Goddard. "But they didn't."

More than 450 exoplanets (short for "extrasolar planets," or planets outside our solar system) have been found, but "most of them have been detected around single stars," Debes notes.

"Globular clusters turn out to be rough neighborhoods for planets," explains Jackson, "because there are lots of stars around to beat up on them and not much for them to eat." The high density of stars in these clusters means that planets can be kicked out of their solar systems by nearby stars. In addition, the globular clusters surveyed so far have been rather poor in metals (elements heavier than hydrogen and helium), which are the raw materials for making planets; this is known as low metallicity.

Debes and Jackson propose that hot Jupiters—large planets that are at least 3 to 4 times closer to their host stars than Mercury is to our sun—are quickly destroyed. In these cramped orbits, the gravitational pull of the planet on the star can create a tide—that is, a bulge—on the star. As the planet orbits, the bulge on the star points a little bit behind the planet and essentially pulls against it; this drag reduces the energy of the planet's orbit, and the planet moves a little closer to the star. Then the bulge on the star gets bigger and saps even more energy from the planet's orbit. This continues for billions of years until the planet crashes into the star or is torn apart by the star's gravity, according to Jackson's model of tidal orbital decay.

"The last moments for these planets can be pretty dramatic, as their atmospheres are ripped away by their stars' gravity," says Jackson. "It has even been suggested recently the hot Jupiter called WASP-12B is close enough to its star that it is currently being destroyed."

Debes and Jackson modeled what would have happened in 47 Tuc if the tidal effect were unleashed on hot Jupiters. They recreated the range of masses and sizes of the stars in that cluster and simulated a likely arrangement of planets. Then they let the stars' tides go to work on the close-in planets. The model predicted that so many of these planets would be destroyed, the survey would come up empty-handed. "Our model shows that you don't need to consider metallicity to explain the survey results," says Debes, "though this and other effects will also reduce the number of planets."

Ron Gilliland, who is at the Space Telescope Science Institute in Baltimore and participated in the 47 Tuc survey, says, "This analysis of tidal interactions of planets and their host stars provides another potentially good explanation—in addition to the strong correlation between metallicity and the presence of planets—of why we failed to detect exoplanets in 47 Tuc."

In general, Debes and Jackson's model predicts that one-third of the hot Jupiters will be destroyed by the time a cluster is a billion years old, which is still juvenile compared to our solar system (about 4-1/2 billion years old). 47 Tuc has recently been estimated to be more than 11 billion years old. At that age, the researchers expect more than 96% of the hot Jupiters to be gone.

The Kepler mission, which is searching for hot Jupiters and smaller, Earth-like planets, gives Debes and Jackson a good chance to test their model. Kepler will survey four open clusters—groups of stars that are not as dense as globular clusters—ranging from less than half a billion to nearly 8 billion years old, and all of the clusters have enough raw materials to form significant numbers of planets, Debes notes. If tidal orbital decay is occurring, Debes and Jackson predict, Kepler could find up to three times more Jupiter-sized planets in the youngest cluster than in the oldest one. (An exact number depends on the brightness of the stars, the planets' distance from the stars, and other conditions.)

"If we do find planets in those clusters with Kepler," says Gilliland, a Kepler co-investigator, "looking at the correlations with age and metallicity will be interesting for shaping our understanding of the formation of planets, as well as their continued existence after they are formed."

If the tidal orbital decay model proves right, Debes adds, planet hunting in clusters may become even harder. "The big, obvious planets may be gone, so we'll have to look for smaller, more distant planets," he explains. "That means we will have to look for a much longer time at large numbers of stars and use instruments that are sensitive enough to detect these fainter planets."

The Kepler mission is managed by NASA's Ames Research Center, Moffett Field, Calif., for the Science Mission Directorate at NASA Headquarters in Washington.

For more information visit http://www.nasa.gov/topics/universe/features/early-exit.html

Amateur astronomers are first to detect small objects impacting Jupiter

Amateur astronomers working with professional astronomers have spotted two fireballs lighting up Jupiter's atmosphere this summer, marking the first time Earth-based telescopes have captured relatively small objects burning up in the atmosphere of the giant planet. The two fireballs – which produced bright freckles on Jupiter that were visible through backyard telescopes – occurred on June 3, 2010, and August 20, 2010, respectively.

A new paper that includes both pros and amateurs, led by Ricardo Hueso of the Universidad del País Vasco, Bilbao, Spain, appears today in the Astrophysical Journal Letters. In the paper, astronomers estimate the object that caused the June 3 fireball was 8 to 13 meters (30 to 40 feet) in diameter. The object is comparable in size to the asteroid 2010 RF12 that flew by Earth on Wednesday, Sept. 8, and slightly larger than the asteroid 2008 TC3, which burned up above Sudan two years ago.

An impact of this kind on Earth would not be expected to cause damage on the ground. The energy released by the June 3 fireball as it collided with Jupiter's atmosphere was five to 10 times less than the 1908 Tunguska event on Earth, which knocked over tens of millions of trees in a remote part of Russia. Analysis is continuing on the Aug. 20 fireball, but scientists said it was comparable to the June 3 object.

"Jupiter is a big gravitational vacuum cleaner," said Glenn Orton, a co-author on the paper and an astronomer at NASA's Jet Propulsion Laboratory, Pasadena, Calif. "It is clear now that relatively small objects, remnants of the formation of the solar system 4.5 billion years ago, still hit Jupiter frequently. Scientists are trying to figure out just how frequently."

Orton and colleagues said this kind of discovery couldn't have been made without amateur astronomers around the world, whose observations of Jupiter provide a near round-the-clock surveillance that would be impossible to do with the long lines of scientists waiting to use the large telescopes. Amateur astronomers, for example, were the first to see the dark spot that appeared on Jupiter in July 2009 as the result of an impact. Professional astronomers are still analyzing that impact.


Anthony Wesley, an amateur astronomer from Murrumbateman, Australia, who was also the first to take a picture of that dark spot on Jupiter in July 2009, was the first to see the tiny flash on June 3. Amateur astronomers had their telescopes trained on Jupiter that day because they were in the middle of "Jupiter season," when the planet is high in the sky and at its largest size, as seen by backyard telescopes.

Wesley was visiting an amateur astronomer friend about 1,000 kilometers (600 miles) away in Broken Hill, and he set a digital video camera to record images from his telescope at about 60 frames per second. He was watching the live video on a computer screen at his friend's house when he saw a two-and-a-half-second-long flash of light near the limb of the planet.

"It was clear to me straight away it had to be an event on Jupiter," he said. "I'm used to seeing other momentary flashes in the camera from cosmic ray impacts, but this was different. Cosmic ray strikes last only for one frame of video, whereas this flash gradually brightened and then faded over 133 frames."

Wesley sent a message out on his e-mail list of amateur and professional astronomers, which included Orton. After receiving Wesley's e-mail, Christopher Go of Cebu, Philippines -- who like Wesley, is an amateur astronomer -- checked his own recordings and confirmed that he had seen a flash, too.

Before Wesley's work, scientists didn't know these small-size impacts could be observed, Hueso explained. "The discovery of optical flashes produced by objects of this size helps scientists understand how many of these objects are out there and the role they played in the formation of our solar system," Hueso said.

For three days afterward, Hueso and colleagues looked for signs of the impact in high-resolution images from larger telescopes: NASA's Hubble Space Telescope, Gemini Observatory telescopes in Hawaii and Chile, the Keck telescope in Hawaii, the NASA Infrared Telescope Facility in Hawaii and the European Southern Observatory's Very Large Telescope in Chile. Scientists analyzed the images for thermal disruptions and chemical signatures seen in previous images of Jupiter impacts. In this case, they saw no signs of debris, which allowed them to limit the size of the impactor.

Based on all these images, and particularly those obtained by Wesley and Go, the astronomers were able to confirm the flash came from some kind of object – probably a small comet or asteroid – that burned up in Jupiter's atmosphere. The impactor likely had a mass of about 500 to 2,000 metric tons (1 million to 4 million pounds), probably about 100,000 times less massive than the object in July 2009.

Calculations also estimated this June 3 impact released about 1 to 4 quadrillion joules (300 million to 1 billion watt-hours) of energy. The second fireball, on Aug. 20, was detected by the amateur Japanese astronomer Masayuki Tachikawa and later confirmed by Aoki Kazuo and Masayuki Ishimaru. It flashed for about 1.5 seconds. The Keck telescope, observing less than a day later, also found no subsequent debris remnants. Scientists are still analyzing this second flash.

Although collisions of this size had never before been detected on Jupiter, some previous models predicted around one collision of this kind a year. Another predicted up to 100 such collisions. Scientists now believe the frequency must be closer to the high end of the scale.

"It is interesting to note that whereas Earth gets smacked by a 10-meter-sized object about every 10 years on average, it looks as though Jupiter gets hit with the same-sized object a few times each month," said Don Yeomans, manager of the Near-Earth Object Program Office at JPL, who was not involved in the paper. "The Jupiter impact rate is still being refined and studies like this one help to do just that."

For more information visit http://www.nasa.gov/topics/solarsystem/features/jupiter20100909.html

Tally-Ho! Deep Impact Spacecraft Eyes Comet Target

60 Days before its flyby, a NASA spacecraft takes a picture of its quarry - comet Hartley 2. Image credit: NASA/JPL/UM - Unannotated view

On Sunday, Sept. 5, NASA's Deep Impact spacecraft beamed down the first of more than 64,000 images it's expected to take of Comet Hartley 2. The spacecraft, now on an extended mission known as EPOXI, has an appointment with the comet on Nov. 4, 2010.

It will use all three of the spacecraft's instruments (two telescopes with digital color cameras and an infrared spectrometer) to scrutinize Hartley 2 for more than two months.

"Like any tourist who can't wait to get to a destination, we have already begun taking pictures of our comet -- Hartley 2," said Tim Larson, the project manager for EPOXI from NASA's Jet Propulsion Laboratory in Pasadena, Calif. "We have to wait for Nov. 4 to get the close-up pictures of the cometary nucleus, but these approach images should keep the science team busy for quite some time as well."

The imaging campaign, along with data from all the instruments aboard Deep Impact, will afford the mission's science team the best extended view of a comet in history during its pass through the inner solar system. With the exception of one, six-day break to calibrate instruments and perform a trajectory correction maneuver, the spacecraft will continuously monitor Hartley 2's gas and dust output for the next 79 days.

This first image of comet Hartley 2 taken by Deep Impact was obtained by the spacecraft's Medium Resolution Imager on Sept. 5 when the spacecraft was 60 million kilometers (37.2 million miles) away from the comet.

EPOXI is an extended mission that utilizes the already "in flight" Deep Impact spacecraft to explore distinct celestial targets of opportunity. The name EPOXI itself is a combination of the names for the two extended mission components: the extrasolar planet observations, called Extrasolar Planet Observations and Characterization (EPOCh), and the flyby of comet Hartley 2, called the Deep Impact Extended Investigation (DIXI). The spacecraft will continue to be referred to as "Deep Impact."

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the EPOXI mission for NASA's Science Mission Directorate, Washington. The University of Maryland, College Park, is home to the mission's principal investigator, Michael A'Hearn. Drake Deming of NASA's Goddard Space Flight Center, Greenbelt, Md., is the science lead for the mission's extrasolar planet observations. The spacecraft was built for NASA by Ball Aerospace & Technologies Corp., Boulder, Colo.

For more information visit http://www.nasa.gov/topics/solarsystem/features/epoxy20100908.html

NASA Thruster Test Aids Future Robotic Lander’s Ability to Land Safely

NASA's Marshall Space Flight Center in Huntsville, Ala., collaborated with NASA's White Sands Test Facility in Las Cruces, N.M., and Pratt & Whitney Rocketdyne in Canoga Park, Calif., to successfully complete a series of thruster tests at the White Sands test facility. The test will aid in maneuvering and landing the next generation of robotic lunar landers that could be used to explore the moon's surface and other airless celestial bodies.

The Robotic Lunar Lander Development Project at the Marshall Center performed a series of hot-fire tests on two high thrust-to-weight thrusters – a 100-pound-class for lunar descent and a 5-pound-class for attitude control. The team used a lunar mission profile during the test of the miniaturized thrusters to assess the capability of these thruster technologies for possible use on future NASA spacecraft.

The test program fully accomplished its objectives, including evaluation of combustion stability, engine efficiency, and the ability of the thruster to perform the mission profile and a long-duration, steady-state burn at full power. The test results will allow the Robotic Lander Project to move forward with robotic lander designs using advanced propulsion technology.

The test articles are part of the Divert Attitude Control System, or DACS, developed by the U.S. Missile Defense Agency of the Department of Defense. The control system provides two kinds of propulsion -- one for control and the other for maneuvering. The Attitude Control System thrusters provide roll, pitch and yaw control. These small thruster types were chosen to meet the golf-cart-size lander's requirement for light-weight, compact propulsion components to aid in reducing overall spacecraft mass and mission cost by leveraging an existing government resource.

"The Missile Defense Agency heritage thrusters were originally used for short-duration flights and had not been qualified for space missions, so our engineers tested them to assess their capability for long-duration burns and to evaluate their performance and combustion behavior," said Monica Hammond, lander propulsion task manager for the Robotic Lunar Lander Development Project at the Marshall Center. "The thrusters are a first step in reducing propulsion technology risks for a lander mission. The results will be instrumental in developing future plans associated with the lander's propulsion system design."

During tests of the 100-pound thruster, the Divert Attitude Control System thruster fired under vacuum conditions to simulate operation in a space environment. The tests mimicked the lander mission profile and operation scenarios. The test included several trajectory correction maneuvers during the cruise phase; nutation control burns to maintain spacecraft orientation; thruster vector correction during the solid motor braking burn; and a terminal descent burn on approach to the lunar surface.

The objective for the five -pound-class thruster test was similar to the 100-pound thruster test with an additional emphasis on the thruster heating assessment due to the long-duration mission profile and operation with MMH/MON-25 -- monomethylhydrazine (MMH) fuel and a nitrogen tetroxide (75 percent)/nitrogen oxide (25 percent) (MON-25) oxidizer.

A standard propellant system for spacecraft is the MMH/MON-3 propellant system -- containing 3 percent nitric oxide. An alternate propellant system, MMH/MON-25, contains 25 percent nitric oxide. With its chemical composition, it has a much lower freezing point than MON-3, making it an attractive alternative for spacecraft with its thermal benefits and resulting savings in heater power. Because the MMH/MON-25 propellant system has never been used in space, these tests allowed engineers to benchmark the test against the MMH/MON-3 propellant system.

"The lower freezing point could save considerable heater power for the spacecraft and increase thermal margin for the entire propulsion system," said Huu Trinh, lead propulsion engineer for the Robotic Lunar Lander Development Project at the Marshall Center. "These tests showed stable combustion in all scenarios and favorable temperature results."

The Robotic Lunar Lander Development Project is a partnership between the Marshall Center and the Johns Hopkins University Applied Physics Laboratory located in Laurel, Md.

For more information visit http://www.nasa.gov/mission_pages/lunarquest/news/thrust_test.html


View this site car transport , auto transport , car shipping