Thursday, June 2, 2011

Two digital color cameras riding high on the mast of NASA's next Mars rover will complement each other in showing the surface of Mars in exquisite detail.




They are the left and right eyes of the Mast Camera, or Mastcam, instrument on the Curiosity rover of NASA's Mars Science Laboratory mission, launching in late 2011.

The right-eye Mastcam looks through a telephoto lens, revealing details near or far with about three-fold better resolution than any previous landscape-viewing camera on the surface of Mars. The left-eye Mastcam provides broader context through a medium-angle lens. Each can acquire thousands of full-color images and store them in an eight-gigabyte flash memory. Both cameras are also capable of recording high-definition video at about eight frames per second. Combining information from the two eyes can yield 3-D views of the telephoto part of the scene.

Motivation to put telephoto capability in Curiosity's main science imaging instrument grew from experience with NASA's Mars Exploration Rover Opportunity and its studies of an arena-size crater in 2004. The science camera on that rover's mast, which can see details comparably to what a human eye can see at the same distance, showed intriguing patterns in the layers of Burns Cliff inside Endurance Crater.

"We tried to get over and study it, but the rover could not negotiate the steep slope," recalled Mastcam Principal Investigator Michael Malin, of Malin Space Science Systems, San Diego. "We all desperately coveted a telephoto lens." NASA selected his Mastcam proposal later that year for the Mars Science Laboratory rover.

The telephoto Mastcam, called "Mastcam 100" for its 100-millimeter focal-length lens, provides enough resolution to distinguish a basketball from a football at a distance of seven football fields, or to read "ONE CENT" on a penny on the ground beside the rover. Its images cover an area about six degrees wide by five degrees tall.

Its left-eye partner, called "Mastcam 34" for its 34-millimeter lens, catches a scene three times wider -- about 18 degrees wide and 15 degrees tall -- with each exposure.

Researchers will use the Mastcams and nine other science instruments on Curiosity to study past and present environments in a carefully chosen area of Mars. They will assess whether conditions have been favorable for life and favorable for preserving evidence about whether life has existed there. Mastcam imaging of the shapes and colors of landscapes, rocks and soils will provide clues about the history of environmental processes that have formed them and modified them over time. Images and videos of the sky will document contemporary processes, such as movement of clouds and dust.

Previous color cameras on Mars have taken a sequence of exposures through different color filters to be combined on Earth into color views. The Mastcams record color the same way consumer digital cameras do: They have a grid of tiny red, green and blue squares (a "Bayer pattern" filter) fitted over the electronic light detector (the charge-coupled device, or CCD). This allows the Mastcams to get the three color components over the entire scene in a single exposure.

Mastcam's color-calibration target on the rover deck includes magnets to keep the highly magnetic Martian dust from accumulating on portions of color chips and white-gray-balance reference chips. Natural lighting on Mars tends to be redder than on Earth due to dust in Mars' atmosphere. "True color" images can be produced that incorporate that lighting effect -- comparable to the greenish look of color-film images taken under fluorescent lights on Earth without a white-balancing adjustment. A white-balance calculation can yield a more natural look by adjusting for the tint of the lighting, as the human eye tends to do and digital cameras can do. The Mastcams are capable of producing both true-color and white-balanced images.

Besides the affixed red-green-blue filter grid, the Mastcams have wheels of other filters that can be rotated into place between the lens and the CCD. These include science spectral filters for examining the ground or sky in narrow bands of visible-light or near-infrared wavelengths. One filter on each camera allows it to look directly at the sun to measure the amount of dust in the atmosphere, a key part of Mars' weather.

"Something we're likely to do frequently is to look at rocks and features with the Mastcam 34 red-green-blue filter, and if we see something of interest, follow that up with the Mastcam 34 and Mastcam 100 science spectral filters," Malin said. "We can use the red-green-blue data for quick reconnaissance and the science filters for target selection."

When Curiosity drives to a new location, Mastcam 34 can record a full-color, full-circle panorama about 60 degrees tall by taking 150 images in about 25 minutes. Using Mastcam 100, the team will be able to broaden the swath of terrain evaluated on either side of the path Curiosity drives, compared to what has been possible with earlier Mars rovers. That will help with selection of the most interesting targets to approach for analysis by Curiosity's other instruments and will provide additional geological context for interpreting data about the chosen targets.

The Mastcams will provide still images and video to study motions of the rover -- both for science, such as seeing how soils interact with wheels, and for engineering, such as aiding in use of the robotic arm. In other videos, the team may use cinematic techniques such as panning across a scene and using the rover's movement for "dolly" shots.

Each of the two-megapixel Mastcams can take and store thousands of images, though the amount received on Earth each day will depend on how the science team chooses priorities for the day's available data-transmission volume. Malin anticipates frequent use of Mastcam "thumbnail" frames -- compressed roughly 150-by-150-pixel versions of each image -- as an index of the full-scale images held in the onboard memory.

Malin Space Science Systems built the Mastcam instrument and will operate it. The company's founder, Michael Malin, participated in NASA's Viking missions to Mars in the 1970s, provided the Mars Orbiter Camera for NASA's Mars Global Surveyor mission, and is the principal investigator for both the Context Camera and the Mars Color Imager on NASA's Mars Reconnaissance Orbiter.

The science team for Mastcam and two other instruments the same company provided for Curiosity includes the lead scientist for the mast-mounted science cameras on Mars rovers Spirit and Opportunity (James Bell of Arizona State University); the lead scientist for the mast camera on NASA's Phoenix Mars Lander (Mark Lemmon of Texas A&M University); James Cameron, director of such popular movies as "Titanic" and "Avatar"; and 17 others with expertise in geology, soils, frost, atmosphere, imaging and other topics.

Mastcam 100 and Mastcam 34 were installed onto Curiosity in 2010. Until March 2011, a possibility remained open that they might be replaced with a different design: two identical zoom cameras. A zoom camera has adjustable focal length, to change from wider-angle to telephoto or vice-versa. That design had been Malin's original proposal. NASA changed the plan to two different fixed-focal-length cameras in 2007 as a cost-cutting measure that preserves the capability for meeting the science goals of the mission and the instrument. The agency funded a renewed possibility for using the zoom-camera design in 2010, but the zoom development presented challenges that could not be fully overcome with enough time for required testing on the rover.

Mastcam 34 took images for a mosaic showing Curiosity's upper deck during tests in March 2011 inside a chamber simulating Mars surface temperature and air pressure. Testing of the rover at NASA's Jet Propulsion Laboratory, Pasadena, Calif., will wrap up in time for shipping the rover to NASA Kennedy Space Center in June. Testing and other launch preparations will continue there. The launch period for the Mars Science Laboratory is Nov. 25 to Dec. 18, 2011, with landing on Mars in August 2012.

Wednesday, June 1, 2011

Oxygen-Rich Soil on Moon

The Moon's surface is covered with oxygen-rich soils, Hubble Space Telescope images show. Planetary scientists believe the oxygen could be tapped to power rockets and be a source of oxygen to breathe for future astronauts.



ORLANDO, Fla.--The moon's surface hasn't been stepped on since the Apollo missions in the 1970's. Now, for the first time in more than 30 years, NASA is going back to the moon.

When the last astronaut took the final step on the moon, many people thought we'd never visit it again. Jim Garvin, a planetary scientist at the NASA/Goddard Space Flight Center in Greenbelt, Md., says, "We went. We came. We saw. We conquered ... And we left."

Now, planetary scientists are going back, but this time through the eyes of the Hubble telescope. Brand new images show a side of our moon we've never seen.

"This is the first time we've looked at the moon with Hubble's spectacular vision to understand things about the moon that today we haven't fully understood. This is why exploration's so exciting," Garvin says.

The amazing pictures were captured using ultra-violet light reflected off the moon's surface. They reveal signs of oxygen-rich soils that scientists believe can be used to power rockets and be a source of oxygen to breathe for future life on the moon.

Garvin says, "So, finding resources, learning where they are, and how much there are, and learning then how to use them for people and utilization of human beings on the moon -- women and men -- is really our long-term goal."

A goal that may seem like light years away -- but thanks to these helpful images, living on the moon may be a closer reality. "We're going to learn to live there, we're going to learn to put human exploration and robot exploration together," Garvin says.

The Hubble telescope is normally meant to look at objects light years away, and researchers found focusing Hubble on the moon -- a mere 250,000 miles away -- was more challenging than expected.

BACKGROUND: For years, the Hubble Space Telescope has given scientists spectacular photographs of the farthest reaches of space, but recently the telescope turned its attention a bit closer to home, taking images of the moon. These images -- the first taken with ultraviolet light -- reveal new information about the composition of the moon, with implications for future lunar exploration.

HOW HUBBLE WORKS: Hubble has a long tube that is open at one end, with mirrors to gather and focus light to its "eyes" -- various instruments that enable it to detect different types of light, such as ultraviolet and infrared. Light enters the telescope through the opening and bounces off a primary mirror to a secondary mirror, which reflects the light through a hole in the center of the primary mirror to a focal point behind the primary mirror.

Smaller mirrors distribute the light to the various scientific instruments, which analyze the different wavelengths. Each instrument uses the same kind of array of diodes that are used in digital cameras to capture light. The captured light is stored in on-board computers and relayed to Earth as digital signals, and this data is then transformed into images.

WHAT WE CAN LEARN: Astronomers can glean a lot of useful scientific information from these images. The colors, or spectrum, of light coming from a celestial object form a chemical fingerprint of that object, indicating which elements are present, while the intensity of each color tells us how much of that element is present. The spectrum can also tell astronomers how fast a celestial object is moving away or towards us through an effect called the Doppler shift. Light emitted by a moving object is perceived to increase in frequency (a blue shift) if it is moving toward the observer; if the object is moving away from us, it will be shifted toward the red end of the spectrum.

NEW INSIGHTS: Thanks to Hubble's high resolution and sensitivity to ultraviolet light, astronomers are able to search for minerals in the lunar crust that may be critical for establishing a sustained human presence on the moon. These include titanium and iron oxides, both of which are sources of oxygen. Since the moon lacks a breathable atmosphere (as well as water), the presence of such minerals is critical. This new data, along with other measurements will help NASA scientists identify the most promising sites for future robotic and human missions.

Researchers Track Space Junk

A team of researchers from the Royal Institute and Observatory of the Navy (ROA) in Cádiz (Spain) has developed a method to track the movement of geostationary objects using the position of the stars, which could help to monitor space debris. The technique can be used with small telescopes and in places that are not very dark.


Objects or satellites in geostationary orbit (GEO) can always be found above the same point on the Equator, meaning that they appear immobile when observed from Earth. By night, the stars appear to move around them, a feature that scientists have taken advantage of for decades in order to work out the orbit of these objects, using images captured by telescopes, as long as these images contain stars to act as a reference point.

This method was abandoned when satellites started to incorporate transponders (devices that made it possible to locate them using the data from emitted and reflected signals). However, the classic astrometric techniques are now combing back into vogue due to the growing problem of space waste, which is partly made up of the remains of satellites engines without active transponders.

"Against this backdrop, we developed optical techniques to precisely observe and position GEO satellites using small and cheap telescopes, and which could be used in places that are not particularly dark, such as cities," says Francisco Javier Montojo, a member of the ROA and lead author of a study published in the journal Advances in Space Research.

The method can be used for directly detecting and monitoring passive objects, such as the space junk in the geostationary ring, where nearly all communications satellites are located. At low orbits (up to around 10,000 km) these remains can be tracked by radar, but above this level the optical technique is more suitable.

Montojo explains that the technique could be of use for satellite monitoring agencies "to back up and calibrate their measurements, to check their manoeuvres, and even to improve the positioning of satellites or prevent them from colliding into other objects."

"The probability of collisions or interferences occurring between objects is no longer considered unappreciable since the first collision between two satellites on 10 February 2009 between America's Iridium33 and the Russians' Cosmos 2251," the researcher points out.

Image software and 'double channel'

The team has created software that can precisely locate the centre of the traces or lines that stars leave in images (due to photograph time exposure). The main advantage of the programme is that it "globally reduces" the positions of the object to be followed with respect to the available stellar catalogues. To do this, it simultaneously uses all the stars and all the photographs taken by the telescope's CCD camera on one night. It does not matter if there are not sufficient reference stars in some shots, because they are all examined together as a whole.

Optical observation allows the object to be located at each moment. Using these data and another piece of (commercial) software, it is possible to determine the orbit of the GEO object, in other words to establish its position and speed, as well as to predict its future positions. The method was validated by tracking three Hispasat satellites (H1C, H1D and Spainsat) and checking the results against those of the Hispasat monitoring agency.

"As an additional original application, we have processed our optical observations along with the distances obtained using another technique known as 'double channel' (signals the travel simultaneously between two clocks or oscillators to adjust the time)," says Montojo. The Time Section of the ROA uses this methodology to remotely compare patterns and adjust the legal Spanish time to International Atomic Time.

Incorporating these other distance measurements leads to a "tremendous reduction" in uncertainty about the satellite's position, markedly improving the ability to determine its orbit.

Data from the ROA's veteran telescope in San Fernando (Cádiz) were used to carry out this study, but in 2010 the institution unveiled another, more modern one at the Montsec Astronomical Observatory in Lleida, co-managed by the Royal Academy of Sciences and Arts of Barcelona. This is the Fabra-ROA Telescope at Montsec (TFRM), which makes remote, robotic observations.

"The new telescope has features that are particularly well suited to detecting space junk, and we hope that in the near future it will play an active part in international programmes to produce catalogues of these kinds of orbital objects," the researcher concludes.

Sending Humans to Mars

What would it take to make a manned mission to Mars a reality? A team of aerospace and textile engineering students from North Carolina State University believe part of the solution may lie in advanced textile materials. The students joined forces to tackle life-support challenges that the aerospace industry has been grappling with for decades.



"One of the big issues, in terms of a manned mission to Mars, is creating living quarters that would protect astronauts from the elements -- from radiation to meteorites," says textile engineering student Brent Carter. "Currently, NASA uses solid materials like aluminum, fiberglass and carbon fibers, which while effective, are large, bulky and difficult to pack within a spacecraft."

Using advanced textile materials, which are flexible and can be treated with various coatings, students designed a 1,900-square-foot inflatable living space that could comfortably house four to six astronauts. This living space is made by layering radiation-shielding materials like Demron™ (used in the safety suits for nuclear workers cleaning up Japan's Fukushima plant) with a gas-tight material made from a polyurethane substrate to hold in air, as well as gold-metalicized film that reflects UV rays -- among others. The space is dome-shaped, which will allow those pesky meteors, prone to showering down on the red planet, to bounce off the astronauts' home away from home without causing significant damage.

"We're using novel applications of high-tech textile technology and applying them to aerospace problems," explains Alex Ray, a textile engineering student and team member. "Being able to work with classmates in aeronautical engineering allowed us to combine our knowledge from both disciplines to really think through some original solutions."

Students also tackled another major issue preventing a manned mission to Mars -- water supply. Currently, astronauts utilize something called a Sabatier reactor to produce water while in space. The Sabatier process involves the reaction of carbon dioxide and hydrogen, with the presence of nickel, at extremely high temperatures and pressure to produce water and methane.

"We wanted to find a way to improve the current Sabatier reactor so we could still take advantage of the large quantities of carbon dioxide available on Mars, and the fact that it is relatively easy to bring large quantities of hydrogen on the spacecraft, since it is such a lightweight element," says recent aerospace engineering graduate Mark Kaufman, who was also on the design team.

Current Sabatier reactors, Kaufman explains, are long, heavy tubes filled with nickel pellets -- not ideal for bringing on a spacecraft. The student groups worked to develop a fiber material to which they applied nickel nanoparticles to create the same reaction without all the weight and volume. They believe their redesigned Sabatier reactor would be more feasible to carry along on a future space shuttle.

In addition to Carter, Ray and Kaufman, the team also included Kris Tesh, Grant Gilliam, Kasey Orrell, Daniel Page and Zack Hester. Textile engineering professor and former aerospace engineer, Dr. Warren Jasper, served as the faculty sponsor. The team also received valuable feedback from Fred Smith, an advanced life support systems engineer with NASA.

Jasper and the student team will present their project at the NASA-sponsored Revolutionary Aerospace Systems Concepts Academic Linkage (RASC-AL) competition, held June 6-8 in Cocoa Beach, Fla. The project will be judged by NASA and industry experts against other undergraduate groups from across the country. RASC-AL was formed to provide university-level engineering students the opportunity to design projects based on NASA engineering challenges, as well as offer NASA access to new research and design projects by students

'Dead' Galaxies

University of Michigan astronomers examined old galaxies and were surprised to discover that they are still making new stars. The results provide insights into how galaxies evolve with time. U-M research fellow Alyson Ford and astronomy professor Joel Bregman presented their findings May 31 at a meeting of the Canadian Astronomical Society in London, Ontario.



Using the Wide Field Camera 3 on the Hubble Space Telescope, they saw individual young stars and star clusters in four galaxies that are about 40 million light years away. One light year is about 5.9 trillion miles.

"Scientists thought these were dead galaxies that had finished making stars a long time ago," Ford said. "But we've shown that they are still alive and are forming stars at a fairly low level."

Galaxies generally come in two types: spiral galaxies, like our own Milky Way, and elliptical galaxies. The stars in spiral galaxies lie in a disk that also contains cold, dense gas, from which new stars are regularly formed at a rate of about one sun per year.

Stars in elliptical galaxies, on the other hand, are nearly all billions of years old. These galaxies contain stars that orbit every which way, like bees around a beehive. Ellipticals have little, if any, cold gas, and no star formation was known.

"Astronomers previously studied star formation by looking at all of the light from an elliptical galaxy at once, because we usually can't see individual stars," Ford said. "Our trick is to make sensitive ultraviolet images with the Hubble Space Telescope, which allows us to see individual stars."

The technique enabled the astronomers to observe star formation, even if it is as little as one sun every 100,000 years.

Ford and Bregman are working to understand the stellar birth rate and likelihood of stars forming in groups within ellipticals. In the Milky Way, stars usually form in associations containing from tens to 100,000 stars. In elliptical galaxies, conditions are different because there is no disk of cold material to form stars.

"We were confused by some of the colors of objects in our images until we realized that they must be star clusters, so most of the star formation happens in associations," Ford said.

The team's breakthrough came when they observed Messier 105, a normal elliptical galaxy that is 34 million light years away, in the constellation Leo. Though there had been no previous indication of star formation in Messier 105, Ford and Bregman saw a few bright, very blue stars, resembling a single star 10 to 20 times the mass of the sun.

They also saw objects that aren't blue enough to be single stars, but instead are clusters of many stars. When accounting for these clusters, stars are forming in Messier 105 at an average rate of one sun every 10,000 years, Ford and Bregman concluded. "This is not just a burst of star formation but a continuous process," Ford said.

These findings raise new mysteries, such as the origin of the gas that forms the stars.

"We're at the beginning of a new line of research, which is very exciting, but at times confusing," Bregman said. "We hope to follow up this discovery with new observations that will really give us insight into the process of star formation in these 'dead' galaxies."