Drive-By Photography


You’re never too busy to do science, even rocketing down to the surface of the Moon.

Andy’s scientific mission is to explore a pit in the Lacus Mortis region of the Moon. This skylight, and others like it, are a new kind of pit that we’ve found on the Moon and on Mars. The ancient lava tubes that they open into may be the safe harbor we need to build colonies on other planets.

But, on the way down, we pass over the pit on the way to landing near it. This short window would give us an unprecedented perspective and resolution. This view of the walls and tunnel entrance would complement, and maybe even inform, how we explore the pit with Andy.

To accomplish this task, a team at Carnegie Mellon spent two years developing a pit modeling sensor, based on Astrobotic’s landing sensor, and writing a whole new software suite for it.

The mapping sensor, doing early testing in our high bay.

The mapping sensor, doing early testing in our high bay.

But even before that, this story starts with Astrobotic’s landing sensors. Because of the latency to the Moon, pilots wouldn’t be able to control their Griffin lander safely; it will have to land itself. With an eye to how crucial that step is, Astrobotic made an exceptional sensor package to fly them down.

At this point, it became clear that there was an opportunity here: with a powerful camera mapping the landing site as Griffin comes screaming in for a soft landing, why not the pit as well?

A team of a half dozen undergraduates set out to do just that. With Professor William “Red” Whittaker as their PI and Astrobotic CTO Kevin Peterson as their mentor, the team set out to build a prototype sensor, based on Astrobotic’s, in order to test their vision. They selected Masten Space Systems for their major verification test. Masten is the team to go if you need to test a rocket flight system; rockets don’t fly or land at all like helicopters, and they have a small fleet of reliable rockets.

Xombie over the Mojave.

Xombie over the Mojave.

To cover the costs of the prototype and the verification test, they applied for - and handily qualified for - two NASA grants, as well as two smaller grants from within CMU. NASA awarded them an Undergraduate Student Instrument Project grant for the development, and a Flight Opportunities Project grant for the testing. CMU awarded them two Small Undergraduate Research Grants.

With such an ambitious goal, the challenges they faced were considerable. All of the sensors have to be fast, accurate, and have to generate a wealth of data - six gigabytes per minute in total. They have to be carefully calibrated intrinsically, and with respect to each other. And the moment of truth would be a single test flight, after Masten had finished their own testing of our installation.

And each sensor is crucial. This mapping project is part of a family of problems in robotics called Simultaneous Localization and Mapping, or SLAM for short. When a robot knows where it is, or knows about its surroundings, it can reason out the other, but figuring out both is hard. The team chose to separate the problems: two of its three HD cameras infer the lander’s location based on their binocular vision. Between frames, their data is supplemented by a high-quality Inertial Measurement Unit, or IMU, which was generously provided by KVH Industries. The third camera and a laser rangefinder (LIDAR) do the actual mapping based on that inferred location.

Half of the CMU team; left to right: Kerry Snyder (Software Lead), Professor Red Whittaker (PI), Neal Bhasin (Student Lead), and Rick Shanor (Mechanical Lead). Not pictured: Oliver Daids (Simulation Dev), Ashrith Balakumar (Mechanical Engineer), Edward Nolan (Electronic Lead), and Kevin Peterson (Astrobotic CTO and mentor).

Half of the CMU team; left to right: Kerry Snyder (Software Lead), Professor Red Whittaker (PI), Neal Bhasin (Student Lead), and Rick Shanor (Mechanical Lead).

Not pictured: Oliver Daids (Simulation Dev), Ashrith Balakumar (Mechanical Engineer), Edward Nolan (Electronic Lead), and Kevin Peterson (Astrobotic CTO and mentor).

Rick Shanor checks the sensor before a test.

Rick Shanor checks the sensor before a test.

I interviewed Ashrith Balakumar, one of the team’s two mechanical engineers, when he’d returned from the Masten test.

“The big challenge,” he explained, “is how robust the system is - it just has to work. And it has to handle gigabytes of data every minute; the cameras take nine hundred high resolution pictures a minute, and that doesn’t even count all the IMU data, for example.”

For the test, a pit was simulated with a ring of six shipping crates. Xombie would fly over it, mapping it with the sensor package, and land nearby.

A preliminary, sparse version of the sensor's generated map. The simulation pit (hexagon) and landing site (circle) are both visible. 

A preliminary, sparse version of the sensor's generated map. The simulation pit (hexagon) and landing site (circle) are both visible. 

The test, Ashrith continued, was a resounding success. As Xombie rocketed over the Mojave, its sensor system rapidly and accurately gimbaled to get a good view of all sides of the simulated pit. With centimeter resolution, their model is way ahead of all other imagery we have of our target.

The next step is to go back to Astrobotic and reintegrate the software and hardware upgrades with the original landing package. Thanks to the incredible work of this team, Griffin will be able to be both driver and tourist as it rockets down to the surface of the Moon.

Comment