Welcome to Pittsburg, and Carnegie Mellon University, home of Astrobotic, one of the teams competing for the $20 million Google Lunar X prize.
And the last time we saw Astrobotic, we were in the middle of the Mojave desert, where they were testing out their landing systems in preparation for landing.
On the moon.
Today, we're gonna check out what happens next.
See if they can actually broadcast high-definition video from the lunar surface all the way back here to us on Earth.
For the testing we're gonna have to leave the lab and go out in the field.
To an active quarry.
That means safety gear is most definitely required, partner.
So we're at the Lafarge test site.
We're testing the image system on the rover.
We're looking at the quality of the images that come off the cameras and then we're transmitting those over a length of similar to what we'll use on the moon.
From the Earth to the Moon, we have a signal propagation delay of 2.5 seconds.
So when we send a command from Earth, it takes 2.5 seconds to reach the Moon.
And from the Moon, the image feedback that gives the result of that action would be another 2.5 seconds.
So we press a button to make the robot something, then we see it 5 seconds later, so we have to plan for that in our driving strategy.
We need to prove that the resolution of the images are good enough to drive.
That the compression that's required to get all that data down doesn't destroy the images.
And that we can take nice high-color HD images of the beautiful things that we're gonna see on the [INAUDIBLE].
Yeah, so right now we're looking at the user interface.
For our prototype moonrover.
On it, we can se the images that are fed to it, those are use to control the rover.
And the higher definition images are streamed back to earth for the viewing of the public.
The two cameras give us stereo video on the moon the sunlight's very bright, very grey it's very flat and so it's hard to tell distances.
We use the stereo cameras to be able to tell distance.
So things like rocks and craters so we don't get stuck.
One of the key requirements of the Google Interacts prize is to prove that we traveled 500 meters on the moon.
And this screen actually updates our distance estimated in real time so we know how far we've traveled and we're also able to see.
The exact number of rotations for each wheel because that's, that along with computer vision techniques are the two key data points for our distance verification.
The rover has this suspension that allows it to drive at low speed very capably.
It's a single pivot suspension.
One pivot up front and then all of the rest of the suspension is fixed
At this point, the big thing they have to do is really just test.
Hey, my name's Jay Kurtz.
I'm one of the judges.
And they're schedule with the next couple weeks to really start going through the, like I say, the thermal, the vacuum, the shock and vibe testing, and really the shock and vibe, the critical part, is to survive the launch.
I don't think they're gonna have a lot of risk there.
We've got a very strong, strong approach to that.
So that, this lander looks very different from the one that we tested in the Mojave, but the engines are the same.
The propulsion fuel system is very similar.
The computing and landing systems are identical.
So what we need to do now, is integrate all of those components, fly a system that's got the, got the similar architecture.
We'll launch it in space and land on the moon.
This is Tim Stevens covering the Google Lunar XPRIZE for CNET.