-I'm Ina Fried with CNET.
I'm here with [unk] who's a member of the Windows Phone testing team and we are looking at a couple of the robots that help out with the testing.
So, a lot of the tests are done automated in servers with simulations but one of the things that Microsoft needs to test is how well that the actual screens are doing at receiving touch, and to do that, they use robots.
What do the robots help test?
-This is just referred to as the Adept Touch Robot.
-And what does that first robot do?
-This particular robot here does touch accuracy testing so if you imagine that we use-- by using the robots, we can repeatedly and consistently cover the entire panel, the touch panel, in a systematic fashion, touching every point on the panel and getting responses from the panel and then we can do a comparison between the reference positions to the actual response
and that tells us how accurate the touch panel is for the device that's in testing.
From this test, we are expecting to see, at any rate, is a map of reference positions to the actual positions so I'll pick this panel for example, what we're seeing here is its very accurate in the center but there's a certain level of error radiating out towards the edges, but that's the level of information that OEM can then pick back
and adjust their queueing so that we get a very accurate touch experience.
In addition to that, the robots can simulate human swipes and we can do more close swipes of the entire panel to make sure that we have that coverage, and, of course, it does the regular jitter testing where we can hold down the phone and see how it handle response under that testing so there's myriad of test conditions that you can pull them in to simulate different user scenarios.
-And then there's a second robot which I believe you guys affectionately call Wally.
What does Wally do?
-Wally actually tests the various sensors on the device itself so these sensors are accelerometer, proximity sensor, and ambient light sensor, so with a single package, we can verify the performances of these sensors.
So what the robot is doing now is putting the device under testing into various reference positions
while we are measuring the responses from the device, so with our accelerometer in place, we should be getting these, you know, we're getting basically readings for each of these reference positions then we can do a comparison between that reading to the reference numbers themselves and that tells us how accurate the accelerometer is in this device.
-And all this is so that when people actually have the phone in their hand, it basically performs the way they'd expect it to.
-Absolutely.
It all ties back down to the user experience and see how, you know, how the device actually performs in user hands so in tests, the robot is a very useful tool in that regard.
-Thanks.
For CNET, I'm Ina Fried.