The Next Interface: YouKeyboards and mice are being replaced by your voice, hands, and eyes. Explore the future of the interface with CNET's live panel.
-Welcome to the CNET stage, everybody, live here at 2013 CES. -I'm Lindsey Turrentine, editor in chief of CNET reviews. You're watching us right on a device that uses either a keyboard, mouse, a touchscreen, a remote control. But imagine a world where all of those become obsolete. -Instead, you might interface with your electronics in the fairly near future by-- or media-- by gesture, by voice, by eye movement, even by brainwave. That's why we think that next interface is actually you. We have 3 guests who are making that a reality today. James Park is CEO of Fitbit known for wearable tech that monitors your calories, your activity. -And right next to him in the middle of our panel is Matt Rogers. He is founder and VP of engineering at Nest. We are both owners of the said product, makers of a connected smart thermostat that figures out your behavior by monitoring or by figuring out your home's HVAC behavior by monitoring yours. -And Michael Buckwald is CEO and cofounder of Leap Motion, which has created technology for operating electronics by just using your hands. And I also wanna point out we've got Brian Tong over here on the little dais right there. And he's gonna be showing us some other fascinating technologies from Vuzix, The Muse brainwave-sensing from InteraXon, and also Eye Tribe's retina-tracking technology. So, we've got a very interesting array of tech for the next hour. First off though, let's get things started. CNET's Kara Tsuboi will get our mental juices on this remarkable trend. -The way we interact with technology is constantly changing and the next interface is you. Google's Project Glass gave us a bird's eye view of the future of how humans will become one with technology. The glasses are outfitted with a camera and voice navigation to truly augment the ware's reality. And when they become available to consumers in 2014, we have to ask, will ever need or want traditional computer screens again? That's just one of several products transforming the human body into an interface. Someday, we won't need all those extras like a mouse, keyboard, and screen to operate technology. Our bodies and gestures will do that for us. The video game industry pounced on this trend several years ago. Sony PlayStation, Nintendo Wii, and Microsoft Xbox all have their own motion sensing devices that allow gamers to play without a wired controller. And now, you can play games with no controllers and only yours eyes thanks to a company called Eye Tribe out of the Netherlands. A mobile device camera follows your retina, instinctively knowing when to turn the page of a book or pop a balloon, making touchscreen technology look passe. When Leap Motion's leap that can read your fingers down to the tiniest wiggle, it stores shelves later this year, will the mouse and keyboard become completely obsolete? -It's about interacting with your computer in the same intuitive way you'd interact with the real world. -And what about gadgets that are so in-tune with our biorhythms, devices about Fibit that can find out more about our bodies in an hour, we've discovered in a lifetime, or Nest smart thermostat that learns your routine and adjusts temperature based on your daily needs? What are the limits of using the body as an interface with technology and who's going to draw that line? Is wearable technology merely a passing fad or can these gadgets and their gamification help change our habits and how do you protect a wearer's privacy? These gadgets and more in the pipeline are all attempting to change the way we interact with technology and travel through the world. But as they become less like props in a science fiction movie and more of a reality, we'll be forced to answer these important questions. -So, let's ask you guys just to kick this off. What are the benefits of connecting your body, and your life, and a hardware in your home to technology? -So, for health and fitness, you know, it's all about removing friction and logging your daily activities. So, the first step in understanding and improving your health is knowing where you are and where you wanna get to. And when we started Fitbit, a lot of the problems with existing devices is that it was very tedious to track things. You'd wear a device. You'd have to plug it into your computer. You'd have to remember to do a lot of things. With, you know, wearable technology, that truly disappears into your lifestyle. You know, tracking is something that you don't have to think about. It's just about invisible tool or friend that just assists you on your goals. -Okay. How about you, Matt? -So, with interface design, you wanna basically boil it down to its most basic components. What's the easiest way of doing something? So, it used to be, you know, the old technology we had, the touchscreen was the easiest way to interact with your phone, with your device. But that's not necessarily a one side fits all. So, we did Nest. We thought, well, what's the best way you wanna interact with your temperature at home? You just wanna turn it. You wanna turn it up and turn it down. That's the most natural gesture for that. So, we built an entire product around that gesture. -Interesting. Mike? -The philosophy behind Leap is very much letting people reach into their computers like we interact with things in the real world and in the same way that my reaching out and picking up this bottle is a very complex, but totally thoughtless action that's totally instant. That's the same experience that we want to bring to these unimaginably powerful computers that are literally everywhere we go. -Now, Michael, all the-- of all the technologies that we're seeing here specifically up the stage with you guys, you're freaking people out the most. This is like as if it's my match. I mean, all of this stuff is really impressive, but you were describing to me that hand gestures are not just very natural, but they're also-- they're high bandwidth in a sense. What does that mean when you're dealing with analogue human hands? -Yeah. So, it's all about letting people bring in the complexity, basically the fact that when I reach out and pick up this object, I am transmitting thousands of data points to my muscles in terms of how far forward I should move, how quickly I should move forward, when I should grab. And if we can let people bring that to a computer and they can sit in front of something like a 3D virtual piece of clay and they can sculpt that piece of clay with their hands and fingers, suddenly, people who have never created anything on the computer ever can create a complex 3D model that's just as good as with someone who would create it who is a professional 3D modeler; and they can do that in training and in seconds and that same accessibility, that extends to all sorts of applications. -What's interesting here is that all of these panelists were taking us back to being children. Aren't you doing the simplest things and getting out of 'em in a sense? -Simple works. -Yeah. -Yeah. -One of the things that we thought about is not just children, but elderly. We wanna build a product that works for everybody. And basically, you boil it down to its most basic components. I love it actually. -Yeah. And if you watch a child like an infant interact with a computer, or even a TV, or anything actually, their first instinct is to reach out and grab it. So, it very much is going back to that most basic instinct I think. -You know, one of the things I think about a lot is voice, which is not something that is necessarily integrated into the technologies that you're working on, but I have children and I watch them laboriously learn how to write their thoughts down, how to learn how to type. Do you think that there's a future in which that type of spending years learning the mechanics of communication becomes less meaningful? -So, languages is a part of who we, you know, humans are. Language is never gonna go away. But the question is can computers keep up? So, this is a question that people have asked for tens of years. I mean, speech recognition technology is not new and it's still not great. You know, it's getting better, but it's still-- it doesn't feel natural yet. You still feel like you're talking to a machine. -Yup. -Now, if language is one of the sort of main touch points that we all know how to do, the other one is, you know, visual. It's such an important part of all human senses. You know, it's called the super sense in many cases. It's the one that we really-- Like television, we really kind of zero in, relate to, and get so much richness out of it. Brian Tong has got a visual technology. I wanna go to him here. He's got-- This is Vuzix Smart Glasses. So, the M100 smart glasses that are-- they're an augmentation to a smartphone's interface. BT, tell us what these are. We can kind of see how you're wearing them there. -Yeah. So, this is a-- think of it as a heads-up display that's right in front of my eye, probably a couple of inches. And what it does is that it links to an app on your smartphone. It enables you to interact with your smartphone through a variety of apps. This is, again, the Vuzix Smart Glass M100. This is going to be different than Google Goggles though, right. You saw the demos of how it was like a transplant heads-up display-- -Yeah. -where this is transmitting the video information from your phone or whether it be e-mail, photos, or movies and then showing that in here. And I'm just gonna take this off really quickly to see if you guys can get a little peek of this to see if you can even get a peek of the screen. Hold on 1 second real quick. -Now, there's a tiny micro display in there. It's what we're looking that. There it is right there. Very hard to see, but that's replicating the smartphone display, right? -Exactly. And we have Matt Hallett over here from Vuzix. And we just wanna ask-- Mike, sorry about it. We just wanna ask you really quickly. You know, what do you see the future of this specific technology moving forward? -We ultimately see it as going to a-- obviously a binocular technology, see through, and we see this as replacing the cellphone one day. You don't need to have a cellphone. You don't need that screen that's in your pocket all the time. You have information access in front of your eye. It's gonna be a natural ability to have that information on the go and you can have all that technology right at your fingertips. -Alright. -Yeah. That's-- It's different than Google Glasses. It looks like it might be similar, but it's actually a more understandable leap because we already know what we're gonna see in it. Augmented reality is like another big jump that's almost-- it's the next step in many ways. -So, I have a question about how the smartphone and mobile technology has made it easier for you guys to work on innovations that you're working on. And, you know, is it-- Is that a bridge to devices that a future interface or is that how we're gonna do it for a long time? -So, for us, it depends. When we think about digital health, we want our devices to really disappear into people's lifestyles. And to do that, we have to make devices nonintrusive, small, etc. But, you know, that runs counter to having an interface directly on the device. So, we think of the smartphone as providing a really rich interface on top of tiny sensors and I think that's a very good combination. -Yeah. We put a lot of this in thought. So, thermostats today have very cumbersome interfaces and they-- lots of data and lots of buttons. But we have this great interface in our hands with our phone. So, let's put the simplest things, things that you need to use everyday, on the actual display of cellphones, on thermostat, and have all the deeper kind of analytic features; how much energy you're using, how to program it on the phone. Like let's use the right user interface for the right job. -It's interesting. And so, Michael, your technology skips that all together it seems like. I mean, is it-- does it render the smaller, you know, tablet-style device not so meaningful? -Well, I think there are probably 2 steps. So, immediately a tablet that has Leap inside it could still have the same screen it has, but suddenly the interactionary can be bigger. Not just bigger, but also 3 dimensional. So, it's not just about this small 2D interaction surface that is the same size as the small screen. I'm interacting with my phone or tablet in this space between me and it, which opens up a lot of opportunity for new types of interaction, which is exciting, but we're also excited about enabling new foreign factors that wouldn't have been possible like perhaps head-mounted display where the user is interacting with augmented reality in their field of view, in their vision without strapping a keyboard or a touchscreen to their arm or something. -Let me ask you guys about a little toward your longer-range plans. Leap Motion, as we've most seen it right now, is a module that works other devices. The Nest Thermostat, see it right there. It's a product into itself. Fitbit products are also products into themselves. They're all discrete. What's the integration plan? Don't all of you want to be out of the hardware business at some point and be licensing your IP? Matt, you're shaking your head no. -Never actually. -The thermostat goes on my wall 25 years from now. -So-- I think so actually. So, there's a natural progression of technology, but there are still some points in our lives where how we expect things to work and those kind of ships take a long time. The thermostat has been in people's walls for a hundred years. -Yeah. -And we love being in the hardware business and we get a lot of value that way. We could build something that's beautiful. Something's that a work of art. -You like your hardware business. -I like being in-- It's one of our key differentiators. One of our, you know, key DNAs is to be building something's that beautiful and easy to use that you wanna have on your wall as opposed to something that you just wanna hide. -Matt, are there other Nest products in the pipeline? -We're always working on new stuff. You know, with this team, there are many things that we could do. -Mike, I got the impression that you were moving toward something much more of an integration with other products strategy, right? -Yes. There are-- Both are going to play a big role in the future. We really like building hardware as well and we like being able to create markets that didn't exist before. And obviously, we can move faster than big OEMs. So, we're gonna continue to build products and create new markets, but we also don't think that users should have to carry around a peripheral to interact with every device. So,-- -Yeah. -the reality is if we want users to use Leap technology to control their smartphone or tablet and we don't want them to carry a peripheral around, we have to work with OEMs and we have to work to embed the technology. So, both are on our future and I think we can deliver great experiences. -And it would seem to me that the Fitbit almost wants to be part of my mobile device. Isn't it usually on me and couldn't you build that into every phone or maybe every phone and media player 'coz people still don't carry their phone to the gym all the time? -Well, I think the notion of interface is tightly linked with the issue of usability and wearability as well. So, if you think about the smartphone, there's a lot of times throughout the day that it's not actually on your body. You know, it might be on your coffee table when you go home-- -When you're sleeping, right. -when you're sleeping-- -You're really weird when you don't sleep with your smartphone in your pyjamas. -A lot of women would actually can't-- -With your purse. -Yeah. Your purse, etc. -Well, right, yeah. -People-- You know, people don't have-- -You don't mind spots. -etc., so-- When you think deeply about their problem, you'll see that there's actually subtle, but important usability issue. -Yeah. -And I think, you know, again, the perfect combination is actually combining the smartphone with tiny sensors using a smartphone as a switch interface-- -Yeah. -and using dedicated sensors as, you know, the collection mechanism. -So, as sensors get smaller and smaller and probably every time cheaper and cheaper, is there a future in which we could have sensors in our clothes or, you know, in any little thing you might have on your body while you're exercising or walking around? -I think that, you know, that's ultimately where things will end up. I think, you know, when you think about embedding things into clothes, there's a lot of additional engineering challenges that you have to think about, but you know it will be-- -Fashion challenges. -Fashion challenges, but I think you know it will be exciting for everybody to try to figure all that out in the future. -It seems to me there's-- you all deal to some degree with the issue of singleton noise in the environment because you're all working with an air gap to some degree if I can kinda put it that way with Leap Motion. You know, here you are making motions that have to be intercepted across space. Things can interfere. The thermostat can-- It's got a motion sensor on if you folks haven't seen how this works. It detects when people move by to get a gauge on motion in the house. The Fitbit could-- you know, has to filter out errant movements that aren't actually physical-- -Uh-huh. -calorie expenditure. How do yo deal with singleton noise when you're not hardwired in the traditional, you know, motherboard fashion that everyone else is. -So-- I mean a lot of it is just, you know, Smart IBM's Analytics. You know, with our motion sensor, we collected a lot of data and over time basically learned how to build the best algorithm on top of it. So, other ways, the feature we use to basically turn down your house when you're not at home. We launched the feature and over the year actually honed it in such that we learned your most likely occupancy patterns based on your motion. So, in the morning when you leave, chances are you're gone. Either you went to the school or you went to work, went shopping. You're gone for a while and that morning period is actually the best time to turn out the temperature because it's cold. So, by basically collecting lots of data and, you know, performing analytics, we could figure out kind of the best patterns for that sensor. -Michael, how do you filter out someone next to me who is picking up a can of coke while I'm trying to do my thing on my laptop. -Well, it goes back to the core philosophy about interaction. So, if Leap was about these binary gestures where I make a sign with my hands and then something's that predetermined happens-- -This is kind of what the gesture control TVs were doing, right? -Right. -Pretty simple. -Right. What would people think about gesture, that's what they think about. And we don't like that as much. We think that people don't really want to learn sign language to control their computers. So, when-- Well, what we're pushing with Leap is the idea of interacting with something that is-- where there's contact. There's an object and you're interacting with it in a physical way. So, we can ignore any noise that isn't related to some contacts. So, if someone is sitting here and they're playing a harp in the area with their fingers and you're moving your hand behind me and it's not anywhere near the harp, we can ignore you very easily. -Because the motions that you're reading and the way that you're doing it is so rich, it's easier to reject things that don't matter, right? I mean, if I'm doing a simple motion on a gesture TV to turn the volume up and down or something and the cat jumps down from a bookshelf, that could be mistaken. It's the same thing, right? -Exactly. A gesture-- -What you're doing is much richer and easier to distinguish. -Absolutely. A gesture TV is looking for is there an up or down movement. Yes or no. We're looking for how are you interacting with this specific contextual situation and it's actually counterintuitive, but because we have so much more data and more accuracy, we're actually much more resistant to noise. -Interesting. Let's go back to Brian Tong. I think you are gonna give us a look into your brain. This terrifies me, but let's do this. -We'll see how much activity we have there. I think it's a lot. -More than I wanna know. -Alright. So, I'm here with Ariel. She's from Muse with this brain-sensing headband and tell us a little bit about this headband. -So, this headband has sensors on my forehead and behind the ears and it allows me to interact with content on my smartphone or tablet. -So, this is an app right here, right, that you have. And when there is brain activity, you'll start seeing the peaks from the activity. Is that correct? -Yup. So, what you're seeing here is actually my brainwaves. This is what it looks like inside my head. I'll blink for you. Blink, blink, blink. So you see that it's live and it's me. -I tried that demo and it didn't work for apparent reasons. So, where do you also see the future of this technology moving for you? -So, brain-sensing technology has lots of advantages. The first one is that you can actually see what's going on in your own head and then do exercises to improve your cognitive functioning and reduce stress. So, this is the first time we've been able to, on a personal level, see inside our own mind and use that information in some way to make our lives better. Ultimately, this is gonna be another controlled interface. A way to control your home electronic systems, your devices, etc. That's in the future, but right now it's about being able to understand and improve the self. -And also, you know, it is a fashion statement. You guys were talking about wearable tech, right? Well-- -Okay. -That's how I've rolled. -Yeeha, oh! -Of course. -Bad ass. -You're not getting-- -You like that eyebrow action for all you tech lovers out there? -Yeah. I'm not getting any brainwaves, but getting some great temple waves. -I should probably stop that. Shouldn't I, Cooley? -Yeah. Yeah. You're making me nuts. You guys have got a party. I'm not gonna invite everyone to InteraXon's party, but explain to us how the headband is gonna be used to operate the beer keg. Here's a great a story. -We're at a CES VIP party tonight and people are gonna be able to pour beer with their mind. So, if they focus on the beer tap, if pours the beer. When you're done, you just clench your teeth or sort of make a gesture like this. Stops pouring. Walk away. Drink. -That's what I do in the bar anyway. -Pretty impressive. -So, it works. -That's crazy. I love that. I was kind of wondering if any of you have ideas about really unique uses that you heard from your users. You know, stories you've heard of, wild things that people have done using your products. -Yeah. We had a user last summer, I think it's in Arizona, who sent us an e-mail about this, but there is a big forest fire in his region and he didn't know if his house was okay, if it burned down, or what was going on inside, but he was able to open up his app and see, oh, my house is fine, it's not too hot inside, my house didn't lay on fire, it is okay. -Interesting. Yeah. It didn't seem like 134 degrees or something if there was a fire in your body. -It would have been down or something-- You know, something would have been different. -Yeah. Yeah. Interesting. James? -For us, you know, we-- our device allows people to track their sleep activity. And you know, we had one family where the parents actually use that. Gave the Fitbits to their kids and use the sleep tracking as a mode of competition. So, kids would compete to see who would get to sleep the quickest and who would wake up the fewest number of times throughout the night, so-- -That is-- I can say as a parent that is genius. That's amazing. I was like, I gotta-- -Yeah. -That's really great. -Michael, what have people done with Leap Motion that made you go never thought of that? -Well, I don't want to-- I don't wanna spoil any of our developer's activities, but there's a ton of things that people have expressed they've been doing and it's been really amazing to see the diversity of it. I think the thing that-- the thing that's most exciting and surprising to me is just how every person is passionate about using this for something else. I think there is something that people feel they have been missing from their experiences with computers and it's just very different from person to person. So, some people are saying I'm really excited that I will be able to, for the first time, edit video, edit audio. Others are saying I'm really excited that for the first time I am going to be able to interact with all of the complexity of my social networking data in this really intuitive powerful way and then there is of course tons of great gaming, things in the works. -And I'd like you also to explain a little bit or illustrate for us a little bit about the resolution of what you're capturing 'coz we're all used to touchscreens where resolution is not great. Trying to type on a touchscreen frustrates everyone. If you're a little bit off and too stupid to figure it out and we're talking about course versus fine gestures. How highly resolved is your ability to know-- if I can move my finger just as tiny as it, you can hardly see it. Can you pick that up? -We definitely can. Our accuracy is about a 100th of a millimeter. And I know that that sounds excessive, but it's actually not. The goal with Leap is for you to feel like your arm is an extension of what's happening on the computer. And in order to actually feel that way and to feel like this is a fundamentally better way of controlling and not just a cool gadget, it has to be extremely accurate, but also very responsive. The latency is almost as important as the accuracy. So-- -Right. It is, isn't it? -Yeah. -I'm sure with tablets a lot of the future innovation will be about latency of the touch response as opposed to even the sensing of the touch response or the pixels in the display. -Yes. One of things very early on with the iPhone that we had to get right. It had to be super, super fast. So,-- -Yeah. -it's one of the reasons why when you swipe left to right it goes left to right immediately. That latency is super critical. -Yeah. -The brain is very good at picking up that difference in latency. -Knowing that you're doing something synthetic and going, okay, this is lagging. -Yes. -Yeah. And you see touchscreen controller vendors competing on scan rates-- -Definitely. -basically. -Yes. -Well, that's one of the things we're feeling when we're using a touchscreen. It's latency even more than accuracy. Okay. The feedback loop is kinda off there. Let me ask you guys about getting the word out to customers. I know you're at various stages in your customer-facing life as companies. What works or how do you get people to think differently about controlling things when they are very, in their minds, good; keyboard, mouse, and touchscreen 'coz they've not known anything else 'coz it works great? How do you get them to sample to think differently about that? -So, for us, I mean we're in very broad retail today. -Yeah. -So, you could walk in your local Lowe's Store, Best Buy, or Apple Store and see the product and a lot of time just try it. And the best sales of the product is actually using it. You know, using it in person. -Yeah. -And as we've grown and gotten more product out in the field, when we people walk in to someone's house and they see that's in the wall, they ask, oh, what is that? -Yeah. -And that's the best sales tools. You know, seeing the product and using it on action. -Yeah. Let me see your thermostat. You've got one here. It's charged up. It has a little battery. I'm gonna get Lindsey's thoughts on this too with the reason-- In case, you guys haven't seen this, one of the reasons I bought the Nest thermostat, which is $249-- -249. -249, so you know not a cheap date as thermostats go, but of all the things I loved about it, being connected. You can operate the thing from your smartphone or computer device. That's great and of course all the intelligence and the great interface they've got in this thing. But being able to turn the temperature just like one of those ancient Honeywell, you know, button thermostats that costs 19 bucks down at the hardware store, that was like that's what I want because it had one of those push buttons thermostats for years when you go from 62 to 69 degrees. It's 7 pushes plus the 3 it doesn't register. -Going back to interface design, that's what we made it around. That's the most natural thing you wanna do when you control your temperature. It's in our brains. -So Lindsey, what got you into the Nest? Because this is our little mini focus group of what got us into different interface. -Well, I reviewed it for CNET. So, that got me. But I loved it. I mean, I became a convert and people who review technology products are very cynical, right? And I-- it was as easy to install as-- I mean, I did it on camera and I just-- Well, I like the way it looks. I like how easy it is, but I also really like the ability to change and other thermostats have this functionality, but I like-- that's my house by the way. -Right. -That's my wall. -That's her hand. -That's my hand. I like the ability to program it from afar. And I don't do that often, but when I need to do it, it's really nice to be able to do it. So, you know, coming home from a ski trip and you're freezing and it's winter and you wanna turn on the thermostat or also being able to turn it on from bed. That's nice. -Right. -Right. -So, we were talking about this earlier and our question was right now Nest you can have 1 or 2 thermostats to work on different zones, right? But it seems like there should be a future in which there's a much more nuance interaction than just when I walked in the front door. -Absolutely. And you start to see this with the, you know, the parking industry starting to be a bit more location aware. You know, did your car just enter the driveway? You know, look at kind of the Ford booth and some of the interesting things they're doing there is the connected car. So, starting to think about, you know, how do all these things work together and is there a possibility of your thermostat learns from the car. I mean, it's crazy. Right? -Right. You're half a mile from home. You're headed in that region. You're usually headed in that direction at that time of day you're coming home from work. -Right. -Turn on the thermostat. -Exactly. -Let's-- I'm gonna ask James a question here. You guys are-- Fitbit is arguably the nearest thing to a household word, I think, in this category of wearable fitness tech. I mean, this has been such an explosion here. You said what? Your area has doubled since last CES you'd say? -Yes. -Yeah. This is a really big bump this year for the wearable fitness space. What is the next thing that you've gotta get done to really make it, not part of a panel about the cutting edge, but mainstream? How do you make that tip over? -You know, I think it's a lot about consumer awareness. And there's a lot of different ways that we do that. So, one is, you know, there's a lot of retailers that actually, you know, carry products in this category. When we started the company in 2007, I don't think there was a major consumer electronics retailer that carried digital sports tracking devices other than, you know, garments and pullers for the very high end. So, in the past 2 years, you know, retailers like Best Buy, Apple, Target, REI, these mainstream retailers have, you know, opened up this category and I think that's really raised awareness of how these devices can help you. So, I think retailers help the media, you know. Consumer health is always top of mind of everybody and especially with these tools people can take control of their own health, which I think is pretty powerful. -Which one are you wearing right now? -I'm wearing the Fitbit Flex. -Okay. That's the new one. Hold that up and show the lights that it does to indicate what to you and then our camera can zoom in and get it. Hold it steady right there. We see the LED is moving. What do they tell us? -So, basically it's progressed towards my goal. So, you can set a step goal. I set something pretty achievable for CES. So, it's about 10,000 and I think I smashed that by midday. -Yeah, it's easy to-- -Yeah. So, the display on the device gives you a very quick feedback throughout the day. And if you want something more, you actually fire up your smartphone and the device has been syncing in the background so you launch the app. The data is already there. You could see your past history, your graphs, etc. And on android phones, there's actually NFC tag embedded in the device. So, all you have to do with certain android phones is just tap the phone through the band and the app will automatically launch. So, we're all about removing friction in the process. -So, this tracks your activity. -Uh-huh -And what else? -So, steps, calories, distance, your sleep quality, so how long it took you to fall asleep, how many times you woke up. There's a haptic motor in there for waking you up gently in the morning. So, it also has a Bluetooth 4.0 radio. So, it's always syncing constantly it the background. So, it's all about, you know, how do we create an ambient device that collects meaningful metrics abut your daily lifestyle and on mobile apps and the web provide tools to actually help you reach your goals. I won't come back in a second. I wanna ask you, Michael, about what will be the breakthrough for most people for your technology. What do you think is gonna be the ah-ha that might be the buzz worthy "hey have you tried this thing?" After we come back for this next demo from BT here, this is the Eye Tribe technology. We mentioned this briefly at the opening. Using Eye scanning to operate the mouse or maybe other parts of the interface on a piece of technology. Tell us what you're doing and not that you're ignoring us watching a movie. -I'm not at all, Brian. So, what this is is this is the Eye Tribe tech [unk]. What it has here is initially you kind of calibrate. There's a camera on the bottom with LEDs in the bottom that detect where are my eyes are. We're gonna show you just an image of the screen right now in a second so you can see what it looks like, okay. So, this is where we have-- you can see little-- my 2 eyeballs in the right hand side. -Yeah. -Okay. So, what I'm gonna do is I'm gonna swipe into an app. Once it calibrates that and knows where my eyes are, I'm gonna jump in a game like Fruit Ninja because you know I'm a Ninja, but this time I'm a Ninja with my eyes. So, we're gonna get to there. And all I have to do is I actually have to look at the fruit. Look at that Cooley. You like that? -Seriously? Wow. -Lindsey, you got some of that? Pineapple, kiwi, mango, lime, pineapple, kiwi, lemon, mango. So, you're actually using your eyes as the interface to move right now, right? This is set to move the cursor around as if it was your watermelon. And oh, there you go. -Okay. It's not voice technology? -No, no, no. I'm just saying it-- -You called fruit-- -I'm getting in the game, baby. I'm in. Ooh, look at that-- Look at that fruit. Uh-oh. Uh-oh. You're distracting me, but that-- You know, it's an example of what we can see here and we're here with Martin Tall, which is a very appropriate name here, right Martin? -Yeah, and that's for probably 6 more. -And Martin, just tell us about the technology and where you guys see it going. -Okay. So, what we're showing here today is an add-on device for your tablet where you can actually control the interface straight and sort of replacing a touch or a mouse or something like that, but we see it being used all across the board, so just making the device a lot more aware of what your brain is actually focusing on. A lot of the cortex is actually dedicated to the visual system. It gives away a lot of your attention and what your motives are, what your interest lies. And I think this technology is enabling much more intelligent computers. They're just not sitting around and waiting for input anymore. They will know before you even know. -And then where do you expect to see this-- you know, this technology integrate? You're hoping to put it in another product in the future, right? -Right. So, it's gonna be desktop, laptop, tablets, smartphones, in cars to see if you're falling asleep, if you're paying attention and all that sort of stuff. So, it's gonna come in to a range of products and it's gonna happen soon too. -Right. Excellent. Alright, back to you guys. -Alright. Now, BT, I have a question. When you were doing that, you were simply moving cursor with your eyes and because the way Fruit Ninja works, you didn't need to have any click command through your eyes, right? There's no click enabled at this point. -That's correct. I mean if you were-- if you looked at it where my eyes were pointing, it was kind of set within this game demo that kind of have a little-- a movement of a swipe already. It was doing little baby cuts. -Yeah. -So, whenever my eyes move to that point, it was chopping up all those fruit. -Alright. -You're jealous, aren't you? -We're always jealous of you. Michael, what we left off was what do you think-- can you envision a time in the fairly near future where people are going to find one thing they kind of congeal around first and start to buzz about around technology like what you're doing and saying, okay, now I get it because I've tried X through it? -So, I think there are 2 things. I know that violates the spirit of the question, but what is that-- the moment someone gets a Leap and they connect it to their computer, they can use it to control Windows 8 or MAC OS 10 like it was a touchscreen. And even for Legacy 2D interfaces, Leap can be a better way of controlling them than a touchscreen because you don't have to move one to one like you do at the touchscreen. If I wanna move from this corner or to that corner, I don't have to drag and I don't have to lean forward, which is tiring. So, I can sit back comfortably, move my finger a small distance, cover the entire screen, but it's still a directed experience. So, we think that will have a lot of mass market appeal, but it's also-- it's about the developers. So, the developer community is the single most important thing for us. We have over 40,000 developers from 150 countries that have applied to be part of the Leap Developer Program and it's-- that's-- the things that they create, which-- a lot of which have a lot of mass market appeal, those are things that are built to work great with Leap. And those are the first things where people are going to use it and say, wow, this is something that I never could have imagined doing before. -You know, I'm getting this vision of your army of editors, how they spend so much time hunched over reaching, like Michael says, for a device they've gotta pull and we're dealing with ergo all the time like every company, getting in the right position so your neck is not stiff and all that. If we could just kind of sit back, I'm not making a pinch so you go, you know, install 2,000 of these tomorrow theoretically-- -I'm gonna join the developer program. That's it. -It's a regimen already. It's a huge ergo breakthrough it would seem just for a way to get your body in the right position and not worry about am I also in the right position to access the interface-- -It also seems like-- -so for one thing. -It's a potentially great replacement for something like a remote control where you have to have a small but kind of not interesting device that sits around your living room instead of sitting, you know, across the room and interacting with something on the other side of the room. -Yeah. Definitely. Definitely. We-- Obviously, you can use Leap near a computer to control that computer. You can also use it to control something that's far away. So, we imagine these cases like a conference room where everyone has a Leap in front of them and everyone is interacting with the same screen. -That's so-- There's potential collaboration as well. -Absolutely. -Now, let's talk about the data that comes out of what all you-- your companies are doing. We've got so much data from the way people interact with digital media today. It's an amazing ground swell in the last, let's say, 15 years generously. But if we look at any frontiers that's are still untilled and user data that we have in the digital media, it's this stuff. It's body and gesture. We have no insight into that. All the data we have about users on the web, we don't know anything about where their arm was, or what their body position was, or what their sort of biorhythm state was when they saw something. This is a-- This is like unlocking the new area of data, isn't it? -Absolutely. I look forward to the day where I don't need to go to the doctor anymore, but instead I'm told by my iPhone or by my-- by a computer, hey, we noticed something wrong. You're walking differently than before. You might wanna go see somebody and here's what you should tell him it is and here's what you give them with all the data for the last 3 months, gives them a history. -You say you're putting an extra 8 pounds of pressure on your left hip. What's wrong? -I am afraid of being nagged. I-- You know, did you guys hear about the haptic feedback fork that-- -Yes. -on the-- It vibrates when you're eating too quickly. I think I would just be like, you know what, I'm putting you in the dishwasher. -So, technology, again, it's to bolster how humans already live. So, it should never be a-- you know, too obtrusive instead of having diagnostic medicine as it is today where you do a bunch of test when you're aren't feeling well to have the diagnostic medicine happening through your entire life. -I think the key thing to it is, you know, if you watch the show House M.D., you know, House, his favorite is, you know, people always lie, but if you are collecting this data and you do present it to your doctor, I mean, it's a-- I think quite informative for the doctor to see how you have actually been living your life between doctor visits. I mean, what he's doing today is trying to make a quick judgment based on observing you for maybe 10, 15 minutes-- -And asking questions. -but-- asking you questions, but, you know, if he was supplemented by this data and possibly automated intelligence on top of that data, I think that will be a huge breakthrough. -And we all have such huge bias anyway about what we think we've done that led to ouch or whatever. We don't really know how we are. I made a question for you James about the-- The Fitbit fits into a space, I think. Tell me if I'm crazy. That is we're moving toward the 401K-fication of health care where we have to become partners and be co-responsible for our health as we now are; our retirement over the last, you know, 30, 40 years since that kicked in. Is that a trend that you think is happening? -I think so. I mean, you know, we sell a lot of devices to employers as part of their corporate wellness programs and I think they're-- I hate it to call it a partnership, but I think, you know, it's good for both the employee-- -Spying. -Yeah. It's good for both the employee and the company for you to be healthy. So, I would hate to see punitive measures ever taken, but I think right now it's all, you know, everyone working together often and, you know, let's get healthy. -Cool. Alright. Any final questions or anything that we've left out? What have we left out? What have you thought why didn't they ask this, why didn't they ask that? -So, we ask this a lot. Why don't you add this feature? Why doesn't it have a clock? All these kind of things and part of what makes building a great device and building a great interface tough is dealing with all the noise and actually the discipline to keep it simple. And that's something to keep in mind, you know, for all of us when looking at this-- -What attribute that serves you well. -Yes. Literally. Yeah. I mean looking at all these next generation technology keeping it really simple is hard to do and that's something we need to do in order to really-- to drive it to the mass market. -Yeah. Michael, final thoughts? -Well, I think that in our case it goes back to the importance of the developer community. So, we are very actively seeking new applications from developers of all kinds and we will be continuing to send out thousands of Leap devices between now and when we actually start shipping the product. -James, last thoughts to leave us with? -For us, you know, I think what we're gonna see in digital health is, you know, smaller and smaller devices collecting more and more meaningful data types. And I think it's a really exciting time because, you know, it's really in the past few years that technology has gotten to the point where devices like the Fitbit have been possible. So, I'm really excited. -Uh-huh. Yeah. Okay. Good stuff. I wanna thank everybody for joining us for this panel. This has been-- You know, I know we like to think as an editorial organization all of our children are beautiful, all of our projects, and panels, and pieces. This is my favorite so far at this show. So, thanks everyone for being here. Please join me in thanking our guest if you will. James Park from Fitbit, Matt Rogers from Nest, and Michael Buckwald from Leap Motion. Great future thinkers. -Thank you so much for coming. -And of course, thanks to BT and the companies that have joined him there. Vuzix; InteraXon with the Muse brainwave-sensing technology; and Eye Tribe, the eye tracking and scanning technology. More good stuff. How about a hand for them? Great innovators. Great innovators. On behalf of myself and editor-in-chief Lindsey Turrentine, thank you for being with us and our live coverage at CES continues here at ces.cnet.com.