HERB, short for Home Exploring Robot Butler, is a joint project between Intel Labs and Carnegie Mellon University that has been going on since 2006. HERB is a robot fashioned out of a Segway that is intended to be used in the homes of the elderly or disabled.
The robot knows how to do things like pick up bottles and put them in the recycle bin or hand items to people, but not because it's been taught all of these tasks. Rather HERB is a learning, autonomous robot.
The robot "sees" the 3D models of objects based on the laser on its front it uses to scan for things. It finds objects--like humans, bottles, recycle bins, and more--and executes tasks based on what the computer inside knows about those items.
Project Oasis uses a depth camera to interpret real world objects, like the steak and bell pepper pictured at left, and create 3D representations of them without the need for special sensors or bar codes.
The depth camera can pick up the steak on a regular kitchen counter, subtract the background, and make information appear on the counter with "buttons." When it sees the steak, a button will be projected on the counter. Tapping the virtual button brings up menu options like recipes, shopping lists and more.
The idea is to create a "smart" space that doesn't require a lab or a setup of many sensors and can be translated to real-world situations, like cooking.
An up close shot of Project Oasis. The system can recognize the two objects--steak and bell pepper--and automatically generate recipes that contain the two items in a virtual menu projected on the kitchen counter.
How do you make a smart device that works regardless of location? That's what Intel is looking at with Project Portico. Research based in the Seattle lab is looking at useful applications of both physical objects and gesture recognition.
Small cameras attached to a tablet PC can pick up not only what a person is doing on the touch-sensitive screen of the tablet, but what he or she is doing near the tablet. The cameras are basically enlarging the surface area of the tablet.
The result is that "if your device is aware of the things around you, it could teach you how to julienne carrots or how to change your carburetor," said Anthony LaMarca, a principle engineer on the project.
The Holodeck Car project has been in the works for about six months in a joint research endeavor with the University of South Australia. The Holodeck Car enables a 3D projection of a full-size car that can allow engineers to fine tune the fuel-efficiency of the vehicle and placement of different parts before ever having to manufacture a single model.
The researchers in Australia project the image of the car onto a cardboard life-size model and stick it in a windtunnel, where they can streamline the car and move side mirrors and headlights around to find optimal fuel efficiency for the car, said Pat McCulley, systems analyst at Intel. The aim is to cut costs and eliminate waste from building real-world models.
Intel is working on technology that allows people to listen to text instead of reading it. It can take text from the Web and automatically summarize it on the fly. It is then converted from text to speech and can be listened to on any device.
Intel says it's working on establishing the infrastructure that will allow cars to be constantly connected to the cloud. That means your car could do things like send you a text when someone tries to break in, or take instant video that you can stream to your mobile device if the car gets into an accident, Vu Nguyen, a technology evangelist for Intel, explained Wednesday.
Having this infrastructure in place would also mean you could turn your car on remotely and set the temperature before you even leave your house or office.
The technology assumes your car is connected to a 3G or 4G network, which would require the same kind of subscription wireless service you use in your cell phone or laptop.
Intel demonstrated a similar idea last year at Intel Research Day, but with an emphasis on streaming entertainment content like music and photos to cars. This year the researchers emphasized vehicle safety and convenience.
Intel is currently in the process of patenting the ability to transfer data between light sources. It's called visible light positioning and could be used in a variety of ways. At Research Day, Intel's researchers showed how cars with LED headlights and taillights with tiny camera sensors can use light to detect the distance between cars.
Car headlights, using light like the red LED light shown at left, can determine in inches, how far away another car is behind them or in front of them and project this information into a driver's display.
"It means you can see cars in your blindspot," said Vu Nguyen, Intel technology evangelist.
Right now few cars have LED (light-emitting diode) headlights, but more will come, he said. Incandescent bulb headlights won't work because their light can't be modulated over different frequencies the way LED's can.
The Computer Vision & Augmented Education project is similar to Project Portico. Using cameras, the objects near a Classmate PC, Intel's Netbook for kids project, or other laptops, can be tracked. It allows young children's lessons involving physical objects--like counting change--to be represented both in reality and on screen virtually. Teachers can then track and customize students' lessons on the Classmate PC.
SENS, or Socially ENabled Services, puts sensors into a new kind of portable device. The sensors track a person's location and context to feed information to a "shadow avatar." Intel calls is "social augmented reality."
The Meeting Diarist project is probably the dream of anyone who's every had to record the minutes of a meeting.
The technology allows researchers to record a meeting and then search the recording afterward for any word or by who was speaking at the time. Faster multicore CPUs enable the computing power to do this quickly and efficiently, Intel says.