So let's take a look, at how portability of experiences enabled in Windows 10, can change the way makers like you, interact with the physical things that you build.
And to show us mixing holograms, with IOT Please help me welcome on stage my very good friend, Nico.
I'd like you to meet B15.
This year, millions of people will use maker kits like this one to enter robot competitions and learn about electronics Many of them will use Raspberry Pi 2, which now supports Windows 10 IoT Core.
We were inspired to see what we could do with the same hardware and hololens.
So let me show you.
B-15, wake up.
So hello to the real B-15.
Because every Windows 10 device has APIs for human and environment understanding we were able to overlay a holographic robot on top of a physical one.
And one thing we noticed while working is you need your data to be easily accessible.
But adding displays to this frame increases your cost and there's no way to see all that information at once.
So let me show you how we check on our data.
B15 control panel.
This facial UI system is a universal windows app.
I'm pulling the data from B15 making it as big as I need it to be.
And placing it where it's relevant in my world.
And I love it that Miko didn't have to actually go back to her desk to do it.
This is a great example of how Windows 10 makes your digital life more powerful by connecting it with your real life.
That's right, you can waste a lot of time tweaking variables on another device before you test
But if all I need to do is something simple like change an LED color, I can use a control scheme, like this.
So that's how we envision controls and displays evolving with holograms.
Now, assuming me and Josh are robots.
You can find paths safely through their workspaces.
Let's take a look at how Miko can help B15 navigate and how holograms can help.
So B15's sensors aren't great at detecting obstacles.
But the cool thing is they don't have to be.
My hollow lens can communicate that information.
B15 path-finding mode.
Now with Windows holographics we can scan an environment.
We can que up movement tasks and visualize the robot's path.
So all I have to do is air tap the ground.
My hollow lens interprets the 3D points.
And tells B15 when to turn and when to move forward.
Now this is a great example of how portable our Windows experiences can be.
This is [UNKNOWN] didn't have to know its environment, the room understanding comes from Windows holographic.
But, what happens when the environment actually changes?
That's no problem.
Because I'm wearing hololens, I'm always aware of this as seen I can see it updated it's route to go around you, and this is critical if you're working with bigger robots, like those in automotive manufacturing.
You need to always be aware of a robots intentions.
Hollow Lens helps developers understand robots better at any scale or in any scenario.
This is just amazing.
Thank you so much Miko, and nice to meet you B15.
E-13, say goodbye.
See you later, Alex.