And I can see my hand.
I'm being instructed to poke the dragon's tummy, yellow vomit goes in the bucket.
I'm pointing at it, I can see my hands pointing at it.
And now I've got these giant dragon claws.
So my hands transformed again, if I make a fist with them, Giant Wolverine spikes pop out of my hand.
I've been looking at VR demos for years.
I remember looking at a lot of oculus demos going back to CS as of the past, and the Oculus Rift.
Facebook's VR headset came out in 2016.
I remember using those Touch controllers and thinking there was amazing how they transform spaces.
Well this year Facebook is trying to dive into AR technologies but in the moment they're looking at incorporating hands into VR.
It could be the stepping stone to what comes next.
I've done hand tracking before in VR and in AR.
I've used hand tracking in a lot of site specific experiences that are almost like theater at places like the Tribeca film Festival.
Leap Motion had some hand tracking that you could use in VR headsets as well.
Hand tracking an AR is incorporated by both Magic Leap and Microsoft's HoloLens.
And earlier this year, I use my hands to manipulate holograms that were floating in front of me with a pulldown visor.
What's interesting about what Facebook is doing with Oculus quest is there putting that hand tracking into a $400 standalone VR headset Which costs a lot less than one of those high end VR headsets.
I tried a few demos here, Facebook's headquarters, and they worked.
They weren't quite as smooth as what I would get out of touch controllers.
But I was able to reach out my hands and it could recognize my finger motions.
I could give thumbs up, I could do this.
I could do that.
I could do that.
I could basically do anything I wanted with my hands.
And there were two different demos that I tried.
One was set in a magical lab that felt like a game that I could play with turning my hands into various monster hands.
And the other one was an Enterprise demo, where I got to.
Explore being an insurance evaluator to see what sort of water damage was in the kitchen.
The hand tracking is using the four cameras that the Oculus quest has, which aren't the most complicated camera setup but there's no depth sensing after Apparatus user.
These are four black and white cameras that are providing enough information to pull out the computer vision processing to make this work.
You could imagine that down the road you could have more advanced cameras that will do that and even more enhanced way.
Facebook is going to incorporate hand tracking into the Oxylus quest early next year.
And open up an SDK so that other people can play around with developing apps for it.
Maybe it'll come to the Oculus Rift as well, that's unclear.
But one thing that has to be cracked is inputs.
Not just feeling something, but also being able to connect Can control things and not have to have a lot of apparatus.
This hand tracking, Facebook has said, could be used not just in the Quest, but you can imagine in other devices.
Maybe in something like Portal where you're standing in front of a magic mirror and then you're using your hands.
When I started trying hand tracking in the Oculus Quest, I started thinking about those AR headsets.
How do I do this and see stuff in the real world?
Could I use the pass through cameras to potentially grab something and move it around my living room?
That's not here yet.
It could be.
But it looks like hand tracking is the beginning of that bridge from VR to AR.
And it looks like Facebook is trying to make that move here, starting at their Oculus Connect conference.