We’ve tested the Hand Physics Lab for Oculus Quest. What it means to you?

The technology for detecting the shape of our hands is evolving more and more. This technology has been opening the doors of our imagination more and more and Holonautic has proved this by creating the Hand Physics Lab for the Oculus Quest. This version still in Alpha tests already has several examples of interaction in a virtual reality scenario.

Compatible only with Oculus Quest, this software makes use of the sensors available in the equipment to detect and know the position of our hands in real time, a mixture between computer vision and machine learning.
To install the application, it is necessary to have the SidequestVR software and have the Oculus Quest in programmer mode. September willingness to experiment follow the steps indicated on the website of SidequestVR.

To date, the application has the following experiences:

  • “Interactive buttons and controllers”
  • “Tools including hammers, lever and ax”
  • “Building block playground!”
  • “Clipboard with pencil, eraser and finger paint”
  • “Egg painting station with your own fingers!”
  • “Paint your own hands!”
  • “Use toy guns to cover dummies with darts”
  • “Typing on a mechanical keyboard and using a mouse”
  • “Weightlifting”
  • “Interactive and animated tiny puppet”
  • “A clone of yourself”
  • “Experience in the chemistry lab with lasers and acid”
  • “Put candles on your cake and light them with a lighter”
  • “Interact with a dynamic door”
  • “Climb and move using only physics”

See here the video of all the experiments we did:

At first it was a little difficult to adapt. There are some options linked by default that facilitated, example of the “Snapping” functionality that takes the objects that we want to grab in our hand automatically. We tested with the functionality off, and it was very difficult or even impossible to grab some objects. Hand detection is also not 100% responsive and there were times when our hands in virtual reality acquired strange positions 🙃.

Other than that, it was really fun and inspiring to know that we already managed to have so many types of interactions of this kind. I spent an average of 30 minutes testing all the experiments, and when I removed my glasses, my brain was already a little confused, and when I grabbed the real water bottle I felt like I was still in virtual reality mode🤣.

But what is the purpose of this kind of interaction? What is the business that can be generated?

Well there are several. The first thing that comes to mind is training. Training in virtual reality is already more than a trend, it is a necessity. Imagine that a company needs to train its workers to assemble a set of parts that involves complex processes. With this type of interaction, workers can enter the virtual reality scenario and take tools and handle them as in reality.

I also see application of this functionality in the area of Physiotherapy or Psychomotor Rehabilitation for example. In which the patient is led to articulate the hands and arms to overcome the challenges in a virtual reality scenario, making it probably a successful therapy. And with another advantage: the patient can do his therapy anywhere!

Do you have any idea where this interaction can be used?

Start typing and press Enter to search