WYVRN

Hand tracking experiences for virtual reality

hand interaction for virtual reality

Now that you have downloaded Interhaptics and followed all the instructions (if you haven’t, click here), you are ready to extend your reality. In this blog post, we will analyze how Interhaptics manages the implementation of hand tracking for virtual reality. The final goal is to optimize the user experience.

Hand interactions for virtual reality

3D hand interaction for virtual reality are usually inspired by what happens in the real world. For instance, when we see a button in VR, we intuitively know that we should push it.

Similarly, when we see a virtual object representing a real one, we respond with hand interactions that we are familiar with. We see a ball, we reach out and grab it.

Ball hand interaction

If we look into this hand interaction more in details, we observe the following actions:

– Reach out and grab a ball
– Displace it
– Release it

From this segmentation of actions, we can create 3D hand interactions formed by three components:

– The starting condition (grab)
– The spatial transformation (displace)
– The ending conditions (release)

Hand interaction for vr

This seems obvious! Why do we need to define and segment a simple concept?

More complex hand interactions

This framework is useful when we approach more complex hand interactions for virtual reality. This process allows to have programmable ending or starting conditions to meet a user experience objective.

In our website, we provide multiple ready-to-use demos to explain this concept. For example, in a professional training scenario, we want a user to wield a key in order to turn a bolt exactly 750 degrees. Then, the user will release the hand at the end of the interaction.

VR Wheel Hand interaction

If we segment these hand interactions with the framework explained above, we have:

– Starting condition: Grab
– Transformation: 750 degrees rotation on the bolt axis
– Ending Condition: Reach the final position

Hand interactions primitives

These hand interactions for virtual reality are more complex than the previous one. However, the segmentation into 3 main blocks allows us to visualize it as a simple logical chain. We call these items “interaction primitives”. An interaction primitive is defined by a starting condition, a transformation, and an ending condition.

If you think about your smartphone, you are already using these hand interactions primitives every day. For instance, the drag and drop on the screen have a starting condition (tap), a transformation (drag), and an ending condition (reach the target or lift the finger from the screen). Apple has been the best company to optimize interaction primitives. With this method, the transformation doesn’t need to follow the exact movement of the finger, and it allows to add of inertial effects to the graphical outcome.

One of the fascinating outcomes of this method is that we can represent a unique hand interaction as a single block and a set of hand interactions as a block scheme.

XR Hand Interactions block Scheme

What makes interaction primitives interesting

When you are creating interactive scenarios with several hand interactions in sequence, you need to quickly execute and implement a set of hand interactions for virtual reality. More specifically, you need the ability to modify the parameters of your interactive content on the fly.

Each interaction primitive provides ending information once completed, allowing to trigger animation/actions or enable further hand interactions for virtual reality. As shown below, the result of these complex hand interactions is a user scene.

Virtual Reality Hand Interaction User Scene Block Scheme

Hand tracking experiences for virtual reality with Interhaptics

Interhaptics developed a large set of interaction primitives that you can apply with just a few clicks inside your 3D engine, to make your scenario interactive. These interaction primitives can be also applied via an API. This allows the integration with a developed product, to implement interactive content easily. You can find some examples in this video.

Interhaptics interaction demonstrator
Interaction Demonstrator video available on Youtube

A bonus of these interaction primitives is that during the transformation phase you can add haptics material, enriching your interactions with tailored haptic feedback. This is just a few clicks away.

Interaction primitives are the best scriptwriters for modeling your interactive content into your XR environment. Check out our demo video now to show you the full potential of hand interactions for virtual reality.

Check out all our articles here to read about how haptics keep you immersed into your VR experiences. Develop hand tracking experiences by downloading our Hand Tracking for VR & MR package.