We recently went to Interaction 17" (Interaction Design Association, which provides an online forum to discuss interaction design issues) and presented our thinking on gestural input. If you haven't seen it, you can check out the link here: https ://vimeo.com/209794287. It is impossible to explain the whole concept in 8 minutes. So, we would like to share the thinking and process behind it.
In recent years, mixed reality technology has become a popular term in design circles, but the basic interaction method is still in a state of perfection. We are used to using interfaces or controllers to interact with machines. However, when using hololens for example, it is difficult for us to interact with it with natural gestures due to its lack of interaction. In this post, we'll describe how we've focused our research on physical interaction with natural gestures in mixed reality.
The lack of interaction makes it difficult for b2b data people to interact with natural gestures
Limb Interaction and Proprioception
We are born to perceive our limbs, joints and muscles without resorting to our five senses. This is called proprioception*. We feel that proprioception should be used as an input method for mixed reality. In this way, we do not need external hardware at all, and only rely on our own body to interact. This approach allows for more natural physical interactions, and we foresee considerable potential.
Using Proprioceptive Gestures as Gesture Interactions
Intangible and tangible
For a long time, we have respected invisible design, and the design must be invisible and easy to use. However, in order to approach natural gesture interactions. We felt that the interaction had to be changed, it had to be obvious and easy to understand for the user. This requires that when we design gesture input, we need to better define the space and sequence of interactions. Next, we will talk about tangible design.
Test prototypes to improve their fidelity
Since there is no better way to think and learn than by doing it, we made a few prototypes. And use them to test with colleagues and quickly improve designs based on feedback.
Explore prototypes: control fidelity from a single variable to a set of 3d variables
We define tangible as a clear benchmark for interaction boundaries. For example, the distance between our thumb and index finger is a tangible proprioceptive interaction, because we clearly know the maximum and minimum values that this distance can achieve. This type of interaction is perfect for controlling transparency or other operations with well-defined upper and lower bound properties.