Further, if we do penetrate the mesh, the intersection clipping will appear less abrupt – since the fingertip and surface will be the same color. This might make it easier to judge the distance between fingertip and surface making us less likely to overshoot and penetrate the surface. Experiment #2: Fingertip Gradients for Proximity to Interactive Objects and UI Elementsįor our second experiment, we decided to make the fingertips change color to match an interactive object’s surface, the closer they are to touching it. When the glow strength and and depth were turned down to a minimum level, it seemed like an effect which could be universally applied across an application without being overpowering. This execution felt really good across the board. A shallow portion of the occluded hand should still be visible but with a change color and fade to transparency. Experiment #1: Intersection and Depth Highlights for Any Mesh Penetration Image courtesy Leap Motionįor our first experiment, we proposed that when a hand intersects some other mesh, the intersection should be visually acknowledged. After prototyping possible solutions, we share our results to help developers tackle similar challenges in their own projects.įor our latest sprint, we asked ourselves: could the penetration of virtual surfaces feel more coherent and create a greater sense of presence? To answer this question, we experimented with three approaches to the hand-object boundary. With interaction sprints at Leap Motion, our team sets out to identify areas of interaction that developers and users often encounter, and set specific design challenges. But how can we take these interactions to the next level? This is usually handled by having the virtual hand penetrate the geometry of that object/surface, resulting in visual clipping. To make physical interactions in VR feel compelling and natural, we have to play with some fundamental assumptions about how digital objects should behave. When you reach out and grab a virtual object or surface, there’s nothing stopping your physical hand in the real world. He has created multiple experiences such as Weightless, Geometric, and Mirrors, and is currently exploring how to make the virtual feel more tangible.īarrett and Martin are part of the elite Leap Motion team presenting substantive work in VR/AR UX in innovative and engaging ways. Martin is Lead Virtual Reality Designer and Evangelist for Leap Motion. Through a mix of prototyping, tools and workflow building with a user driven feedback loop, Barrett has been pushing, prodding, lunging, and poking at the boundaries of computer interaction. Guest Article by Barrett Fox & Martin Schubertīarrett is the Lead VR Interactive Engineer for Leap Motion. Leap Motion’s Barrett Fox and Martin Schubert explain: In a series of experiments, Leap Motion has been exploring how they can apply visual design to make controller-free input hand input more intuitive and immersive. When your expectation of feedback isn’t met, it can be unclear how to best interact with this new non-physical world. But reaching out to touch and interact in this way can be jarring because there’s no physical feedback from the virtual world. There’s an intuitive appeal to using controller-free hand-tracking input like Leap Motion’s there’s nothing quite like seeing your virtual hands and fingers move just like your own hands and fingers without the need to pick up and learn how to use a controller.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |