In the recent WWDC 2023 event, Apple unveiled its long-hoped-for mixed reality headset branded as Apple Vision Pro. Introduced by Tim Cook as a spatial computer, Vision Pro welcomes you to a virtual world full of opportunities to work, interact, and relax.
To ensure its supremacy, the creator has designed the headsets with the most innovative features. For instance, Apple ditched hardware controlling to rely on hand gestures and eye tracking to interact with objects in the augmented world.
If you want to know what built-in hand and eye gestures are available for Apple Vision Pro, this article will suffice!
How to use Hand gestures to control Vision Pro
Apple has provided 6 hand gestures that will let you control the Vision Pro software easily. Let’s check them out here:
- Tap: Tapping your index finger and thumb together instructs the headset to click on a virtual object appearing in the current display. You can call it a pinch. In addition, it is synonymous with how you tap on the screen of your iPhone.
- Double Tap: Tapping your index finger and thumb together twice signals a double-tap hand gesture.
- Pinch and Hold: This gesture is equivalent to a tap and holds on your iPhone. It can do tasks like highlighting text.
- Pinch and Drag: With Pinch and Drag, you can scroll (vertically or horizontally) the windows or move them across the virtual space. The faster you move your hand, the quicker the scrolling will be.
- Zoom: Zoom will require you to involve both your hands. All you need to do for a zoom-in is pinch your fingers together and then pull your hands apart. A pushing gesture will let you zoom out. In addition, you can alter the size of open windows simply by dragging at the corners.
- Rotate: To rotate virtual objects, you must pinch the fingers (both hands) together and rotate the hands. Simple.
Important Vision Pro gesture details
After learning about the 6 hand gestures to control the Vision Pro headset, you must also understand a few more intricate details about the gadget.
Role of eye movements
Hand gestures will work in cooperation with your eye movements. This will be facilitated by the powerful cameras placed in the Apple Vision Pro.
- Per Apple’s announcement, Vision Pro has a stack of 12 stunning cameras and five sensors to track hand gestures, eye movements, and real-time 3D mapping of the surrounding environment.
- Two cameras will support a display with over a billion pixels per second to showcase a clear virtual world, while the other lenses will help in hand tracking, 3D mapping, and head tracking.
- Interestingly, eye positions will play an important role in pointing at the objects or tasks you are willing to achieve through hand gestures. For instance, looking at a particular object on the virtual screen will target and highlight it. You can then use hand gestures for more detailed output.
Further, you don’t need to keep your hands always in midair while operating Vision Pro. Feel free to place them in your lap and avoid tiring your hands.
In fact, Apple is motivating people to avoid grand gestures that could strain their hand muscles. The robust camera and sensor setup installed on the headset can precisely and accurately track even the lightest of your motions.
With Vision Pro, you can manipulate and select objects in proximity or away from you. However, Apple believes you might prefer using big gestures to control the virtual space in front of you. Surprisingly, you can explore and use your fingers to use a virtual object. For instance, if you want to scroll through your Safari window, you can reach your hand out and scroll rather than use hand gestures.
Though not tested by the developers, Vision Pro is also expected to support hand movements like typing in the air. Everything will work together to give a realistic touch to the virtual space. Suppose you are drawing a cat virtually. For this, you will look at your preferred point at the display, pick a tool with your hand, and draw using hand gestures. With your eye movements, you can move the cursor around.
Room for creativity
Apart from the given-above six gestures, developers can custom-create multiple gestures of their choice for their apps. However, for this, the developers must ensure that their customized gestures are unique from the main ones and do not add pressure to their hands during work.
Apple Vision Pro will also support Bluetooth keyboards, trackpads, mice, and game controllers. In addition, the device has voice-centric search and dictation tools.
People lucky enough to use the Vision Pro firsthand have unanimously given a big ‘thumbs-up’ to Apple’s creativity. The headset strives to incorporate fluid touch controls using fingers, as on iPhones and iPads. It’s intuitive, intelligent, and immersive like other Apple products, but with added credit points.
- New iOS 17 Accessibility features: Assistive Access, Personal Voice, and Live Speech
- watchOS 10 new features and supported devices
- Download the official macOS Sonoma wallpapers here
Readers like you help support iGeeksBlog. When you make a purchase using links on our site, we may earn an affiliate commission. Read more.
Srishti is an avid writer who loves exploring new things and letting the world know about them through her words. With a curious mind, she will let you move through the nooks and corners of the Apple ecosystem. When not writing, you can find her gushing over BTS like a true BTS Army would.