A Major Apple patent reveals next-gen hand & eye tracking systems for AR and VR environments that introduces a unique wrist generated menu -…

Posted: June 24, 2021 at 11:17 pm

Today the US Patent & Trademark Office published a patent application from Apple that relates to a computer system with a display generation component that provides virtual reality and mixed reality experiences via a display. Apple's invention further relates to both Macs and a future mixed reality headset using a hand-tracking system for Macs and an eye-tracking system for a headset (HMD). When using a headset, a next-generation user menu with interactive options can be uniquely generated onto a user's wrist and/or palm to assist the user in an AR or VR environment.

In Apple's patent background they noted that the development of computer systems for augmented reality has increased significantly in recent years. Example augmented reality environments include at least some virtual elements that replace or augment representations of the physical world. Input devices, such as cameras, controllers, joysticks, touch-sensitive surfaces, and touch-screen displays for computer systems and other electronic computing devices are used to interact with virtual/augmented reality environments. Example virtual elements include virtual objects include digital images, video, text, icons, and control elements such as buttons and other graphics.

But methods and interfaces for interacting with environments that include at least some virtual elements (e.g., applications, augmented reality environments, mixed reality environments, and virtual reality environments) are cumbersome, inefficient, and limited. For example, systems that provide insufficient feedback for performing actions associated with virtual objects, systems that require a series of inputs to achieve a desired outcome in an augmented reality environment, and systems in which manipulation of virtual objects are complex, tedious and error-prone, create a significant cognitive burden on a user, and detract from the experience with the virtual/augmented reality environment. In addition, these methods take longer than necessary, thereby wasting energy. This latter consideration is particularly important in battery-operated devices. Apple's invention is to remedy this.

Apple's invention covers computer systems with improved methods and interfaces for providing computer-generated experiences to users that make interaction with the computer systems more efficient and intuitive for a user. Such methods and interfaces optionally complement or replace conventional methods for providing computer-generated reality experiences to users. Such methods and interfaces reduce the number, extent, and/or nature of the inputs from a user by helping the user to understand the connection between provided inputs and device responses to the inputs, thereby creating a more efficient human-machine interface.

In some embodiments, the computer system is a desktop computer with an associated display. In some embodiments, the computer system is portable device (e.g., a notebook computer, tablet computer, or handheld device). In some embodiments, the computer system is a personal electronic device (e.g., a wearable electronic device, such as a watch, or a head-mounted device). In some embodiments, the computer system has a touchpad. In some embodiments, the computer system has one or more cameras. In some embodiments, the computer system has a touch-sensitive display (also known as a "touch screen" or "touch-screen display").

In some embodiments, the computer system has one or more eye-tracking components. In some embodiments, the computer system has one or more hand-tracking components.

In accordance with some embodiments, a method is performed at a computer system including a display generation component and one or more input devices, including: detecting a wrist at a location that corresponds to a respective position within a view of a three-dimensional environment that is provided via the display generation component without displaying representations of applications at the respective position within the view of the respective three-dimensional environment that corresponds to the location of the wrist; while detecting the wrist at the location that corresponds to the respective position within the view of the three-dimensional environment that is provided via the display generation component:

Apple's patent FIG. 4 below is a block diagram illustrating a hand tracking unit of a computer system that is configured to capture gesture inputs of the user; FIG. 5 is a block diagram illustrating an eye tracking unit of a computer system that is configured to capture gaze inputs of the user.

Apple's patent FIG. 7A above and FIG. 7D below are block diagrams illustrating user interactions with a computer-generated three-dimensional environment (e.g., including interactions to display and/or move an application, content, or control in the computer-generated three-dimensional environment, and optionally changing the privacy modes thereof).

Apple's patent FIGS. 7F and 7G above are block diagrams illustrating privacy control in a shared computer-generated three-dimensional environment (e.g., including controlling privacy of an application in the shared computer-generated three-dimensional environment based on an owner's hand posture and/or display location of the application).

Hand-Tracking System with micro-gestures: In some embodiments, micro-gestures include a micro-tap input (e.g., the finger tip of a first finger of a hand moves towards and touches down on a portion of another finger of the same hand, or the palm of the same hand, optionally followed by lift-off of the finger tip from the touch-down location), a micro-double-tap input (e.g., two consecutive micro-tap inputs performed by the same first finger on the same portion of the same first hand, with the duration between the two micro-tap inputs less than a threshold amount of time), a micro-drag or micro-swipe input (e.g., movement of a first finger on the surface of a second finger of the same hand in a respective direction (e.g., along the side of the second finger, or across the second finger from the same of the palm toward the back of the hand)), a micro-flick input (e.g., movement of a first finger relative to a second finger of the same hand in a respective direction away from the second finger (e.g., a upward flick, a forward flick, an inward flick, etc.)).

Micro-gesturing was first covered in a patent report we posted back on April first titled "Apple reveals more about Mixed Reality Headset GUI's, how to control menus with Eye Tracking & in-air Micro Gesturing."

In today's patent, Apple further notes that the in-air gesture of the whole hand includes an open hand wave input (e.g., whole hand moving upward, downward, toward the user, away from the user, sideways in front of the user, etc., with the palm open and fingers extended), a closed hand wave input (e.g., whole hand in a first moving upward, downward, away from the user, toward the user, or sideways in front of the user, etc.), a palm opening input (e.g., all fingers moving in union from a retracted state to an extended state), a palm closing input (e.g., all fingers moving in union from an extended state to a retracted state), a push input (e.g., with the palm open and moving away from the user), a pull input (e.g., with the palm open and moving toward the user), a point input (e.g., moving the whole hand toward a target direction with the index finger raised), etc.

In some embodiments, a gaze input is used to select the target of the input, and the in-air hand gesture is used to select the operation that is performed with respect to the target in the third form of the first application.

In some embodiments, characteristics such as the speed, duration, timing, direction, and amount of the movement of the hand are used to determine the characteristics (e.g., direction, amount, speed, etc.) of the manner by which the operation is performed.

Apple's Unique Wrist Menu

In respect to the next-gen wrist menu, Apple notes that in some embodiments, while displaying the menu of selectable options (see #7026 starting in FIG. 7A) at the position within the view of the three-dimensional environment that corresponds to the location on the inner side of the wrist (#7028), the computer system detects a gesture on or proximate to the inner side of the wrist at a location that corresponds to the position of the menu in the view of the three-dimensional environment (e.g., the gesture is a flick gesture on the wrist 7028). In response to detecting the gesture on or proximate to the inner side of the wrist at the location that corresponds to the respective position of the menu n the view of the three-dimensional environment, the computer system displays the menu of selectable options (or a three-dimensional version of the menu) in the view of the three-dimensional environment at a position that is independent of the location of the wrist (e.g., displaying the plurality of selectable options in a dock in the space in the center of the field of view of the display generation component, and the user can interact with the representations in the dock by using air gestures, or using gaze in conjunction with micro-gestures, etc.).

In some embodiments, a similar interaction is implemented for the controls and notifications displayed in the view of the three-dimensional environment at positions corresponding to the back of the user's hand and wrist when the back of the user's hand and wrist is facing toward the user (see FIG. 7D above).

In response to a predefined gesture input on or proximate to the outer side of the wrist or hand at a location that corresponds to the position of the controls and/or notifications in the view of the three-dimensional environment (e.g., the gesture is a flick gesture on the back of the hand or wrist), the computer system displays the controls and/or notifications in the view of the three-dimensional environment (or a three-dimensional version of the controls and notifications) at a position that is independent of the location of the wrist (e.g., displaying the controls and/or notifications in the space in the center of the field of view of the display generation component, and the user can interact with the controls and/or notifications by using air gestures, or gaze in conjunction with micro-gestures).

Apple's patent appears to be a master-patent for their massive project relating to a next generation user interface for AR and VR environments. Apple's patent application is long and micro-detailed that could hours to read and digest. For those wanting to review Apple's patent application 20210191600 titled "Devices, Methods, and Graphical User Interfaces for Displaying Applications in Three-Dimensional Environments," click here.

Considering that this is a patent application, the timing of bringing such a system to market is unknown at this time.

Read this article:

A Major Apple patent reveals next-gen hand & eye tracking systems for AR and VR environments that introduces a unique wrist generated menu -...

Related Posts