Another Dimension of Apple’s Eye Tracking Technology reveals the use of Biometrics and Machine Learning – Patently Apple

Today the US Patent & Trademark Office published Apple's fourth patent application relating to their eye tracking system for 2020 alone. The patent relates to yet another dimension of their advanced eye tracking/eye gazing technology for their future Head Mounted Display (HMD) device. The other three patents covering this technology could be reviewed here: 01, 02 and 03. Today's patent introduces us to how an eye tracking system is able to obtain biometrics of a user using event camera data and then adjust the brightness of the imagery generated onto the HMD display and more.

Apple's invention covers a head-mounted device includes an eye tracking system that determines a gaze direction of a user of the head-mounted device. The eye tracking system often includes a camera that transmits images of the eyes of the user to a processor that performs eye tracking. Transmission of the images at a sufficient frame rate to enable eye tracking requires a communication link with substantial bandwidth.

Various implementations include devices, systems, and methods for determining an eye tracking characteristic using intensity-modulated light sources. The method includes emitting light with modulating intensity from a plurality of light sources towards an eye of a user. The method includes receiving light intensity data indicative of an intensity of the emitted light reflected by the eye of the user in the form of a plurality of glints. The method includes determining an eye tracking characteristic of the user based on the light intensity data.

Apple's eye tracking system using intensity-modulated light sources uses machine learning. This system can perform some very unique functions. For instance, in one case, the one or more light sources modulate the intensity of emitted light according to user biometrics.

For instance, if the user is blinking more than normal, has an elevated heart rate, or is registered as a child, the one or more light sources decreases the intensity of the emitted light (or the total intensity of all light emitted by the plurality of light sources) to reduce stress upon the eye.

As another example, the one or more light sources modulate the intensity of emitted light based on an eye color of the user, as spectral reflectivity may differ for blue eyes as compared to brown eyes.

In various implementations, eye tracking, or particularly a determined gaze direction, is used to enable user interaction such as allowing the user to gaze at a pop-up menu on the HMD display and then choose one specific option on that menu by simply gauging the user's gaze position in order to perform an action.

Apple's patent FIG. 1 below is a block diagram of an example operating environment #100 wherein the controller (#110) is configured to manage and coordinate an augmented reality/virtual reality (AR/VR) experience for the user.

Apple's patent FIG. 4 illustrates a block diagram of a head-mounted device (#400). The housing (#401) also houses an eye tracking system including one or more light sources #422, a camera 424, and a controller 480. The one or more light sources 422 emit light onto the eye of the user 10 that reflects as a light pattern (e.g., a circle of glints) that can be detected by the camera 424. Based on the light pattern, the controller 480 can determine an eye tracking characteristic of the user 10. For example, the controller 480 can determine a gaze direction and/or a blinking state (eyes open or eyes closed) of the user 10. As another example, the controller 480 can determine a pupil center, a pupil size, or a point of regard. Thus, in various implementations, the light is emitted by the one or more light sources 422, reflects off the eye of the user 10, and is detected by the camera 424. In various implementations, the light from the eye of the user 10 is reflected off a hot mirror or passed through an eyepiece before reaching the camera 424.

In patent FIG. 5A above we see an eye of a user having a first gaze direction; FIG. 5B illustrates the eye of the user having a second gaze direction.

In various implementations, the one or more light sources emit light towards the eye of the user which reflects in the form of a plurality of glints which form a pattern. The reflected pattern (and, potentially, other features, such as the pupil size, pupil shape, and pupil center), an eye tracking characteristic of the user can be determined.

The eye includes a pupil surrounded by an iris, both covered by a cornea. The eye also includes a sclera (also known as the white of the eye).

Apple's patent FIG. 9A below illustrates a functional block diagram of an eye tracking system (#900) including an event camera (#910). The eye tracking system outputs a gaze direction of a user based on event messages received from the event camera.

The geometric analyzer #970 receives data regarding detected glints from the glint detector (#940) and data regarding the pupil of the eye of the user from the pupil detector (#960). Based on this received information, the geometric analyzer determines an eye tracking characteristic of a user, such as a gaze direction and/or a blinking state of the user.

(Click on image to Enlarge)

Apple's patent FIG. 9B below illustrates a functional block diagram of an eye tracking system (#902) including a machine-learning regressor #980. Here the glint detector (#940), pupil detector (#960), and geometric analyzer (#970) uses a machine-learning regressor that determines the eye tracking characteristic based on the target-feature and the off-target feature.

(Click on image to Enlarge)

Lastly in patent FIG. 9C below we're able to see a functional block diagram of an eye tracking system (#904) including a gaze estimator (#990). The eye tracking system here includes an event camera (#910). The event messages are fed into a probability tagger (#925) that tags each event message with a probability that the event message is a target-frequency event message.

The probability-tagged event messages are fed into a feature generator (#935) that generates one or more features that are fed into a gaze estimator (#990) that determines an eye tracking characteristic (e.g., a gaze direction) based on the one or more features.

(Click on image to Enlarge)

Apple's patent application 20200278539 that was published today by the U.S. Patent Office is shown being filed back in Q1 2020, though the patent shows that some of the work dates back to a 2017 filing being incorporated into this latest filing.

Considering that this is a patent application, the timing of such a product to market is unknown at this time.

Apple Inventors

Daniel Kurz: Senior Engineering Manager (Computer Vision, Machine Learning) who came to Apple via the acquisition of Metaio. Some of the earlier work on this patent likely came from the Metaio acquisition and revised with Apple team members.

Li Jia: Computer Vision and Machine Learning Engineering Manager that resides in Beijing China. Jia leads a team to develop CVML algorithms for mobile camera applications. Jia also organizes collaboration with Tsinghua University on research projects on computer vision and machine learning.

Raffi Bedikian: Computer Vision Engineer. He worked 5 years over at Leap Motion.

Branko Petljanski: Engineering Manager, Incubation (Cameras)

Read this article:
Another Dimension of Apple's Eye Tracking Technology reveals the use of Biometrics and Machine Learning - Patently Apple

Related Posts

Comments are closed.