If using your thumbs to use your iPhone is your eyes.
Matt Moss was attending Apple’s WWDC as a scholarship student, reports Mashable, when he realized that there were many opportunities to be had with ARKit 2. He noticed one quality in particular. “I saw that ARKit 2 introduced eye tracking and quickly wondered if it’s precise enough to determine where on the screen a user is looking,” he explained to Mashable over a Twitter direct message. “Initially, I started to build the demo to see if this level of eye tracking was even possible.”
This technology could be highly beneficial to people who live with disabilities, pointed out Moss. “Once the demo started to work, I began to think of all the possible use cases, the most important of which being accessibility,” he said to Mashable.”I think this kind of technology could really improve the lives of people with disabilities.”
Mashable pointed out that there is some concern that advertisers would take advantage of the eye-tracking tools, but it seems like the benefits would outweigh the negatives.