The Developer Showed Control of the Look in iOS 12

A developer named Matt Moss demonstrated in his Twitter account the management of the iPhone X with a glance. He managed to achieve this thanks to a fresh version of the augmented reality of Apple ARKit, debuted with the release of iOS 12.

As you can see in the video, Matt controls the interface of the application he wrote exclusively with the eyes – his head is still. The virtual pointer is moved by the movement of the eyes, and the click occurs when the blink is blinking. Judging by the published video, everything works quite accurately.

Most likely, the interface management smartphone with the help of the view is only feasible if iOS 12 (augmented reality ARKit 2.0) and iPhone X (TrueDepth camera system). The feature demonstrated by the developer is extremely promising: it can greatly simplify the use of the smartphone for people with disabilities. Probably, the potential of managing the iPhone with the help of the view is not fully disclosed, because iOS 12 and, therefore, the ARKit 2.0 platform is currently being tested – with the release, the augmented reality from Apple will surely work even better.

It is hoped that the “apple” corporation will notice the development of Matt (who, incidentally, is still a student), and someday it will appear in iOS – at least as one of the functions of the section “Universal Access” (analog of “Special Opportunities” on Android).

For more stuff visit our site techverses.com and discover what you want.

Add a Comment

Your email address will not be published. Required fields are marked *