Enlarge / The cameras on iPhones are getting (selectively) smarter.Samuel Axon

Apple introduced several of the headlining features of its upcoming iOS 13 during WWDC, but people playing with the closed beta version have uncovered some additional tools. One newly found addition is FaceTime Attention Correction, which adjusts the image during a FaceTime video call to make it look like a person is looking into the camera rather than at their devices screen.

In practice, that means that while both you and your contact are looking at each others faces, youll both appear to be making direct eye contact. Mike Rundle and Will Sigmon were the first to tweet about the find, and they describe it as uncanny, “next-century shit.” Another beta tester, Dave Schukin, posited that the feature relies on ARKit to make a map of a persons face and use that to inform the image adjustments.

Guys – "FaceTime Attention Correction" in iOS 13 beta 3 is wild.

Here are some comparison photos featuring @flyosity: https://t.co/HxHhVONsi1 pic.twitter.com/jKK41L5ucI

— Will Sigmon (@WSig) July 2, 2019

The feature appears to only be rolling out to the iPhone XS and iPhone XS Max with the current beta testing. It will get a wider release to the general public when iOS 13 officially goes live, which will likely be sometime this fall.

Apple has been introducing more and more features centered on automatically changing images. It has been giving its cameras tools like Smart HDR, which analyzes and composites multiple frames for the "best" shot or automatic reductions in the effect of shaky hands. Usually, these tools are optional, although you may need to dig around in your devices settings to make sure the tools are off rather than on by default.

Its a slick application of Apples augmented reality tools, which are admittedly impressive and powerful. But how many people are clamoring for this feature? Is there any real benefit to making it seem like were staring into each others windows to the soul when we FaceTime?

How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.

Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNRead More – Source

[contf] [contfnew]

Ars Technica

[contfnewc] [contfnewc]