Apple’s new iPhones launch this week, and unlike last year, every one of the new devices comes equipped with the TrueDepth sensor array originally found in the iPhone X. Most consumers who are interested in Apple’s products know that piece of technology drives Face ID (an authentication method by which you log into your phone just by showing it your face) and Animojis, those 3D animated characters in Messages that follow your facial expressions.
But Apple and the developers who make apps for its platforms have more applications for the 3D sensing tech planned in the future, and consumers might not be aware of them. In this video, Ars Technica’s Valentina Palladino and iOS app developer Nathan Gitter talk about how TrueDepth works, what exciting things it might be used for in the future, and what users have to look out for in terms of privacy and security concerns.
Gitter made a game for iPhones called Rainbrow that allows you to play by moving your eyebrows. He talks through which existing applications of the tech excite him and which ones he’s most looking forward to as more developers tap into the system. Specifically, that means accessibility and sentiment analysis either in apps or—and this is where users have to be cautious—advertising.
Apple’s policies for its App Store forbid developers from using the technology for ads, but in the long run, you’ll see tech like this in places besides Apple’s store. And developers can still ask you for access to your face data for your own use. TrueDepth is cool technology, but as always, you should be careful about what you opt into.
If you’re already well-versed on how this technology works and what its applications are, great. But if not, check out the video—Valentina and Nathan explain it succinctly for a wide audience.
Listing image by Valentina Palladino