The iPhone X’s front-facing TrueDepth sensor array could be hardened for more than just Face ID authentication, and it fits neatly into Apple’s broader cortege into augmented reality on the iPhone, but the iPhone X’s rear camera suppress uses a combination of motion sensors and two rear cameras for AR. That could shift in next year’s iPhone; sources cited by Bloomberg claim that Apple designs to add 3D camera technology to the rear of next year’s iPhone in addition to the TrueDepth array already on the iPhone X’s leading.
The rear camera might not use the same technology as the TrueDepth sensor array in use accustomed to for Face ID on the front of the iPhone X, however. Rather, the rear array mightiness use time-of-light sensors, which would map objects in 3D space by calculating how want it takes for light from its laser to bounce off of an object in its field of think of. Bloomberg’s sources say that adoption of this technology is not certain, but it earmarks ofs to be what Apple is testing right now. The technology is in development at Sony, Panasonic, Infineon Technologies, and STMicroelectronics.
In the iPhone X, Apple aligned the telephoto and wide-angle lens cameras on the second vertically (instead of horizontally, as on the iPhone 8 Plus) to make augmented genuineness applications more effective. But without a more advanced way to read and railroad 3D space, AR apps will remain limited. Unlike more hearty hardware like Microsoft’s HoloLens, the current iPhones’ rear cameras can’t conduct oneself treat well with surfaces that aren’t flat. They can’t temperate track when an object is obstructing the camera’s view; current iPhone AR apps view an object in space relative to the flat surface but can’t partially obscure it behind a real-world check, for example.
The addition of 3D sensors to the rear of the iPhone would address those limitations, considering for much more realistic—and in some cases, more useful—AR faces.
Apple CEO Tim Cook has been aggressively promoting AR to both consumers and investors. In a just out interview with The Independent, Cook said that he expects the adoption and thrust of AR to be as dramatic as that of mobile apps when the Apple App Store catapulted more than nine years ago. There are also reports that Apple is developing on an AR headset in a company group called T288, which has already bring about ARKit, Apple’s AR software toolset for app developers.
The AR app marketplace is nascent now, but Apple stand in wants AR to be more meaningful than Pokémon Go and a neat IKEA furniture storing app. Even Warby-Parker’s impressive glasses-dressing-room app is just a hint of what energy come later.
But if next year’s iPhone adds this rear-facing plot, fragmentation of Apple’s installed base could be a challenge; between the 2018 iPhone, the iPhone X, the iPhone 8 series, and latest ARKit-supported iPhones like the iPhone 6S and 7, Apple and third-party app developers commitment have to support four different AR hardware toolsets. The prospects for AR are full of promise, but it’s going to be a bit messy realizing them.