iPhone 9 or with 3D camera function, ready for AR era
According to foreign media reports, Apple will introduce a rear-mounted 3D sensor array in the new iPhone. Apple plans to purchase the laser elements for this array from Lumentum. Lumentum is a California-based company from which Apple has purchased the front-mounted true depth lasers currently used in iPhones. It is reported that Apple engineers have been working on rear-mounted 3D cameras for two years, and are currently planning to release at least one product with a 3D camera in the second half of this year.
Of course, Apple isn’t alone in considering this feature in its 2020 flagship phones. The next-generation Galaxy S20+ and S20 Ultra, which Samsung released just last month, are equipped with rear-mounted time-of-flight (ToF) sensors. They are used for live focus (optional blur effect in photos) and quick measurement (allowing users to measure objects in front of them).
However, Apple has made greater strides in developing application programming interfaces and tools for third-party developers and its own software teams to create new experiences and features.
In 2017, Apple introduced a similar front-facing TrueDepth array on the iPhone X. Its key feature is facial recognition, which authenticates a user’s facial scan before unlocking access to personalized files and services on the device.
However, this technology still has a lot of potential untapped. Apple may want to install these sensors on the back of the iPhone to inspire the creation of new and useful applications, while preparing for the arrival of augmented reality AR technology.
Apple CEO Tim Cook has previously said he believes augmented reality will be a watershed moment in some way, similar to the launch of the App Store. There is plenty of evidence that Apple has been developing AR glasses in-house. Earlier this week, 9to5Mac (a site dedicated to Apple news) claimed to have found evidence of a new AR app in leaked iOS 14 code, along with signs that new iPad Pro models will also feature a rear ToF sensor. This new AR app will allow users to get information on products and other items in the surrounding space.
So far, augmented reality apps on the iPhone have relied on the depth difference between the multiple conventional cameras behind the new iPhones to handle depth, but none of that is as accurate as the newer-generation sensor arrays.
Introducing these more precise tools will certainly improve AR capabilities on iPhones, but it may not be enough to open the watershed that Cook predicted. Using AR experiences on mobile phones is still a bit awkward, and it may take a developer gold rush to make AR glasses a success.
The Links: TD320N16S0F LQ9D023 IGBT