Last week, media reports revealed that Apple is working with Sony and we can expect the Japanese tech company’s camera sensors to be implemented in upcoming iPhones. Sony’s Satoshi Yoshihara who is the head of the sensor division shed some more light on the company’s 3D camera sensors and how they can be adopted into AR applications.
Current face recognition technology used by most manufacturers project invisible dots to detect a user’s face in a virtual 3D space. However, Sony’s technology differs drastically in that regard and creates a depth map instead which is capable of measuring a user’s face in a 3D environment by reflecting laser pulses. The company revealed that the technology can work on faces up to 16 feet away.
Phone cameras have come a long way when it comes to taking images and videos, but depth-sensing is something that has been lacking in current-gen technology available on any smartphone. Sony’s new depth-sensing technology is far more advanced than what we have ever seen, and it may make it into facial recognition technology but also in drones, self-driving cars, robotics and more.
Sony’s new 3D depth-sensing technology can not only make facial recognition far more accurate than it is, but it can also reduce the number of parts that go into mobile camera units allowing phones to become sleeker than they already are. iOS and Android are heavily invested in augmented reality, and we can see Sony’s technology being adopted into AR apps as well.
In 2017, there were reports of Apple working on its own 3D camera sensors, but it seems like the tech giant is partnering with Sony instead. Apple has not yet confirmed the deal between both companies and simply revealed that it is conducting negotiations with suppliers.
What do you think about the new depth-sensing camera technology by Sony? Let us know in the comments below. Don’t hesitate to also like and subscribe to our socials on Facebook and Twitter.