(iTers News) - Seeing and capturing objects in the 3D depth camera lens and sensor is a better reference to create better digital representation for augmented reality, or AR applications. As 3D depth camera sensors will likely be a norm on a growing number of mobile devices, AR software developers stand better chance to create more powerful, life-like  AR apps. 


Metaio, a maker of augmented reality, or AR software, leads the pack unveiling a Metaio SDK that support new 3D depth sensing cameras.


“Smartphones and tablets have historically made use of single, “2D” cameras primarily intended for image capture, but as smart devices have become more powerful, we are demanding more and more from the optics of these devices,” said Peter Meier, CTO and co-founder with Metaio.


“Recent announcements from the likes of Google and Intel indicate new devices are hitting the market that can “see” the world in 3D via what are known as RGB-D (red, green, blue + depth) sensors. “With the ability to understand depth information, mobile devices will become significantly more powerful when it comes to Augmented Reality and computer vision tasks”, added he. 


Here comes a video demonstrating how two innovative devices interact with their built-in depth-sensing cameras to create better digital representations of real and physical world on Metaio’s AR apps. Both of the two devices are a Windows OS PC tablet, and an iPad featuring the new Structure Sensor from Occipital.


Occipital’s Structure Sensor is one of the first devices to be supported in the Metaio SDK.


Occipital CEO and co-founder Jeff Powers said, “We share a common goal with Metaio of allowing developers to create powerful and convincing 3D and AR experiences. It's why we created the Structure Sensor. Metaio’s support of the Structure Sensor and SDK will bring added realism to AR experiences with real world scale and occlusion."


The video illustrates how added 3D sensors make AR much more powerful by virtually eliminating the need for markers in many use cases, and even more importantly, solving the “occlusion” problem where virtual data is not correctly rendered into the real environment in a natural way. Use cases include Augmented Reality gaming where digital objects react and interact with the physical environment, accurate indoor navigation that does not require a GPS signal and scanners that can extract 3D models from the environment as easily as shooting traditional video.


Metaio’s flagship SDK will be updated to version 6.0 this month and will support depth sensor input from devices including the Occipital Structure sensor and other devices expected to hit the market early 2015.


“Knowing that the likes of Google, and Intel are heavily investing in depth-sensing camera devices, we made sure our SDK is prepared for the next big surge of innovation that this hardware provides developers” says Peter Meier. Allowing for development on iOS, Android and Windows PC, the Metaio SDK is the most powerful Augmented Reality developer tool on the market, featuring the most advanced object tracking technology available.




 


저작권자 © KIPOST(키포스트) 무단전재 및 재배포 금지