HomeNewsKnow how Apple iPhone 12 Pro, iPhone 12 Pro Max can help...

Know how Apple iPhone 12 Pro, iPhone 12 Pro Max can help visually impaired users

ARKit 4 introduces a brand-new Depth API that helps seek detailed depth information.

In what is being regarded as an initiative to alert the visually impaired users while they move around, news is rife that the latest beta version of iOS 14.2 on Apple iPhone 12 Pro, iPhone 12 Pro Max comes with a feature that can help one detect how far the other person is from them.

According to a TechCrunch report, this feature to help visually impaired will be introduced on iPhone 12 Pro as well as iPhone Pro Max running the iOS 14.2. Moreover, it’s ARKit, Apple’s augmented reality (AR) platform for iOS devices, which paves way for this interesting feature.

- Advertisement -

ARKit 4 introduces a brand-new Depth API, creating a new way to access the detailed depth information gathered by the LiDAR Scanner on iPhone 12 Pro, iPhone 12 Pro Max, and iPad Pro. Location Anchors leverages the higher-resolution data in Apple Maps to place AR experiences at a specific point in the world in your iPhone and iPad apps. And support for face tracking extends to all devices with the Apple Neural Engine and a front-facing camera, so even more users can experience the joy of AR in photos and videos.

In the Sample Code Occluding Virtual Content with Disable People, Apple revealed that “You might choose to disable people occlusion for performance reasons if, for example, no virtual content is present in the scene, or if the device has reached a serious or critical thermalState. To temporarily disable people occlusion, remove that option from your app’s frameSemantics. Then, rerun your session to effect the configuration change.”

The tech giant further added that by default, virtual content covers anything in the camera feed. For example, when a person passes in front of a virtual object, the object is drawn on top of the person, which can break the illusion of the AR experience. And to cover your app’s virtual content with people that ARKit perceives in the camera feed, you enable people occlusion. Your app can then render a virtual object behind people who pass in front of the camera. ARKit accomplishes the occlusion by identifying regions in the camera feed where people reside, and preventing virtual content from drawing into that region’s pixels.

Furthermore, it was revealed that AR content realistically passes behind and in front of people in the real world, making AR experiences more immersive while also enabling green screen-style effects in almost any environment. Depth estimation improves on iPhone 12, iPhone 12 Pro, and iPad Pro in all apps built with ARKit, without any code changes.

Arkit 4 also helps detect up to 100 images at a time and get an automatic estimate of the physical size of the object in the image. 3D object detection is more robust, as objects are better recognized in complex environments. And now, machine learning is used to detect planes in the environment even faster.

For the latest gadget and tech news, and gadget reviews, follow us on TwitterFacebook and Instagram. For newest tech & gadget videos subscribe to our YouTube Channel. You can also stay up to date using the Gadget Bridge Android App.

- Advertisement -


Please enter your comment!
Please enter your name here

- Advertisement -

Follow Us

Must Read

Google Docs now has a text watermark tool

Google Docs now has a text watermark tool

Google Docs users may now add text watermarks to every page of their document. The search giant Google has introduced the app with the...
- Advertisement -