APPLE HIGHLIGHTS AUGMENTED REALITY AND MORE AT WWDC 2020

Apple’s 20th Worldwide Developers Conference wrapped up last Friday, and with it came the latest and greatest the tech giant has in store for us consumers. As futurists, we at Gravity Jack get fired up about any and every piece of emerging technology, so naturally the announcement of ARKit 4 caught our attention. Several new features will add to the sophistication of augmented reality apps, further expanding use cases and delivering more realistic experiences as a whole. So without further ado, here are the augmented reality highlights from WWDC 2020.

LOCATION ANCHORS

AR models can now be affixed to specific points in the real world. And when we say specific, we’re talking exact longitude, latitude, and elevation. By giving a model precise coordinates, users are delivered a new level of consistency by viewing the AR at the same spot every time. The new GeoTracking Configuration allows us to scale up AR experiences to entire cities, encompassing landmarks, monuments, and other tourist attractions. Definitely a key feature to AR’s growth down the line. 

DEPTH & LIDAR

The latest iPad Pro model truly earns its “Pro” status, most notably gaining a new LiDAR sensor for more accurate environment mapping. Similar to how sonar uses sound, a LiDAR (Light Detection And Ranging) sensor works by firing a slew of lasers at its surroundings then measuring how quickly the light beams are reflected back. The further away an object is, the longer the light will take to come back. In doing so, LiDAR provides a detailed topographic map of its environment. LiDAR technology was historically used for large-scale mapping, like when NASA mapped Earth’s atmosphere, Mars, and other parts of the solar system. 

So why is this such a breakthrough for augmented reality? Because previous mobile AR experiences could only see through the device’s camera and had to work with a limited viewing angle. By generating an accurate depth map with LiDAR, the iPad Pro can precisely place AR models within its environment instead of relying on a rough camera snapshot. The end result: digitally generated content can be placed realistically without shifting as the user walks around. 

FACE & HAND TRACKING

With the introduction of ARKit 4 comes an expanded support for face tracking technology. All devices with the A12 processor (like the 2nd gen iPhone SE) now have access to face tracking. Furthermore, Apple’s Vision network now features tracking for fingers and hands, enabling recognition of complex gestures. Increased recognition means more possibilities for realistic expression, such as motion capture for 3D models

CLOSING THOUGHTS

AR continues its march forward, and with support from giants like Apple, the industry’s future looks as bright as ever. We at Gravity Jack welcome these new additions with open arms, as many of our projects will benefit from enhanced mapping and detection. As new players step in, who knows which barriers we’ll be breaking next? Want to start a conversation? Give us a call anytime!

RELATED ARTICLES