Apple Wins Patent For Augmented Reality Mapping System For iPhone

3rockAR

Apple Wins Patent For Augmented Reality Mapping System For iPhone

We’ve known for quite a while that Apple has been working hard on implementing augmented reality features into their products. This morning, Apple has finally been granted patent for “Augmented reality maps”.

 

As published by the U.S. Patent and Trademark Office, Apple’s U.S. Patent No. 9,488,488 for “Augmented reality maps” describes a mapping app capable of tapping into iPhone’s advanced sensor suite to present users with real-time augmented views of their surrounding environment.

The patent (acquired from Flyby Media Inc. Regents of the University of Minnesota) titled “Visual-Based Inertial Navigation” covers indoor mapping services. The technology supplements and compliments that which Apple acquired from startup WiFiSLAM back in 2013. The patent covers indoor mapping being communicated on a smartphone or VR headset using VR techniques and unique guidance Virtual Path Indicators that are presented in our cover graphic. The accuracy of inertial navigation is down to mere centimeters.

Today’s patent notes that “Visual-based inertial navigation systems rely on information obtained from images and inertial measuring devices in order to achieve localization and/or mapping. Since visual-based inertial navigation systems do not require signals from GPS or cell towers, such systems may be used indoors where GPS and cell signals cannot reach or are unavailable due to interference.

Furthermore, visual-based inertial navigation systems enable very high position accuracy, e.g., on the order of centimeters. However, visual-based inertial navigation systems are typically computationally intensive as they need to process large amounts of image data acquired from an image detector, such as a camera, and inertial readings generated by an inertial measurement unit (IMU), all in real-time. In addition, to achieve highly accurate measurements of position, a history of information related to previous poses (positions and orientations), inertial measurements and image features is typically stored, thus requiring devices to use a substantial amount of memory and consequently large computation time to process this information.”

Using the newly gleaned information, the app is able to intelligently determine an approximation of the displayed live video scene, including field of view. From there, the captured image, or live video feed, is visually augmented with vital mapping information downloaded from an offsite server.

Certain embodiments offer simple POI identification, for example the name of buildings and monuments, though the system allows for more complex computations. In some examples, Apple’s invention generates interactive route guidance complete with onscreen path indicators overlaid onto the live streamed video. The method can also pull data from the cloud to alert users that they are heading down a one-way street, for example.

Depending on device orientation, the system is capable of switching display formats from live video to a bird’s-eye view, making transitioning into and out of AR mode a seamless experience.

Share this post

Leave a Reply

Your email address will not be published. Required fields are marked *