Motion tracking – where the phone is ‘relative to the ’
Environment tracking – detects the size and location of surfaces
Light estimation – what are the current lighting conditions?
Phones initially supported are the Samsung Galaxy S7 onwards and LG 30/+ (with Oreo), Google Pizel (all models) and OnePlus 5. Other manufacturers including Huawei, Motorola, ASUS, Nokia, ZTE, Sony, vivo, and Xiaomi are on board too.
ARCore 1.0 features improved environmental understanding that enables users to place virtual assets on textured surfaces like posters, furniture, toy boxes, books, cans and more. Android Studio Beta now supports ARCore in the Emulator, so you can quickly test your app in a virtual environment right from your desktop.
Google has partnered with a few great developers to showcase how they’re planning to use AR in their apps.
Snapchat has created an immersive experience that invites you into a “portal”—in this case, FC Barcelona’s legendary Camp Nou stadium.
Visualize different room interiors inside your home with Sotheby’s International Realty.
See Porsche’s Mission E Concept vehicle right in your driveway and explore how it works.
With OTTO AR, choose pieces from an exclusive set of furniture and place them, true to scale, in a room.
In China, place furniture and over 100,000 other pieces with Easyhome Homestyler, see items and place them in your home when you shop on JD.com.
Play games from NetEase, Wargaming and Game Insight.
With Lens in Google Photos, when you take a picture, you can get more information about what’s in your photo. In the coming weeks, Lens will be available to all Google Photos English-language users who have the latest version of the app on Android and iOS. Also over the coming weeks, English-language users on compatible flagship devices will get the camera-based Lens experience within the Google Assistant. We’ll add support for more devices over time.
And Lens, powered by AI and computer vision, makes it easier to search and take action on what you see.