Blog
June 28, 2018
A Peek Inside the Niantic Real World Platform

When we built Niantic, we structured the company’s mission around three core values: exploration, exercise, and real-world social interaction. But not in our wildest dreams did we imagine the kind of positive impact our Augmented Reality (AR) experiences would have on our players and the communities where they live. We feel lucky that we’ve been able to combine our passion for technology and gaming to create innovative experiences for all ages.

Today, we are offering a preview of the technology we have been developing: the Niantic Real World Platform. This is the first time we’ve given an update of this nature publicly, and I’m confident it will provide a sense of how committed we are to the future of AR, and to furthering the type of experiences we have pioneered.

Throughout the past year, we have made strategic investments in initiatives focused on AR mapping and computer vision. Recently, we announced the acquisition of Escher Reality, who are contributing to our “Planet Scale AR” efforts. And today, we are announcing that we have acquired the computer vision and machine learning company Matrix Mill, and have established our first London office. It’s through the coordination of these teams that we’ve been able to establish what the Niantic Real World Platform looks like today, and what it will be in the future.

We think of the Niantic Real World Platform as an operating system that bridges the digital and the physical worlds. Building on our collective experience to date, we are pushing the boundaries of geospatial technology, and creating a complementary, interactive real-world layer that consistently brings an engaging experience to users.

Modeling Reality
The Niantic Real World Platform advances the way computers see the world, moving from a model centered around roads and cars to a world centered around people. Modeling this people-focused world of parks, trails, sidewalks, and other publicly accessible spaces requires significant computation. The technology must be able to resolve minute details, to specifically digitize these places, and to model them in an interactive 3D space that a computer can quickly and easily read.

We are also tackling the challenge of bringing this kind of sophisticated technology to power-limited mobile devices. The highest quality gameplay requires a very accurate “live” model that adapts to the dynamics of the world. It needs to accomplish the difficult task of adjusting the model as the environment around the user changes, or as people move themselves–or their phones.

Knitting together a cohesive moving perspective is a challenge, but we are focused on solving it through a blend of machine learning and computer vision–all built on top of reliable and scalable infrastructure.

Understanding Reality
Advanced AR requires an understanding of not just how the world looks, but also what it means: what objects are present in a given space, what those objects are doing, and how they are related to each other, if at all. The Niantic Real World Platform is building towards contextual computer vision, where AR objects understand and interact with real world objects in unique ways–stopping in front of them, running past them, or maybe even jumping into them.

Niantic

In the above GIF, we illustrate how our computer vision algorithm identifies objects and concludes what they are with a quantifiable confidence score. For example, when training our computer vision on a table and chairs, it will identify them and put them in context within a broader space. Ultimately, this allows those objects to be added to an AR vocabulary. The larger the vocabulary, the more understanding we have, and the richer AR on our Real World Platform can be.

Once we understand the “meaning” of the world around us, the possibilities of what we can layer on is limitless. We are in the very early days of exploring ideas, testing and creating demos. Imagine, for example, that if our platform is able to identify and contextualize the presence of flowers, then it will know to make a bumblebee appear. Or, if the AR can see and contextualize a lake, it will know to make a duck appear.

Recognizing objects isn’t limited to understanding what they are, but also where they are. One of the key limitations of AR currently is that AR objects cannot interact meaningfully in a 3D space. Ideally, AR objects should be able to blend into our reality, seamlessly moving behind and around real world objects.

This is where our new team in London has focused its research. Using computer vision and deep learning, they are developing techniques to understand 3D space enabling much more realistic AR interactions than are currently possible.

In the video above, you can see Pikachu weaving through and around different objects in the real world, dodging feet and hiding behind planters. This level of integration into the environment around us is a proof-of-concept that excites us about the future of AR.

Sharing Reality
We’re particularly focused on innovations in AR that serve Niantic’s core mission and values. Today we are revealing innovative new cross-platform AR technology that facilitates high performance shared AR experiences.

Multiplayer gameplay requires the components of Niantic’s Real World Platform to work in concert across multiple users. And, when you add more people and several varying perspectives, these components must work in specific ways in order to create a visually compelling shared experience. In our research, we’ve found that one of the biggest obstacles is latency: it’s nearly impossible to create a shared reality experience if the timing isn’t perfect.

To address this challenge, we have developed proprietary, low-latency AR networking techniques. Thanks to this solution, we’ve been able to build a unified, cross-platform solution that enables a shared AR experience with a single code base.

You can see this in action in the videos below:

This is just a brief glimpse into what we have under the hood of the Niantic Real World Platform. While we are using this technology first for games, it is clear that it will be relevant to many kinds of applications in the future.

Our Real World Platform in the Real World
Because we are so excited about the opportunity in advanced AR, we want other people to be able to make use of the Niantic Real World Platform to build innovative experiences that connect the physical and the digital in ways that we haven’t yet imagined. We will be selecting a handful of third party developers to begin working with these tools later this year, and you can sign up to receive more information here.

Thank you to those who have been on this journey with us, chasing after the future and creating the Niantic Real World Platform. We promise that there is plenty more to come.


—jh

June 28, 2018
Welcoming Matrix Mill to Niantic: Redefining How Machines See and Understand the World

Niantic

Hello everyone,

Today we are announcing the acquisition of London-based company Matrix Mill and its incredibly talented team, led by computer vision and machine learning experts Gabriel Brostow, Michael Firman, and Daniyar Turmukhambetov. Welcome to Niantic!

The Matrix Mill team comes from University College London, where they spent years building and perfecting deep neural networks that can infer information about the surrounding world from one or more cameras. This technology redefines how machines see and understand the 3D world, and more importantly, how digital objects can interact with the real elements of it.

At Niantic, we frequently talk about how in order to augment reality, you need to be able to understand it. The Matrix Mill team has come up with novel ideas that push the boundaries of what machines can process, thinking around occlusions, and seeing the world closer to the way human eyes can. As a result of this hard work, AR experiences can feel more natural to the eye, which is a goal we have squarely in our sights.

We’re excited about Matrix Mill joining Niantic, as it significantly advances our computer vision and machine learning efforts. We are committed to investing aggressively in R&D that can enhance the experiences of our users, both today and in the decades to come. The addition of the Matrix Mill team also allows us to continue to deliver on planet-scale AR, and helps us advance the Niantic Real World Platform.

If computer vision, machine learning, and the development of neural networks are up your alley, and you want to apply that passion to applications that entertain hundreds of millions of people around the world, we’d love to hear from you.


—jh

June 8, 2018
Celebrating World Oceans Day with Impact

World Oceans Day, celebrated every year on June 9, was originally suggested as an international day of celebration and awareness in 1992 by the government of Canada. By 2008, the United Nations had officially recognized it. Since then, people around the world celebrate the day by focusing attention on awareness and recognition of the importance of taking care of our oceans.

Niantic

Just 8 weeks ago, Niantic celebrated Earth Day by working with NGOs around the world to host and run 68 cleanup events in 19 countries. These events brought together more than 4,200 dedicated players and collected 6.5 tons of garbage in aggregate (originally counted at almost 5 tons just 24 hours after the event, you can read the full details here). Our specific focus was to remove as much garbage, and in particular plastic, from local waterways as we could. These cleanup sites not only included oceans, but also rivers, canals and other water travel routes. Diversity is essential, because even if you don’t live near the ocean, trash ultimately travels to and accumulates in our deepest waters. Everyone did an amazing job, and we’re so impressed with the impact that the players and NGOs had on the environment.

In California, Heirs to Our Oceans (H2OO) is a rising tide of young leaders around the globe who are taking the ocean crisis into their own hands by educating themselves and others, bringing hope and solutions to the surface, and creating waves of change that will ensure the health of our blue planet for their generation and for future generations. H2OO organized four cleanup sites for Earth Day 2018: One in Koror, Palau and three in California, USA. Their events were led by incredible young people who ran the collections and held educational talks on subjects such as the importance of waterways, the detriments of plastic, and the #breakfreefromplastic movement. At the cleanup event in the San Francisco Bay Area, the Pokémon GO community worked alongside the Heirs to find plastic straws, filling a 5-gallon bucket full of straws! Several Heirs are planning to create a dress from the on-going straw collection that will highlight the fact that more than 500 million straws are used per day in the USA alone.

Niantic

Across the USA and along the East Coast, Plastic Ocean Project worked to educate participants through field research, implement progressive outreach initiatives, and incubate solutions to address the global plastic pollution problem, working with and for the next generation to create a more sustainable future. POP organized 9 cleanup events for Earth Day 2018 across the USA, including a cleanup event off the coast of North Carolina. Their cleanup events included plastic and waste collection and holding brand audits to understand which companies are responsible for the products that result in the most plastic waste. In addition, the program raised awareness for Hope Spot Hatteras, which was an initiative inspired by the work of Mission Blue and Sylvia Earle, and focused on awareness for the critical offshore ecosystem.

In addition, Umisakura organized a cleanup event for 300 participants in Enoshima, Kanagawa, Japan. They filled up more than 120 bags of litter and also made an effort to collect recyclable cigarette filters. Additionally, the NGO brought together the Pokémon GO community, academics and researchers, environmental experts, fishermen from the Katase fishing port, and local Enoshima residents to plant eelgrass in the sea bed. This particular grass is known to prevent further loss and degradation of marine environments by providing a number of important ecosystem functions, including foraging areas and shelter to young fish and invertebrates, food for migratory waterfowl and sea turtles, and spawning surfaces for fish species. By trapping sediment, eelgrass beds also reduce coastal erosion.

As you can see from these stories, positive impact is greatest when communities come together. We celebrate World Oceans Day to not only show our admiration for our players and the larger community, but also to encourage the continued efforts to reduce plastic creation, consumption, and ultimately waste. If you’re a part of a community that spends time working to reduce the negative impacts that plastic has on our oceans, share it with us by tagging our Niantic social channels and using the #GamesDoingGood hashtag.


—jh