Snapchat's augmented reality lenses can span whole city blocks

The company's latest tools look to open up collaborative, larger-scale AR and more.

Scott Stein Editor at Large
I started with CNET reviewing laptops in 2009. Now I explore wearable tech, VR/AR, tablets, gaming and future/emerging trends in our changing world. Other obsessions include magic, immersive theater, puzzles, board games, cooking, improv and the New York Jets. My background includes an MFA in theater which I apply to thinking about immersive experiences of the future.
Expertise VR and AR | Gaming | Metaverse technologies | Wearable tech | Tablets Credentials
  • Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps
Scott Stein
4 min read

Snapchat's Local Lenses can paint whole city blocks with AR.


Snapchat, like many companies, is pursuing augmented reality at a fever pitch, with AR-enabled smartglasses on the horizon. The company's newest AR tools, introduced at its developer summit Thursday, are going even further, introducing world-mapping, collaborative AR capabilities into city-block-sized experiences. They're called Local Lenses and promise to transform whole real-life neighborhoods into interactive art pieces.

Snapchat's previous Landmark Lenses worked at specific, iconic locations, such as New York's Flatiron building, adding augmented reality visuals on top of it for sharable photo and video. Some Snap Lenses already transform areas with extra effects, like lava or water. Local Lenses will build a map of a larger area in what's called a "point cloud," which will let lots of people walk around, see and create in the same visually changed version of reality.

Collaborative world-spanning AR is being pursued by many companies at the moment, including Microsoft, Apple, Google and Pokemon Go developer Niantic, and is clearly the next step on the path to a round of future smartglasses.

Snapchat is making several other moves into AR and world-scanning: It's opening up its face- and world-transforming lens tools to other apps (the MLB was announced as a partner), meaning there could be other places where Snap's wild effects start appearing, besides just Snapchat.

Additional tools will let Lens creators add their own neural nets and AI into their own creations, meaning a whole range of possible reality-bending effects -- or games, or shopping tools, or world scanning, or even works of art. The photo filter company Prisma is one of Snapchat's machine learning Lens early partners, seen in the video above. Snapchat's also adding some Google Lens-like reality scanning to recognize plants and dogs.

But the larger-scale mapping of reality into AR seems like the most surprising development, even though Snapchat hinted at these directions last year.

According to Snap's Camera Platform SVP Eitan Pilipski, Snap's Local Lenses are aiming for more privately owned, larger-scale locations first, such as baseball stadiums and museums. But Snap's promo video shows people painting and walking through a transformed city street, which Local Lenses will allow. Artists could unlock it for collaborative art, or activists for awareness, or there could be historical tools, like a city guide.

There's also a question of how many will access these tools, who will curate them... and how collaborations will be moderated.

"For the most part, when we are announcing a new location or new landmark, it's usually associated with a creative direction that we're trying to inspire -- we're working with an artist, we're working with a partner that we basically want to showcase, like a narrative," Pilipski says of how future larger-scale AR will work. "We are ultimately providing tools for our partners to be able to ultimately activate those places." 

Pilipski emphasizes that right now, point clouds of physical spaces will be created for areas that communities have publicly authorized. "At the same time, we want to make sure businesses can also activate their own Landmarkers, because they have the rights to do that."

There's a book I read last year by Tim Maughan called Infinite Detail, which imagined a city in an age where cities were blanketed in layers of augmented reality. Neal Stephenson's 2019 book Fall also imagined worlds where different social bubbles end up subscribing to different AR filters that interpret reality differently. The challenge of blending AR with real-life places could lead to challenges: Who has the right to transform a place, and would it be seen as inspiring or offensive?

"Every place is unique," says Pilipski. "For places we have launched [AR] that have a significant meaning, like, let's say, a holy place or a historical monument, we are going to make sure that we are basically covering all the bases and are really thoughtful in terms of activating those locations. 

"That's the approach we're taking for every Lens that we promote ... we're making sure we're channeling as many diversified opinions around doing something like that because we want to do something that hopefully doesn't harm our community. At the same time, we will make mistakes. And when that happens, and when someone tells us, hey, this is probably not the best idea, we will take the right necessary action to react on that."

Will this mean public spaces increasingly become areas where people live in layers of AR, and are there any spaces that won't be mapped? "As far as I know, regulation or policymaking in terms of preventing you from going and creating a point cloud or activation of AR against public places ... we are not aware of such guidance," says Pilipski, noting, "if it's a private place, it's a different story.

"At the same time, we're very thoughtful and very sensitive to that. If you look at the Landmarkers that we have launched -- I think now there are more than 20 -- we made sure that we're doing that in a way that empowers the community, but at the same time we're trying to be sensitive and not do something too crazy."