Have you ever seen a dog so adorable or a plant so lush out in the wild that you had to know what it was right then and there?
Snap announced new partnerships on Thursday with the apps Dog Scanner and PlantSnap that will allow Snapchat users to do just that. Snapchatters can identify dogs or plants they encounter in the real world by scanning them right in Snapchat.
When you press and hold on the camera screen in Snapchat, lenses that are relevant to what the camera is pointing at are unlocked. For example, if I point and hold the camera on my dog right now, lenses that put sunglasses or heart eyes specifically formatted for the shape face of a dog appear.
Now, if you point the camera at a particularly Good Boy you see, you can access a lens that tells you what breed the dog is, using the data and A.I. of Dog Scanner, which recognizes nearly 400 dog breeds (my dog would get 100 percent purebred mutt). And if you focus your lens on a tree, bush or bud that catches your eye, you’ll be able to identify 90 percent of known plants and trees with the PlantSnap integration.
Snap announced the new features at the Snap Partner Summit, which it held virtually Thursday.
The ability to identify two of earth’s best things — dogs and plants — through your smartphone, of course already exists; Dog Scanner and PlantSnap are standalone apps. But it’s helpful that the capability comes within Snapchat itself if you’re either someone who uses the app frequently already, or doesn’t want to have to download a new app for each object you want your smartphone to help identify.
Plus, more categories are coming soon. An upcoming integration with the food and cosmetics scanning app Yuka will let Snapchatters unlock nutrition facts when they point and hold the camera at a food item. Snap already lets you point and hold to identify a song through Shazam, solve math problems with Photomath, and identify (and shop for) products sold on Amazon.
The dog and plant integrations are the sort of typically playful and fun feature that Snapchat is known for. However, the lens product also holds opportunity for further monetization for the company, as Snap CEO Evan Spiegel pointed out during a Q&A with reporters. For example, Snap unveiled a partnership with Louis Vuitton that allows users to point and hold on the monogram logo, which then takes users to content about their new collection. It’s easy to see how — similar to the Amazon integration — this could lead to not just brand content and awareness, but shopping.
Snap made some other announcements around lenses for both developers and users Thursday. It’s making more lens development templates available, such as ways to interact with — wait for it — feet (this could enable experiences like virtually trying on shoes).
On the user side, pointing and holding in a neighborhood will now unlock “local lenses,” which lets users actually decorate buildings and other landmarks in AR. It’s kind of like a shared street art experience, in which users build on each other’s creations, that anyone in the physical space can access.
Snapchat’s innovation in AR has helped the company keep its creative edge, even as companies like Facebook continually try to copy it. The biggest trouble with Snapchat’s AR products is keeping track of all the things the app can do in a sometimes difficult to navigate lens ecosystem. But with a new voice search feature and a souped up Activity Bar, also announced Thursday, Snap’s working on that, too.