Getting started with Snap AR Spectacles: an ambitious and impractical start


It doesn’t take long to understand why Snap’s first real AR glasses aren’t for sale. The overall design is the highest quality of any stand-alone AR glasses I’ve tried, and they make it easy to jump quickly into a variety of augmented reality experiences, from a multiplayer game to a virtual art installation. But the first pair given to me in a recent demo overheated after about 10 minutes, and the screens are so small that I wouldn’t want to look at them for a long time, even if the battery allows it.

Snap is aware of the limitations. Instead of releasing these glasses publicly, it treats this generation of glasses as a private beta. The company has distributed pairs to hundreds of its augmented reality creators since announcing the glasses in May and recently made a few notable software updates based on user feedback. “It was really about putting technology in the hands of real people and doing it in a way that would allow us to maximize our learning from their use experiences,” Bobby Murphy, Co-Founder and CTO by Snap. , said of the deployment.

After months of asking for a demo, Snap invited me and a handful of other journalists to try them out in collaboration with Goal Day, Snap’s annual AR Creators Conference taking place virtually this week. Guided by Snap employees in a backyard in Los Angeles, I tried out a wide range of augmented reality experiences in the glasses, including a zombie chase, a game of pong, a solar system projection, and a interactive artwork that used basic manual tracking.

Demos have shown me that Snap has an ambitious, long-term vision for where AR is heading. The hardware also highlighted the technical limitations that keep mainstream AR glasses at bay.

Like the previous versions, these AR glasses feature a bold design. The narrow, sharp frame has a similar aesthetic to Tesla’s Cybertruck, something not lost on product designers at Snap, and they come with a sturdy, magnetized case that can be turned into a charging stand. .

The glasses are light to wear, with flexible sides that can bend the head enough to accommodate prescription glasses underneath. (Corrective lenses are available for augmented reality creators who applies and receive a pair.) They include stereo speakers, built-in Wi-Fi, a USB-C port for charging, and dual front cameras for capturing video and detecting surfaces.

The biggest limitation I noticed was the battery, which only lasts for 30 minutes of use. Snap didn’t try to hide this fact and had several pairs ready to be traded for me.

The AR effects, which Snap calls Lenses, are projected by a pair of dual waveguide screens that sync with Snapchat on a paired mobile phone. Besides the battery, the main drawback of these glasses is the small size of the screens, which covers about half of the physical lenses. Due to the small field of view, the AR effects I tried looked better after the fact in actual size on a phone screen, rather than in actual glasses. Even still, the WaveOptics waveguides were surprisingly rich in color and clarity. The 2,000 nits of brightness of the screens means they are clearly visible in sunlight, a tradeoff that seriously affects battery life.

Since Snap announced these Shows earlier this year, it has added new software enhancements. To maximize battery life, an endurance mode automatically turns off screens when an AR lens, such as a scavenger hunt game, is running but not actively in use. The lenses can be adapted to specific locations based on a GPS radius. An upcoming feature called Custom Landmarkers will allow people to overlay lenses on local landmarks persistently so others wearing glasses can see them.

Another new software update brings the connected lenses to the glasses, allowing multiple pairs to interact with the same lens when sharing a Wi-Fi network. I tried a few basic multiplayer games with a creator of Snap. AR named Aidan Wolf, including one he created that lets you shoot energy orbs at your opponent with the capture button on the side of the frame. The pairing system still needs some work as it took a few tries to sync our glasses to play the game.

None of the lenses I tried blew me away. But a few have shown me the promise of how convincing AR glasses will be once the hardware gets more advanced. The rudimentary hand tracking was limited to a lens I tried that allowed me to spot different parts of a moving work of art with specific gestures. Assuming the hand tracking improves over time, I can see this is a key way to control the glasses. In one of the other more impressive experiments, I placed persistent location markers around the backyard and then ran through them.

Most of the lenses I tried seemed to be the basic proof of concept I’ve seen in other AR headsets over the years and not experiences that would force me to purchase these glasses if they were available in the market. ‘purchase. But for glasses that have been around for less than a year, it’s clear that creators will imagine some interesting lenses as future software and hardware improve. I’ve seen a few early concepts online that are compelling, including exercise games, utility use cases like see the city you are in on your trip, and AR food menus.

Here are some lenses I captured during my demo:

The main visual interface of the glasses is called Lens Carousel. A touchpad on the side of the frame uses gestures to navigate in and out of lenses, view recorded footage, and send them to Snapchat friends without removing the glasses. You can also use your voice to signal a goal. Ways to control future Spectacles will likely include eye tracking and more robust manual tracking – technologies Snap is already exploring.

A dedicated button on the side of the glasses frame is for Scan, Snap’s visual search feature that was recently introduced to Snapchat’s main app. I used it to scan a plant on a table and my glasses recommended trying a few plant related goals. Like Scan in Snapchat, its functionality and ability to recognize objects is quite limited at the moment. But if this continues to improve, I might see scanning as a core feature for Spectacles in the years to come.

Meanwhile, the technology that powers the lenses continues to advance. At Lens Fest this week, Snap is announcing a host of new tools to make Lenses smarter, including a music library from top music labels and the ability to extract real-time information from external partners like the crypto trading platform. FTX, Accuweather, and iTranslate. A new real-world physics engine and Snap calls World Mesh software allow Lenses to interact more naturally with the world by moving with the laws of gravity, reacting to real surfaces, and understanding the depth of a scene .

Like Meta, Snap sees AR glasses as the future of computing. “We have been very interested and invested in AR for several years now because we see AR as the ability to perceive the world and render digital experiences in the same way that we naturally observe and interact with our environment as people. “Bobby Murphy tells me.” And I think that’s really in stark contrast to how we use a lot of technology today. “

Murphy won’t say when AR Spectacles will be ready to be sold publicly, but Meta and other tech companies have signaled that consumer-ready AR glasses aren’t coming anytime soon. “We fully understand that it will still take a number of years,” Murphy said, citing battery and display technology as the two main limitations.

While the technology to make quality AR glasses a reality is still under development, Snap is already betting its future on AR in the age of the mobile phone. According to Murphy, “Snap’s top priority now as a business is to truly support and empower our partners and our community to be as successful as possible through AR.”

Snap claims to have over 250,000 lens designers who have collectively made 2.5 million lenses which have been viewed a staggering 3.5 trillion times. 300 designers have created a Lens that has been viewed over a billion times. “We’re building this really crazy augmented reality delivery system that doesn’t exist anywhere else,” says Sophia Dominguez, Head of AR Platform Partnerships at Snap.

Now the company is starting to focus on ways to help lens makers make money, including a new marketplace that allows developers of apps using Snap’s camera technology to pay directly. creators to use their lenses. Viewers of a Lens can send its creator an in-app gift, which they can then redeem for real money. And for the first time, a Lens can include a link to a website, allowing a designer to link to something like an online store directly from AR.

How about letting users pay for Lenses directly? “It’s something we’ve definitely thought about,” Murphy says. He calls NFTs a “very, very fascinating space” and “a good example of digital assets. [and] digital art having some kind of real, tangible value.

Snap doesn’t like to talk about its future product roadmap, but Murphy is clear that “new updates to our hardware roadmap” will continue to come “quite often”. In the meantime, he’s working to woo the creators of AR long before his glasses are ready for prime time. While there is no guarantee that Snap will be a major player when the technology is finally ready, for now, it has a head start.