I tried Snapchat’s Spectacles AR Glasses and it changed the way I look at smart glasses — here’s why

Snapchat’s new Spectacles AR glasses are the first pair of smart specs that gave me a glimpse of the future of eyewear.

How so? Well, aside from being able to play a virtual game of frisbee with a furry little critter in Niantic’s Peridot Lens game (the developers behind Pokémon Go), I was amazed by how I could share a space with other Spectacles users and see what they see — and vice versa.

From creating AI-generated 3D images right on the AR glasses and drawing smiley faces in the air to dropping virtual sandwiches to Peridot creatures, everything I was doing in augmented reality could be seen in real time by others wearing Snapchat Spectacles.

Not only that, but I was also able to interact with other people’s creations, being able to move around and resize 3D images and even pet others’ furry friends. It was all in a shared space, and most importantly, it all worked in a snap (pun very much intended).

After trying out the latest generation of Snapchat’s Spectacles during an exclusive demo in London, I realized two things: Snap is successfully making shared AR experiences a reality, and the future of smart glasses looks to be a lot of fun.

While this pair of Snapchat Spectacles are made for developers, behind a subscription price of $99 per month, it’s a taste of what’s to come, especially with Meta’s $1,000 smart glasses and Android XR coming down the line. If they are anything like what I’ve seen in the demo, the best smart glasses will be seeing a generational leap for consumers, and soon.

Now, onto my AR shenanigans.

@tomsguide
♬ Vlog lo-fi chill hop ♬(1258569) – Ninja Muzik Tokyo

Spec check

First, a quick look at what these AR glasses bring.

The Snapchat Spectacles are powered by two Snapdragon processors that distribute tasks between them. That brings a lot of power, but you’ll find that battery life only lasts around 45 minutes, and less when using demanding apps (one pair died while I was using it).

Otherwise, they boast a 46-degree diagonal field of view with a 37-pixel resolution, with a see-through stereo display and Liquid Crystal on Silicon (LCoS) miniature projectors. Everything looks smooth thanks to its 120Hz frequency, and there’s even dynamic display brightness and automatic tinting lenses for when you’re outdoors.

(Image credit: Future / Tom’s Guide)

You can also expect crisp spatial audio through its stereo speakers, and six microphones that could easily pick up my voice. Oh, and there’s full hand tracking support, and Wi-Fi 6, Bluetooth and GPS connectivity.

That’s quite a lot to fit into a pair of smart glasses, and as you can see, it’s exactly why they look quite chunky. Again, these Spectacles aren’t aimed at consumers, as they’re mainly to help developers create apps and programs for a more commercial product likely to come down the line.

Despite being thick and weighing 226g (6.4 ounces), they fit quite comfortably on my face, as the weight of the glasses were distributed throughout the body very well.

Shared vision

(Image credit: Future / Tom’s Guide)

During the demo, Snap’s Director of Computer Vision Engineering Qi Pan handily walked me through how Snapchat Spectacles work, and it was impressively intuitive.

I was able to use my hands to navigate around menus with ease, pinching to scroll and selecting apps like I would when using a Meta Quest 3’s hand-tracking feature. But more impressively, I could use the palm of my hand to press menu buttons. It’s only a small thing, but having my hand used as its own form of haptic feedback feels natural, and it makes it easy to get to grips with it all.

Afterwards, I was thrown into the world of fuzzy creatures with Niantic’s Peridot Executive Producer Alicia Berry guiding me along the way. Peridot is an AR-first game for smartphones, but it flourishes even more on AR glasses.

Almost reminiscent of my time playing (and not taking very good care of pets in) Tamagotchi, a Peridot (or Dot, for short) appeared right before my eyes and started galloping across the real-world room. I was able to go over to pet it, have a sandwich in hand while it waited for me to feed it and even play a few rounds of fetch with a frisbee (I’m sure it was annoyed I threw it out of reach many a time).

While this was a fun experience, it truly clicked was when Pan and Berry connected to my AR space and put their animals in my field of view. Here, I could do what I was already doing, but this time with their pets, and they could play, pet or feed with mine, too.

I could see this being great appeal for younger audiences, especially being able to interact with each other’s Dots. But this could be taken outdoors, too, and I can see the fun that can be had walking the streets with a Peridot jumping into view.

Of course, there are limitations with the AR glasses right now, with battery life only lasting up to 45 minutes. That’s not enough time for taking the glasses outdoors, or even using them indoors, but for a next-gen set of smart glasses from Snap, there are fun times to be had using Spectacles — especially with others.

Ideas you can (literally) handle

(Image credit: Future / Tom’s Guide)

Powered by Snap OS, the Spectacles come packed with My AI, a multi-modal generative AI that you can speak to. While it’s able to pull up web results and more, I got to try creating 3D images and see it appear in the real-world space.

Being whipped up via the onboard AI, I was asked to think of the craziest thing I could imagine. Showing my lack of improvisation, apparently the “craziest” thing I thought was a golden robot in a car. Imaginative, I know. Anyway, it only took a few seconds for a 3D image of a cartoon robot in a space car to pop up in the room.

However, like the Peridot multiplayer experience, Pan joined in and got to see my wild imagination in his view. But more impressively, he was able to interact with it, move it around and resize it to his liking.

That got me thinking about how a feature like this could be used in a collaborative workspace, whether it be at a school or an office. Sure, it’s fun to have AI-generated 3D images similar to Apple’s Image Playground for now, but at a larger, more complex scale, it can lead to more creative solutions that everyone can see at once and even interact with to make changes.

After the demo, I got talking to Pan about the features Snapchat expects to see from developers, and he told me that the possibilities are endless. For one, there’s apparently a developer looking into interacting with smart appliances via the Spectacles, being able to turn on and off a light in a room by looking at it and using hand gestures to cause an action. Now that is pretty cool.

With those kinds of capabilities, along with sharing ideas thought up and manifested into AR via AI, there’s a lot I’m looking forward to in these smart glasses.

Outlook

(Image credit: Future / Tom’s Guide)

We’ve seen the latest Snapchat Spectacles in action before, but actually witnessing how it works myself changed my views on smart glasses in general. Down the line, I wouldn’t be surprised to see myself rocking a pair of these specs — albeit with a much less blockier design, akin to the Ray-Ban Meta smart glasses.

Collaboration and sharing a space with other users appear to be the aim of the AR game, and if this is the trajectory that smart glasses are heading to, I’m more convinced that smart glasses will take off. That’s what Snap’s Spectacles are made for, and it’s in the developers’ hands to come out with a plethora of engaging features that will make smart glasses a sure-fire win for consumers.

Anyway, I’ll be waiting for the next time I get the chance to show people more strange AI-generated images through AR glasses, perhaps even some action figures.

More from Tom’s Guide




Source link

Exit mobile version