The most important parts of Intel’s new Vaunt smart glasses are the pieces that were left out.
There is no camera to creep people out, no button to push, no gesture area to swipe, no glowing LCD screen, no weird arm floating in front of the lens, no speaker, and no microphone (for now).
From the outside, the Vaunt glasses look just like eyeglasses. When you’re wearing them, you see a stream of information on what looks like a screen — but it’s actually being projected onto your retina.
The prototypes I wore in December also felt virtually indistinguishable from regular glasses. They come in several styles, work with prescriptions, and can be worn comfortably all day. Apart from a tiny red glimmer that’s occasionally visible on the right lens, people around you might not even know you’re wearing smart glasses.
Like Google Glass did five years ago, Vaunt will launch an “early access program” for developers later this year. But Intel’s goals are different than Google’s. Instead of trying to convince us we could change our lives for a head-worn display, Intel is trying to change the head-worn display to fit our lives. (...)
One of the Vaunt team’s primary design goals was to create a pair of smart glasses you could wear all day. Vaunt’s codename inside Intel was “Superlite” for a reason: they needed to weigh in under 50 grams. That’s still more than most eyeglasses by a noticeable margin, but Google Glass added an extra 33 grams on top of whatever pair you were wearing. Anything more and they’d be uncomfortable. The electronics and batteries had to be placed so they didn’t put too much weight on either your nose or your ears. They had to not just look like normal glasses, they had to feel like them.
That’s why all of the electronics in Vaunt sit inside two little modules built into the stems of the eyeglasses. More importantly, though, the electronics are located entirely up near the face of the frames so that the rest of the stems, and even the frame itself, can flex a little, just like any other regular pair of glasses. Other smart glasses have batteries that are integrated into the entire stem, “so those become very rigid and do not deform to adjust to your head size,” says Mark Eastwood, NDG’s industrial design director. “It’s very important when you look at eyewear that it deforms along its entire length to fit your head.” (...)
At its core, Vaunt is simply a system for displaying a small heads-up style display in your peripheral vision. It can show you simple messages like directions or notifications. It works over Bluetooth with either an Android phone or an iPhone much in the same way your smartwatch does, taking commands from an app that runs in the background to control it. (...)
Before we get into all that, let’s just lay down the hardware basics. On the right stem of the glasses sits a suite of electronics designed to power a very low-powered laser (technically a VCSEL). That laser shines a red, monochrome image somewhere in the neighborhood of 400 x 150 pixels onto a holographic reflector on the glasses’ right lens. The image is then reflected into the back of your eyeball, directly onto the retina. The left stem also houses electronics, so the glasses are equally weighted on both sides.
So, yeah: lasers in your eye. Don’t worry, though, says Eastwood. “It is a class one laser. It’s such low power that we don’t [need it certified],” he says, “and in the case of [Vaunt], it is so low-power that it’s at the very bottom end of a class one laser.”
The hardware here is all custom, all the way down to the silicon that powers Vaunt — which is Intel-designed, of course. “We had to integrate very, very power-efficient light sources, MEMS devices for actually painting an image,” says Jerry Bautista, the lead for the team building wearable devices at Intel’s NDG. “We use a holographic grading embedded into the lens to reflect the correct wavelengths back to your eye. The image is called retinal projection, so the image is actually ‘painted’ into the back of your retina.”
Because it’s directly shining on the back of your retina, the image it creates is always in focus. It also means that the display works equally well on prescription glasses as it does on non-prescription lenses. (...)
Using a Vaunt display is unlike anything else I’ve tried. It projects a rectangle of red text and icons down in the lower right of your visual field. But when I wasn’t glancing down in that direction, the display wasn’t there. My first thought was that the frames were misaligned.
Turns out: that’s a feature, not a bug. The Vaunt display is meant to be nonintrusive. It’s there when you want it, and completely gone when you don’t. Without a speaker or vibrate mode to notify you, I couldn’t help but wonder if that would mean a bunch of missed information.
Not so, according to Intel’s engineers. Your eyes are very rarely just sitting still. They roam around and see things in their peripheral vision all the time, your brain just doesn’t bother to process and include all that information in your focus. But should there be new information over there, you’d be likely to notice it. (...)
Vonshak was also especially clear about another point: the goal is to do more than just blast notifications into your eyeball. Instead, Intel aims to offer ambient, contextual information when you need it. But since they couldn’t get into specifics just yet, all of the examples were very hypothetical. “You’re in the kitchen, you’re cooking. You can just go ‘Alexa, I need that recipe for cookies,’ and bam, it appears in your glasses,” Vonshak says.
How will you actually interact with Vaunt? That’s also a little unclear. Sometimes the hypotheticals involved voice. Other times it seemed like very subtle head gestures — tracked by the accelerometer — would be key. And in other ways, it seemed like you’re not supposed to interact with it at all, but instead, just trust the AI to show you what you need to know in the moment. One example I heard was getting relevant information about the person who’s calling you (a birthday or a reminder) while you’re on the phone with them.
Whatever the final interaction model will be, it will be subtle and you shouldn’t expect to be doing a lot of pressing and swiping and tapping. “We really believe that it can’t have any social cost,” Vonshak insists again. “So if it’s weird, if you look geeky, if you’re tapping and fiddling — then we’ve lost.”
by Dieter Bohn, The Verge | Read more:
Image: Vjeran Pavic
There is no camera to creep people out, no button to push, no gesture area to swipe, no glowing LCD screen, no weird arm floating in front of the lens, no speaker, and no microphone (for now).
From the outside, the Vaunt glasses look just like eyeglasses. When you’re wearing them, you see a stream of information on what looks like a screen — but it’s actually being projected onto your retina.
The prototypes I wore in December also felt virtually indistinguishable from regular glasses. They come in several styles, work with prescriptions, and can be worn comfortably all day. Apart from a tiny red glimmer that’s occasionally visible on the right lens, people around you might not even know you’re wearing smart glasses.
Like Google Glass did five years ago, Vaunt will launch an “early access program” for developers later this year. But Intel’s goals are different than Google’s. Instead of trying to convince us we could change our lives for a head-worn display, Intel is trying to change the head-worn display to fit our lives. (...)
One of the Vaunt team’s primary design goals was to create a pair of smart glasses you could wear all day. Vaunt’s codename inside Intel was “Superlite” for a reason: they needed to weigh in under 50 grams. That’s still more than most eyeglasses by a noticeable margin, but Google Glass added an extra 33 grams on top of whatever pair you were wearing. Anything more and they’d be uncomfortable. The electronics and batteries had to be placed so they didn’t put too much weight on either your nose or your ears. They had to not just look like normal glasses, they had to feel like them.
That’s why all of the electronics in Vaunt sit inside two little modules built into the stems of the eyeglasses. More importantly, though, the electronics are located entirely up near the face of the frames so that the rest of the stems, and even the frame itself, can flex a little, just like any other regular pair of glasses. Other smart glasses have batteries that are integrated into the entire stem, “so those become very rigid and do not deform to adjust to your head size,” says Mark Eastwood, NDG’s industrial design director. “It’s very important when you look at eyewear that it deforms along its entire length to fit your head.” (...)
At its core, Vaunt is simply a system for displaying a small heads-up style display in your peripheral vision. It can show you simple messages like directions or notifications. It works over Bluetooth with either an Android phone or an iPhone much in the same way your smartwatch does, taking commands from an app that runs in the background to control it. (...)
Before we get into all that, let’s just lay down the hardware basics. On the right stem of the glasses sits a suite of electronics designed to power a very low-powered laser (technically a VCSEL). That laser shines a red, monochrome image somewhere in the neighborhood of 400 x 150 pixels onto a holographic reflector on the glasses’ right lens. The image is then reflected into the back of your eyeball, directly onto the retina. The left stem also houses electronics, so the glasses are equally weighted on both sides.
So, yeah: lasers in your eye. Don’t worry, though, says Eastwood. “It is a class one laser. It’s such low power that we don’t [need it certified],” he says, “and in the case of [Vaunt], it is so low-power that it’s at the very bottom end of a class one laser.”
The hardware here is all custom, all the way down to the silicon that powers Vaunt — which is Intel-designed, of course. “We had to integrate very, very power-efficient light sources, MEMS devices for actually painting an image,” says Jerry Bautista, the lead for the team building wearable devices at Intel’s NDG. “We use a holographic grading embedded into the lens to reflect the correct wavelengths back to your eye. The image is called retinal projection, so the image is actually ‘painted’ into the back of your retina.”
Because it’s directly shining on the back of your retina, the image it creates is always in focus. It also means that the display works equally well on prescription glasses as it does on non-prescription lenses. (...)
Using a Vaunt display is unlike anything else I’ve tried. It projects a rectangle of red text and icons down in the lower right of your visual field. But when I wasn’t glancing down in that direction, the display wasn’t there. My first thought was that the frames were misaligned.
Turns out: that’s a feature, not a bug. The Vaunt display is meant to be nonintrusive. It’s there when you want it, and completely gone when you don’t. Without a speaker or vibrate mode to notify you, I couldn’t help but wonder if that would mean a bunch of missed information.
Not so, according to Intel’s engineers. Your eyes are very rarely just sitting still. They roam around and see things in their peripheral vision all the time, your brain just doesn’t bother to process and include all that information in your focus. But should there be new information over there, you’d be likely to notice it. (...)
Vonshak was also especially clear about another point: the goal is to do more than just blast notifications into your eyeball. Instead, Intel aims to offer ambient, contextual information when you need it. But since they couldn’t get into specifics just yet, all of the examples were very hypothetical. “You’re in the kitchen, you’re cooking. You can just go ‘Alexa, I need that recipe for cookies,’ and bam, it appears in your glasses,” Vonshak says.
How will you actually interact with Vaunt? That’s also a little unclear. Sometimes the hypotheticals involved voice. Other times it seemed like very subtle head gestures — tracked by the accelerometer — would be key. And in other ways, it seemed like you’re not supposed to interact with it at all, but instead, just trust the AI to show you what you need to know in the moment. One example I heard was getting relevant information about the person who’s calling you (a birthday or a reminder) while you’re on the phone with them.
Whatever the final interaction model will be, it will be subtle and you shouldn’t expect to be doing a lot of pressing and swiping and tapping. “We really believe that it can’t have any social cost,” Vonshak insists again. “So if it’s weird, if you look geeky, if you’re tapping and fiddling — then we’ve lost.”
by Dieter Bohn, The Verge | Read more:
Image: Vjeran Pavic