Hands On: Intel's Project Alloy Headset

hands-on-intel-and-39;s-project-alloy-headset photo 1

In one corner of a guest room at the Hyatt Regency in Santa Clara, Calif., there's a spinning tower of molten metal that will spew across the bed if you put your finger in it. But you won't be able to see it unless you're wearing a "mixed reality" headset like Intel's Project Alloy.

First announced last August, Project Alloy is a reference design for an untethered headset that can display a virtual world with elements from your physical environment mixed in. At the Santa Clara Hyatt this week, Intel was showing off Alloy's latest iteration, a headset that looks vaguely like the Sony PlayStation VR but is completely wireless, so you can roam around the physical world with nothing to trip over.

When you first put the headset on, you're immediately transported into an intriguing hybrid of the physical and virtual worlds. While Microsoft's HoloLens or Qualcomm's augmented reality glasses superimpose computer-rendered objects on your normal field of view, with Alloy you initially feel as if you're wearing a traditional VR headset like the HTC Vive or Oculus Rift. You're completely immersed in the virtual world, until you step close enough to a physical object—a hotel bed, for example—which will materialize in the virtual world as if it was magic.

It isn't magic, of course. You can see the bed thanks to Alloy's RealSense camera, which is capable of depth perception, 3D imaging, interior mapping, and object tracking. The data is fed back not to a high-powered gaming PC or PlayStation, as is the case with Oculus or Sony, but instead to a CPU and graphics card contained within the headset. Intel designed the hotel room demo with fairly high filtering of the physical world so that you don't see real objects unless they're about two feet in front of you.

hands-on-intel-and-39;s-project-alloy-headset photo 2

That distance perfectly demonstrates what is perhaps Alloy's most ground-breaking feature: you can see your hands and the rest of your body in the virtual world, and software developers can design interactive elements that respond not to electronic inputs from controllers like the Oculus Touch, but from your actual hands. The end result was that you could touch the spinning tower of metal with a pointer finger and it would immediately deform, with molten droplets flying everywhere. (If you're less adventurous, you also could have also held a pen or other physical object and used it to interact with the spinning tower).

In theory, this opens up augmented reality to a host of non-video game applications that would be awkward or impossible with glasses like the HoloLens or regular VR headsets like the HTC Vive. For instance, surgeons could fire up an immersive simulation of a patient in an operating theater, using their hands to interact with virtual scalpels and other tools.

hands-on-intel-and-39;s-project-alloy-headset photo 3

There are a few downsides, however. Perhaps the most obvious is the development time required to devise realistic-looking virtual worlds that respond to motion-tracking inputs. Creating virtual objects for a HoloLens app is remarkably simple using the Unity game engine, and you're not necessarily worried about their realism because users will expect them to look rudimentary compared to objects from the physical world that they can see through the lens. But with Alloy, hands, feet, and other physical objects are intruders in an otherwise virtual world, and the less realistic that world looks, the less time viewers will want to spend in it.

Unfortunately, the Alloy demo in the hotel room definitely looked cheap. Everything virtual, from the spinning metal tower to a deformable blob on the other side of the room, was pixelated. There was a static snow effect that covered the display, likely a result of the RealSense cameras picking up light disturbances. And the borders of everything physical—like hands, shoes, or the bed—was jagged, as if it had been badly photoshopped.

hands-on-intel-and-39;s-project-alloy-headset photo 4

Intel can't be faulted for all of this mediocrity, since Project Alloy is simply a reference design for other companies to emulate. The content for Alloy-based headsets, which could go on sale by the end of the year, will almost certainly get better over time. And Intel has rapidly improved the hardware since Alloy was announced—there's now an Intel Core i7 inside instead of an M-series processor, and the number of RealSense cameras has been shaved from three to two in order to improve efficiency and battery life.

The fact remains, however, that the industry is a long way off from realizing the full potential of something like Project Alloy. Even Intel, which has pivoted away from desktop and server chips to become one of the VR industry's biggest cheerleaders, readily admits that there is a long road ahead.

"I think that we are on this verge of something exciting about to happen, and yet virtual reality has been there before, where it didn't live up to its promises," Achin Bhowmik, Intel's general manager of perceptive computing, said at the Augmented World Expo here in Santa Clara on Wednesday. "There's a lot of work for us to do."

Recommended stories

Relax Hands On: Intel's Project Alloy Headset stories

Intel's New Core X CPU Comes With 18 Cores

The pricey top-of-the line Core i9 Extreme Edition is the first Intel desktop CPU to breach one teraflop of performance. It has less powerful and cheaper siblings, too.

5 Big Trends From Google I/O 2017

Google I/O 2017 was a continued transformation of the company, from machine learning to greater control of Android. Here's what we learned at the show.

Google Home Can Now Help You Cook

And in case you'd rather daydream about seeing the hit musical Hamilton instead of cooking dinner, Google can help you out with that, too.

More stories