Project Alloy, Intel's new "merged reality" headset, is scheduled to be available for purchase in the Q4 2017 time window.  If you are hoping to develop a large application for Alloy in time for the launch window then you will want to start considering now about how to set up a stopgap development environment before a proper Development Kit is available.

 

This was an approach commonly taken in the videogame development industry in the past, where developers set up PCs with a specification approximate to what they thought a new game console's would be and then switched to actual development kits later in development once the console platform-owner (e.g Nintendo, Microsoft or Sony) could supply them with one.

 

In this guide, we will look at some useful guidelines to preparing your project idea for development

 

!.  What specification of PC should I target for my development machine?

 

The official specification for the Alloy headset is due to be released to developers sometime around the middle of 2017.  Unlike headsets such as Oculus Rift and HTC Vive, the Alloy headset will not be tethered to a PC by a cable.  Instead, it will contain a full PC board inside the headset.

 

One of the few concrete details available is that the headset will use some form of the 7th generation Kaby Lake processor.  As the specification of the GPU that will provide the graphics for the headset is currently unknown, this means that developers should aim relatively low in regard to the graphics power that their application will require.  If the GPU is more powerful than expected then that will be a pleasant surprise.  But if you design an application that requires a high end video card to run well then it will be much harder to scale the application down to meet a lower specification.

 

This does not mean that you should lower your ambitions.  Instead, you should be aiming to extract maximum performance from the hardware available by creating highly optimized code, art and other project assets.  This is a principle practiced by videogame developers for decades, when their dreams did not quite match the capability of the target hardware.  Indeed, many useful lessons about optimizing for the biggest bang for your processing buck can be learned by looking to the games of the past.

 

My own Alloy development machine is a Skylake 6th generation laptop with 8 GB of memory and Intel HD 520 integrated graphics.  This was my machine of choice because I believe that it is a reasonable approximation of the hardware that may be found in the final Alloy headset's PC board.  My previous development machine was an i3 4th generation Haswell with 6 GB of memory and a 2013-era Nvidia GT 610 video card.

 

The video card, even being 4 years old, was a key factor in the performance of my project.  Once the project was transferred to the Skylake laptop with integrated graphics it slowed down noticeably, even though the processor is superior.  Rather than being discouraged about this, I view it as a positive challenge.  As highly optimized as my code is already, I know that there is still more that I can do to squeeze more performance out of it.  And the better the performance that I can achieve on this development machine, the better it will run on final Alloy equipment if its specification exceeds that of my dev laptop.

 

As an example of where performance design can be made by thinking carefully about your design: in the Unity game creation engine that RealSense is compatible with, the amount of graphics processing required can be reduced by using a method called Static Batching.  This is where you place a tick in a box labeled 'Static' for objects that are stationary and will never move.  Unity will then place all objects that use the same texture into a grouped-together 'batch', meaning that Unity has to draw that object onscreen fewer times and so the overall project should run faster.

 

2.  What control methods will Alloy support for my application?

 

Previous Intel demonstrations and hands-on sessions by developers at events where Intel has presented the in-development headset give us some idea of what to expect.  Via its "inside-out" tracking, Alloy can - like the RealSense Developer Kits - track hand movements / gestures and facial movements.  So if you have experience with developing RealSense applications with the Developer Kit cameras then that knowledge should be relatively easy to adapt for Alloy applications.

 

Alloy has also been shown to be compatible with physical handheld controllers with 'six degrees of motion' - forward, back, up, down, left and right.  Until final hardware is available, using an existing Bluetooth-enabled handheld motion controller such as PlayStation Move with your development PC is likely to be sufficient to prototype such controls.

 

In regard to locomotion, an Alloy application can update the user's position in the virtual environment as they walk through a room with their real-life feet.  If your application will be a sit-down experience though then you may find it easier to assign movement to a hand gesture via Alloy's five-finger detection capability, or to a button on the handheld controller.

 

3.  How can I truly take advantage of this new "Merged Reality" medium of bringing real-world elements into a virtual environment?

 

You may be interested in also reading my article on designing applications with "User Imagined Content" for the alloy headset.

 

Advanced 'User Imagined Content' Design Principles For Intel Project Alloy Applications

 

In conclusion: develop smart and aim high!