Skip navigation
1 2 Previous Next

Intel® RealSense

20 Posts authored by: MartyG

Hi everyone,


Though this is not a RealSense product, you may be interested in considering Intel's new Alexa-compatible Speech Enabling Developer Kit as a part of a project that you are developing that contains RealSense elements.  It is listed now for pre-order on the Intel Click store.  Please note that it currently only ships to the United States.


Intel® Speech Enabling Developer Kit - Intel® Speech Enabling Developer Kit


SDK 2.0 Build 2.81 Released

Posted by MartyG Nov 1, 2017

Hi everyone,


Build 2.81 of RealSense SDK 2.0 is now available for download.  The latest feature in this version is:


rs2_start_processing_queue - "a convenience function that lets the user target the output of a processing block (for example align) directly into a frame_queue.  This helps in languages where function pointers are not available, such as LabView".


Full Mac OSX support for the complete range of SDK features is not in this build, although there is OSX support for color, depth and IR streams.


librealsense/ at master · IntelRealSense/librealsense · GitHub


The release notes and download links can be found here:


Releases · IntelRealSense/librealsense · GitHub

Hi everyone,


Recently there have been a couple of cases where users could not get the OpenCV example in SDK 2.0's Getting Started guide to work.  This was because the script referenced a file called rsutil.hpp that did not exist.


Progress towards fixing this is being made.  The instructions have now been updated, replacing this file with a new include.


// include OpenCV header file

#include <opencv2/opencv.hpp>


The new include reference (the file opencv.hpp in a folder called opencv2) is apparently not in the file structure of version 2,80 of SDK 2.0, so will probably be introduced in the next release.


librealsense/ at development · IntelRealSense/librealsense · GitHub

Hi everyone,


I wanted to share the progress on my company's PC game 'My Father's Face', which can make use of RealSense cameras with hand and face tracking.  The video below showcases the opening natural controls tutorial and a small number of the interaction possibilities with those controls in the first section of the game (operating light switches, sink taps, shower, etc).


I apologize for the slowness, camera view glitches and apparent control awkwardness in some parts.  The game is normally super smooth even on low specification, non-gaming laptops with integrated graphics.  The video was recorded on such a laptop, but the video recording software was consuming significant processing resources, hence the slowdown.  I hope that you still enjoy it, and see the possibilities for similar advanced full-body controls in your own projects. 



Some of the game's RealSense mechanisms are documented in the extensive range of step by step guides I have published, listed at this link:


Index of Marty G's RealSense Unity How-To Guides


Whilst the fully realistic body controls shown in the video are possible to create using the standard capabilities supplied with the RealSense SDK, I created my own advanced custom animation system for the game called CamAnims, the principles of which are described here:


Using The 'CamAnims' Real-Time Game Animation Technique

Hi everyone,


A new article has been published about using RealSense cameras to build a mobile robot.


Build an Autonomous Mobile Robot with the Intel® RealSense™ Camera, ROS*, and SAWR | Intel® Software


The above project currently uses R200 or ZR300 cameras, but can be easily upgraded to the new D-cameras, according to the author.

Hi everyone,


The latest version (2.80) of the new RealSense SDK 2.0 is now available for download. Among the new features are ROS support, preliminary node.js integration and also a Depth Quality Tool that can be run as a pre-made executable


Release Intel® RealSense™ SDK 2.0 (build 2.8.0) · IntelRealSense/librealsense · GitHub


Please also note that in 2.80, the names of some of the API commands have also been changed.  For example, rs2_stop_pipeline and rs2_start_pipeline are now rs2_pipeline_stop and rs2_pipeline_start  (these two changes in particular make the commands fit neatly within the naming convention of the rs2_pipeline class).  See the 'API Changes' link in the release notes (below) for the full list of changes.


The release notes can be read here:


Release Notes · IntelRealSense/librealsense Wiki · GitHub


30 Oct Edit: the current status of Mac OSX support in SDK 2.0 has been updated.


librealsense/ at master · IntelRealSense/librealsense · GitHub

Hi everyone,


I had a faulty SR300 for months that began working perfectly again.  After going through everything I'd done today that might have changed its condition, I found the answer, and confirmed that it was what was making the camera work (as it would only stream when this solution was active).


Here's the tech details:


* Acer Aspire Es 15 laptop with USB 3.0

*  An unpowered USB 2.0 4-port hub

*  An Android tablet

*  A Micro USB tablet charging cable

*  RealSense SR300 camera plugged into the USB port on the back of the laptop (not the hub)


With the SR300 plugged into the USB port on the back of the PC, I connected the Android tablet to the USB 2.0 unpowered hub via the Micro USB cable with the tablet switched off, so that it entered battery charging mode.


For the duration that the tablet is in charging mode, the SR300 camera is fully functional in every way.  As soon as the tablet is disconnected from the USB 2.0 hub and its charging ceases, the SR300 stops working.


If people having problems with their SR300 also are fortunate enough to own a tablet (or some other USB chargeable device), I recommend giving the above process a try.  There's nothing to lose!


Edit: the process seems to require that the device being charged has less than 100% charge on it.  If the device does not allow the charging process to proceed because the battery is full then the SR300 camera signal cuts out.


Edit 2: 24 hours after the initial posting of this article, the SR300 camera is still functioning exactly according to the rules above, and the point about connected devices needing to have less than 100% charge in order for the camera to work normally is verified.

Hi everyone,


The RealSense D415 and D435 cameras and their ship to countries are now listed on the Intel Click online store, though the cameras are not listed as in stock yet.


Intel® RealSense™ Developer Kits

Hi everyone,


The RealSense ZR300 camera model is joining the R200 and SR300 in being discontinued from availability. The new RealSense D435 camera will be the ZR300 model's successor, and the focus of the RealSense range will be the D415 and D435 cameras, and the new open source cross-platform RealSense SDK 2.0 software.


If you are thinking of purchasing a ZR300 Developer Kit camera then you should please do so whilst stocks last in the Intel Click online store.


Intel® RealSense™ Developer Kits


Thank you.

Hi everyone,


A number of people in the past have wanted to be able to use the Faceshift software with RealSense but it did not work with modern RealSense SDK setups.  ppaneter kindly provided a precise configuration though that allowed Faceshift to work with the SR300 camera, and MrPMorris has confirmed the method works for them.  Thanks guys!


Re: SR300 not working with Asus laptop

Hi everyone,


The virtual reality news site Road To VR reports - with a confirmation statement from Intel - that the Project Alloy wireless 'merged reality' headset that was due to be released by the end of this year has unfortunately been canceled.


Given how a number of manufacturers had debuted Windows VR mixed-reality headsets recently whilst there was no fresh news of Alloy, the possibility of the project having been canceled seemed strong.  It is sadly confirmed now.


RealSense SDK 2.0 documentation

Posted by MartyG Sep 13, 2017

Hi everyone,


Now that the new open-source, cross-platform RealSense SDK 2.0 for Windows and Linux has launched, I went through the documentation files and assembled some useful introductory information links.  SDK 2.0 is an upgraded version of Librealsense (which has also been referred to in recent times by Intel by the alternate name 'RealSense Cross Platform API').


It is distinguished from the previous version of Librealsense by being identified as being a "development" branch (one that is still in development and may have issues), whilst the older Librealsense version is on the stable "main" branch.




GitHub - IntelRealSense/librealsense at development



librealsense/ at development · IntelRealSense/librealsense · GitHub



librealsense/ at master · IntelRealSense/librealsense · GitHub



librealsense/examples at development · IntelRealSense/librealsense · GitHub



librealsense/tools at development · IntelRealSense/librealsense · GitHub



librealsense/ at development · IntelRealSense/librealsense · GitHub



librealsense/ at development · IntelRealSense/librealsense · GitHub

Since Intel acquired the vision technology Movidius a year ago, people have been wondering how that technology would manifest as an Intel product.  Although the new RealSense D415 and D435 cameras were undoubtedly influenced by the acquisition, the most visible manifestation of the purchase has been revealed today: the Movidius Myriad X vision chip.


Although not a RealSense product, it is worth highlighting the Myriad X on this forum because of the natural parallels of its possible applications with the kind of projects that developers use RealSense cameras for.


You can read more about the Myriad X in this Intel news release.


Introducing Myriad X: Unleashing AI at the Edge | Intel Newsroom


Edit: further details about Myriad X can be found on its information page.


Myriad™ X: Ultimate Performance at Ultra-Low Power | Machine Vision Technology | Movidius

Hi everyone,


As my company's RealSense-enabled full game 'My Father's Face' heads for its late 2017 release date on the target platforms of PC and Intel Project Alloy headset, I thought it would be a good time to share some new preview images.  We hope it will give you creative inspiration for of what you can achieve with the RealSense camera in your own projects when you really harness its true power!









Features of 'My Father's Face' include:


- A huge fully explorable, physics-driven island in an exciting new universe where you can go to anywhere that you can see.

- Choice of either male or female characters to play as, and local and online multiplayer in a shared world with mixed or same genders.  Walk together, run together, play together, work together and touch together!

- Live lip sync that replicates the player's mouth movements as they speak into their real-life microphone and fully animates the virtual face.

- Player-controlled characters that utilize RealSense technology to precisely mirror the player's body and arm / hand movements and facial expressions almost 1:1, thanks to technology two years in the making.

Hi everyone,


After creating many hundreds of pages of personal research material over the past seven years - part of a much vaster archive belonging to my company - I decided that it was time that I tried to put more of it into practical use for modern-day applications such as development of software for the forthcoming Intel Project Alloy merged-reality headset.  The natural place to start was at the beginning of the archive, in 2010.


I rediscovered an idea for creating living avatars for classroom teaching.  The teacher would wear an all-covering white bodysuit, whilst an image projector would move around the classroom ceiling on rails, tracking the teacher's position and projecting an image onto the suit's surface that would change the teacher's appearance in doing so.  For example, if the teacher got down on hands and knees then the projector could make them look like a bear by projecting a bear image onto the suit from above.



As the Project Alloy headset can scan real-world large objects such as furniture and convert them into a virtual object of a similar size, it made me realize that this could provide a new way to make living avatars a reality.  If the headset could scan furnishings and convert them into a virtual representation, I wondered, maybe it could do the same for living people observed by the headset too, as they should not be any different from moving furniture to the camera.  This would allow any person observed by the headset wearer to take on a virtual form of similar size and shape.


And as Alloy is constantly scanning the room (a feat made possible by its advanced vision processing chip), rather than just taking a single calibration scan at start-up (like Microsoft's Kinect camera did), in theory it ought to be able to update in real-time the virtual representation of the living person that the headset's camera is observing.


Perhaps, with Project Alloy, we will all have the opportunity to interact with friends and colleagues as animals, heroes, villains and creatures beyond imagination ...