5 Replies Latest reply on Jul 1, 2017 7:02 AM by MartyG

    Anyone ever figure out how to send video frames to the SDK?

    robertoschler

      I've been away from RealSense development for almost a year .  I have a few "update" questions:

       

      - Did anyone ever figure out how to send video frames to the SDK instead of getting them from the camera?  I'd like to use the background removal service on video files.  Several people have asked how to do this over time, but I've never seen a solution.

      - I have the F200 and the R200.  I've already seen the table comparing those cameras to the newer SR300.  But from a developer's perspective, is there any major feature or advantage to the SR300 that might compel me to upgrade to that camera?

       

      Finally, why are all the Realsense camera sub-forums "archived"?  Please tell me that the RealSense product line or SDK hasn't been scuttled like the IoT Galileo products.  I love the RealSense camera line.

        • 1. Re: Anyone ever figure out how to send video frames to the SDK?
          MartyG

          1.  You can record camera data as .RSSDK format files that are based on the h264 video format and load them into the SDK, and it can treat the files as though it is receiving live input from the camera.

           

          How to Record and Playback Streaming Sequences in Intel® RealSense™ SDK | Intel® Software

           

          2.  The SR300 has a slightly longer default maximum range (1m for the F200, 1.5m for the SR300).  The SR300 is newer camera technology, but you may not see a massive difference between the two cameras.  The real generational leap will be with the forthcoming RealSense 400, which can be used both indoors and outdoors, can track double the number of depth points and has twice the operating range of previous generations.  The R400 is due sometime this year.

           

           

          The old Intel Developer Zone RealSense forums are archived, as focus was switched in January 2017 to this forum on the Support section of the Intel website, which is on a different forum system. 

           

          Despite the ending of Joule, Edison and Galileo, the RealSense camera range is continuing.  Some RealSense products may be retired over time, but new RealSense products such as the RealSense 400 will be introduced into the range.

          1 of 1 people found this helpful
          • 2. Re: Anyone ever figure out how to send video frames to the SDK?
            robertoschler

            Hi Marty,

             

            Thanks a ton.  Nice to see someone from the Perceptual and RealSense challenge days.

             

            The R400 sounds really cool.  Can't wait to get my hands on one.  What kind of projects have you been up to lately?

             

            Feel free to share any other similar RealSense threads that have hugely important information like that streaming sequences thread.  I fear I may have missed something important.

            • 3. Re: Anyone ever figure out how to send video frames to the SDK?
              MartyG

              This link provided scripts for each aspect of raw data streaming including more on record / playback.

               

              Intel® RealSense™ SDK 2016 R2 Documentation

               

              As for me, thanks for asking.  . I have been on the same PC game project for the past 3 years.  .  It started off as a solely RealSense game.  I realized over time though that a game that could only be played with a developer kit camera would not sell many copies!

               

              So I engineered custom systems that allow control of a game character's full body and arms / hands with joypad or mouse that are as natural and precise as RealSense controls, using a real-time animation system I call CamAnims due to its origins as a Realsense animation system..  This approach should allow easy full body control in VR by allowing players to bind the traditional controls to headset motion sensors and VR '6 degrees of freedom' hand controller inputs.

               

              RealSense still handles avatar facial animation by reading the face features such as eyes, lips and eyebrows, though there is automated animation too for those who do not have a camera or headset with a camera.

               

              The level of real-time body control possible with CamAnims is similar to CG animation like this scene from 'Sing'.

               

              https://m.youtube.com/watch?v=enuprdVo7GA

               

              The details for building a CamAnim system will be released to the RealSense community once I have time to do proper  public documentation for it.  In the meantime, I welcome individual questions about it, and also gladly point you to a forum post from March 2017 where I discuss CamAnims tech in more detail. 

               

              'My Father's Face' for PC and Project Alloy - Tech Trailer 8

               

              Apologies for the jerky frame rate on the test video in that post - the video recording software was having a bad day.  The true speed is equal to human movement, like in the 'Sing' video. 

               

              The target platforms for the game's release this year are PC and the forthcoming RealSense-equipped Intel Project Alloy 'merged reality' headset, though it should be playable on Vive and Oculus if control-binded like a non VR game played on the headset

              • 4. Re: Anyone ever figure out how to send video frames to the SDK?
                robertoschler

                Thanks for the streaming data link and good luck with your game project and CamAnims sounds like a very cool project.  I enjoyed your demo video.

                 

                If you happen to find any web pages that talk about how to scan a 3D object into Adobe After Effects, please let me know.  I picked up an inexpensive "storefront spinner" from a camera shop that I can use to spin an object on while scanning it with my F200 (I believe my F200 is a better choice for 3D scanning than my R200?).  But the tutorial I found only shows how to import OBJ converted to PLY objects to Blender, not After Effects:

                 

                 

                Intel RealSense 3D scanning: How To Scan then Prep for Blender & Unity | Intel® Software

                • 5. Re: Anyone ever figure out how to send video frames to the SDK?
                  MartyG

                  Thanks for your kind words.    I believe you only need to convert an OBJ to PLY if you want to import textures from a model that originated in RealSense into a modeling package such as Blender.  Otherwise if you want to just import a plain gray scan, OBJ should suffice.

                   

                  Apparently though, importing .OBJ files into modern versions of After Effects can be problematic due to them lacking a certain plugin for OBJ files that needs to be imported into your After Effects installation.  This is called Trapcode Form and has to be purchased for $200.

                   

                  Buy Red Giant Trapcode Form | Download the free trial

                   

                  A more affordable alternative for using OBJ files is to use Maxon Cinema 4D Lite that comes packaged with After Effects.

                   

                  Insert 3D objects into your compositions |