The R200 does not have the hand tracking feature that the F200 and SR300 does. But as an alternative, you could try using Blob Tracking. This is a more simplistic form of tracking that recognizes a general flat-ish area of skin - such as the palm - instead of specific hand joints.
Is it a platform specific hand tracking algorithm? Why i can't use PXCMHandModule or PXCMHandCursorModule with pipeline from R200 camera?
The R200 was designed originally to be installed on the rear of mobile devices such as tablets and face outwards (i.e be "world facing"), unlike the F200 and SR300 which were designed to face towards the user. Therefore there was no compelling need for hand tracking in the R200, since it would only be of use if you held the device with one hand and reached your other hand around the back of the device so that the camera could see it.
I understand that the R200 camera is a rearview camera, I have a development kit for R200 and I would like to try the cursor module, see how it works. It may be possible to somehow re-initialize the R200 camera so that the cursor module starts to work with the video from the camera or how to create an independent video file that can be pushed to the input to the cursor module? I work with the SDK 2016 R2 on Visual Studio 2015 with C#.
There is certainly nothing to stop you from using the R200 like a user-facing camera. Its rear-facing origins means though, of course, that you will have to find workarounds for the features that it lacks due to that mobile origin.
The .RSSDK video format that RealSense uses is based on the H.264 video format, also known as MPEG-4 Part 10 (not to be confused with the different MP4 video format). So any independent video file that you create would likely have to be in the H.264 format in order to be compatible with RealSense. That is, unless you mean that you aim to use RealSense to record an RSSDK video directly from the camera.
How can I pack recorded h264 video from another camera to rssdk file that I can then play it in clip editor?
I haven't seen any documentation on playing in RealSense an H.264 format video that was recorded by a non-RealSense camera, so I can only make some guesses about how to experiment with this.
If you haven't got the playback code already, you can find it in the 'Steps to play the streaming sequences file' section at the very bottom of this link:
Method 1. try loading the H.264 file directly into the SDK and hope that it recognizes it as a RSSDK file without needing to change it.
Method 2: if the SDK objects to the file format, try renaming the file so it has an .rssdk extension and see if the SDK accepts it then.
I'm sorry I can't be any more definite than that with my suggestions - there always has to be someone who tries something for the first time before the rest of us learns whether it is possible or not.