- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Why can't gesture and cursor information be displayed simultaneously in cursor mode?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I ran tests with the Touchless Controller example in the RealSense SDK '2016 R2', which uses the Cursor Mode's tracking and gesture principles. The cursor seemed to have a very limited 'sweet spot' range near to the camera in which tracking would be activated or stall, depending on how far the hand is from the camera.
The requirement to have the hand so near the camera, and the ease of which the tracking can be stalled when the hand moves further away, makes me think that the mode is probably using crude 'blob tracking' of the hand palm, rather than more advanced hand bone-joint tracking. Blob tracking is a mode where the camera just looks for a large flat-ish surface area, such as the palm, knee or sole of the foot (the SR300 can track knees, soles and toes too, as it mistakes them for palms and fingers).
Based on my tests, if you are losing the cursor coordinate display when doing gestures then I wonder if your hand is going outside of the limited tracking range when you are making the gesture and therefore stalling tracking. In the Touchless Controller sample program, when this happens a 'Cursor not visible' message appears at the top of the screen.
If you have not used the Touchless Controller sample before, you can find it by going to a folder that the '2016 R2' and '2016 R3' SDKs place on your desktop called 'Intel RealSense SDK Gold'. Inside this folder is a program called Sample Browser.
In the search box at the side of the Browswer, type in 'touchless' to easily find the Touchless Controller. You can click the Run button to run the sample from the Browser, or click 'Source' to access the source code for the sample.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I ran tests with the Touchless Controller example in the RealSense SDK '2016 R2', which uses the Cursor Mode's tracking and gesture principles. The cursor seemed to have a very limited 'sweet spot' range near to the camera in which tracking would be activated or stall, depending on how far the hand is from the camera.
The requirement to have the hand so near the camera, and the ease of which the tracking can be stalled when the hand moves further away, makes me think that the mode is probably using crude 'blob tracking' of the hand palm, rather than more advanced hand bone-joint tracking. Blob tracking is a mode where the camera just looks for a large flat-ish surface area, such as the palm, knee or sole of the foot (the SR300 can track knees, soles and toes too, as it mistakes them for palms and fingers).
Based on my tests, if you are losing the cursor coordinate display when doing gestures then I wonder if your hand is going outside of the limited tracking range when you are making the gesture and therefore stalling tracking. In the Touchless Controller sample program, when this happens a 'Cursor not visible' message appears at the top of the screen.
If you have not used the Touchless Controller sample before, you can find it by going to a folder that the '2016 R2' and '2016 R3' SDKs place on your desktop called 'Intel RealSense SDK Gold'. Inside this folder is a program called Sample Browser.
In the search box at the side of the Browswer, type in 'touchless' to easily find the Touchless Controller. You can click the Run button to run the sample from the Browser, or click 'Source' to access the source code for the sample.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
When g_gestures=true and g_Info=true, only g_Info is working but no gesture displays.Why is that?
Part of the code is as follows
if(g_cursor)
{
// Get current hand outputs
if(g_cursorDataOutput->Update() == PXC_STATUS_NO_ERROR)
{
// Display alerts
if(g_alerts)
{
PXCCursorData::AlertData alertData;
for(int i = 0 ; i < g_cursorDataOutput->QueryFiredAlertsNumber() ; ++i)
{
if(g_cursorDataOutput->QueryFiredAlertData(i,alertData) == PXC_STATUS_NO_ERROR)
{
std::printf("%s was fired at frame %d \n",Definitions::CursorAlertToString(alertData.label).c_str(),alertData.frameNumber);
}
}
}
// Display gestures
if(g_gestures)
{
PXCCursorData::GestureData gestureData;
for(int i = 0 ; i < g_cursorDataOutput->QueryFiredGesturesNumber() ; ++i)
{
if(g_cursorDataOutput->QueryFiredGestureData(i,gestureData) == PXC_STATUS_NO_ERROR)
{
std::wprintf(L"Gesture: %s was fired at frame %d \n",Definitions::GestureTypeToString(gestureData.label),gestureData.frameNumber);
const pxcCHAR* gestureName = Definitions::GestureTypeToString(gestureData.label);
const pxcCHAR* click = L"CURSOR_CLICK";
if (click==gestureName){
string path = "C:\\Program Files (x86)\\iebook\\release\\iebook.exe";
int a = WinExec(path.data(), SW_SHOW);
}
}
}
}
// Display cursor information
if(g_Info)
{
//TODO
PXCCursorData::ICursor *cursor;
for(int i = 0 ; i < g_cursorDataOutput->QueryNumberOfCursors() ; ++i)
{
g_cursorDataOutput->QueryCursorData(PXCCursorData::ACCESS_ORDER_BY_TIME,i,cursor);
std::string handSide = "Unknown Hand";
handSide = cursor->QueryBodySide() == PXCHandData::BODY_SIDE_LEFT ? "Left Hand Cursor" : "Right Hand Cursor";
std::printf("%s\n==============\n",handSide.c_str());
std::printf("Cursor Image Point: X: %f, Y: %f \n",cursor->QueryCursorImagePoint().x,cursor->QueryCursorImagePoint().y);
std::printf("Cursor World Point: X: %f, Y: %f, Z: %f \n",cursor->QueryCursorWorldPoint().x,cursor->QueryCursorWorldPoint().y,cursor->QueryCursorWorldPoint().z);
std::printf("Cursor Engagement status: %d%c \n\n", cursor->QueryEngagementPercent(), '%');
}
}
}
}
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have never programmed the cursor mode, but from a general programming principles point of view: as the g_info section comes before the g_gestures section, if g_gestures never triggers then I would recommend checking the end brackets
}
If there is a mistake in the number of brackets or their position, the g_gesture section may end up getting treated as being part of the g_info section instead of being defined as its own separate function.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page