- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
I would like to trigger a script in Unity based on input received by the SR300 camera. I came across this video (https://www.youtube.com/watch?v=D_8YvdvRZKI Alternative Hand Tracking Modes with the Intel® RealSense™ SDK and Unity* 5 - YouTube )where the hand blob is shown in the screen as it is being tracked via the SR300.
I would like to view the hand blob image similarly and at the same time trigger an event via a script. I have created a few box collider objects but am not sure how to trigger events.
Thank you.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Given how simplistic blob tracking is, using the gestures Blob Detected or Blob Lost with a SendMessageAction may be your best options. Blob tracking requires you to put the flesh much closer to the camera than joint tracking, so it is very hard to trigger it accidentally.
Link Copied
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Given how simplistic blob tracking is, using the gestures Blob Detected or Blob Lost with a SendMessageAction may be your best options. Blob tracking requires you to put the flesh much closer to the camera than joint tracking, so it is very hard to trigger it accidentally.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page