ciao, sono di Roma, anche io sono interessato.
e ne abbiamo presa una per il fablab.
magari passaci a trovare al fablab a Ostiense www.romamakers.org
Ciao, io sono in provincia di Pisa, so chi siete e ti ho anche intravisto alla makerfaire. Comunque mi fa piacere collaborare, ti contatto via Facebook e vediamo cosa si puo' allestire.
Hi, I'm italian too , I write in english for other users .
Your project is interesting, peraphs outside my knowledge of programming, so , (for now) I can't actively participate, but I'll try to look at that sdk.
In general, I think that Galileo could expand along two main lines:
- Arduino "classic" programming: C++ libraries + processing code Arduino IDE
Both are useful but the latter required more architecture level knowledge.
I think the project needs both approaches:
- A c++ module running on linux for actual feature detection i.e. acquiring the signal from webcam(s), convert it to usable features that can be fed to a classifier for training, then perform detection using the training data
- A set of arduino libraries which could be used for example to move the camera to follow hands, and when gestures are detected to perform some outputs, i.e. switching on a lamp or opening a cabinet doors, and so on.
I'm pretty sure if we create a diversified group around the project we could use everybody's help, i.e. for testing or even providing requirements or use cases.
in the specific says that can be used openCv, it will be a possible way?
@fbasile: For these environments, I move better on the low level side, btw the main problem for me is time availability, let see what are the possibilities.
@stefanosky: Where do you have seen the OpenCv specification ?
P.S. nice project your fablab
Yes, I know we can use opencv, but It's simply not enough for a gesture tracking appliance.
It comes pre-compiled on the mini-sd linux version shipped with the galileo and I it's also accessible via python cv module. The current opencv gesture tracking works with a webcam, identifying hand position extracted from the background of the picture.
On the other hand I know for sure Intel has some much more powerful software and hardware in their perceptual toolkit, that's why I'm asking support here.
The point of my project is that I would like to build a "galileo box" you put in front of the keyboard or on top of a cabinet or robot, able to track and follow the movement of your hands bottom-up (in a vertical manner), not facing a camera or screen.
In my idea it could look like a Leap Motion sensor, except it should work while standing in front of the "object" and not sitting in front of you computer screen.
Hope to have explained better my idea!
I think that main question is how to port a standard linux distribution to the galileo board.
I checked with ssh and it seems a yocto distribution, i don't know how this could be useful.