[Gesture Lab] - Project Goal


        Gesture Lab         VX1/Voxon   Gesture-oriented program + holography
       
Description:
Designing and writing a fully functional gesture input program as input control system
for the VX1 machine.


Motivation:
The motivation behind this project is to eliminate the gap between the user and the
VX1 machine by allowing the user to use the machine without pressing any physical
button.


External Links:
A video about VX1:
Hand gesture UI examples:


Because we are going to use their source code to do this project. You can download their
SDK through their website: https://voxon.co/. There’re 6 files written in C/C++ included
almost everything about how it’s OS(simulator) and other applications work, it has around
120000 lines of code in total. For our project, we only need to focus on voxiesimp.c and
voxieleap.c. But we are going to explore some useful functions in voxiebox.h and
voxiedemo.c too. (Draw function is just like processing, we initialize object of voxiebox,
and use it to draw a line, cube or something else). Because we are going to utilize Leap
Motion tech, we need LeapSDK too.


Document for LeapSDK4.0.0(the latest one):
LeapC: https://developer.leapmotion.com/documentation/v4/index.html


Project tasks:
1.)    Read the documents and codes of VoxonSDK and LeapSDK. (done)
2.)    Setting up environment and done some simple test. (done)
3.)    Design and implement the code. We will focus on the interface first. The idea is  we
just implement the hand gesture functionality map with their original input system. We
might need to design the scenario that how user is going to interact with the interface
(more UX based) and only change their interface if necessary.
4.)  Good documentations as well as a website(can be wiki page on github) that briefly
described what we are doing for this project.
5.)    A video that shows the features of our program. (To attract people)

Best thing is that I already have experienced with Leap Motion. I know how to implement
the leap motion code. It’s not a hard task, but it takes time to learn. Moreover, it requires
a good design with sufficient testing to make the program works better. Besides, Voxon
also has a simple demo program in voxieleap.c, we can start with that template(it only
has 500 lines. We are not going to look for complicated gestures design, such as more
than 5 gestures or combo gestures. We want to make our program as easy to use, that
all the people could learn how to use it in few seconds, such as grabbing an object,
moving around the object, pinching two fingers to zoom in or zoom out, swiping your
hand to turn the object around and etc.

Comments

Popular Posts