Together with the University of Stuttgart Institute for Visualization and Interactive Systems (VIS) we are developing the prototype of a new kind of assistive system in the project motionEAP. The prototype combines the 3D-spaces of the depth sensors Kinect and Leap Motion. It detects individual fingers of both hands and allows to direct processes with simple gestures. Both gestures in space and touch events on the surface of the workspace are detected. At the same time the system allows to project videos or interactive 3D-spaces on any kind of surface.
As an example this allows to zoom or rotate a workpart through simple gestures. In future development iterations we will integrate object detection allowing a context- or product-specific feedback on processes, e.g. in manual assembly.
This context-aware feedback is a pre-requisite for the implementation of gamification components. These allow to integrate feedback smoothly and least disruptive while motivating the assistive system’s users.
This video illustrates the prototype’s current features:
[fvplayer src=”http://www.rel14.korion.de/wp-content/uploads/2014/11/motionEAP_Prototyp-Demo.mp4″ width=”854″ height=”480″]