Glad to say that Our work on designing multitouch techniques for navigating in 3D virtual environment has been accepted for publication at Interact 2013.
You can find more details on the dedicated web page | MoveAndLook
This year some of the students of the | Master IvI worked on interior 3D reconstruction. Their goal was to improve the 3D reconstruction done by Kinect+PCL. For that they study how much the ICP deviate during reconstruction and how to improve the texture mapping.
Clément Moerman released a video of his work on Easy 3D Virtual Navigation with multi-touch device.
Some code is also available on his web site: http://www.lifl.fr/~moerman/download/download.html
BarCamp? on gestual interaction in Lille (France)
The third Forum on Interaction Tactile et Gestuelle event will be held in Lille. fitg.lille.inria.fr/
We are organizing a multitouch and gestual event in Lille. We will probably show some work on 3d object manipulation using 2d touch devices. More info on the event are available at the following address: http://interaction.lille.inria.fr/TG10/
I made a fresh released of blenderTUIO. You can find it on the usual website | http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO. In addition to be synchronized with the latest 2.5 alpha 1 blender version it also fix some bug related to the tuio sensors. I also release on the same page new examples files as the old one was not working anymore because blender 2.5 is based on python 3 (syntax modification) and also it have 'yet' a new gameengine API.
Have Fun, D.
I just have released an alpha version of blenderTUIO based on the incoming 2.5 major release of Blender. This version is available at http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO . Interesting aspect of this version is that, in addition to the multitouc/TUIO sensor, a new raw OSC sensor. A such osc sensor can then be used to control blender object using a wide range of input devices (wii-mote, art-track).
Happy new year,
Benjamin Walther-Franks working in the Human Computer Interaction group at the Universität Bremen (http://dm.tzi.de/en/research/hci/) recently published a paper about using multi-touch table for animation. Even if it is not obvious, the prototype that is visible in the video has been implemented using BlenderTUIO.
You can find good examples of such 3d application on the BSContact web site: http://www.bitmanagement.de/ or http://www.idees-3com.com/accueil/index/presentation. I was thinking that it could be fun to make multi-touch application in such a framework; thus I made the device sensor.
In the following are the protos and a basic example of vrml code that make use of it. Currently it just print in the BSContact console the multi-touch events received, but with a bit more work you should be able to interact with a complete 3d world or 2d gui.
For GPL-Proprietary reason I cannot distribute the plugin, I'm digging into this issue. Don't hesitate to mail me for info and to inform me of your interest in having this issue solved.
At IRCICA we have a great room that is equipped with a large screen, 7.1 sounds and hand tracking devices. You can have more info about the room at: http://www2.lifl.fr/pirvi/pirvi_salleRV.php. The tracking devices, named art-track, use their own format to send data to client applications. To make this compatible with a larger codebase Damien Marchal has implemented a gateway that convert the proprietary format to TUIO/OSC like messages. Thus it is now possible to control applications made with BlenderTUIO or with PyMT in this room. Two TUIO/OSC messages are supported. The first one is standard TUIO message: /tuio/2dcur. The cursors given to the application are computed by intersecting the fingers pointing location with the screen. The second type of message is /tuio/3dhand and permits to export the complete hands location and orientation as well as all the fingers position in absolute space.
But stay tuned, a video of all this is currently in progress…
I just add a new video in which I'm playing with PyMT. PyMT is a good python framework for prototyping multitouch application. The rending is done using opengl and the multitouch interface is based on tuio. The video is not based on the official version of PyMT: you can see the window borders have their color changed according on their status: unselected, translating, rotation. I have to commit this for the version 0.3 of PyMT.
I have finished the implementation of an orientationless gesture system. The initial implementation is based on PyMT. In the attached picture you can see how the gesture detection system is used to extract the "default" rotation. This "default" rotation being used to initialize the user interface components. A such approach permit to take into account where the users are around the table and insure that the windows they pop-up are always readable.
In a short time frame I planned to test this feature in our MultiTouch? Blender implementation http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO. At a longer term I hope this feature will integrated into the main branch for the 0.3 release of PyMT.
In this blog we will give informations and progress about the different virtual reality and multi-touch projects we have at the University of Lille. Stay tunes.