-= PIRVI =-
Recent posts

Intuitive Navigation on Multitouch devices

Glad to say that Our work on designing multitouch techniques for navigating in 3D virtual environment has been accepted for publication at Interact 2013.

You can find more details on the dedicated web page | MoveAndLook

  • Posted: 2013-05-23 11:12 (Updated: 2013-05-23 11:13)
  • Author: marchal
  • Categories: (none)
  • Comments (0)

Students working on Reconstruction using Kinect and PCL

This year some of the students of the | Master IvI worked on interior 3D reconstruction. Their goal was to improve the 3D reconstruction done by Kinect+PCL. For that they study how much the ICP deviate during reconstruction and how to improve the texture mapping.

Navigating in Virtual Environment.

Clément Moerman released a video of his work on Easy 3D Virtual Navigation with multi-touch device.

Some code is also available on his web site: http://www.lifl.fr/~moerman/download/download.html

FITG 2012

BarCamp? on gestual interaction in Lille (France)

The third Forum on Interaction Tactile et Gestuelle event will be held in Lille. fitg.lille.inria.fr/

Regards, D.

BarCamp on gestual interaction in Lille (France)

We are organizing a multitouch and gestual event in Lille. We will probably show some work on 3d object manipulation using 2d touch devices. More info on the event are available at the following address: http://interaction.lille.inria.fr/TG10/

Regards, D.

Fresh release of blender TUIO

Hi all,

I made a fresh released of blenderTUIO. You can find it on the usual website | http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO. In addition to be synchronized with the latest 2.5 alpha 1 blender version it also fix some bug related to the tuio sensors. I also release on the same page new examples files as the old one was not working anymore because blender 2.5 is based on python 3 (syntax modification) and also it have 'yet' a new gameengine API.

Have Fun, D.

NewYear alpha release of blenderTUIO

I just have released an alpha version of blenderTUIO based on the incoming 2.5 major release of Blender. This version is available at http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO . Interesting aspect of this version is that, in addition to the multitouc/TUIO sensor, a new raw OSC sensor. A such osc sensor can then be used to control blender object using a wide range of input devices (wii-mote, art-track).

Happy new year,

D.

Cool usage of BlenderTUIO

Sven Widderich, master student at the "h_da" university in Darmstadt is working on a Sphere like multi-touch surface. He also realized a Linux version of BlenderTUIO that he is now using to develop its applications.

To be honest, I'm fond of its half-sphere multitouch surface displaying the Earth:

Cool usage of BlenderTUIO

Benjamin Walther-Franks working in the Human Computer Interaction group at the Universität Bremen (http://dm.tzi.de/en/research/hci/) recently published a paper about using multi-touch table for animation. Even if it is not obvious, the prototype that is visible in the video has been implemented using BlenderTUIO.

On the road to make multitouch vrml application using BSContact

Hi all,

Last week I made a Device Sensor plug-in to bring TUIO to BSContact. BSContact is a VRML/X3D browser that permits to build cool 3d applications using only javascript and xml. Making a 3d application in x3d/vrml+javascript gives a taste of Dynamic HTML for 3d.

You can find good examples of such 3d application on the BSContact web site: http://www.bitmanagement.de/ or http://www.idees-3com.com/accueil/index/presentation. I was thinking that it could be fun to make multi-touch application in such a framework; thus I made the device sensor.

In the following are the protos and a basic example of vrml code that make use of it. Currently it just print in the BSContact console the multi-touch events received, but with a bit more work you should be able to interact with a complete 3d world or 2d gui.

#VRML V2.0 utf8
# define the data we want to get from the multitouch device
PROTO MultiTouch
[	
	eventOut SFString info	
        eventOut MFInt32  touch
]
{}


DEF TouchDownSensor DeviceSensor
{
    device "MULTITOUCH"
    eventType "on_touch_down"
    event DEF on_touch_down MultiTouch {}
}

DEF TouchMoveSensor DeviceSensor
{
    device "MULTITOUCH"
    eventType "on_touch_move"
    event DEF on_touch_move MultiTouch {}
}

DEF TouchUpSensor DeviceSensor
{
    device "MULTITOUCH"
    eventType "on_touch_up"
    event DEF on_touch_up MultiTouch {}
}


DEF EventPrinter Script
{
    eventIn SFString debug
    eventIn SFString info
    eventIn MFInt32	 on_touch_move
    eventIn MFInt32	 on_touch_up
    eventIn MFInt32	 on_touch_down
 
    url "javascript:
 	function info(a)
    {
        print('MultiTouchSensor[INFO]:         '+a);
    }

function on_touch_move(a)
    {
        print('MultiTouchSensor[INFO]: on_touch_move: '+ a[0] + ', '+ a[1] + ' ' + a[2]);
    }

	function on_touch_up(a)
    {
        print('MultiTouchSensor[INFO]: on_touch_up: '+ a[0] + ', '+ a[1] + ' ' + a[2]);
    }

	function on_touch_down(a)
    {
        print('MultiTouchSensor[INFO]: on_touch_down: '+ a[0] + ', '+ a[1] + ' ' + a[2]);
    }
    "
}

ROUTE on_touch_down.touch TO EventPrinter.on_touch_down
ROUTE on_touch_move.touch TO EventPrinter.on_touch_move		 
ROUTE on_touch_up.touch	  TO EventPrinter.on_touch_up

ROUTE on_touch_down.info TO EventPrinter.info
ROUTE on_touch_move.info TO EventPrinter.info
ROUTE on_touch_up.info	 TO EventPrinter.info

For GPL-Proprietary reason I cannot distribute the plugin, I'm digging into this issue. Don't hesitate to mail me for info and to inform me of your interest in having this issue solved.

Art-track, BlenderTUIO and our Virtual Reality Room

At IRCICA we have a great room that is equipped with a large screen, 7.1 sounds and hand tracking devices. You can have more info about the room at: http://www2.lifl.fr/pirvi/pirvi_salleRV.php. The tracking devices, named art-track, use their own format to send data to client applications. To make this compatible with a larger codebase Damien Marchal has implemented a gateway that convert the proprietary format to TUIO/OSC like messages. Thus it is now possible to control applications made with BlenderTUIO or with PyMT in this room. Two TUIO/OSC messages are supported. The first one is standard TUIO message: /tuio/2dcur. The cursors given to the application are computed by intersecting the fingers pointing location with the screen. The second type of message is /tuio/3dhand and permits to export the complete hands location and orientation as well as all the fingers position in absolute space.

But stay tuned, a video of all this is currently in progress…

Testing PyMT

I just add a new video in which I'm playing with PyMT. PyMT is a good python framework for prototyping multitouch application. The rending is done using opengl and the multitouch interface is based on tuio. The video is not based on the official version of PyMT: you can see the window borders have their color changed according on their status: unselected, translating, rotation. I have to commit this for the version 0.3 of PyMT.

Orientationless gesture support in PyMT

Hi all,

I have finished the implementation of an orientationless gesture system. The initial implementation is based on PyMT. In the attached picture you can see how the gesture detection system is used to extract the "default" rotation. This "default" rotation being used to initialize the user interface components. A such approach permit to take into account where the users are around the table and insure that the windows they pop-up are always readable.

In a short time frame I planned to test this feature in our MultiTouch? Blender implementation http://forge.lifl.fr/PIRVI/wiki/MTUtils/blenderTUIO. At a longer term I hope this feature will integrated into the main branch for the 0.3 release of PyMT.

Opening this blog

In this blog we will give informations and progress about the different virtual reality and multi-touch projects we have at the University of Lille. Stay tunes.