Are you interested in multi-touch gestures for multimedia tabletop applications, tangible or surface computing, and interaction design? Take a look at this video of Grafiti:
Alessandro De Nardi, a student in computer science at the University of Pisa, Italy, has worked on Grafiti during Google's Summer of Code and is still involved with the project. Alessandro, a member of the NUI-group, is also in the Music Technology Group of Barcelona's UPF University. Allesandro was supervised by by Martin Kaltenbrunner and Sergi Jorda, of the reactable project.
According to De Nardi's Google Code website:
"Grafiti is a C# framework built on top of the Tuio client that manages multi-touch interactions in table-top interfaces. The possible use of tangible objects is particularly contemplated. It is designed to support the use of third party modules for (specialized) gesture recognition algorithms. However a set of modules for the recognition of some basic gestures is included in this project.
The development is in the alpha state, going to beta soon.
The goals I've been aiming at are: generality, versatility, speed of execution, extensibility, ease of programming (integration) of external modules (simple APIs and protocols) and effective interface design with regards to HCI aspects." -Alessandro De Nardi
Demos in binary form are available for download for MacOSX and Windows on the Google Grafiti site. There is a Grafiti site on Sourceforge. Apparently the alpha version will work on all 32-bit MS Windows,, BSD platforms, and POSIX (linux).
(Note: This post was updated on 12/14/08 and includes corrections.)
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by relevance for query reactable. Sort by date Show all posts
Showing posts sorted by relevance for query reactable. Sort by date Show all posts
Oct 26, 2008
Grafiti - a multi-touch, table-top, surface computing application, from a member of the NUI group.
Posted by
Lynn Marentette
Jun 18, 2008
Oculusia, an Interactive Multi-Touch Table Game for Four People
The following video is a nice demonstration of a multi-touch table game, Oculusia. The game was created by four students from the Aalborg University, Copenhagen, for Medialogy 2008. The students, Jens Fursund, Thomas Frohlich, Thomas Miksa, and Tobias Brogaard, also created the table.
More Information:
Students Explore Multi-touch with Oculusia
Related:
The "Workbench" used in the Medialogy Lab at Aalborg University, Copenhagen:

"The Workbench displays 3D stereo images and the tracking system allows for interaction with virtual objects on the display surface. The tracking system uses infrared light to illuminate the display surface and the camera is only sensitive to infrared light. Basic image processing techniques are then applied to acquire position of the hand."
A Multi-Touch DigitalPainting Application pdf (Søren Støvlbæk, Larsen Gavin, Francis Murray & Jesper Højgaard Pedersen:Institute of Electonic Systems, Computer Vision, and Graphics, Aalborg University 2007)
The ReacTable: A Tangible Musical Instrument pdf(Martin Kaltenbrunner, Music Technology Group, Universitat Pompeu Fabra, 2006)
More Information:
Students Explore Multi-touch with Oculusia
Related:
The "Workbench" used in the Medialogy Lab at Aalborg University, Copenhagen:

"The Workbench displays 3D stereo images and the tracking system allows for interaction with virtual objects on the display surface. The tracking system uses infrared light to illuminate the display surface and the camera is only sensitive to infrared light. Basic image processing techniques are then applied to acquire position of the hand."
A Multi-Touch DigitalPainting Application pdf (Søren Støvlbæk, Larsen Gavin, Francis Murray & Jesper Højgaard Pedersen:Institute of Electonic Systems, Computer Vision, and Graphics, Aalborg University 2007)
The ReacTable: A Tangible Musical Instrument pdf(Martin Kaltenbrunner, Music Technology Group, Universitat Pompeu Fabra, 2006)
Posted by
Lynn Marentette
May 19, 2008
More Multi-Touch from members of the NUI group!
It is always exciting to see what members of the NUI group are doing!
Here is a new video of a multi-touch creation by some of the members of the NUI group. Although this is a proof-of-concept example, it is fun to see how it is played out, using the little iPhone-like touch-pad widgets as a navigation tool for the large screen.
Read the "Multi-touch Goodness" article in Gizmodo of an interview with Christian Moore about this demo and his Lux open-source framework. (Christian is a colleague of Harry van der Veen, both members of the NUI group.)
Here is an excerpt from the interview:
"JD: Why Flash?
CM: Because it's fast to prototype in. However, the software is broken into several segments. One C++ application that tracks hands that talks to Flash... WPF... or another C++ app... and basically everything you can imagine. You can enable multitouch in any environment, like Cocoa."
High-resolution screen shots and additional information can be found on the nuiman website.
For my tech-minded readers:
I'm pretty sure that the C++ application that track hands and fingers in the video demo uses Touchlib, a library for creating multi-touch interaction. Touchlib can work with TUIO, a protocol for tabletop tangible user interfaces. Applications such as Flash and Processing support TUIO. For more information about TUIO, read "TUIO: A Protocol for Table-Top Tangible User Interfaces".
(Information from the NUI group website mentions that OpenCV, or Open Computer Vision Library, found on SourceForge, can support blog detection and tracking.)
The people behind TUIO are from the Reactable project, of the Music Technology Group at Pompeu Fabra University in Barcelona:

Here is my plug for the NUI group, once again!
"The NUI group, or Natural User Interface Group, is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.
We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."
FYI
I came across the Harry van der Veen of the NUI group in early 2007 when I was working on touch-screen projects for my HCI and Ubicomp classes, and I'm inspired by all of the creativity I've seen coming from this group.
If you'd like to see more demos, visit the Natural User Interface website, a commercial out-growth of Harry and his colleague's work, where you can view a reel that includes a few touch-screen games. I love the vision statement on this site:
"Technology should enable us to interact with computers, in the same way we interact with the real world; in a way which is natural to us, namely through gestures, expressions, movements, and manipulations. Our vision is to change the way people interact with computers."
Here is a new video of a multi-touch creation by some of the members of the NUI group. Although this is a proof-of-concept example, it is fun to see how it is played out, using the little iPhone-like touch-pad widgets as a navigation tool for the large screen.
Read the "Multi-touch Goodness" article in Gizmodo of an interview with Christian Moore about this demo and his Lux open-source framework. (Christian is a colleague of Harry van der Veen, both members of the NUI group.)
Here is an excerpt from the interview:
"JD: Why Flash?
CM: Because it's fast to prototype in. However, the software is broken into several segments. One C++ application that tracks hands that talks to Flash... WPF... or another C++ app... and basically everything you can imagine. You can enable multitouch in any environment, like Cocoa."
High-resolution screen shots and additional information can be found on the nuiman website.
For my tech-minded readers:
I'm pretty sure that the C++ application that track hands and fingers in the video demo uses Touchlib, a library for creating multi-touch interaction. Touchlib can work with TUIO, a protocol for tabletop tangible user interfaces. Applications such as Flash and Processing support TUIO. For more information about TUIO, read "TUIO: A Protocol for Table-Top Tangible User Interfaces".
(Information from the NUI group website mentions that OpenCV, or Open Computer Vision Library, found on SourceForge, can support blog detection and tracking.)
The people behind TUIO are from the Reactable project, of the Music Technology Group at Pompeu Fabra University in Barcelona:

Here is my plug for the NUI group, once again!
"The NUI group, or Natural User Interface Group, is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.
We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."
FYI
I came across the Harry van der Veen of the NUI group in early 2007 when I was working on touch-screen projects for my HCI and Ubicomp classes, and I'm inspired by all of the creativity I've seen coming from this group.
If you'd like to see more demos, visit the Natural User Interface website, a commercial out-growth of Harry and his colleague's work, where you can view a reel that includes a few touch-screen games. I love the vision statement on this site:
"Technology should enable us to interact with computers, in the same way we interact with the real world; in a way which is natural to us, namely through gestures, expressions, movements, and manipulations. Our vision is to change the way people interact with computers."
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)