Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Sep 15, 2009

Applications for touch, multi-touch, and gesture interaction: What's Marching to Market?

Every week or so, I hear about a new multi-touch enabled laptop.Today, I came across a review on Gizmodo about the Lenovo ThinkPad T400, which is a 14.1 capacitive touchscreen laptop that can handle four touches at once:

Lenovo ThinkPad T400s with Multitouch Review: Finger Flicking Fun


The Multitouch ThinkPad in Action

Lenovo ThinkPad T400s with Multitouch from Gizmodo on Vimeo.

SCREENSHOT
 http://cache.gawker.com/assets/images/gizmodo/2009/09/SimpleTab.jpg
"The screen is pure beauty. Like a capacitive touchscreen phone, light taps on the display are responsive and you don't have to think about positioning your finger in a certain way to open apps or rearrange windows." - Joanna Stern, Gizmodo, 9/15/09


OTHER EXAMPLES
 
THE FUJITSU LIFEBOOK T5010

Add the Fujitsu LifeBook t5010 to the Windows 7 Multitouch Laptop List
Gizmodo, Joanna Stern, 9/15/09
http://cache.gawker.com/assets/images/gizmodo/2009/09/LifeBook_T5010_multitouch.jpg


DELL LATITUDE XT2
 Dell Latitude XT2 Tablet PC


http://i.dell.com/images/global/products/latit/latit_highlights/laptop-latitude-xt2-overview3.jpg






 












TOUCH DEMO- HP TOUCHSMART TX2 & WINDOWS 7


AIR HOCKEY FOR WINDOWS 7: IDENTITY MINE


FINGERTAPPS





Want to create something multi-touch?

It looks like there will be a growing need for Windows 7 application developers.  Here's some code examples using Windows Presentation Foundation and the Windows 7 Multitouch API

Windows 7 Multitouch Application Development, Part I (The Code Project: Kunal Chowdhury)


Other Developer Resources

Touchlib Multi-touch Development Kit  (NUI GROUP)

MSDN Windows Touch: Developer Resources

nTrig's Multi-Touch Download for Windows 7

NUITech's Snowflake Suite for NextWindow

TouchKit Multi-touch Development Kit


GestureWorks Flash Multitouch SDK

3M Multi-touch Developer Kit


RELATED
Touch-less Interaction from lm3labs

Create the future you want. with ubiq'window from Nicolas Loeillot on Vimeo.

More about gesture interaction and multi-touch can be found on this blog- do a search!

Aug 8, 2009

More about Project Natal: Richochet - Great Gaming for Fitness, Johnnie Chung Lee's Contribution


(Credit: CNET News)
Ina Fried, in a recent CNET Beyond Binary post, recently reviewed her experience playing Ricochet, a 3D game developed by Microsoft for Natal, the company's new gesture-recognition, controller-less Xbox gaming system. Above is a screenshot from Fried's article, Exclusive: Getting up close and personal with Natal:

Here is the video:


Ina Fried had a chance to spend some time in Redmond, Washington to explore the games in development at Microsoft, and hang out with the people responsible for Project Natal.

In her Beyond Binary article, Fried notes that the Ricochet game provides quite a workout, and this has had a positive effect on the Natal team:


"Since I started working on this project, I've lost almost like 10 pounds," said Kudo Tsunoda, general manager of Microsoft Game Studios and the creative director for Project Natal. "We're going to have the most in-shape development team you've ever seen."


Fans of Johnny Chung Lee will be happy to know that his work at Microsoft contributed to this game in someway, if they don't know this by now!

Who is Johnny Chung Lee? Read my post, "I wish I could be Johnny Chung Lee for a Day!".


RELATED

Speaking of Natal, it should be out next year (Ina Fried, CNET)

Gates: Natal to bring gesture recognition to Windows too



May 26, 2009

GestureTek's "Cube": A compact Interactive Gesture-based Display System

GestureTek's "Cube"




From the GestureTek website:

"Introducing The Cube - a compact, turnkey, 'plug and play' interactive display unit that brings the power of gesture control to a variety of display spaces. Project the interactive 80” diagonal display onto almost any floor, wall, table or counter for branding, advertising, entertainment and product promotion. The Cube will engage customers, turn heads and drive business results."

Brochure
(pdf)

May 10, 2009

Future of Interaction: Microsoft Lab's Vision 2019 Montage; Future Healthcare Montage

The following video is from Microsoft Office Vision 2019. If you visit the YouTube source, you'll find an interesting discussion about this five-minute+ clip.



Video Presentation from MIX09: Untold Stories of Touch, Gesture, & NUI
Joseph Fletcher, Design Manager, Microsoft Surface


"Learn about new patterns, behaviors, and design approaches for touch and gesture interfaces from a practitioners point of view. Learn early lessons from applied knowledge of touch applications, devices, and design methods."




LukeW's notes from Joseph Fletcher's presentation
LukeW's post: Design Principles at Microsoft

LukeW is the Senior Director of Product Ideation & Design at Yahoo! Inc.

Apr 25, 2009

How soon will we see interactive information visualization for multi-touch & gesture systems?

The field of information visualization is growing. Until recently, most visualizations were created for use on a single PC or larger screen, allowing for interaction by only one user at a time. I have a feeling that this will be changing in the very near future.

Why? Interactive duo and multi-touch interfaces are becoming more common, and now come in all sizes of screens, from the iPhone, the Surface, and CNN's multi-touch "Magic Wall". People of all ages play interactive games on the Wii, often on large flat-panel displays. In my opinion, the time is right for those developing applications for the InfoViz world to think about harnessing the power of multi-touch.


Below is a picture of the front page of the Visual Complexity website. If you go to the site, you can select a visualization, and then explore it more in detail, as each picture links to a web-page that provides background information about the visualization project, the artist or team behind the project, and links to the project's website.

I took a look at a variety of the examples posted on the Information Complexity website, and think many would be enhanced by a transformation to a multi-touch, gesture, bi-manual, or duo/multi user system. I'm interested in learning what others think about this. If you are working on a collaborative information visualization project, feel free to add a comment and post a link.

Here is a a nice quote from the website:

"Functional visualizations are more than innovative statistical analyses and computational algorithms. They must make sense to the user and require a visual language system that uses colour, shape, line, hierarchy and composition to communicate clearly and appropriately, much like the alphabetic and character-based languages used worldwide between humans."

Matt Woolman
Digital Information Graphics




Update: I did a search for "multi-touch" on the Visual Complexity website and found a couple of interactive applications:

Reactable (I've posted about this system a few times!)
(Reactable website)



















Prototouch

(Wirmachenbunt Website)



















Nov 13, 2008

RENCI at Duke University: Multi-Touch Collaborative Wall and Table utilizing TouchLib; More about UNC-C's Viz lab...

RENCI is a multi-disciplinary collaboration between several universities in North Carolina, with centers located at the Europa Center, Duke University, N.C. State, UNC Chapel Hill, East Carolina University, UNC-Asheville, UNC-Charlotte, and the Health Sciences Library at UNC-Chapel Hill. Many of the centers focus on visualization and collaborative technologies, and have been involved in multi-touch "surface" computing.

The pictures below are from the RENCI center at Duke University:

http://vis.renci.org/multitouch/wp-content/themes/daleri-dark-10/img/front.jpg

Duke Multi-Touch Collaborative Wall

The multi-touch wall is 13 x 5 feet, and utilizes six high-definition projectors, resulting in a combined resolution of 5760-2160, and supports multiple users. According to information on the RENCI website, the design is scalable and applicable to non-flat surfaces. The wall system runs on Windows and Linux.

Duke Multitouch Wall. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

DI, or Direct Illumination is used for touch detection in both the wall and the table for detecting touch. A separate instance of Touchlib runs for each of the 8 cameras used to detect touch. A gesture engine interprets the information about touches on the screen as gesture events. Each camera is handled separately for image processing and blob tracking tasks.

Direct Illumination (DI)

Graphics from the RENCI Vis Group Multi-Touch Blog

The Duke Multi-Touch Wall System

Here is cool picture of the "Multi-touch Calibration Device", which uses a built-in TouchLib utility.

Calibrating using the utility built into TouchLib.

Additional information can be found on the RENCI Vis Group Multi-Touch Blog.

FYI

Touchlib is a multi-touch development kit that can be found on the NUI-Group website.

"Touchlib is a library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light, and sends your programs these multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It includes a configuration app and a few demos to get you started, and will interace with most types of webcams and video capture devices. It currently works only under Windows but efforts are being made to port it to other platforms."

If you are interested in creating your own multi-touch table, the NUI-Group website and forums are a great place to start.

Related:

If you follow my blog, you probably know that I've taken several graduate courses at UNC-Charlotte. Some of my professors and a classmate or two have been involved in some exciting visualization research over the past year. (If you are serious about multi-touch and other visually-based applications, it is worth taking some time to familiarize yourself with visualization and interaction research.)

News from the UNC-Charlotte Vis Center:

At the University of North Carolina at Charlotte, RENCI is a collaboration between the UNC Charlotte Urban Institute, the Center for Applied Geographic Information Science, and the Charlotte Visualization Center.

11/06/2008
Robert Kosara’s group wins two awards at IEEE VisWeek Caroline Ziemkiewicz and Robert Kosara won Honorable Mention (the second highest award) at the IEEE InfoVis Conference for their paper, “The Shaping of Information by Visual Metaphors”. Also, Alex Godwin, Kosara’s student, won Best Poster for his submission, “Visual Data Mining of Unevenly-Spaced Event Sequences”.

The Vis Center is pretty fascinating, as you can see by the group of visitors at an open house.

If you are just as fascinated by this stuff as the guys in the picture, here are links to some recent papers by UNC-Charlotte faculty affiliated with the Vis Center:

The Shaping of Information by Visual Metaphors (Caroline Ziemkiewicz and Robert Kosara)

Evaluating the Relationship Between User Interaction and Financial Visual Analysis (Don Hyun Jeong, Wenwen Dou, Felsia Stukes, William Ribarsky, Heather Richter Lipford, Remco Chang)

Visual Analytics for Complex Concepts Using a Human Cognition Model (Tera Marie Green, William Ribarsky, and Brian Fisher)

Jun 6, 2008

Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors

Every so often I come across a blog that has been around for a while, but is a new discovery to me. If you are interested in interactive multimedia and emerging technology, and you haven't heard of Richard Banks, you will probably be pleased with his vast array of interesting posts!

After browsing around Richard Bank's blogs, I found that he works for Microsoft Research. Like me, he maintains more than one blog, and he blogs about similar topics.

Here is a view into his world:

rb.trends: tracking future technology experience
rb.work: technology, design and research
rb.log: family, photography, architecture etc.

One of Richard's recent posts brought me back to my musings about interactive surfaces:

Via Richard Banks and Gizmodo: A Touch Screen Poker Table from Hard Gaming

If it is possible to program an application to handle the metrics that support poker, then it is possible to program a multi-user application for work or education, and have this application integrated into any type display, according to need.

The form factor above would be useful for team meetings or collaborative project planning, especially during group decision-making or policy planning activities that involve data analysis or information visualization.

Could this concept be modified to fit the form of a kidney table for use in education? Think about it. We could combine the best features of surface computing with the best features of multi-user video games.
http://www.qualitychurchfurniture.com/images/activitytablecolors/lrg_48KID72LOset.jpg
Rationale? Video-game applications can handle the metrics of multi-user interaction, which would be ideal for tracking student progress and interaction. Touch interaction would enable young children with an easy way to access the application- or all of the information that will be available on the multi-touch web of the future, without learning first to type.

(Touch and gesture interaction could help us move towards the implementation of the Internet of Things)

Could this concept be transformed for use by two people collaborating on an electronic drafting table?


http://www.dwvartengine.com/media_downloads/dw_draftingTable_lo.jpg The image “http://ecx.images-amazon.com/images/I/31Y2BKBBhxL._SL500_AA280_.jpg” cannot be displayed, because it contains errors.

If you are a NUI Group member, think about this!


Note: I came up with the phrase, "multi-touch web of the future" as I was typing this post. I guess I didn't invent it - I looked it up and found Andrew Foote's
post, "The Multi-Touch Web"

I'm in.


Somewhat Related.......


More for the tech-curious:

Interactive data visualization in Second Life

Idle together "Technology blog oriented towards good design and impressive web applications"

Apple Developing Full Multi-Touch Macs - Apple Insider


More for the tech-savvy:

Multi-touch gestures in the Factor UI (Apple)- by Pestov. Includes code.

NSResponder Modifications: Swipe, Rotate, and Magnify, by Elliott Harris, a member of the NUI Group, I think.

Web Development Guidelines for iPhone
(lots of good info, with video clips and code)

Web 3D Consortium

Unity 3D kit for the

Google's Android


Multi-touch Web Consortium (in my dreams)

Gesture Web Consortium (in my dreams)

Apr 3, 2011

INTERACTIVE TABLETOPS AND SURFACES 2011 CALL FOR PAPERS AND NOTES (6th Annual ACM ITS 2011 - Nov. 13-16)

Thanks to Johannes Schöning for sharing this information!

CALL FOR PAPERS AND NOTES
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011
ITS 2011
November 13-16, 2011
Portopia Hotel, Kobe, Japan
The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications.

Building on their success in previous years, ITS again features Papers and Notes presentations, as well as tutorials, posters, demonstrations tracks and a doctoral symposium. ITS 2011 will also include workshops.

ITS 2011 will bring together top researchers and practitioners who are interested in both the technical and human aspects of ITS technology. On behalf of the conference organizing committee, we invite you to begin planning your submissions and participation for this year's conference!

CALL FOR PARTICIPATION: PAPERS AND NOTES
The use of interactive surfaces is an exciting and emerging research area. Display technologies, such as projectors, LCD and OLED flat panels, and even flexible display substrates, coupled with input sensors capable of enabling direct interaction, make it reasonable to envision a not-so-distant future in which many of the common surfaces in our environment will function as digital interactive displays. ITS brings together researchers and practitioners from a variety of backgrounds and interests, such as camera and projector based systems, new display technologies, multi-touch sensing, user interface technologies, augmented reality, computer vision, multimodal interaction, novel input and sensing technologies, computer supported cooperative work (CSCW), and information visualization.

The intimate size of this single-track symposium provides an ideal venue for leading researchers and practitioners to exchange research results and experiences. We encourage submissions on (but not limited to) the following topic areas as they relate to interactive tabletops and surfaces:

* Gesture-based interfaces
* Multi-modal interfaces
* Tangible interfaces
* Novel interaction techniques
* Data handling/exchange on large interactive surfaces
* Data presentation on large interactive surfaces
* Software engineering methods
* Computer supported collaborative work
* Middleware and network support
* Virtual reality and augmented reality
* Social protocols
* Information visualizations
* Hardware, including sensing and input technologies with novel capabilities
* Human-centered design & methodologies
* Applications

CONFERENCE COMMITTEE
Conference Co-Chairs
Jun Rekimoto, The University of Tokyo / SonyCSL, Japan
Hideki Koike, University of Electro-Communications, Japan
Kentaro Fukuchi, Meiji University, Japan

Program Co-Chairs
Yoshifumi Kitamura, Tohoku University, Japan
Daniel Wigdor, University of Toronto, Canada

SUBMISSIONS
Papers/Notes: We invite paper submissions of two kinds: Papers (10 pages) and Notes (4 pages). Papers must present original, highly innovative, prospective and forward-looking research, possibly in one or more of the themes given above. Notes must also report novel and complete research, but where the scope and scale of the contribution is more focused and succinct than papers. Submissions must be submitted as a single PDF file in the ACM format through the submission system. A template for submissions can be found on the ITS

All accepted submissions will be presented at ITS 2011 and appear in the ITS digital proceedings and be archived in the ACM digital library.

IMPORTANT DATES
Paper/Note Submissions: June 30, 2011
Paper/Note Author Notifications: September 2, 2011
Paper/Note Camera-Ready Deadline: October 21, 2011

The calls for the posters, demos, tutorials, workshops and the doctoral symposium will be published soon, please check our website for further information.

RELATED
ITS 2010 Awards

Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by  Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers




Who's Who
ITS 2010Conference General Chairs 
Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH 

ITS 2010 Organization Committee

Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
 
Brave NUI World: Designing User Interfaces for Touch and Gesture

Description: Description: Book Cover


Note: This post includes links, please let me know of anything I should correct.

Aug 15, 2009

The Interdisciplinary World of Dance and Interactive Technology:

After I finished my post "What happens when engineers and musicians get together? They get Calvin Harris "Ready for the Weekend" with the Humanthesizer and Bare Ink", I thought I'd catch up on the world of dance and technology.

I took a look at an on-line community, Dance-Tech.NET which focuses on the "interdisciplinary explorations on the performance of motion". I was happy to find that two people I know are members of this vibrant group of people:

Celine Latulipe

Dr. Celine Latulipe is a Human-Computer Interaction researcher at the University of North Carolina-Charlotte. She is the lead in the Dance.Draw project, a collaboration between the Software and Information Systems department, the Department of Dance, and a digital artist.

A Mischief of Mus Musculus”

As Celine puts it, "You are more than your eyes and one hand. Why should you have to be less than you are when confronted with a digital device? Exquisite interaction is a collection of research projects...that aim to enrich your expressive creativity in the digital realm by allowing you and your collaborators to use more of your body in that digital interaction"


Visualization code was designed by Mike Wirth, using Processing, to create the art displayed on the Art from Dance page of the Dance.Draw project.

Doug Fox

Doug Fox
created and maintains the Great Dance and Kinetic Interface blogs. When he was 42, he decided to study dance - modern, ballet, and jazz. He is interested in the intersection of dance and technology, and more recently became involved in the study of animation related to dance and movement.

The following is from Doug's "About" page - the videos and links are worth exploring:

"As a starting point, I'd like to encourage readers to visit the
Movement Is at the Heart of Scientific and Technological Change background page. Here you will find 16 videos (plus links to more videos) that show in very concrete terms how new computer interfaces and digital devices being used in a range of fields are, in essence, body-centric and movement-centric..."

Here is a sample of Doug's topics:

Dance Theater Workshop's Twitter Community Choreography
Dance Vlogging, Will this Video Genre Increase in Popularity?
Shoot Dance Videos with the new iPhone 3GS
Prodigy Warrior's Dance Combines Stop-Motion Animation and Puppetry
Choreographing Gesture Controls for Interactive Devices (be sure to read the comments and link to Arizona State University: School of Arts, Media and Engineering)

Doug Fox writes about topics that are truly interdisciplinary:




Dance Your Ph.D. Contest - A Wonderful Merging of Dance and Science
This contest was funded by the American Association for the Advancement of Science. The objective of the contest? "Using no words or images, interpret your Ph.D. thesis in dance form".

Contest winners were paired with choreographers, who created a new dance work based on a peer-reviewed article in a scientific journal. (The Science Dance Match-Up Challenge)

I just had to post the videos here - you can read detailed descriptions about each video on YouTube:

The role of Vitamin D in beta-cell function


Cerebral activation patterns induced by inflection of regular and irregular verbs with positron emission tomography: A comparison between single subject and group analysis


A Molecular Dance in the Blood, Observed


Popular Choice: Physics Tango "Single Molecule Measurements of Protelomerase TelK-DNA Complexes."


For a look at the 2008 "Dance Your Ph.D." videos, and the the videos of the 2009 contest entries, visit the 2009 AAS/Science Dance Contest web page.

The four videos generated by the professional choreographers can be accessed on Vimeo.

Links to the scientific articles, the bios of the choreographers and scientists, and videos of the choreographer's renditions of the scientist's work can be found on the Science/AAAS website:

The Gonzo Scientist: The Science Dance Match-Up Challenge
John Bohannon, Science, 4/17/09

SOMEWHAT RELATED
Usually I reserve this section for links and information from external sources, but this time, I thought I'd share a few of my opinions that are somewhat related to this topic.

Why do I think interdisciplinary pursuits are important?

In my opinion, to move forward, the arts and other disciplines need to embrace the interdisciplinary way of thinking. There is much that is mixing and converging as I type these words. There is less emphasis for young people to pick one little corner of a field of study and make it their life's obsession. I have always had an "interdisciplinary" approach to life, ever since I can remember. I attribute this to my parents, who nurtured me as a musical, dancing, artistic child to pursue my talents at a young age, and when faced with choosing a college major, to go for a double major.

My interdisciplinary nature has fueled my journey into the world of technology, and my early background in the arts probably explains why I'm excited about interactive multimedia, extending into the realms of emmersive games, multi-touch and gesture interaction, and technology-supported interaction that takes place in larger public spaces.

Note:

As many of my readers know, I work as a school psychologist in my "day job". I DO miss the time when I was working part-time and taking graduate classes at UNC-Charlotte, but when the economy went downhill, It was necessary for me to return to work full-time.

The upcoming academic year will be busy! With the recent budget cuts to school districts in the in the state of North Carolina, I will have another school added to my schedule. I'm excited that it is a high school for technology and the arts, and that the school has a strong dance program!

I've posted quite a bit recently, since I have plenty to share. Soon I will only have time to post about 3 times a week.

Aug 18, 2009

CRISTAL: One Giant Remote Control Multi-Touch Coffee Table; ACM Interactive Tabletops and Surfaces 2009 in Banff, Canada


Via Wired Gadget Lab Priya Ganapati 8/14/09

What is CRISTAL ? Control of Remotely Interfaced Systems using Touch-based Actions in Living Spaces and acronym for a project at the Media Interaction Lab at the Upper Austria University of Applied Sciences, Digital Media.

Watch the videos:




"CRISTAL simplifies the control of our digital devices in and around the living room. The system provides a novel experience for controlling devices in a home environment by enabling users to directly interact with those devices on a live video image of their living room using multi-touch gestures on a digital tabletop." -mediainteractionlab, YouTube

The CRISTAL project is a collaboration between several people, spanning across a few universities, according to the Media Interaction Lab website:
Christian Rendl
Media Interaction Lab
Florian Perteneder
Media Interaction Lab
Thomas Seifried
Media Interaction Lab
Michael Haller
Media Interaction Lab
Daisuke Sakamoto
University of Tokyo
Jun Kato
University of Tokyo
Masahiko Inami
Keio University
Stacey D. Scott
University of Waterloo
CRISTAL received the Best Emerging Technology Award at the 36th International Conference and Exhibition on Computer Graphics and Interactive Techniques (SIGGRAPH 2009)

Below is a sample of the Interactive Media Lab's publications:

M. Haller, P. Brandl, C. Richter, T. Seifried, J. Leitner, and A. Gokcezade, 2009.
"Interactive Displays and Next-Generation Interfaces." Springer, 2009. [bibtex]

C. Köffel, W. Hochleitner, J. Leitner, M. Haller, A. Geven, and M. Tscheligi, 2009.
"Using Heuristics to Evaluate the Overall User Experience of Video Games and Advanced Interaction Games." Springer, 2009. [in press] [bibtex]

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009.
"Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009.
"Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. 7, pp. 33-40, 2009. in press [bibtex]

J. Leitner, M. Haller, K. Yun, W. Woo, M. Sugimoto, M. Inami, A. D. Cheok, and H. D. Been-Lirn, 2009.
"Physical Interfaces For Tabletop Games," Computer Entertainment, vol. XX, p. XX, 2009. [bibtex]

M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation."

D. Leithinger and M. Haller, 2007.
"Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]

D. Regenbrecht, M. Haller, J. Hauber, and M. Billinghurst, 2006.
"Carpeno: interfacing remote collaborative virtual environments with table-top interaction," Virtual Reality, vol. 10, iss. 2, pp. 95-107, 2006. [bibtex]

One of the people involved in the CRISTAL project is Stacey D. Scott, Ph.D., is an assistant professor of systems design engineering at the University of Waterloo. She is also the director of the Collaborative Systems Laboratory. The Collaborative Systems Laboratory focuses on fundamental interfaces and interaction techniques for shared large-screen displays, such as multi-display environments and social-supporting digital tabletop interfaces, and also collaborative and decision support interfaces for complex, time-critical team environments.

Dr. Scott is also one of the program co-chairs of the upcoming ACM Interactive Tabletops and Surfaces 2009 Conference will be held November 23-25 in Banff, Canada.

Mark your calendars!

The following topics, as they relate to interactive tabletops and surfaces, will be presented:

  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Sensing and input technologies
  • Human-centered design & methodologies
Here is the "who's who" of interactive tabletops and surfaces- the Interactive Tabletops and Surfaces program committee:

Patrick Baudisch Hasso Plattner Institute Potsdam, Germany
Francois Berard University of Grenoble, France
Peter Brandl Media Interaction Lab, Upper Austria University of Applied Sciences, Austria
Andreas Butz University of Munich, Germany
Francois Coldefy Orange Labs, France
Morten Fjeld Chalmers University of Technology, Sweden
Kentaro Fukuchi University of Electro-Communications, Japan
Tovi Grossman Autodesk Research, Canada
Mark Hancock University of Calgary, Canada
Petra Isenberg University of Calgary, Canada
Yuichi Itoh Osaka University, Japan
Karrie Karahalios University of Illinois, USA
Hiro Kato Osaka University, Japan
Hideki Koike University of Electro-Communications, Japan
Frank Maurer University of Calgary, Canada
Max Mühlhäuser TU Darmstadt, Germany
Christian Muller-Tomfelde CSIRO-ICT Centre, Australia
Miguel Nacenta University of Saskatchewan, Canada
Patrick Olivier Newcastle University, UK
Jun Rekimoto Sony / University of Tokyo, Japan
Meredith Ringel Morris Microsoft Research, USA
Daisuke Sakamoto Tokyo University, Japan
Yoichi Sato University of Tokyo, Japan
Chia Shen Harvard University, USA
Masahiro Takatsuka University of Sydney, Australia
Lucia Terrenghi Vodafone Group R&D, Germany
Bruce Thomas University of Southern Australia, Australia
Melanie Tory University of Victoria, Canada
Edward Tse SMART Technologies, Canada
Fred Vernier South-Paris University, France
Andy Wilson Microsoft Research, USA
Massimo Zancanaro Bruno Kessler Foundation (formerly ITC), Italy



If you are a university student researching interactive tabletops, multi-touch surfaces, and/or gesture interaction, I hope this post helps!

Oct 13, 2013

UPDATE: NUITEQ's Snowflake ABC has been updated, improved, and retitled-- Snowflake MultiTeach®

August 24, 2016 Update

This post was originally published on October 13, 2013, and focused on Snowflake ABC, a software suite of entertaining educational apps developed by NUITEQ®.

Fast forward to 2016, and you'll see many great improvements have been made by the NUITEQ® team.  Along with the changes came a new name, Snowflake MultiTeach®


Snowflake MultiTeach® was designed for multi-touch, multi-user displays and tables, but can be used on single-touch displays or traditional computers/laptops with a mouse. It works well for "front-of-class" presentations and has many features that support collaborative group work.



So what is new?

The Lessons app was added to the suite of applications, along with Nodes, a concept-mapping and presentation application that supports project and problem based learning, STEM, and STEAM.

The Lessons app is a powerful addition to Snowflake MultiTeach®.  Within this app,  teachers (and students) can build a variety of interactive, engaging lessons, learning games, and quizzes, with little effort. 

Snowflake MultiTeach® can support several lessons and apps running at the same time, using the Zones feature, which makes it ideal for students working in pairs or small groups.

To compliment the Lessons app, the online MultiTeach® Lessons Community  was developed to provide a way for teachers to share lessons they've created.  The lessons posted on the MultiTeach® Lessons Community can be accessed from within the Lessons home screen.  Lessons can be searched by subject and level and imported directly into the Lessons home screen.  

Lessons created with Snowflake MultiTeach® can be exported directly from the Lessons home screen to the MultiTeach® Lessons Community so they can be shared with others. Snowflake MultiTeach® and sign-up to the MultiTeach® Lessons Community is required for access.

To support teachers as they make the transition from single-touch whiteboards to multi-user displays, the MultiTeach® Pinterest site was created.  There you can find curated and regularly updated information about how to use Snowflake MultiTeach®, 21st century teaching and learning, free digital resources for the classroom, and much more.

For information about how teachers can integrate Snowflake MultiTeach® into their classrooms, take some time to read the following blog posts. Share with your peers!

Infuse Arts into STEM Projects Using Collaborative Touch Screen Technology

Collaborative STEM Education and Careers Exploration

How to MultiTeach®: Introduction to Snowflake MultiTeach® Nodes App

Differentiated instruction through Snowflake MultiTeach®


Like what you see?

If you are interested in replacing your current single-touch interactive whiteboard with a multi-touch display, contact a representative from NUITEQ® by following the email link on the website, and you will be connected to someone in your region who can assist you. 

Here is the old post for reference.  

FYI: Most of the ABC apps shown below can be found within the current version of Snowflake MultiTeach®.

NUITEQ is a multi-touch & gesture tech company based in Sweden focused on collaborative natural user interfaces and interaction. Instead of using a traditional mouse and keyboard, NUITEQ's products support the effective use of fingers and hands to interact and explore digital content. I've been following this company since it was launched by Harry van der Veen in 2007 and have been happy to share NUITEQ's updates on this blog over the years.

NUITEQ's products can be found in public spaces, such as museums, retail centers, and transportation hubs,  at special events, such as trade shows and exhibitions, in collaborative work settings, such as board rooms and design studios, and in educational settings.   

One of the latest innovations from NUITEQ is Snowflake ABC, multi-touch software for children and teens. It runs on a range of multi-touch hardware surfaces, including Lenovo, 3M Touch Systems,  Zytronic, Flatfrog, PQ Labs, Touch International, Lumio, and others.  



Multitouch software Snowflake ABC for kids (K-12) from multitouch software on Vimeo.

In my opinion, Snowflake ABC would be great for use in homes, libraries, and classrooms. Although the video shows children using the software on large displays and tables, it works well on smaller 27-inch touch screens, preferably those that support multi-touch interaction. A free trial version is available.  For parents considering purchasing a home computer, the Lenovo IdeaCenter A520 and other multi-touch "all-in-ones" come loaded with a variety of Snowflake apps.

Below is a screen shot various Snowflake ABC educational apps, from the Snowflake Suite website.


Credit: NUITEQ








































RELATED
Multi-touch software Snowflake ABC 1.3 for kids released (K-13) - Now with 18 languages out of the box!  NUITEQ Blog, 9/26/13
Snowflake ABC Brochure (pdf)
NUITEQ Signs OEM Agreement with Lenovo for its Multitouch Software Platform Snowflake Suite PRWEB, 8/31/12
Review:  Lenova IdeaCentre A520
Jason Palaszewski, 8/22/13, Lenovo Blog

Snowflake Interactive Display at SkellefteÃ¥ Airport (Snowflake Suite)