Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Nov 13, 2008

RENCI at Duke University: Multi-Touch Collaborative Wall and Table utilizing TouchLib; More about UNC-C's Viz lab...

RENCI is a multi-disciplinary collaboration between several universities in North Carolina, with centers located at the Europa Center, Duke University, N.C. State, UNC Chapel Hill, East Carolina University, UNC-Asheville, UNC-Charlotte, and the Health Sciences Library at UNC-Chapel Hill. Many of the centers focus on visualization and collaborative technologies, and have been involved in multi-touch "surface" computing.

The pictures below are from the RENCI center at Duke University:

http://vis.renci.org/multitouch/wp-content/themes/daleri-dark-10/img/front.jpg

Duke Multi-Touch Collaborative Wall

The multi-touch wall is 13 x 5 feet, and utilizes six high-definition projectors, resulting in a combined resolution of 5760-2160, and supports multiple users. According to information on the RENCI website, the design is scalable and applicable to non-flat surfaces. The wall system runs on Windows and Linux.

Duke Multitouch Wall. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

DI, or Direct Illumination is used for touch detection in both the wall and the table for detecting touch. A separate instance of Touchlib runs for each of the 8 cameras used to detect touch. A gesture engine interprets the information about touches on the screen as gesture events. Each camera is handled separately for image processing and blob tracking tasks.

Direct Illumination (DI)

Graphics from the RENCI Vis Group Multi-Touch Blog

The Duke Multi-Touch Wall System

Here is cool picture of the "Multi-touch Calibration Device", which uses a built-in TouchLib utility.

Calibrating using the utility built into TouchLib.

Additional information can be found on the RENCI Vis Group Multi-Touch Blog.

FYI

Touchlib is a multi-touch development kit that can be found on the NUI-Group website.

"Touchlib is a library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light, and sends your programs these multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It includes a configuration app and a few demos to get you started, and will interace with most types of webcams and video capture devices. It currently works only under Windows but efforts are being made to port it to other platforms."

If you are interested in creating your own multi-touch table, the NUI-Group website and forums are a great place to start.

Related:

If you follow my blog, you probably know that I've taken several graduate courses at UNC-Charlotte. Some of my professors and a classmate or two have been involved in some exciting visualization research over the past year. (If you are serious about multi-touch and other visually-based applications, it is worth taking some time to familiarize yourself with visualization and interaction research.)

News from the UNC-Charlotte Vis Center:

At the University of North Carolina at Charlotte, RENCI is a collaboration between the UNC Charlotte Urban Institute, the Center for Applied Geographic Information Science, and the Charlotte Visualization Center.

11/06/2008
Robert Kosara’s group wins two awards at IEEE VisWeek Caroline Ziemkiewicz and Robert Kosara won Honorable Mention (the second highest award) at the IEEE InfoVis Conference for their paper, “The Shaping of Information by Visual Metaphors”. Also, Alex Godwin, Kosara’s student, won Best Poster for his submission, “Visual Data Mining of Unevenly-Spaced Event Sequences”.

The Vis Center is pretty fascinating, as you can see by the group of visitors at an open house.

If you are just as fascinated by this stuff as the guys in the picture, here are links to some recent papers by UNC-Charlotte faculty affiliated with the Vis Center:

The Shaping of Information by Visual Metaphors (Caroline Ziemkiewicz and Robert Kosara)

Evaluating the Relationship Between User Interaction and Financial Visual Analysis (Don Hyun Jeong, Wenwen Dou, Felsia Stukes, William Ribarsky, Heather Richter Lipford, Remco Chang)

Visual Analytics for Complex Concepts Using a Human Cognition Model (Tera Marie Green, William Ribarsky, and Brian Fisher)

Jun 6, 2008

Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors

Every so often I come across a blog that has been around for a while, but is a new discovery to me. If you are interested in interactive multimedia and emerging technology, and you haven't heard of Richard Banks, you will probably be pleased with his vast array of interesting posts!

After browsing around Richard Bank's blogs, I found that he works for Microsoft Research. Like me, he maintains more than one blog, and he blogs about similar topics.

Here is a view into his world:

rb.trends: tracking future technology experience
rb.work: technology, design and research
rb.log: family, photography, architecture etc.

One of Richard's recent posts brought me back to my musings about interactive surfaces:

Via Richard Banks and Gizmodo: A Touch Screen Poker Table from Hard Gaming

If it is possible to program an application to handle the metrics that support poker, then it is possible to program a multi-user application for work or education, and have this application integrated into any type display, according to need.

The form factor above would be useful for team meetings or collaborative project planning, especially during group decision-making or policy planning activities that involve data analysis or information visualization.

Could this concept be modified to fit the form of a kidney table for use in education? Think about it. We could combine the best features of surface computing with the best features of multi-user video games.
http://www.qualitychurchfurniture.com/images/activitytablecolors/lrg_48KID72LOset.jpg
Rationale? Video-game applications can handle the metrics of multi-user interaction, which would be ideal for tracking student progress and interaction. Touch interaction would enable young children with an easy way to access the application- or all of the information that will be available on the multi-touch web of the future, without learning first to type.

(Touch and gesture interaction could help us move towards the implementation of the Internet of Things)

Could this concept be transformed for use by two people collaborating on an electronic drafting table?


http://www.dwvartengine.com/media_downloads/dw_draftingTable_lo.jpg The image “http://ecx.images-amazon.com/images/I/31Y2BKBBhxL._SL500_AA280_.jpg” cannot be displayed, because it contains errors.

If you are a NUI Group member, think about this!


Note: I came up with the phrase, "multi-touch web of the future" as I was typing this post. I guess I didn't invent it - I looked it up and found Andrew Foote's
post, "The Multi-Touch Web"

I'm in.


Somewhat Related.......


More for the tech-curious:

Interactive data visualization in Second Life

Idle together "Technology blog oriented towards good design and impressive web applications"

Apple Developing Full Multi-Touch Macs - Apple Insider


More for the tech-savvy:

Multi-touch gestures in the Factor UI (Apple)- by Pestov. Includes code.

NSResponder Modifications: Swipe, Rotate, and Magnify, by Elliott Harris, a member of the NUI Group, I think.

Web Development Guidelines for iPhone
(lots of good info, with video clips and code)

Web 3D Consortium

Unity 3D kit for the

Google's Android


Multi-touch Web Consortium (in my dreams)

Gesture Web Consortium (in my dreams)

Apr 3, 2011

INTERACTIVE TABLETOPS AND SURFACES 2011 CALL FOR PAPERS AND NOTES (6th Annual ACM ITS 2011 - Nov. 13-16)

Thanks to Johannes Schöning for sharing this information!

CALL FOR PAPERS AND NOTES
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011
ITS 2011
November 13-16, 2011
Portopia Hotel, Kobe, Japan
The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications.

Building on their success in previous years, ITS again features Papers and Notes presentations, as well as tutorials, posters, demonstrations tracks and a doctoral symposium. ITS 2011 will also include workshops.

ITS 2011 will bring together top researchers and practitioners who are interested in both the technical and human aspects of ITS technology. On behalf of the conference organizing committee, we invite you to begin planning your submissions and participation for this year's conference!

CALL FOR PARTICIPATION: PAPERS AND NOTES
The use of interactive surfaces is an exciting and emerging research area. Display technologies, such as projectors, LCD and OLED flat panels, and even flexible display substrates, coupled with input sensors capable of enabling direct interaction, make it reasonable to envision a not-so-distant future in which many of the common surfaces in our environment will function as digital interactive displays. ITS brings together researchers and practitioners from a variety of backgrounds and interests, such as camera and projector based systems, new display technologies, multi-touch sensing, user interface technologies, augmented reality, computer vision, multimodal interaction, novel input and sensing technologies, computer supported cooperative work (CSCW), and information visualization.

The intimate size of this single-track symposium provides an ideal venue for leading researchers and practitioners to exchange research results and experiences. We encourage submissions on (but not limited to) the following topic areas as they relate to interactive tabletops and surfaces:

* Gesture-based interfaces
* Multi-modal interfaces
* Tangible interfaces
* Novel interaction techniques
* Data handling/exchange on large interactive surfaces
* Data presentation on large interactive surfaces
* Software engineering methods
* Computer supported collaborative work
* Middleware and network support
* Virtual reality and augmented reality
* Social protocols
* Information visualizations
* Hardware, including sensing and input technologies with novel capabilities
* Human-centered design & methodologies
* Applications

CONFERENCE COMMITTEE
Conference Co-Chairs
Jun Rekimoto, The University of Tokyo / SonyCSL, Japan
Hideki Koike, University of Electro-Communications, Japan
Kentaro Fukuchi, Meiji University, Japan

Program Co-Chairs
Yoshifumi Kitamura, Tohoku University, Japan
Daniel Wigdor, University of Toronto, Canada

SUBMISSIONS
Papers/Notes: We invite paper submissions of two kinds: Papers (10 pages) and Notes (4 pages). Papers must present original, highly innovative, prospective and forward-looking research, possibly in one or more of the themes given above. Notes must also report novel and complete research, but where the scope and scale of the contribution is more focused and succinct than papers. Submissions must be submitted as a single PDF file in the ACM format through the submission system. A template for submissions can be found on the ITS

All accepted submissions will be presented at ITS 2011 and appear in the ITS digital proceedings and be archived in the ACM digital library.

IMPORTANT DATES
Paper/Note Submissions: June 30, 2011
Paper/Note Author Notifications: September 2, 2011
Paper/Note Camera-Ready Deadline: October 21, 2011

The calls for the posters, demos, tutorials, workshops and the doctoral symposium will be published soon, please check our website for further information.

RELATED
ITS 2010 Awards

Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by  Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers




Who's Who
ITS 2010Conference General Chairs 
Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH 

ITS 2010 Organization Committee

Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
 
Brave NUI World: Designing User Interfaces for Touch and Gesture

Description: Description: Book Cover


Note: This post includes links, please let me know of anything I should correct.

Aug 15, 2009

The Interdisciplinary World of Dance and Interactive Technology:

After I finished my post "What happens when engineers and musicians get together? They get Calvin Harris "Ready for the Weekend" with the Humanthesizer and Bare Ink", I thought I'd catch up on the world of dance and technology.

I took a look at an on-line community, Dance-Tech.NET which focuses on the "interdisciplinary explorations on the performance of motion". I was happy to find that two people I know are members of this vibrant group of people:

Celine Latulipe

Dr. Celine Latulipe is a Human-Computer Interaction researcher at the University of North Carolina-Charlotte. She is the lead in the Dance.Draw project, a collaboration between the Software and Information Systems department, the Department of Dance, and a digital artist.

A Mischief of Mus Musculus”

As Celine puts it, "You are more than your eyes and one hand. Why should you have to be less than you are when confronted with a digital device? Exquisite interaction is a collection of research projects...that aim to enrich your expressive creativity in the digital realm by allowing you and your collaborators to use more of your body in that digital interaction"


Visualization code was designed by Mike Wirth, using Processing, to create the art displayed on the Art from Dance page of the Dance.Draw project.

Doug Fox

Doug Fox
created and maintains the Great Dance and Kinetic Interface blogs. When he was 42, he decided to study dance - modern, ballet, and jazz. He is interested in the intersection of dance and technology, and more recently became involved in the study of animation related to dance and movement.

The following is from Doug's "About" page - the videos and links are worth exploring:

"As a starting point, I'd like to encourage readers to visit the
Movement Is at the Heart of Scientific and Technological Change background page. Here you will find 16 videos (plus links to more videos) that show in very concrete terms how new computer interfaces and digital devices being used in a range of fields are, in essence, body-centric and movement-centric..."

Here is a sample of Doug's topics:

Dance Theater Workshop's Twitter Community Choreography
Dance Vlogging, Will this Video Genre Increase in Popularity?
Shoot Dance Videos with the new iPhone 3GS
Prodigy Warrior's Dance Combines Stop-Motion Animation and Puppetry
Choreographing Gesture Controls for Interactive Devices (be sure to read the comments and link to Arizona State University: School of Arts, Media and Engineering)

Doug Fox writes about topics that are truly interdisciplinary:




Dance Your Ph.D. Contest - A Wonderful Merging of Dance and Science
This contest was funded by the American Association for the Advancement of Science. The objective of the contest? "Using no words or images, interpret your Ph.D. thesis in dance form".

Contest winners were paired with choreographers, who created a new dance work based on a peer-reviewed article in a scientific journal. (The Science Dance Match-Up Challenge)

I just had to post the videos here - you can read detailed descriptions about each video on YouTube:

The role of Vitamin D in beta-cell function


Cerebral activation patterns induced by inflection of regular and irregular verbs with positron emission tomography: A comparison between single subject and group analysis


A Molecular Dance in the Blood, Observed


Popular Choice: Physics Tango "Single Molecule Measurements of Protelomerase TelK-DNA Complexes."


For a look at the 2008 "Dance Your Ph.D." videos, and the the videos of the 2009 contest entries, visit the 2009 AAS/Science Dance Contest web page.

The four videos generated by the professional choreographers can be accessed on Vimeo.

Links to the scientific articles, the bios of the choreographers and scientists, and videos of the choreographer's renditions of the scientist's work can be found on the Science/AAAS website:

The Gonzo Scientist: The Science Dance Match-Up Challenge
John Bohannon, Science, 4/17/09

SOMEWHAT RELATED
Usually I reserve this section for links and information from external sources, but this time, I thought I'd share a few of my opinions that are somewhat related to this topic.

Why do I think interdisciplinary pursuits are important?

In my opinion, to move forward, the arts and other disciplines need to embrace the interdisciplinary way of thinking. There is much that is mixing and converging as I type these words. There is less emphasis for young people to pick one little corner of a field of study and make it their life's obsession. I have always had an "interdisciplinary" approach to life, ever since I can remember. I attribute this to my parents, who nurtured me as a musical, dancing, artistic child to pursue my talents at a young age, and when faced with choosing a college major, to go for a double major.

My interdisciplinary nature has fueled my journey into the world of technology, and my early background in the arts probably explains why I'm excited about interactive multimedia, extending into the realms of emmersive games, multi-touch and gesture interaction, and technology-supported interaction that takes place in larger public spaces.

Note:

As many of my readers know, I work as a school psychologist in my "day job". I DO miss the time when I was working part-time and taking graduate classes at UNC-Charlotte, but when the economy went downhill, It was necessary for me to return to work full-time.

The upcoming academic year will be busy! With the recent budget cuts to school districts in the in the state of North Carolina, I will have another school added to my schedule. I'm excited that it is a high school for technology and the arts, and that the school has a strong dance program!

I've posted quite a bit recently, since I have plenty to share. Soon I will only have time to post about 3 times a week.

Aug 18, 2009

CRISTAL: One Giant Remote Control Multi-Touch Coffee Table; ACM Interactive Tabletops and Surfaces 2009 in Banff, Canada


Via Wired Gadget Lab Priya Ganapati 8/14/09

What is CRISTAL ? Control of Remotely Interfaced Systems using Touch-based Actions in Living Spaces and acronym for a project at the Media Interaction Lab at the Upper Austria University of Applied Sciences, Digital Media.

Watch the videos:




"CRISTAL simplifies the control of our digital devices in and around the living room. The system provides a novel experience for controlling devices in a home environment by enabling users to directly interact with those devices on a live video image of their living room using multi-touch gestures on a digital tabletop." -mediainteractionlab, YouTube

The CRISTAL project is a collaboration between several people, spanning across a few universities, according to the Media Interaction Lab website:
Christian Rendl
Media Interaction Lab
Florian Perteneder
Media Interaction Lab
Thomas Seifried
Media Interaction Lab
Michael Haller
Media Interaction Lab
Daisuke Sakamoto
University of Tokyo
Jun Kato
University of Tokyo
Masahiko Inami
Keio University
Stacey D. Scott
University of Waterloo
CRISTAL received the Best Emerging Technology Award at the 36th International Conference and Exhibition on Computer Graphics and Interactive Techniques (SIGGRAPH 2009)

Below is a sample of the Interactive Media Lab's publications:

M. Haller, P. Brandl, C. Richter, T. Seifried, J. Leitner, and A. Gokcezade, 2009.
"Interactive Displays and Next-Generation Interfaces." Springer, 2009. [bibtex]

C. Köffel, W. Hochleitner, J. Leitner, M. Haller, A. Geven, and M. Tscheligi, 2009.
"Using Heuristics to Evaluate the Overall User Experience of Video Games and Advanced Interaction Games." Springer, 2009. [in press] [bibtex]

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009.
"Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009.
"Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. 7, pp. 33-40, 2009. in press [bibtex]

J. Leitner, M. Haller, K. Yun, W. Woo, M. Sugimoto, M. Inami, A. D. Cheok, and H. D. Been-Lirn, 2009.
"Physical Interfaces For Tabletop Games," Computer Entertainment, vol. XX, p. XX, 2009. [bibtex]

M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation."

D. Leithinger and M. Haller, 2007.
"Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]

D. Regenbrecht, M. Haller, J. Hauber, and M. Billinghurst, 2006.
"Carpeno: interfacing remote collaborative virtual environments with table-top interaction," Virtual Reality, vol. 10, iss. 2, pp. 95-107, 2006. [bibtex]

One of the people involved in the CRISTAL project is Stacey D. Scott, Ph.D., is an assistant professor of systems design engineering at the University of Waterloo. She is also the director of the Collaborative Systems Laboratory. The Collaborative Systems Laboratory focuses on fundamental interfaces and interaction techniques for shared large-screen displays, such as multi-display environments and social-supporting digital tabletop interfaces, and also collaborative and decision support interfaces for complex, time-critical team environments.

Dr. Scott is also one of the program co-chairs of the upcoming ACM Interactive Tabletops and Surfaces 2009 Conference will be held November 23-25 in Banff, Canada.

Mark your calendars!

The following topics, as they relate to interactive tabletops and surfaces, will be presented:

  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Sensing and input technologies
  • Human-centered design & methodologies
Here is the "who's who" of interactive tabletops and surfaces- the Interactive Tabletops and Surfaces program committee:

Patrick Baudisch Hasso Plattner Institute Potsdam, Germany
Francois Berard University of Grenoble, France
Peter Brandl Media Interaction Lab, Upper Austria University of Applied Sciences, Austria
Andreas Butz University of Munich, Germany
Francois Coldefy Orange Labs, France
Morten Fjeld Chalmers University of Technology, Sweden
Kentaro Fukuchi University of Electro-Communications, Japan
Tovi Grossman Autodesk Research, Canada
Mark Hancock University of Calgary, Canada
Petra Isenberg University of Calgary, Canada
Yuichi Itoh Osaka University, Japan
Karrie Karahalios University of Illinois, USA
Hiro Kato Osaka University, Japan
Hideki Koike University of Electro-Communications, Japan
Frank Maurer University of Calgary, Canada
Max Mühlhäuser TU Darmstadt, Germany
Christian Muller-Tomfelde CSIRO-ICT Centre, Australia
Miguel Nacenta University of Saskatchewan, Canada
Patrick Olivier Newcastle University, UK
Jun Rekimoto Sony / University of Tokyo, Japan
Meredith Ringel Morris Microsoft Research, USA
Daisuke Sakamoto Tokyo University, Japan
Yoichi Sato University of Tokyo, Japan
Chia Shen Harvard University, USA
Masahiro Takatsuka University of Sydney, Australia
Lucia Terrenghi Vodafone Group R&D, Germany
Bruce Thomas University of Southern Australia, Australia
Melanie Tory University of Victoria, Canada
Edward Tse SMART Technologies, Canada
Fred Vernier South-Paris University, France
Andy Wilson Microsoft Research, USA
Massimo Zancanaro Bruno Kessler Foundation (formerly ITC), Italy



If you are a university student researching interactive tabletops, multi-touch surfaces, and/or gesture interaction, I hope this post helps!

Oct 13, 2013

UPDATE: NUITEQ's Snowflake ABC has been updated, improved, and retitled-- Snowflake MultiTeach®

August 24, 2016 Update

This post was originally published on October 13, 2013, and focused on Snowflake ABC, a software suite of entertaining educational apps developed by NUITEQ®.

Fast forward to 2016, and you'll see many great improvements have been made by the NUITEQ® team.  Along with the changes came a new name, Snowflake MultiTeach®


Snowflake MultiTeach® was designed for multi-touch, multi-user displays and tables, but can be used on single-touch displays or traditional computers/laptops with a mouse. It works well for "front-of-class" presentations and has many features that support collaborative group work.



So what is new?

The Lessons app was added to the suite of applications, along with Nodes, a concept-mapping and presentation application that supports project and problem based learning, STEM, and STEAM.

The Lessons app is a powerful addition to Snowflake MultiTeach®.  Within this app,  teachers (and students) can build a variety of interactive, engaging lessons, learning games, and quizzes, with little effort. 

Snowflake MultiTeach® can support several lessons and apps running at the same time, using the Zones feature, which makes it ideal for students working in pairs or small groups.

To compliment the Lessons app, the online MultiTeach® Lessons Community  was developed to provide a way for teachers to share lessons they've created.  The lessons posted on the MultiTeach® Lessons Community can be accessed from within the Lessons home screen.  Lessons can be searched by subject and level and imported directly into the Lessons home screen.  

Lessons created with Snowflake MultiTeach® can be exported directly from the Lessons home screen to the MultiTeach® Lessons Community so they can be shared with others. Snowflake MultiTeach® and sign-up to the MultiTeach® Lessons Community is required for access.

To support teachers as they make the transition from single-touch whiteboards to multi-user displays, the MultiTeach® Pinterest site was created.  There you can find curated and regularly updated information about how to use Snowflake MultiTeach®, 21st century teaching and learning, free digital resources for the classroom, and much more.

For information about how teachers can integrate Snowflake MultiTeach® into their classrooms, take some time to read the following blog posts. Share with your peers!

Infuse Arts into STEM Projects Using Collaborative Touch Screen Technology

Collaborative STEM Education and Careers Exploration

How to MultiTeach®: Introduction to Snowflake MultiTeach® Nodes App

Differentiated instruction through Snowflake MultiTeach®


Like what you see?

If you are interested in replacing your current single-touch interactive whiteboard with a multi-touch display, contact a representative from NUITEQ® by following the email link on the website, and you will be connected to someone in your region who can assist you. 

Here is the old post for reference.  

FYI: Most of the ABC apps shown below can be found within the current version of Snowflake MultiTeach®.

NUITEQ is a multi-touch & gesture tech company based in Sweden focused on collaborative natural user interfaces and interaction. Instead of using a traditional mouse and keyboard, NUITEQ's products support the effective use of fingers and hands to interact and explore digital content. I've been following this company since it was launched by Harry van der Veen in 2007 and have been happy to share NUITEQ's updates on this blog over the years.

NUITEQ's products can be found in public spaces, such as museums, retail centers, and transportation hubs,  at special events, such as trade shows and exhibitions, in collaborative work settings, such as board rooms and design studios, and in educational settings.   

One of the latest innovations from NUITEQ is Snowflake ABC, multi-touch software for children and teens. It runs on a range of multi-touch hardware surfaces, including Lenovo, 3M Touch Systems,  Zytronic, Flatfrog, PQ Labs, Touch International, Lumio, and others.  



Multitouch software Snowflake ABC for kids (K-12) from multitouch software on Vimeo.

In my opinion, Snowflake ABC would be great for use in homes, libraries, and classrooms. Although the video shows children using the software on large displays and tables, it works well on smaller 27-inch touch screens, preferably those that support multi-touch interaction. A free trial version is available.  For parents considering purchasing a home computer, the Lenovo IdeaCenter A520 and other multi-touch "all-in-ones" come loaded with a variety of Snowflake apps.

Below is a screen shot various Snowflake ABC educational apps, from the Snowflake Suite website.


Credit: NUITEQ








































RELATED
Multi-touch software Snowflake ABC 1.3 for kids released (K-13) - Now with 18 languages out of the box!  NUITEQ Blog, 9/26/13
Snowflake ABC Brochure (pdf)
NUITEQ Signs OEM Agreement with Lenovo for its Multitouch Software Platform Snowflake Suite PRWEB, 8/31/12
Review:  Lenova IdeaCentre A520
Jason Palaszewski, 8/22/13, Lenovo Blog

Snowflake Interactive Display at SkellefteÃ¥ Airport (Snowflake Suite)

Aug 10, 2013

Kinect Interaction to support people with disabilities: DoubleFine's Happy Action Theater/Kinect Party; OAK Air Switch and Face Switch, resources

I've been experimenting with gesture and touch-based applications for many years and I'm excited to see how things have unfolded over the past couple of years, especially in the field of special education.

Last week I downloaded DoubleFine'Kinect-based Happy Action Theater/Kinect Party,  to use during some group activities with students I work with who have significant disabilities (including severe autism).  I wish I had discovered this suite of games sooner!

I had loads of fun with students and colleagues as we explored some of the 36 creative, and sometimes zany, minigames.  I had heard that DoubleFine had launched something special, but didn't realize how awesome it was until I spent some serious playful time with it at home last weekend. I then tried it out at work this past week.  

If you are planning to explore Happy Action Theater/Kinect party, keep in mind that it plays best when there are at least two people and an audience to cheer everything along.  Through the use of blob detection algorithms, the games can handle up to 6 players at a time, which is perfect for small-group special classes.

The following trailer gives just a little hint of what this suite of mini-games is all about!


I noted that many of the games were effective in helping students become more aware of their peers. They began to play and interact with one-another in ways I hadn't previously imagined.  I especially liked the fact that many of the mini-games made it possible for students in wheelchairs to participate.  

I look forward to exploring more of the games over the next few months and will follow up with a future post after I get more input from my colleagues (and students).

I learned about Kinect Party through my contact with people involved with the GestureSEN wiki. The wiki was created as part of a Professional Learning Community (PLC) for people who work with students in specialized schools, similar to the school where I work, and contains a wealth of information about the use of newer and emerging technologies, such as the iPad, Leap Motion, the Kinect, and eye-gaze systems to support young people with significant disabilities  including autism  Some members of the GestureSEN wiki have learned to code or are in the process of doing so, motivated by what they've experienced so far with their students.  (More information and links are listed in the "RELATED" section of this post.)

OAK

OAK was developed by RCAST at the University of Tokyo in collaboration with Microsoft Japan Co., Ltd. It uses the motion-tracking capabilities of Microsoft's Kinect sensor to create non-contact switches for people with limited mobility, enabling them access to computers and other electronic devices and systems.  The video below provides a nice overview of the OAK system.

The OAK Pro bundle includes the following applications:

The Air Switch software uses the distance/depth capabilities of the Kinect sensor to support gestures of the head, hands, or larger body part to turn things off or on. The infrared from the Kinect also supports the use of the Air Switch in the dark.   The color mode function captures movements from smaller parts of the body, such as a fingertip.

The Face Switch software uses facial recognition software that can track the movements of the face, mouth, tongue, and eyes.  It can identify facial parts that have moved significantly, and records motion data 

The Motion History software observes  the movement of a person's body using the video component of the Kinect sensor.   This customizes the system to the individual and ensures accuracy of the switch.   Movements are color coded and provide the person who is setting up the system a means to fit the system to the specific capabilities and needs of the user.  

The OAK system can be enhanced by the sue of peripherals, such as a USB 4 channel relay box, an IR remote control device or outlet, or other on/off switches/outlets.

The Assist-i corporation has made the OAK system and peripherals available on Amazon Japan.  From what I can tell from the company's website, the OAK software can be downloaded free for a 30-day trial.   I'd love to see how it would work with some of the students I work with who have difficulty accessing conventional switches!  It would be wonderful to come up with ways for these students to access a wider range of digital media activities and games.


















RELATED
University adapting videogame technology to help physically disabled computer users
Philip Kendall, Japan Today, 10/10/12
OAK Air Switch (PC Kinect)
OneSwitch.org.uk 4/30/13
OAK Air Switch, Face Switch, Motion History Pro Bundle (pdf)
Assist-i Corporation
Amazon Ai store: Assist-i Corporation (Prices are in Yen.)

Below is a partial list of links to resources related to using or creating engaging interactive applications and games for people with special needs: 

Using Kinect in Special Ed Classrooms: Advice from Loudoun County, Virginia Teachers
Microsoft in Education Team, Microsoft in Education Blog, 6/1/12

KinectSEN-Kinect and Special Educational Needs round-up
Greg Dunan, Microsoft Coding4Fun, 10/11/12

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform andKaiser Permanente therapists offers a barrel of possibilities!
Lynn Marentette, Interactive Multimedia Technology, 7/23/13

Behind the Scenes: Creating Marty the Monkey (The character from Vectorform's autism assessment app) John Einselen, Vectorform Blog, 7/24/13


Kinect Party Review: More Fun from the Fun Kings
Casey Lynch, IGN, 12/20/12


The Power of Kinect in Special Needs Education
Willemijn de Lint,  Hans Smeele, mytylschool De Ruimte

Sign Language Recognition and Translation with Kinect (pdf)
Ming Zhou, et. al.

Cool Kinect move: Reading sign language in real time
Christopher MacManus, CNET, 7/18/13

Anthony Rhys, Trinity Fields ICT

James Winchester, SENClassroom blog

PMLD Eyegaze Project at Trinity Fields

Kinect hacking using Processing

Kinect SEN and Processing Resources
Keith Manville, Oak Grove College OpenSEN

Mat's Classroom Blog

GestureSEN Wiki
KinectSEN Wiki; KinectSEN News
ProcessingSEN wiki
LeapSEN Wiki
EyegazeSEN Wiki

SEN Students and Coding
OpenSEN, 3/5/13

Processing2

Kinect for Windows Blog

Kinect For Windows
DoubleFine

Understanding Engagement, Module 3.2Training materials for teachers of learners with severe, profound and complex learning difficulties, UK Dept. for Education


Subscribe by e-mail