Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Oct 13, 2013

UPDATE: NUITEQ's Snowflake ABC has been updated, improved, and retitled-- Snowflake MultiTeach®

August 24, 2016 Update

This post was originally published on October 13, 2013, and focused on Snowflake ABC, a software suite of entertaining educational apps developed by NUITEQ®.

Fast forward to 2016, and you'll see many great improvements have been made by the NUITEQ® team.  Along with the changes came a new name, Snowflake MultiTeach®


Snowflake MultiTeach® was designed for multi-touch, multi-user displays and tables, but can be used on single-touch displays or traditional computers/laptops with a mouse. It works well for "front-of-class" presentations and has many features that support collaborative group work.



So what is new?

The Lessons app was added to the suite of applications, along with Nodes, a concept-mapping and presentation application that supports project and problem based learning, STEM, and STEAM.

The Lessons app is a powerful addition to Snowflake MultiTeach®.  Within this app,  teachers (and students) can build a variety of interactive, engaging lessons, learning games, and quizzes, with little effort. 

Snowflake MultiTeach® can support several lessons and apps running at the same time, using the Zones feature, which makes it ideal for students working in pairs or small groups.

To compliment the Lessons app, the online MultiTeach® Lessons Community  was developed to provide a way for teachers to share lessons they've created.  The lessons posted on the MultiTeach® Lessons Community can be accessed from within the Lessons home screen.  Lessons can be searched by subject and level and imported directly into the Lessons home screen.  

Lessons created with Snowflake MultiTeach® can be exported directly from the Lessons home screen to the MultiTeach® Lessons Community so they can be shared with others. Snowflake MultiTeach® and sign-up to the MultiTeach® Lessons Community is required for access.

To support teachers as they make the transition from single-touch whiteboards to multi-user displays, the MultiTeach® Pinterest site was created.  There you can find curated and regularly updated information about how to use Snowflake MultiTeach®, 21st century teaching and learning, free digital resources for the classroom, and much more.

For information about how teachers can integrate Snowflake MultiTeach® into their classrooms, take some time to read the following blog posts. Share with your peers!

Infuse Arts into STEM Projects Using Collaborative Touch Screen Technology

Collaborative STEM Education and Careers Exploration

How to MultiTeach®: Introduction to Snowflake MultiTeach® Nodes App

Differentiated instruction through Snowflake MultiTeach®


Like what you see?

If you are interested in replacing your current single-touch interactive whiteboard with a multi-touch display, contact a representative from NUITEQ® by following the email link on the website, and you will be connected to someone in your region who can assist you. 

Here is the old post for reference.  

FYI: Most of the ABC apps shown below can be found within the current version of Snowflake MultiTeach®.

NUITEQ is a multi-touch & gesture tech company based in Sweden focused on collaborative natural user interfaces and interaction. Instead of using a traditional mouse and keyboard, NUITEQ's products support the effective use of fingers and hands to interact and explore digital content. I've been following this company since it was launched by Harry van der Veen in 2007 and have been happy to share NUITEQ's updates on this blog over the years.

NUITEQ's products can be found in public spaces, such as museums, retail centers, and transportation hubs,  at special events, such as trade shows and exhibitions, in collaborative work settings, such as board rooms and design studios, and in educational settings.   

One of the latest innovations from NUITEQ is Snowflake ABC, multi-touch software for children and teens. It runs on a range of multi-touch hardware surfaces, including Lenovo, 3M Touch Systems,  Zytronic, Flatfrog, PQ Labs, Touch International, Lumio, and others.  



Multitouch software Snowflake ABC for kids (K-12) from multitouch software on Vimeo.

In my opinion, Snowflake ABC would be great for use in homes, libraries, and classrooms. Although the video shows children using the software on large displays and tables, it works well on smaller 27-inch touch screens, preferably those that support multi-touch interaction. A free trial version is available.  For parents considering purchasing a home computer, the Lenovo IdeaCenter A520 and other multi-touch "all-in-ones" come loaded with a variety of Snowflake apps.

Below is a screen shot various Snowflake ABC educational apps, from the Snowflake Suite website.


Credit: NUITEQ








































RELATED
Multi-touch software Snowflake ABC 1.3 for kids released (K-13) - Now with 18 languages out of the box!  NUITEQ Blog, 9/26/13
Snowflake ABC Brochure (pdf)
NUITEQ Signs OEM Agreement with Lenovo for its Multitouch Software Platform Snowflake Suite PRWEB, 8/31/12
Review:  Lenova IdeaCentre A520
Jason Palaszewski, 8/22/13, Lenovo Blog

Snowflake Interactive Display at Skellefteå Airport (Snowflake Suite)

Aug 10, 2013

Kinect Interaction to support people with disabilities: DoubleFine's Happy Action Theater/Kinect Party; OAK Air Switch and Face Switch, resources

I've been experimenting with gesture and touch-based applications for many years and I'm excited to see how things have unfolded over the past couple of years, especially in the field of special education.

Last week I downloaded DoubleFine'Kinect-based Happy Action Theater/Kinect Party,  to use during some group activities with students I work with who have significant disabilities (including severe autism).  I wish I had discovered this suite of games sooner!

I had loads of fun with students and colleagues as we explored some of the 36 creative, and sometimes zany, minigames.  I had heard that DoubleFine had launched something special, but didn't realize how awesome it was until I spent some serious playful time with it at home last weekend. I then tried it out at work this past week.  

If you are planning to explore Happy Action Theater/Kinect party, keep in mind that it plays best when there are at least two people and an audience to cheer everything along.  Through the use of blob detection algorithms, the games can handle up to 6 players at a time, which is perfect for small-group special classes.

The following trailer gives just a little hint of what this suite of mini-games is all about!


I noted that many of the games were effective in helping students become more aware of their peers. They began to play and interact with one-another in ways I hadn't previously imagined.  I especially liked the fact that many of the mini-games made it possible for students in wheelchairs to participate.  

I look forward to exploring more of the games over the next few months and will follow up with a future post after I get more input from my colleagues (and students).

I learned about Kinect Party through my contact with people involved with the GestureSEN wiki. The wiki was created as part of a Professional Learning Community (PLC) for people who work with students in specialized schools, similar to the school where I work, and contains a wealth of information about the use of newer and emerging technologies, such as the iPad, Leap Motion, the Kinect, and eye-gaze systems to support young people with significant disabilities  including autism  Some members of the GestureSEN wiki have learned to code or are in the process of doing so, motivated by what they've experienced so far with their students.  (More information and links are listed in the "RELATED" section of this post.)

OAK

OAK was developed by RCAST at the University of Tokyo in collaboration with Microsoft Japan Co., Ltd. It uses the motion-tracking capabilities of Microsoft's Kinect sensor to create non-contact switches for people with limited mobility, enabling them access to computers and other electronic devices and systems.  The video below provides a nice overview of the OAK system.

The OAK Pro bundle includes the following applications:

The Air Switch software uses the distance/depth capabilities of the Kinect sensor to support gestures of the head, hands, or larger body part to turn things off or on. The infrared from the Kinect also supports the use of the Air Switch in the dark.   The color mode function captures movements from smaller parts of the body, such as a fingertip.

The Face Switch software uses facial recognition software that can track the movements of the face, mouth, tongue, and eyes.  It can identify facial parts that have moved significantly, and records motion data 

The Motion History software observes  the movement of a person's body using the video component of the Kinect sensor.   This customizes the system to the individual and ensures accuracy of the switch.   Movements are color coded and provide the person who is setting up the system a means to fit the system to the specific capabilities and needs of the user.  

The OAK system can be enhanced by the sue of peripherals, such as a USB 4 channel relay box, an IR remote control device or outlet, or other on/off switches/outlets.

The Assist-i corporation has made the OAK system and peripherals available on Amazon Japan.  From what I can tell from the company's website, the OAK software can be downloaded free for a 30-day trial.   I'd love to see how it would work with some of the students I work with who have difficulty accessing conventional switches!  It would be wonderful to come up with ways for these students to access a wider range of digital media activities and games.


















RELATED
University adapting videogame technology to help physically disabled computer users
Philip Kendall, Japan Today, 10/10/12
OAK Air Switch (PC Kinect)
OneSwitch.org.uk 4/30/13
OAK Air Switch, Face Switch, Motion History Pro Bundle (pdf)
Assist-i Corporation
Amazon Ai store: Assist-i Corporation (Prices are in Yen.)

Below is a partial list of links to resources related to using or creating engaging interactive applications and games for people with special needs: 

Using Kinect in Special Ed Classrooms: Advice from Loudoun County, Virginia Teachers
Microsoft in Education Team, Microsoft in Education Blog, 6/1/12

KinectSEN-Kinect and Special Educational Needs round-up
Greg Dunan, Microsoft Coding4Fun, 10/11/12

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform andKaiser Permanente therapists offers a barrel of possibilities!
Lynn Marentette, Interactive Multimedia Technology, 7/23/13

Behind the Scenes: Creating Marty the Monkey (The character from Vectorform's autism assessment app) John Einselen, Vectorform Blog, 7/24/13


Kinect Party Review: More Fun from the Fun Kings
Casey Lynch, IGN, 12/20/12


The Power of Kinect in Special Needs Education
Willemijn de Lint,  Hans Smeele, mytylschool De Ruimte

Sign Language Recognition and Translation with Kinect (pdf)
Ming Zhou, et. al.

Cool Kinect move: Reading sign language in real time
Christopher MacManus, CNET, 7/18/13

Anthony Rhys, Trinity Fields ICT

James Winchester, SENClassroom blog

PMLD Eyegaze Project at Trinity Fields

Kinect hacking using Processing

Kinect SEN and Processing Resources
Keith Manville, Oak Grove College OpenSEN

Mat's Classroom Blog

GestureSEN Wiki
KinectSEN Wiki; KinectSEN News
ProcessingSEN wiki
LeapSEN Wiki
EyegazeSEN Wiki

SEN Students and Coding
OpenSEN, 3/5/13

Processing2

Kinect for Windows Blog

Kinect For Windows
DoubleFine

Understanding Engagement, Module 3.2Training materials for teachers of learners with severe, profound and complex learning difficulties, UK Dept. for Education


Subscribe by e-mail

May 27, 2013

Leap Motion and Google Earth Experiment: Cute Doggie Photo-globe Mashup

Leap Motion and Google Earth Experiment: Cute Doggie Photo-globe Mashup 

I finally experimented with my Leap Motion controller and Google Earth, using a mashup I created a few years ago with pictures of cute dogs from my Flickr photo-stream.  In the video below, you can see that my gesture navigation skills still need some practice!

I should have watched the following video of Leap Motion in action with Google Earth before trying this experiment at home : )  

I am pretty sure that developers will be able to tweak Leap Motion + Google Earth interaction in the near future.  I'd like to adapt it for use with kids as well as adults who have mild motor impairments.





















Cute Doggies Photo-Globe Mash-up using Google Earth and a Flickr Set (How-to)

If you'd like to make your very own photo-globe using Google Earth and Flickr photos, here are the directions, ported and updated from a previous post:


This photo is a screen shot of photos of just about every dog I know, and some that happened to cross my path. In this post, I'll share some information about how to create a photo-globe in Google Earth. 

The first step is to make sure you have lots of pictures related to your theme uploaded to a site such as Flickr.  (You can also create a photo-globe using pictures from your computer's hard drive.)

To get the pictures into Google Earth, I used the Image Overlay feature, and in the "link" textbox, I entered the image URL for each picture that I'd previously loaded as a set in Flickr.



To do prepare for this, make sure you go to "view" tab on the upper left-hand section of your screen, and make sure that "toolbar" is checked. Also make sure that "Grid" is selection, as this will help make it easier to arrange and align your pictures.  You can turn off this feature later. Near the top of the screen, click on the Image Overlay icon. (I've highlighted it in the picture.)



You'll have to enter the URL of the image you'd like to add to the globe in the "Link" textbox, which I've highlighted in the above picture.  In this case, I've used a link to one of my pictures in a Flickr set I created for this project.

One thing to keep in mind is that the picture will take up a much larger space than you might prefer, so you'll have to adjust the size using the green markers:

Positioning the Overlay in the Viewer
The following directions are from the "Positioning the Imagery in the Viewer" section in the help section:


  1. Use the center cross-hair marker to slide the entire overlay on the globe and position it from the center. (Tip: do this first.)
  2. Use the triangle marker to rotate the image for better placement.
  3. Use any of the corner cross-hair markers to stretch or skew the selected corner. If you press the Shift key when selecting this marker, the image is scaled from the center.
  4. Use any of the four side anchors to stretch the image in or out of from the selected side. If you press the Shift key when doing this, the image is scaled from the center.

TIP:  Try positioning the center of the image as a reference point first, and then use the Shift key in combination with one of the anchors to scale the image for best positioning.

Directions updated to reflect latest version of Flickr, as of 5/27/13:

To find the image URL for a photo in Flickr that you wish to link on your photo-globe, select your desired photo and right click "Copy Image URL".
















Put your curser in the Link section of  the "New Image Overlay" dialog box in Google Earth, and right click to select "paste" from the drop-down menu















Then repeat the process.  It helps to name each picture so that you can find it easily in Google Earth.

To enhance your mash-up, you can add place-marks that contain URLs that link to additional information about the subject of a picture, such as blog posts with embedded videos and/or text related to a picture, and so forth. Directions can be found in Google Earth's help section.

The process of building a photo-globe in Google Earth is a bit tedious.  If someone has a short-cut to share, please let me know!


RESOURCES
Google Earth
Flickr
Programmable Web (My hunch is that this site might provide some information about shortcuts for creating a photo-globe in Google Earth.)
LEAP Motion

Mar 16, 2013

UPDATE: What's New for Kinect? Fusion, real-time 3D digitizing, design considerations, and more.

The Evolution of Microsoft Kinect

I've been following the evolution of Microsoft's Kinect, and recently discovered a few interesting videos that show how far the system has come. According to Josh Blake, the founder of the OpenKinect community and author of the Deconstructing the NUI blog,  the Kinect for Windows SDK v1.7 will be released on Monday, March 18th, from http://www.kinectforwindows.com.  More details about this version can be found on Josh's blog as well as the official Kinect for Windows blog.


It is possible to create applications for desktop systems that work with the Kinect in interesting ways, as you'll see in the following videos. I think there is potential here for use in education/edutainment!

Below is a video of Toby Sharp, of Microsoft Research, Cambridge, demonstrating Kinect Fusion.  The software allows you to use a regular Kinect camera to reconstruct the world in 3D.



KinEtre: A Novel Way to Bring Computer Animation to Life
According to information from the YouTube description, "KinÊtre is a research project from Microsoft Research Cambridge that allows novice users to scan physical objects and bring them to life in seconds by using their own bodies to animate them. This system has a multitude of potential uses for interactive storytelling, physical gaming, or more immersive communications."




The following videos are quite long, so feel free to re-visit this post when you have time to relax and take it all in!

Kinect Design Considerations
This video covers Microsoft's Human Interface Guidelines, scenarios for interaction and use, and best practices for user interactions.  It also includes a preview of the next major version of the Kinect SDK. 


Kinect for Windows Programming Deep Dive
This video discusses how to build Windows Desktop apps and experiences with the Kinect, and also previews some future work.




RELATED
Kinect for Windows Developer Downloads
Kinect for Windows Blog
Deconstructing the NUI Blog (Josh Blake)
Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible
Celia Gorman, IEEE Spectrum, 3/13/13
Kinect hand recognition due soon, supports pinch-to-zoom and mouse click gestures.
Tom Warren, The Verge, 3/6/13
Microsoft's KinEtre Animates Household Objects
Samuel K. Moore, IEEE Spectrum, 8/8/12
Kinect Fusion Lets You Build 3-D Models of Anything Celia Gorman, IEEE Spectrum, 3/6/13
Description of Kinect sessions at Build 2012
Kinect for every developer!
Tom Kerhove, Kinecting for Windows, 2/15/13
Kinect in the Classroom
Kinect Education

Note: Although I recently received my developer kit for Leap Motion, another gesture-based interface, I haven't lost interest in following news for Kinect.

Mar 11, 2013

Leap Motion: My Dev Kit Arrived - Now What?! Thoughts About "NUI" Child-Computer-Tech-Interaction - and More



My Leap Motion developer kit arrived last week. I carefully unboxed the small device and tried out the demo apps that came with the SDK.  I'm doing more looking than leaping at this point.

I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year.  It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.

Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson.   Family and friends captured his first moments after birth with iPhones, and shared across the Internet.  Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.

The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video.   He was seven months old when he first encountered my first iPad.  It was fingers-and-toes interaction from the start.  

In the first picture below,  he's playing with NodeBeat.  In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.




















My grandson is new to motion control applications, so I'm just beginning to learn what he likes,  and what he is capable of doing.  A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft!  (I think next time we'll try Kinect Sesame Street TVor revisit Kinectimals.)  


One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far.  There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.

One app I really like is  Adam Somer's AirHarp, featured in the video clip below:


I also like the idea behing the following app, developed by undergraduate students:

Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
 
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."

"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team



There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation

"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab



Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine.  From the video, you can tell that they had a blast!  

Here is an excerpt from the chatter:  "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."

According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."  

The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:



Below are a few more videos featuring Leap Motion:


Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)


The Leap Motion Experience at SXSW 2013


LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...



RELATED
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Social Sign
IntuiLab
Leap Motion: Low Cost Gesture Control for Your Computer Display

SOMEWHAT RELATED
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."

Comment:
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group.  I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs.   It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.

Feb 20, 2013

Disney Research: Touche, Touch and Gesture Sensing

The following video is a demonstration of something called Swept Frequency Capacitive Sensing. It recognizes various configurations of hands and body during interactions.  This system is different than conventional capacitive touch sensing, as it senses a range of frequencies to develop a capacitive profile that provides a significant amount of data that can be analysed and utilized in an application.

At 1:23, the SFCS is demonstrated on a table, sensing body posture or body configuration. It is a wireless system and can be used on smaller touch screens, such as mobile devices.  It can recognize interactions in liquids.


Touche was awarded Best Paper at ACM CHI 2012:

RELATED
Touche: Touch and Gesture Sensing for the Real World
Disney Research
Sato, M., Poupyrev, I, and Harrison, C. Touché: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects. In Proceedings of CHI’12. 2012. ACM.
Paper [PDF, 10Mb]
Touche with Arduino
Swept Frequency Capacitive Sensing (SFCS)
Audrey Cropp, Responsive Landscapes, 2/18/13

SOMEWHAT RELATED Synthetic Ecologies Course Reading List
Responsive Environments Course
Allen Sayegh, Harvard Graduate School of Design



AirHarp for Leap Motion, a Responsive Musical Natural User Interface

I like this demonstration of Adam Somers  AirHarp music application for use with the Leap Motion 3D controller:


AirHarp is being developed in C++ using Adam Somer's audio processing toolkit, MusKit.  This looks interesting!  Things have changes since I last took a computer music technology course (back in 2003).

Adam Somers is a senior software engineer at Universal Audio.  He has a graduate degree in music technology from Stanford, and a background in computer science, electronics, human-computer interaction, and signal processing.

Leap Motion is a motion-control software and hardware start-up company located in San Francisco, California. According to promotional information from the website, the company's first product, the Leap Motion controller, is 200 times more sensitive than existing technologies.  It will be interesting to see how this plays out.  (I'm still waiting for my pre-order.)

RELATED
AirHarp (links to GitHub)
Leap FAQs
Leap Motion Website
Leap Motion Developer Portal
Leap Motion Leadership Team
Leap Motion goes retail: Motion controller sold exclusively at Best Buy
Michael Gorman, engadget, 1/16/13

Leap Motion: Low Cost Gesture Control for your Computer Display
Asus partners up with Leap Motion, PCs with 3D motion control to debut in 2013
Michael Gorman, engadget, 1/3/13
Stanford Center for Computer Research in Music and Acoustics


Feb 15, 2013

Designing for Touch & Gesture: Tips for Apps and the Web (Updated)

In the past, our fingers did the walking, sifting through files, papers, pamphlets, and phonebooks, and then by point-click-clicking with a mouse to interact with images and text, in essence, electronic imitations of the paper-based world. Traditional forms, brochures, ad inserts, and posters informed much of the design. 

How much have things change?   It is 2013, but you'd think it was 1997 from the PowerPoint look and feel of many apps and web sites!   Touch is everywhere, but from what I can tell, not enough designers and developers have stepped up to the plate to think more deeply about ways their applications can support human endeavors though touch and gesture interactions.  

For an overview of this topic, take a look at my 2011 post, written after a number of ugly encounters with user-unfriendly applications:  Why bother switching from GUI to NUI?  

For an in-depth look into the history of multi-touch, the wisdom of Bill Buxton is well-worth absorbing.  He's worked with all sorts of interfaces, and has been curating the history of multi-touch and gesture systems since 2007:


Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12



Even if you are not a designer or developer, I encourage you to explore some of the links below:

Touch Gestures for Application Design
Luke Wroblewski, 10/9/12

Common Misconceptions About Touch
Steven Hoober, 3/18/13

Designing With Tablets in Mind:  Six Tips to Remember
Connor Turnbull, Webdesign tuts+, 9/27/11

Finger-Friendly Design: IDeal Mobile Touchscreen Target Sizes
Anthony T, Smashing Magazine, 2/21/12

Best Practices: Designing Touch Tablet Experiences for Preschoolers (pdf)
Sesame Street Workshop


Are Touch Screens Accessible?
AcessIT, National center on Accessible Information Technology in Education

iOS Human Interface Guidelines
Apple

Android User Interface Guidelines
Using Touch Gestures
Handling Multi-Touch Gestures
Android

Designing for Tablets?  We're Here to Help!
Roman Nurik, Android Developers Blog 11/26/12

Touch interaction design (Windows Store apps)
Microsoft - MSDN

Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12


Jan 17, 2013

XBox Kinect in the OR: Kinect supports gesture interaction with 3D imaging of the patient, while operating.

Here's an interesting use of technology for health - the Xbox Kinect in the OR!

Thanks to Harry van der Veen for the link!


RELATED
Kinect sensor poised to leap into everyday life
Niall Firth, NewScientist, 1/17/13

For the tech-curious:
PrimeSense (Company that developed the 3D depth sensor that powers the Kinect, the sensor in Ava, a healthcare robot by iRobot, and more.)

OpenNI (Framework for the development of 3D sensing middleware libraries and applications.)

NiTE: Natural Interface Technology for End User (Perception algorithms layer for 3D computer vision, allows for hand locating, tracking, analyzing scenes, and tracking skeleton joints.)

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7