I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year. It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.
Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson. Family and friends captured his first moments after birth with iPhones, and shared across the Internet. Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.
The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video. He was seven months old when he first encountered my first iPad. It was fingers-and-toes interaction from the start.
In the first picture below, he's playing with NodeBeat. In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.
My grandson is new to motion control applications, so I'm just beginning to learn what he likes, and what he is capable of doing. A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft! (I think next time we'll try Kinect Sesame Street TV, or revisit Kinectimals.)
One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far. There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.
One app I really like is Adam Somer's AirHarp, featured in the video clip below:
I also like the idea behing the following app, developed by undergraduate students:
Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."
"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team
There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation
"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab
Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine. From the video, you can tell that they had a blast!
Here is an excerpt from the chatter: "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."
According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."
The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:
Below are a few more videos featuring Leap Motion:
Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)
The Leap Motion Experience at SXSW 2013
LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Leap Motion: Low Cost Gesture Control for Your Computer Display
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group. I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs. It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.