Showing posts sorted by relevance for query NUI. Sort by date Show all posts
Showing posts sorted by relevance for query NUI. Sort by date Show all posts

Mar 18, 2009

More for Multi-touch: NextWindow Plug-in for Natural User Interface's Snowflake Multi-touch Software -and more.



Those of you have an HP TouchSmart, Dell Studio One PC, or NextWindow displays might be interested in the new NUI plug-in that supports NUI Suite Snowflake software. Here are the features of the plugin, according to information from the Natural User Interface website:
  • Detailed user manual included with FAQ
  • Developed on fast and reliable C++ platform
  • Intuitive
  • Customizable
  • Gesture recognition library
  • TUIO/OSC (Open Sound Control) support (sending and receiving events)
  • Low level API
  • Hardware accelerated rendering
  • Support for wide variety of media types
  • Advanced window handler that supports scaling and rotation
  • Suitable for Windows® XP and Windows® Vista (Mac OSX and Linux can be developed on request)
  • Audio support
  • Single, dual support
  • Multi-threaded resource handler (For fast data visualization)

"NUI has partnered up with NextWindow™, an international leader in the development of optical multi-touch technology and the manufacturer of optical multi-touch screens, overlays and OEM touch components."

"NextWindow™'s integrated technology allows for natural and intuitive interaction of digital content on flat TFT, LCD and Plasma solutions."

"The NUI NextWindow™ plug-in can be used with any programming language that supports TUIO, i.e. C/C++/C#, Java, Flash, Python, VVVV etc, meaning that software developers can run their own applications on NextWindow™, utilizing the NUI NextWindow™ plug-in."

Comment:
I became a fan of NextWindow touch-screen displays in early 2007 when I worked on a couple of touch-screen projects in my HCI and Ubicomp classes at UNC-Charlotte.


I've been using my HP TouchSmart PC at work with students with disabilities. I'm experimenting with the NUI Suite SnowFlake on my TouchSmart, and found that interacting with the Particles application delighted students with severe autism. The activities provided opportunities to establish joint attention. I also noticed an increase in the number of vocalizations and/or verbalizations among the students. Of course, this was NOT a scientific study.

RELATED
Definition of Joint Attention from UConn:

"Joint Attention is the process of sharing one’s experience of observing an object or event, by following gaze or pointing gestures. It is critical for social development, language acquisition, cognitive development…"

http://eigsti.psy.uconn.edu/jt_attn.JPG


Establishing joint attention is an important step in the development of social interaction skills among young people who have autism spectrum disorders.

More about joint attention:

Joint Attention Study Has Implication for Understanding Autism
Science Daily, 9/29/07

Asperger-Advice: Joint Attention

Autism Games: Joint Attention and Reciprocity

Why is joint attention a pivotal skill in autism?
Tony Charman
Philos Trans R Soc Lond B Biol Sci. 2003 February 28; 358(1430): 315–324.
doi: 10.1098/rstb.2002.1199.

Jul 18, 2008

Natural User Interface: Overview of multi-touch technology and application development by Harry van der Veen,- Business to Buttons

The image “http://www.multitouch.nl/sverige/smoke1.jpg” cannot be displayed, because it contains errors.The image “http://transfer.naturalui.com/cpc/P1000258.JPG” cannot be displayed, because it contains errors.

Harry van der Veen from Natural User Interface Europe AB, was one of the keynote speakers at the Business to Buttons: Designing for Effect conference, held in June 2008.
In this presentation video, Harry discusses the past, present, and future of multi-touch technology, and reviews the importance of multi-touch over single touch displays. He also provides a good overview of gesture interaction, something that he researched when he was a student. This presentation includes several video examples of multi-touch applications in action.

The presentation is well worth the 30-minute view!


"Harry van der Veen is a Bachelor of Multimedia, derived from the Dutch education Communication, Multimedia and Design, focused on Interaction Design and Project Management. He is CEO, co-founder and co-owner of the Sweden based commercial company Natural User Interface Europe AB, which focuses on delivering standardized and customized multi-touch hardware / software solutions and services to the global market. In addition to that, he co-founded the NUIGroup community, which is the worlds largest online platform where a global network of people share their ideas and information in an open source community, focused on multi-touch hardware and software solutions."

The image “http://nuigroup.com//images/nui.jpg” cannot be displayed, because it contains errors.
NUIGroup Community

The image “http://www.multitouch.nl/sverige/nuilogo.bmp” cannot be displayed, because it contains errors.
Harry van der Veen's blog

Natural User Interface Europe AB (Harry van der Veen's company)

NUIGroup Wiki: This wiki includes tutorials for developing multi-touch applications, building your own low-cost multi-touch table, and information about current projects that are in progress.

Related Information:


The Business to Buttons: Designing for Effect conference was held on June 12-12 in Malmo, Sweden, organized by Malmo University and inUse, a user experience consultancy. Partners in this conference included Adaptive Path, a product experience strategy and design company, Patrick W. Jordan, a design, marketing, and brand strategist, the cocktail, a user experience and interaction design studio, cooper, a product design company, and OresundIT, a non-profit network.


Don Norman, the author of books such as "Design of Everyday Things" and "The Design of Future Things", presented at this conference. Don Norman is one of the founding fathers of the Human-Computer Interaction and related fields, and is the co-founder of the Nielsen Norman Group, a consultant firm that helps company create human-centered products.

Videos of Don Norman's Presentations:
Emotional Design: Total User Experience
Cautious Cars and Cantankerous Kitchens

Other:
Business to Buttons 2008 Recorded Sessions

Business to Buttons 2008 Downloads

My posts about the work of NUI Group members:

Multi-Touch Plug-in for NASA World Wind?!

More Multitouch: NUI Group's Christopher Jette's multi-touch work featured in Engaget ; Croquet?

More Multi-Touch from members of the NUI group!

Multi-touch Crayon Physics from multitouch-barcelona, inspired by Crayon Physics by Kloonig Games

Cross Post: Seth Sandler's YouTube Video, "How to Make a Cheap Multi-touch Pad" goes viral

NUI-Group Member Bridger Maxwell Receives High School Science Fair Award for Multi-Touch Screen Project

Look, touch, listen, and play: Seth Sandler's interactive Audio Touch Table video; NUI Group and Google's Summer of Code

[nuiab.jpg]

Mar 11, 2013

Leap Motion: My Dev Kit Arrived - Now What?! Thoughts About "NUI" Child-Computer-Tech-Interaction - and More



My Leap Motion developer kit arrived last week. I carefully unboxed the small device and tried out the demo apps that came with the SDK.  I'm doing more looking than leaping at this point.

I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year.  It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.

Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson.   Family and friends captured his first moments after birth with iPhones, and shared across the Internet.  Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.

The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video.   He was seven months old when he first encountered my first iPad.  It was fingers-and-toes interaction from the start.  

In the first picture below,  he's playing with NodeBeat.  In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.




















My grandson is new to motion control applications, so I'm just beginning to learn what he likes,  and what he is capable of doing.  A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft!  (I think next time we'll try Kinect Sesame Street TVor revisit Kinectimals.)  


One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far.  There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.

One app I really like is  Adam Somer's AirHarp, featured in the video clip below:


I also like the idea behing the following app, developed by undergraduate students:

Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
 
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."

"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team



There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation

"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab



Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine.  From the video, you can tell that they had a blast!  

Here is an excerpt from the chatter:  "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."

According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."  

The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:



Below are a few more videos featuring Leap Motion:


Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)


The Leap Motion Experience at SXSW 2013


LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...



RELATED
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Social Sign
IntuiLab
Leap Motion: Low Cost Gesture Control for Your Computer Display

SOMEWHAT RELATED
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."

Comment:
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group.  I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs.   It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.

Sep 18, 2010

Interactive Tabletops and Surfaces: 2010 ACM Conference, Nov. 7-10, Saarbrucken, Germany. Wish I could go!

If you are new to this blog, you should know that I'm passionate about interactive tables and surfaces of all sizes!   Although this technology has been around for a while, it is a new concept to most people.  The researchers and practitioners involved in the upcoming 2010 Interactive Tabletops and Surfaces Conference have been an important influence in the way people think about interacting with technology, and have made significant contributions to this emerging field over the past several years.   It hasn't been an easy road, given that most of us have minds brainwashed through years of forced keyboard-and-mouse interaction and traditional WIMP (Windows, Icons, Menus, Pointers) interfaces.

I first learned about the first Interactive Tabletops conference, held in 2006, in early 2007.  At the time, I was working on projects for my HCI and Ubiquitous Computing classes, trying to learn everything I could about natural user interaction, large touch-screen displays, tabletop computing, and multi-touch.   I was inspired by the interesting work going on in this field.  This was before the first iPhone was introduced, before Microsoft's multi-touch Surface was unveiled, and three years before Apple broke out with the iPad.

Many of the people involved with the 2010 Interactive Tabletops and Surfaces Conference are (or have been) affiliated with the NUI group NUI stands for Natural User Interface, or Natural User Interaction - the NUI group is "a global research community focused on the open discovery of natural user interfaces."  I joined the NUI-group in 2007 when I was looking for more information about the nuts and bolts of multi-touch programming and systems, and have been encourage to see how things have evolved since then.

Members of another group, sparkon, are also participating in the Interactive Tabletops and Surfaces conference.  Sparkon is an on-line community that includes people involved with  interactive technologies, including tabletop and surface computing. "On sparkon, you'll find projects demonstrating the latest interactive techniques, applications, software frameworks, case studies, and blog articles relating to creative and emergent technology."  (I'm also a member of Sparkon.)


Here's the information from the conference website:

ACM Interactive Tabletops and Surfaces, Saarbrücken, Germany:  7-10 November, 2010
"ITS 2010 is a premier venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a young community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, interaction design, and studies expanding our understanding of design considerations of ITS technologies and of their applications in modern society. ITS 2010 will bring together top researchers and practitioners who are interested in both the technical and human aspects of interactive tabletop and surface technologies. It is our hope that we will be able to achieve increased synergy of approaches between the disciplines engaged in the research in the area of interactive tabletops and surfaces, Design, HCI, UbiComp, Psychology, MobileHCI and other related fields. More directly, we intend to encourage immediate interdisciplinary collaboration on future research topics. Young scholars and Ph.D. students are especially encouraged to submit papers and participate in the doctoral colloquium."


Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH
Conference General Chairs



KEYNOTE SPEAKER:  W. Bradford Paley

"Bio: W. Bradford Paley uses computers to create visual displays with the goal of making readable, clear, and engaging expressions of complex data. He did his first computer graphics in 1973, founded Digital Image Design Incorporated in 1982, and started doing financial & statistical data visualization in 1986. He has exhibited at the Museum of Modern Art; he created TextArc.org; he is in the ARTPORT collection of the Whitney Museum of American Art; has received multiple grants and awards for both art and design, and his designs are at work every day in the hands of brokers on the floor of the New York Stock Exchange. He is an adjunct associate professor at Columbia University, and is director of Information Esthetics: a fledgling interdisciplinary group exploring the creation and interpretation of data representations that are both readable and esthetically satisfying."


SAMPLE TOPICS



  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Interactive surface hardware, including sensing and input technologies with novel capabilities
  • Human-centered design & methodologies





RELATED
Previous Conferences








PLUGS
From the conference website -Links to the conference sponsors:




We appreciate the generous support of the following sponsors, without whom this conference would not be possible. Click on the logos to learn more about our generous supporters, and let us know if you are interested in becoming a sponsor.

Champions:

 

Benefactors:

  

Donors:

Contributors:

Academic Sponsors:

     

Jan 20, 2009

More Multi-touch Multimedia: Video demonstration of applications created with Snowflake and Flash



This video showcases the work of Natural User Interface-AB, using NUI Suite 1.0 Snowflake and Flash.

Here is the plug from the company's website:
"Natural User Interface (NUI) is a Swedish innovative emerging technology company specializing in commercially available advanced multi-touch software, hardware and service solutions. NUI's solutions can convert an ordinary surface into an interactive, appealing and intelligent display that creates a stunning user experience."

For more information and links:

For Techies and the Tech Curious: Multi-touch/Gesture from the NUI-Group

Search this blog!

Nov 1, 2010

Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tall, NUI-Group Member

I came across the first version of the open-source ITU Gaze Tracker on the NUI Group forum in April of 2009 and played around with it a bit.  I was impressed.  I'm happy to say that the new version looks even better, although I haven't had the time to try it out.  Below are two recent videos that will give you a better understanding about gaze tracking.  


For the tech-curious, make sure you take the time to view the second video!  Links to info & code are below.


GT2 High speed remote eye tracking "Pushing the limits"


Technical Demonstration


Info about  the ITU Gaze Tracker 2.0 Beta from the NUI Group Forum, posted by Martin Tall:



Introducing the ITU Gaze Tracker 2.0 Beta
"We’ve made great progress since the initial release, today we open the doors for version 2.0. Internally we’ve rewritten major parts of the platform to gain flexibility and higher performance.  First version was DIY playtime, this version is nothing short of a screamer. High performance, very accuracy tracking. People are telling us we are crazy giving it away but we’re dedicated to the mission: Accessible eye tracking for all, regardless of nationality and means. We’re making it happen."
Important highlights for GT2.0b:
- Supports three modes of operation, head-mounted, remote mono/binocular
- Vastly improved performance, +500fps head mounted, +170fps remote binocular (both eyes)
- Awesome accuracy, avg. 0.3 - 0.7 degrees of visual angle (remote binocular)
- New U.I, looks so.. 2010
- Automatic tuning (optimization of algorithms parameters)
- Relatively low CPU-utilization and memory footprint (12%, 170Mb, core i7 860 win7-64)
- Many enhancements, bug-fixes etc.

Aug 31, 2010

Osmosis: Multi-touch systems for... everywhere!

Not long ago I had the opportunity to chat with Stuart McLean, the founder of Osmosis, a company that delivers customized multi-touch systems of hardware and software that support human-centered natural user interaction.   Stuart has many years of experience working in more traditional IT/business roles, and knows from this experience that there is  better way to support  human computer interaction, including interaction between people.

Like many of us in the "NUI" community, Stuart was impressed by the video of Jeff Han's 2006 TED Talk, which demonstrated a variety of awesome multi-touch, multi-user applications on a high-resolution drafting table.  Stuart saw the importance of natural user interfaces and interaction and became involved with the NUI Group, a "global research community focused on the open discovery of natural user interfaces". 

Unlike traditional tech companies, Osmosis is a collaboration between a global network of engineers, designers, and developers who share the "NUI" vision. This collaboration enables the company to provide solutions for clients across a range of countries, cultures, and domains.


Below is a photo-gallery of some of the applications and systems developed by Osmosis:


Multi-touch by Osmosis
GALLERY
As you can see from the gallery photos, Osmosis provides a range of possibilities for their clients and potential clients.  All of the displays are high-definition.  Some are projection-systems, and others are displays with multi-touch sensing technology.  Since the construction is modular, a variety of form factors are available.  High-quality surround and domed sound systems are available.  Applications include information kiosks, point of sale/digital signage, hospitality, presentation and training, education, and audio-visual performance and production.  Osmosis also provides applications that support interaction with tangible objects.

Below are two videos that give a taste of what Osmosis is all about:

OSMOSIS DEMO REEL

Demo Reel from Osmosis on Vimeo.

MULTI-TOUCH EVERYWHERE

MT Everywhere from Osmosis on Vimeo.

I can see where some of these applications would be great in K-12 educational settings.  Just look at the joy on the faces of the kids in the Multi-Touch Everywhere video!

(Short video clips of the Osmosis applications in action can be found in the showcase page of the company's website.)

Dec 20, 2009

For Techies & Tech Curious: Python and PyMT developments - PyMT and speech recognition

PyMT is short for Python Multi-Touch, a project that is the work of several members of the NUI Group.  Sharath Patali experimented with speech input for PyMT and used the pocketsphinx library from CMU to integrate into PyMT. It worked out well, as you can see from his video demo below:


PyMT Speech Recognition from Sharath Patali on Vimeo.

Other people involved with the PyMT project are Nathanael Lecaude, Matthew Virbel, Thomas Hansen, and Xelapond.


Sharath Patali's Blog/Website Roll (Links to some NUI-group members)


Matthieu Virbel on Vimeo
NUI Group on Vimeo


Python and Game Programming Resources
Adaptation and Evaluation of Numpty Physics for Multi-touch Multiplayer Interaction (pdf)
(A python-based module called "numptyphysics" was created to integrate Python multi-touch cold to allow the C++ code to parse data, converting it to C structs passed to the game code using pointers.)
Python Programming Language Official Website
Pygame
PythonGames
PythonL Game Programming Wiki, by Geoff Howland and Rene Dudfield
Lectures 1-6
Beginning Game Development with Python and Pygame -Book (Will McGugan)
Game Programming with Python - Book (Sean Riley)

Comment:
The reason I'm putting together resources about Python, multi-touch, and games is that I hope to facilitate an exchange between two of the schools I serve as a school psychologist.

Both of the schools are on the same campus. One is a high school for technology and the arts, and one is a program for teens and young adults who have severe disabilities, including autism.  Next semester, one of the computer teachers will be teaching a game programming class using Python for a class of graduating seniors, and if all goes well, perhaps some of the students will create a game for the students with disabilities that would work well on a SMARTboard.

Even better:  It would be great if the pre-engineering students could build a multi-touch table or two for the students with disabilities, running games in PyMT that the computer students create!

Oct 31, 2009

Sensory-Mind's Ring Wall, an interactive multi-touch wall you don't even have to touch!

Sensory-Minds is a small company in Germany that is focused on research and design in the field of Natural User Interfaces. If you visit the SENSORY-MINDS website, you'll find that it has been designed for touch interaction.



ring°wall from SENSORY-MINDS on Vimeo.

Information about the Ring Wall from Sensory-Mind's Vimeo site:

".....The two-piece ring°wall consists of a LED display and a multitouch information-wall and impresses by its size: a total surface of 425 square meters, which equals more than 6000 computer displays, is the biggest of its kind. An interactive World emerges out of 34 million pixels generated by 15 high definition projectors and is supported by 30 directional speakers.


Multitouch sensors basing on laser technology, convert the usage of the natural user interface into an experience. By direct touching, more than 80 users can simultaneously get informed about news and activities around the ringworld.


The interactive wall is not only a central information system, but also an innovative advertising tool and medium for public viewings."

Heiko Hoffman of Sensory-Minds recently joined the NUI-Group.  Here is a response to a question on the NUI-group forum about the way the system's sensors work:


"The sensor works like spinning radar gun, or like a wiper in a car, that means like a radar gun you get the distance to the object.  It’s not quiet difficult when you know the position of the sensor and the degree from the beam to get the X,Y position.  Yes, you don’t need to touch the surface but we arranged it that the radar beam is very close to the screen so it seems that you have to touch the surface.

At the moment the system (has) limitations because each sensor (gets) coordinates and this means that you got shadows.This is no problem because the people interact very fast. We are working on a system where the data from the sensors were put together and that would be the solution for this problem."



Note:

I'll be writing a few more posts updating the work of past and present NUI-group members as well as news from the commercial multi-touch & gesture community.  Be sure to check back soon, or subscribe to this blog!

Jul 12, 2009

NUI-Group Members: What are they doing now?

Multitouch Media Application Pro v3.0 from Falcon4ever on Vimeo.

MMA Pro is a multitouch photo and video organizer build in Adobe AIR (Flex3) and has new features such as Google Maps, support for uploading pictures on the fly using blue-tooth. For more information, visit Laurence Muller's website, Multigesture.Net. There you can download the application. Make sure you read the install instructions that are included in the readme.text, and also make sure that you have the latest Adobe AIR 1.5.x. Laurence also recommends installing BlueSoleil to handing the pairing of devices and file transfers. (If you've never programmed for Bluetooth, take his advice!)

Laurence Muller (M.Sc.) is a Scientific Programmer at the University of Amsterdam who develops scientific software for multi-touch devices. He is a member of the NUI-Group.

The following video highlights some of the applications from the University of Amsterdam from about a year ago:

Multitouch Applications from Falcon4ever on Vimeo.

Feel free to leave a comment and a link or two if you are a NUI-Group member and like to share your recent projects!

Jan 26, 2009

SPARSH: DYI demo of an open-source multi-touch table and applications by NUI-group members

The following video is a demonstration of "Sparsh", an interactive multi-touch FTIR table built in eight weeks by a group of engineering students in India. Most of the information regarding the hardware and software you see running on this low-cost system can be found on the open-source NUI-group website, forums, and wiki.


Sparsh Multitouch Display from anirudh on Vimeo.

I especially like the multi-touch DJ application!


For more information, view the posts related to the NUI group on this blog.

Sparsh Website

Sep 6, 2013

Eye Tribe Eye Tracker Dev Kit, $99; Open Source ITU Gaze Tracker Grows Up!

The Eye Tribe Eye Tracker developer kit is available for pre-order for $99.00. The kit comes with an SDK with C++, C#, and Java, full source code included.  

I've been waiting for a while to see this happen! 

The Eye Tribe Eye Tracker is an outgrowth of the work of a group of researchers at the IT University of Copenhagen.  At the time, it was known as the open-source ITU Gaze Tracker. 
I came across it a few years ago in a NUI-Group forum, and later wrote a post about it when the 2.0 version was released. 

Although the Eye Tribe Tracker was originally developed to meet the needs of people with disabilities who could not access computers, it was found to have potential for a number of other uses that were not really possible before the spread of mobile technologies such as touch-screen tablets and smart phones. 

To get a better understanding of eye-gaze/tracking technology, take a look at the following videos and follow the related links.



Below is a demonstration of the gaze UI on an Android smartphone:


Here is another look at this technology running on a Windows 8 Tablet:





RELATED
The Eye Tribe (website)
Eye Tribe starts taking pre-orders for $99 Windows eye tracker
Senseye will let you control your mobile phone with your eyes
Martin Bryant, The Next Web, 12/2/11
Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tull, NUI-Group Member
Lynn Marentette, Interactive Multimedia Technology, 11/1/10
ITU GazeGroup
Gaze Tracker Development
GazeGroup Forum
Martin Tall


RELATED VIDEOS
Eye Tribe was formally known as Senseye. Below is an earlier video that shows how it worked with a web-cam on a mobile device:



Open-Source ITU Gaze Tracker

ITU Gaze Tracker from ITUcph on Vimeo.


Earlier Videos of the ITU Gaze Tracker:
Technical Demonstration 




Seeking Sustainable Innovation

Mar 16, 2013

UPDATE: What's New for Kinect? Fusion, real-time 3D digitizing, design considerations, and more.

The Evolution of Microsoft Kinect

I've been following the evolution of Microsoft's Kinect, and recently discovered a few interesting videos that show how far the system has come. According to Josh Blake, the founder of the OpenKinect community and author of the Deconstructing the NUI blog,  the Kinect for Windows SDK v1.7 will be released on Monday, March 18th, from http://www.kinectforwindows.com.  More details about this version can be found on Josh's blog as well as the official Kinect for Windows blog.


It is possible to create applications for desktop systems that work with the Kinect in interesting ways, as you'll see in the following videos. I think there is potential here for use in education/edutainment!

Below is a video of Toby Sharp, of Microsoft Research, Cambridge, demonstrating Kinect Fusion.  The software allows you to use a regular Kinect camera to reconstruct the world in 3D.



KinEtre: A Novel Way to Bring Computer Animation to Life
According to information from the YouTube description, "KinÊtre is a research project from Microsoft Research Cambridge that allows novice users to scan physical objects and bring them to life in seconds by using their own bodies to animate them. This system has a multitude of potential uses for interactive storytelling, physical gaming, or more immersive communications."




The following videos are quite long, so feel free to re-visit this post when you have time to relax and take it all in!

Kinect Design Considerations
This video covers Microsoft's Human Interface Guidelines, scenarios for interaction and use, and best practices for user interactions.  It also includes a preview of the next major version of the Kinect SDK. 


Kinect for Windows Programming Deep Dive
This video discusses how to build Windows Desktop apps and experiences with the Kinect, and also previews some future work.




RELATED
Kinect for Windows Developer Downloads
Kinect for Windows Blog
Deconstructing the NUI Blog (Josh Blake)
Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible
Celia Gorman, IEEE Spectrum, 3/13/13
Kinect hand recognition due soon, supports pinch-to-zoom and mouse click gestures.
Tom Warren, The Verge, 3/6/13
Microsoft's KinEtre Animates Household Objects
Samuel K. Moore, IEEE Spectrum, 8/8/12
Kinect Fusion Lets You Build 3-D Models of Anything Celia Gorman, IEEE Spectrum, 3/6/13
Description of Kinect sessions at Build 2012
Kinect for every developer!
Tom Kerhove, Kinecting for Windows, 2/15/13
Kinect in the Classroom
Kinect Education

Note: Although I recently received my developer kit for Leap Motion, another gesture-based interface, I haven't lost interest in following news for Kinect.