Showing posts sorted by relevance for query NUI. Sort by date Show all posts
Showing posts sorted by relevance for query NUI. Sort by date Show all posts

Jan 20, 2009

More Multi-touch Multimedia: Video demonstration of applications created with Snowflake and Flash



This video showcases the work of Natural User Interface-AB, using NUI Suite 1.0 Snowflake and Flash.

Here is the plug from the company's website:
"Natural User Interface (NUI) is a Swedish innovative emerging technology company specializing in commercially available advanced multi-touch software, hardware and service solutions. NUI's solutions can convert an ordinary surface into an interactive, appealing and intelligent display that creates a stunning user experience."

For more information and links:

For Techies and the Tech Curious: Multi-touch/Gesture from the NUI-Group

Search this blog!

Nov 1, 2010

Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tall, NUI-Group Member

I came across the first version of the open-source ITU Gaze Tracker on the NUI Group forum in April of 2009 and played around with it a bit.  I was impressed.  I'm happy to say that the new version looks even better, although I haven't had the time to try it out.  Below are two recent videos that will give you a better understanding about gaze tracking.  


For the tech-curious, make sure you take the time to view the second video!  Links to info & code are below.


GT2 High speed remote eye tracking "Pushing the limits"


Technical Demonstration


Info about  the ITU Gaze Tracker 2.0 Beta from the NUI Group Forum, posted by Martin Tall:



Introducing the ITU Gaze Tracker 2.0 Beta
"We’ve made great progress since the initial release, today we open the doors for version 2.0. Internally we’ve rewritten major parts of the platform to gain flexibility and higher performance.  First version was DIY playtime, this version is nothing short of a screamer. High performance, very accuracy tracking. People are telling us we are crazy giving it away but we’re dedicated to the mission: Accessible eye tracking for all, regardless of nationality and means. We’re making it happen."
Important highlights for GT2.0b:
- Supports three modes of operation, head-mounted, remote mono/binocular
- Vastly improved performance, +500fps head mounted, +170fps remote binocular (both eyes)
- Awesome accuracy, avg. 0.3 - 0.7 degrees of visual angle (remote binocular)
- New U.I, looks so.. 2010
- Automatic tuning (optimization of algorithms parameters)
- Relatively low CPU-utilization and memory footprint (12%, 170Mb, core i7 860 win7-64)
- Many enhancements, bug-fixes etc.

Aug 31, 2010

Osmosis: Multi-touch systems for... everywhere!

Not long ago I had the opportunity to chat with Stuart McLean, the founder of Osmosis, a company that delivers customized multi-touch systems of hardware and software that support human-centered natural user interaction.   Stuart has many years of experience working in more traditional IT/business roles, and knows from this experience that there is  better way to support  human computer interaction, including interaction between people.

Like many of us in the "NUI" community, Stuart was impressed by the video of Jeff Han's 2006 TED Talk, which demonstrated a variety of awesome multi-touch, multi-user applications on a high-resolution drafting table.  Stuart saw the importance of natural user interfaces and interaction and became involved with the NUI Group, a "global research community focused on the open discovery of natural user interfaces". 

Unlike traditional tech companies, Osmosis is a collaboration between a global network of engineers, designers, and developers who share the "NUI" vision. This collaboration enables the company to provide solutions for clients across a range of countries, cultures, and domains.


Below is a photo-gallery of some of the applications and systems developed by Osmosis:


Multi-touch by Osmosis
GALLERY
As you can see from the gallery photos, Osmosis provides a range of possibilities for their clients and potential clients.  All of the displays are high-definition.  Some are projection-systems, and others are displays with multi-touch sensing technology.  Since the construction is modular, a variety of form factors are available.  High-quality surround and domed sound systems are available.  Applications include information kiosks, point of sale/digital signage, hospitality, presentation and training, education, and audio-visual performance and production.  Osmosis also provides applications that support interaction with tangible objects.

Below are two videos that give a taste of what Osmosis is all about:

OSMOSIS DEMO REEL

Demo Reel from Osmosis on Vimeo.

MULTI-TOUCH EVERYWHERE

MT Everywhere from Osmosis on Vimeo.

I can see where some of these applications would be great in K-12 educational settings.  Just look at the joy on the faces of the kids in the Multi-Touch Everywhere video!

(Short video clips of the Osmosis applications in action can be found in the showcase page of the company's website.)

Dec 20, 2009

For Techies & Tech Curious: Python and PyMT developments - PyMT and speech recognition

PyMT is short for Python Multi-Touch, a project that is the work of several members of the NUI Group.  Sharath Patali experimented with speech input for PyMT and used the pocketsphinx library from CMU to integrate into PyMT. It worked out well, as you can see from his video demo below:


PyMT Speech Recognition from Sharath Patali on Vimeo.

Other people involved with the PyMT project are Nathanael Lecaude, Matthew Virbel, Thomas Hansen, and Xelapond.


Sharath Patali's Blog/Website Roll (Links to some NUI-group members)


Matthieu Virbel on Vimeo
NUI Group on Vimeo


Python and Game Programming Resources
Adaptation and Evaluation of Numpty Physics for Multi-touch Multiplayer Interaction (pdf)
(A python-based module called "numptyphysics" was created to integrate Python multi-touch cold to allow the C++ code to parse data, converting it to C structs passed to the game code using pointers.)
Python Programming Language Official Website
Pygame
PythonGames
PythonL Game Programming Wiki, by Geoff Howland and Rene Dudfield
Lectures 1-6
Beginning Game Development with Python and Pygame -Book (Will McGugan)
Game Programming with Python - Book (Sean Riley)

Comment:
The reason I'm putting together resources about Python, multi-touch, and games is that I hope to facilitate an exchange between two of the schools I serve as a school psychologist.

Both of the schools are on the same campus. One is a high school for technology and the arts, and one is a program for teens and young adults who have severe disabilities, including autism.  Next semester, one of the computer teachers will be teaching a game programming class using Python for a class of graduating seniors, and if all goes well, perhaps some of the students will create a game for the students with disabilities that would work well on a SMARTboard.

Even better:  It would be great if the pre-engineering students could build a multi-touch table or two for the students with disabilities, running games in PyMT that the computer students create!

Oct 31, 2009

Sensory-Mind's Ring Wall, an interactive multi-touch wall you don't even have to touch!

Sensory-Minds is a small company in Germany that is focused on research and design in the field of Natural User Interfaces. If you visit the SENSORY-MINDS website, you'll find that it has been designed for touch interaction.



ring°wall from SENSORY-MINDS on Vimeo.

Information about the Ring Wall from Sensory-Mind's Vimeo site:

".....The two-piece ring°wall consists of a LED display and a multitouch information-wall and impresses by its size: a total surface of 425 square meters, which equals more than 6000 computer displays, is the biggest of its kind. An interactive World emerges out of 34 million pixels generated by 15 high definition projectors and is supported by 30 directional speakers.


Multitouch sensors basing on laser technology, convert the usage of the natural user interface into an experience. By direct touching, more than 80 users can simultaneously get informed about news and activities around the ringworld.


The interactive wall is not only a central information system, but also an innovative advertising tool and medium for public viewings."

Heiko Hoffman of Sensory-Minds recently joined the NUI-Group.  Here is a response to a question on the NUI-group forum about the way the system's sensors work:


"The sensor works like spinning radar gun, or like a wiper in a car, that means like a radar gun you get the distance to the object.  It’s not quiet difficult when you know the position of the sensor and the degree from the beam to get the X,Y position.  Yes, you don’t need to touch the surface but we arranged it that the radar beam is very close to the screen so it seems that you have to touch the surface.

At the moment the system (has) limitations because each sensor (gets) coordinates and this means that you got shadows.This is no problem because the people interact very fast. We are working on a system where the data from the sensors were put together and that would be the solution for this problem."



Note:

I'll be writing a few more posts updating the work of past and present NUI-group members as well as news from the commercial multi-touch & gesture community.  Be sure to check back soon, or subscribe to this blog!

Jul 12, 2009

NUI-Group Members: What are they doing now?

Multitouch Media Application Pro v3.0 from Falcon4ever on Vimeo.

MMA Pro is a multitouch photo and video organizer build in Adobe AIR (Flex3) and has new features such as Google Maps, support for uploading pictures on the fly using blue-tooth. For more information, visit Laurence Muller's website, Multigesture.Net. There you can download the application. Make sure you read the install instructions that are included in the readme.text, and also make sure that you have the latest Adobe AIR 1.5.x. Laurence also recommends installing BlueSoleil to handing the pairing of devices and file transfers. (If you've never programmed for Bluetooth, take his advice!)

Laurence Muller (M.Sc.) is a Scientific Programmer at the University of Amsterdam who develops scientific software for multi-touch devices. He is a member of the NUI-Group.

The following video highlights some of the applications from the University of Amsterdam from about a year ago:

Multitouch Applications from Falcon4ever on Vimeo.

Feel free to leave a comment and a link or two if you are a NUI-Group member and like to share your recent projects!

Jan 26, 2009

SPARSH: DYI demo of an open-source multi-touch table and applications by NUI-group members

The following video is a demonstration of "Sparsh", an interactive multi-touch FTIR table built in eight weeks by a group of engineering students in India. Most of the information regarding the hardware and software you see running on this low-cost system can be found on the open-source NUI-group website, forums, and wiki.


Sparsh Multitouch Display from anirudh on Vimeo.

I especially like the multi-touch DJ application!


For more information, view the posts related to the NUI group on this blog.

Sparsh Website

Sep 6, 2013

Eye Tribe Eye Tracker Dev Kit, $99; Open Source ITU Gaze Tracker Grows Up!

The Eye Tribe Eye Tracker developer kit is available for pre-order for $99.00. The kit comes with an SDK with C++, C#, and Java, full source code included.  

I've been waiting for a while to see this happen! 

The Eye Tribe Eye Tracker is an outgrowth of the work of a group of researchers at the IT University of Copenhagen.  At the time, it was known as the open-source ITU Gaze Tracker. 
I came across it a few years ago in a NUI-Group forum, and later wrote a post about it when the 2.0 version was released. 

Although the Eye Tribe Tracker was originally developed to meet the needs of people with disabilities who could not access computers, it was found to have potential for a number of other uses that were not really possible before the spread of mobile technologies such as touch-screen tablets and smart phones. 

To get a better understanding of eye-gaze/tracking technology, take a look at the following videos and follow the related links.



Below is a demonstration of the gaze UI on an Android smartphone:


Here is another look at this technology running on a Windows 8 Tablet:





RELATED
The Eye Tribe (website)
Eye Tribe starts taking pre-orders for $99 Windows eye tracker
Senseye will let you control your mobile phone with your eyes
Martin Bryant, The Next Web, 12/2/11
Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tull, NUI-Group Member
Lynn Marentette, Interactive Multimedia Technology, 11/1/10
ITU GazeGroup
Gaze Tracker Development
GazeGroup Forum
Martin Tall


RELATED VIDEOS
Eye Tribe was formally known as Senseye. Below is an earlier video that shows how it worked with a web-cam on a mobile device:



Open-Source ITU Gaze Tracker

ITU Gaze Tracker from ITUcph on Vimeo.


Earlier Videos of the ITU Gaze Tracker:
Technical Demonstration 




Seeking Sustainable Innovation

Mar 16, 2013

UPDATE: What's New for Kinect? Fusion, real-time 3D digitizing, design considerations, and more.

The Evolution of Microsoft Kinect

I've been following the evolution of Microsoft's Kinect, and recently discovered a few interesting videos that show how far the system has come. According to Josh Blake, the founder of the OpenKinect community and author of the Deconstructing the NUI blog,  the Kinect for Windows SDK v1.7 will be released on Monday, March 18th, from http://www.kinectforwindows.com.  More details about this version can be found on Josh's blog as well as the official Kinect for Windows blog.


It is possible to create applications for desktop systems that work with the Kinect in interesting ways, as you'll see in the following videos. I think there is potential here for use in education/edutainment!

Below is a video of Toby Sharp, of Microsoft Research, Cambridge, demonstrating Kinect Fusion.  The software allows you to use a regular Kinect camera to reconstruct the world in 3D.



KinEtre: A Novel Way to Bring Computer Animation to Life
According to information from the YouTube description, "KinÊtre is a research project from Microsoft Research Cambridge that allows novice users to scan physical objects and bring them to life in seconds by using their own bodies to animate them. This system has a multitude of potential uses for interactive storytelling, physical gaming, or more immersive communications."




The following videos are quite long, so feel free to re-visit this post when you have time to relax and take it all in!

Kinect Design Considerations
This video covers Microsoft's Human Interface Guidelines, scenarios for interaction and use, and best practices for user interactions.  It also includes a preview of the next major version of the Kinect SDK. 


Kinect for Windows Programming Deep Dive
This video discusses how to build Windows Desktop apps and experiences with the Kinect, and also previews some future work.




RELATED
Kinect for Windows Developer Downloads
Kinect for Windows Blog
Deconstructing the NUI Blog (Josh Blake)
Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible
Celia Gorman, IEEE Spectrum, 3/13/13
Kinect hand recognition due soon, supports pinch-to-zoom and mouse click gestures.
Tom Warren, The Verge, 3/6/13
Microsoft's KinEtre Animates Household Objects
Samuel K. Moore, IEEE Spectrum, 8/8/12
Kinect Fusion Lets You Build 3-D Models of Anything Celia Gorman, IEEE Spectrum, 3/6/13
Description of Kinect sessions at Build 2012
Kinect for every developer!
Tom Kerhove, Kinecting for Windows, 2/15/13
Kinect in the Classroom
Kinect Education

Note: Although I recently received my developer kit for Leap Motion, another gesture-based interface, I haven't lost interest in following news for Kinect.

Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Dec 9, 2010

Interested in the OpenNI Initiative? OpenKinect? To learn more, read Josh Blake's Interview of Tamir Berliner of PrimeSense




Josh Blake, Deconstructing the NUI, 12/9/10



Josh Blake recently interviewed Tamir Berliner, one of the founders of PrimeSense.  If you haven't heard, Microsoft's Kinect was based on work by PrimeSense, and licensed their technology. PrimeSense provides consumer electronics with natural user interaction capabilities. The good news is that the company recently released open-sourced middleware for natural interaction and depth-camera drivers. It will be interesting to see how this will play in the near future!




In the interview, Tamir discussed a number of topics related to postWIMP technologies.  He also announced the newly created  OpenNI, "an industry-led, not-for-profit organization formed to certify compatibility and interoperability of Natural Interaction (NI) devices, applications, and middleware."   It is good to see this level of support for the cause!


Here is a quote from the interview that I especially liked:

"I believe that till today the devices we’ve been using, made us learn greatly lot about them before we could use them and gain their value. I’m pretty sure everyone who is reading this has got at least 3 remotes sitting on his living room table, and at least once a week needs to help someone use their computer/media center/phone/etc. It’s time for that to change and it’s up to us, the technologists to make this revolution happen, it’s time for the devices to take the step of understanding what we want and making sure we get that, even without asking if it’s a trivial task as opening a door when we approach, closing the lights when we leave the room, even making sure we have hot water to shower with when we return from work or wake up in the morning, depends on what we normally do." -Tamir


RELATED
Here are a couple of videos from the OpenNI website that demonstrate OpenNI-compliant applications:

OpenNI-compliant real time skelton tracking by PrimeSense


OpenNI-compliant real time SceneAnalyzer by PrimeSense



FYI: 
Josh Blake is the author of the Deconstructing the NUI blog. Over the past couple of years, he's explored natural user interfaces and interactions through his work on applications designed for Microsoft Surface and Win7 with Windows Presentation Foundation.
About a month ago, Josh organized OpenKinect, an on-line community to support collaboration among people interested in exploring ways to use Kinect with PCs and other devices.  An example of this effort is the open source code, libfreenect, which includes drivers and libraries for Windows, Linux, and OS X. 


The Natural User Interface Revolution
Josh Blake, 1/5/09


Kinect for Xbox 360: The inside story of Microsft's secret 'Project Natal'  (long, but worth reading) David Rowan, Wired UK, 10/29/10


People of libreenect

OpenNI User Guide (pdf)

Oct 12, 2010

Update on Josh Blake, newly designated Microsoft Surface MVP

Josh Blake is the Tech Lead of the InfoStrat Advance Technology Group in DC.  He has been creating multi-touch applications Microsoft's Surface multi-user table-tops for a while. Recently, his team built a suite of applications designed for use by young children at a museum.  Below is a video demonstration of some of this work. It really looks exciting!


Microsoft Surface and Magical Object Interaction

Josh Blake's blog is called Deconstructing the NUI- for those of you new to this blog, NUI stands for Natural User Interface (also known as Natural User Interaction).  See his post, Microsoft Surface and Magical Object Interaction, for more information!

RELATED
Here is a plug for Josh Blake's book, "Multitouch on Windows"

Book Ordering Information

FYI:  InfoStrat  is hiring  WPF experts as well as Microsoft CRM and Microsoft SharePoint experts.


Microsoft Surface MVPs
Dr. Neil Roodyn
Dennis Vroegop
Rick Barraza
Joshua Blake





Jan 27, 2010

How a multi-touch table is made: NUI Group members Aaron Bitler and Brady Simpson on HAK.5

If you want to know more about the inner workings of DYI multi-touch, you'll be interested in the video below.

Aaron Bitler and Brady Simpson learned how to build a multi-touch table through their participation in the NUI Group, and recently formed a company, 3M8 to distribute multi-touch solutions.  The 1/2 hour video provides an in-depth tour of multi-touch technology, in an an on-line broadcast on the HAK.5 website.


"Brady Simpson explains the different multi-touch methods. Aaron Bitler gives us a tour of the x19 multitouch tables construction. Brady explains the software used to power the x19 lcdmultitouch table, including the nuigroup CCV application. Aaron wraps up by explaining to us how he got involved in multitouch. He shares with us some sites where others can get started in their own homebrew projects, as well as his own multitouch company 3M8s at LCDMultiTouch.com."

UPDATE:  New website for Brady Simpson's company: http://www.lcdmt.com/

Dec 31, 2009

Josh Blake's' Nice Multi-touch and Natural User Interface Applications for Surface (Cross-Post )



Information from Josh's YouTube channel:
"This is a video of some of the cool multi-touch and Natural User Interface (NUI) applications I designed and developed for Surface and Windows 7." The InfoStrat.VE map control for WPF and Surface is available for free at http://virtualearthwpf.codeplex.com.
I especially like the moving ring-menu concept, as it facilitates smoother collaboration between people on an interactive table or surface, where flexible orientation control is important.

At 3:15, the demonstration of Josh's ink-shape recognition begins. This is a feature that would be great to incorporate in my applications for children with disabilities who have some fine-motor limitations.


Josh's Blog:  Deconstructing the NUI
Josh's Recent Post about post-WIMP concepts:
Metaphors and OCGM
Josh works at InfoStrat

Nov 21, 2009

Want to make some multi-touch? Try PyMT- Python Multitouch. Featured in Make. (via Sharath Patali)

Sharath Patali, a member of the NUI-Group, has been working with Python Multitouch, otherwise known as PyMT, to create multi-touch applications.  He shared a link to a recent post in Make, featuring PyMT.  Sharath is the author of the UI Addict blog, and is currently doing his internship at NUITEQ (Natural User Interface Technologies).

I've been told that the beauty of PyMT is that it makes it "easy" to create multi-touch prototype applications using very few lines of code, which is great for trying out different ideas in a short period of time.  It helps if you already know Python!


PyMT - A post-WIMP Multi-Touch UI Toolkit from Thomas Hansen on Vimeo.

"PyMT is a python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. PyMT is written in Python, based on pyglet toolkit."


PyMT Programming Guide


PyMT Website

Note: 
Christopher, author of The Space Station blog, is a member of the NUI-Group, and is building his own multi-touch table running his PyMT-based applications. Christopher is a student in Koblenz, Germany, studying computational visualistics, known as information visualization in the US.

Jul 26, 2009

Multi-Touch Musical Instruments- Surface Editor: Post via the NUI-Group Forum

Home Page Image
Multi-touch Everywhere Technology in Action

Below is the video from
Future Instruments, a company formed through Geneva's Conservatory of Music and the University of Applied Sciences Western Switzerland. It is worth watching the entire video clip.

The following information was posted in the NUI-Group Forum:

"The research group at Geneva’s Conservatory of Music presents the Surface Editor and Surface Tracker software applications for turning ordinary surfaces into multi-touch musical instruments. The Surface Editor allows users to custom design interfaces by dragging and dropping components, such as buttons, sliders, keyboards, and many others, onto the interface. The surface can then be used to control digital audio workstations, virtual instruments, hardware synthesizers and samplers, VJ software, and other applications, via either the MIDI or OSC protocol. The Surface Tracker is a multi-touch tracking application created for tracking movements on a surface using high speed infrared OptiTrack cameras. These cameras have on-board image processing, which allows them to process 100 frames/sec while sparing the host computer’s CPU. The applications are available at
http://www.surface-editor.com."

Here are more details from the Future Instruments website:

Surface Tracker

"The Surface Tracker is an open source application for low latency multi-touch finger tracking. It was designed initially for our Multi-Touch Everywhere system (MUTE), but it should work more generally as well to track fingers which have been illuminated using the laser light plane method. It only supports low latency OptiTrack cameras; these cameras have frame rates of 100 frames per second, but due to their onboard image processing, these cameras only consume 3-5% of your computer's CPU (depending on your processor)."

"The Surface Tracker is a standalone application, which sends Open Sound Control messages to any connected client application at a user defineable IP address and port. It implements the TUIO protocol, which was specially designed for transmitting multi-touch events on a table surface."

"The Surface Tracker is currently only available on Windows. This is due to the fact that the driver for OptiTrack camera's is currently only available on Windows. We are, however, currently having discussions with the makers of the OptiTrack camera about extending support of their cameras to OS-X
."

You can download the Surface Tracker from the Future Instruments download page.

I wish this system was around when I was taking a computer music technology class!

RELATED
Greg Kellum's Website: Surface Editor

A Flexible Mapping Editor for Multi-touch Musical Instruments(pdf)
Greg Kellum, Alain Crevosier (9th International Conference for New Interfaces for Musical Expression - NIMES

Sneak Peek: Musical Instruments of the Future
Eliot Van Buskirk, Gadget Lab, Wired 2/25/09

Jul 7, 2009

Lonely Planet Travel App by Amnesia-Razorfish for the Surface

I came across this video on the Microsoft Surface blog about a demo created for Lonely Planet by Amnesia-Razorfish, from REMIX 2009 Australia this past June:


Lonely Planet proof-of-concept at REMIX Australia 2009

Following a link from the Surface blog, I found yet another video:





Get Microsoft Silverlight




Design in the Round. Creating Compelling User Experiences for Surface

The video provides an overview of the history of human computer interaction and look towards the future of NUI (natural user interface). The Surface is viewed as only as a step towards NUI, which follows a person as they go about the day, interacting with technology via a variety of devices and settings.

"Designing for multi-touch, gesture-based and tangible experience like Microsoft Surface presents a new set of challenges. How do you design for a user interface that doesn't have a top? How to allow for multiple simultaneous users without them getting in each other's way?..."

Apr 22, 2009

From the NUITEQ (Natural User Interface) Gallery, via Harry van der Veen

Kids take to multi-touch interaction naturally!

The following photos are from Harry van der Veen's Multi-touch blog. (Harry was one of the founding members of the NUI-Group, and also is the CEO of NUITEQ-Natural User Interface)

The last two pictures are of the HP TouchSmart running NUI Suite Snowflake software, developed by the Natural User Interface Europe AB (NUITEQ) for think LCD, Plasma, and FT displays.






Feb 4, 2009

Ideum's High Resolution Multitouch Table for Museum Exhibits, running NUI's SnowFlake Software Package.

The Don Harrington Discovery Center in Amarillo, Texas, and the Vulcan Park and Museum in Birmingham, Alabama will both have large, high resolution multi-touch table exhibits provided by Ideum.

Below is a demonstration video from Ideum:



Ideum's Portfolio Site
Ideum's multi-touch, multi-user mashup, created with NUI Suite 1.0 Snowflake, a gesture recognition multi-touch software package.
Ideum's mt2 Table Information & Specs (pdf)

Ideum's mt2 Table was tested at a hands-on museum during the development process.

Jan 11, 2009

Usability, Accessibility, and User Experience in a Win7 Environment

Microsoft's newest operating system, Win7 (beta), allows for multi-touch interaction and application development. From what I can tell from the news from the Consumer Electronic Show (CES 2009), touch and gesture interaction holds quite a bit of interest among consumers and application developers alike.

Hopefully people will think carefully about user experience and usability factors when developing the applications we'll be using in the not too distant future!

Because of my background as a school psychologist, I’m interested how touch/gesture applications can address accessibility and universal usability issues.

I wasn't able to locate user experience or usability information for Win7. Here are a few links from the Microsoft website that I think are worth reading:

Touch Interaction Guidelines

How to Design a Great User Experience

Designing with Windows Presentation Foundation

Accessibility

According to the research from Microsoft, ”more than half of all computer users experience difficulties or impairments related to accessibility, and are likely to benefit from the use of accessible technology. Moreover, approaching software design with the flexibility and inclusiveness that are the hallmarks of accessibility often results in overall improved usability and customer satisfaction.”

Regarding touch/gesture interaction, know that “19% (24.4 million) of computer users have a mild dexterity difficulty or impairment, and 5% (6.8 million) of computer users have a severe dexterity difficulty or impairment.”

Christian Moore, from the NUI-Group, shared this nice graphic that depicts the various fields that intersect or converge with interface design, which I think is useful when thinking about how we will design, develop, and use emerging technologies:



Concepts such as accessibility and universal usability can fit into this design!

For more information regarding multi-touch, see my previous post, "For Techies and the Tech-Curious: Multi-touch/Gesture from the NUI Group"