Showing posts sorted by date for query NUI. Sort by relevance Show all posts
Showing posts sorted by date for query NUI. Sort by relevance Show all posts

Dec 31, 2009

The Post-WIMP Explorers' Club: Update of the Updates, Morning of 12/31/09

What is the Post WIMP Explorers Club?  
I came up with the name of this semi-fictional club as a way to organize my thoughts (and blog posts) regarding the development of a new metaphor for post-WIMP applications and technologies, related specifically to natural user interfaces, natural user interaction design, and off-the-desktop user experience.  


Update, morning of 12/31/09:
Josh Blake, author of the blog "Deconstructing the NUI", posted Metaphors and OCGM  this morning.  It fleshes out post-WIMP concepts, addressing metaphors & interfaces.  The premise is that NUI metaphors will be less complex than GUI (WIMP) metaphors.    My feeling is that on the surface, this will hold true, especially for consumers/users and people creating light-weight applications and software widgets.  


Underneath the surface,  where designers and developers brains spend more time than users & consumers, things might be more complex.  Why? The technology to support the required wizardry is more complex.  With convergence, the creation of new technologies, applications, communication systems, and even electronic entertainment, is  now dependent upon the work and thinking of people from a wider range of disciplines.  Each discipline brings to the table a set of terms rooted in theory, and even research practices.


Update,  late afternoon, 12/30/09:
Richard Monson-Haefels response to Ron George's "Part 2".  The concept of OCGM might be growing on him now... OCGM: George's Razor : "If Ron George can explain how OCGM encompasses Affordances and Feedback than I'll be convinced that OCGM works for NUI. Otherwise, I think OCGM is a great start that would benefit from an added "A" and "F"." -Richard   
  • OCGM relates to Occam's Razor.  It is helpful to read a bit about it if you are are interested in the post WIMP conversations. (The link is to an an article from "How Stuff Works", via Richard Monson-Haefel.)
UPDATE 12/30/09  -- This post is part of a discussion between several different bloggers, and was written before Ron George wrote his latest post, Welcome to the OCGM Generation!  Part 2, which I recommend that you read now, or within the same time frame, as this post.   Since I'm not ready to write "Part 2" of this post, I tweaked what I had and added some links to a handful of my previous posts that touch on this and related topics.  The links can be found at the bottom of this post.




START HERE FOR THE "ORIGINAL" POST FROM  12/29 & 12/20/09:


Background
About a year ago I responded to a conversation between Johnathan Brill, Josh Blake, and Richard Monson-Haefel discussing "post-WIMP" conceptualization regarding natural user interfaces and interaction, otherwise known as NUI.  The focus of the discussion was on Johnathan's post, "New Multi-touch Interface Conventions". At the time, we were reading Dan Saffer's book, Designing Gestural Interfaces, and contemplating new ways that technology can support human interaction and activities in a more natural, enjoyable, and intuitive manner.  

A few days later, I shared some of the concepts from the discussion on a post on this blog, "Why "new" ways of interaction?".  The post includes video of Johnathan Brill discussing PATA, a post-WIMP analogue to assist with multi-touch/gesture based application development, which he describes as follows:
Places
"Lighting, focus, and depth, simplified searching and effecting hyperlinked content."
Animation "Using animation to subtly demonstrate what applications do and how to use them is a better solution than using icons. Animations makes apps easier to learn."
Things "Back in the days of floppy disks, objects helped us organize our content. This limitation was forced by arcane technology, but it did have one huge advantage. We used our spatial memory to help us navigate content. Things will help us organize content and manipulate controllers across a growing variety of devices."

Auras "Auras will help us track what we are tracking and when an interaction has been successful."
(For reference, I've copied some of my responses to the first discussion, which can be found near the end of this post)


A year later....
What has changed?   Everything post-WIMP has been covered like a blanket by the NUI-word.  "NUI" now functions as a generic term for anything that is not exactly WIMP.  There is a sense of urgency now to figure out how best to conceptualize post-WIMP interfaces and interactions.  Newer, affordable technologies enable us to interact with friends and family while we are on-the go. Netbooks, e-Readers, SmartPhones, large touch screen displays, interactive HDTV, and new devices with multi-modal I/O's abound.  Our grandparents are on Facebook and twitter from their iPhones.  Our world no longer requires us to be slaves to the WIMP mentality.


So what is the problem?
The technology has moved along so fast that application designers and developers have not had a chance to catch up. (The iPhone is an exception.)  The downturn in the economy has made it difficult for many to take the leap from traditional software or web development and gain new skill sets.  On top of it all, most of us over the age of 15 have been brainwashed from years of working within the constraints of WIMP. It doesn't matter if we are users, consumers, students, designers, or developers.


Even the folks least likely to have difficulty expanding into the post-WIMP world have had some difficulty.  If you've had training in HCI (Human-Computer Interaction), you were inadvertently brainwashed with the best. The bulk of the theory and research you contemplated was launched at a time when WIMP was king, even as the Web expanded. Many of the of the principles held dear to traditional HCI folks have been shattered, and no-one has come up with a "theory of everything" that will cover all of the human actions and interactions that are supported or guided by new technologies.


The problem, in part, is that letting go of WIMP is hard to do, as illustrated by the following post from the Ars Technica website:  Light Touch:  A Design Firm Grapples with Microsoft Surface  (Matthew Braga, 6/29/09) "Ditching the mouse and keyboard means a whole lot more than just doing without two common peripherals.  As those who have worked with Microsoft Surface have found out, you have to jettison decades of GUI baggage and start with a whole new mindset...In actuality, few multi-touch gestures are really anything like what we experience in the physical world. There is no situation in which we pull on the corners of an image to increase its size, or swipe in a direction to reveal more content. So, in the context of real-world interaction, these types of gestures are far from natural...gestures should not only feel natural, but logical; the purpose that gestures like these serve, after all, is to replace GUI elements to the end of making interaction a more organic process."   (Be sure to read the comments.)

Now that the Surface is taking root in more places, and touch-screen all-in-one PC's and tablets are starting to multiply, more people are giving "NUI" some thought. Ron George, an interaction and product designer with experience working with Microsoft's Surface team has contributed to the post-WIMP discussion and spent some time sharing ideas with Josh Blake, a .NET, SharePoint, and Microsoft Surface Consultant for InfoStrat and author of Deconstructing the NUI blog. The outcome of this discussion was Ron George's December 28th blog post, "OCGM (pronounced Occam['s Razor] is the replacement for WIMP", and Josh Blake's post, "WIMP is to GUI as OCGM (Occam) is to NUI".   (Be sure to read the comments for both of these posts!)



OCGM (as conceptualized by Ron George)


Objects "are the core of the experience. They can have a direct correlation with something physical, or they can just be objects in the interface."


Containers "will be the “grouping” of the objects. This can manifest itself in whatever the system sees fit to better organize or instruct the user on interactions. They do not have to be, nor should they be, windows. They can be any sort of method of presentation or relationship gathering as seen fit."


Gestures "I went into detail about the differences in Gestures and Manipulations in a previous post [check it out for a refresher]. Gestures are actions performed by the user that initiate a function after its completion and recognition by the system. This is an indirect action on the system because it needs to be completed before the system will react to it."


Manipulations "are the direct influences on an object or a container by the user. These are immediate and responsive. They are generally intuitive and mimic the physical world in some manner. The results are expected and should be non-destructive. These are easily performed and accidental activations should be expected and frequent."

To illustrate a point regarding the validity of the OCGM analogy proposed by Ron George, Josh Blake shares the following video of a presentation from REMIX 2009, in which August de los Reyes, the Principle Director of User Experience for Surface Computing at Microsoft, briefly discusses the TOCA (Touch, Objects, Containers, and Actions) concept, suggested to replace the WIMP concept:

The video wouldn't embed, so go to the following link:


Predicting the Past: A Vision for Microsoft Surface
"Natural User Interface (NUI) is here. New systems of interaction require new approaches to design. Microsoft Surface stands at the forefront of this product space. This presentation looks at one of the richest sources for inventing the future: the past. By analyzing preceding inflection points in user interface, we can derive some patterns that point to the brave NUI world." 


The concepts outlined in the presentation are similar to Microsoft's Vision for 2019


Richard Monson-Haefel added his thoughts about the discussion about OCGM in his recent blog post, "What is NUI's WIMP?"  Richard disagrees with the OCGM concept, as he feels it doesn't encompass some important interactions, such as speech/direct voice input.   He'd probably agree that NUI is NOT WIMP 2.0.



Post-NUI, Activity Theory, and Off-the-Desktop Interaction Design:
As I was reading the recent posts and discussions regarding NUI/OCGM, I also contemplated some of what I've been reading over my holiday break, "Acting With Technology:  Activity Theory and Interaction Design", written by Victor Kaptelinin and Bonnie A. Nardi.   Victor Kaptelinin is the co-editor of "Beyond the Desktop Metaphor: Designing Integrated Digital Work Environments" (MIT Press, 2007), and has an interest in computer-supported cooperative work.  Bonnie Nardi brings to the IT world her background in anthropology, and is the co-author of "Information Ecologies:  Using Technology with the Heart" (MIT Press, 1999). The authors know what they are talking about. 


It is important to note that activity theory-based interaction design is viewed as a "post-cognitivistic", and informed by some of what I studied in psychology, education, and social science years ago. Within the field of activity theory are some important differences, which I'll save for a future post. 


Below are some concepts taken from the book. I am still mulling them over through the prism of NUI, post-WIMP, PATA, TOCA, OCGM, etc.  That's why there will be at "Part II", with specific examples.


"Means and ends, the extent to which the technology facilitates and constrains attaining user's goals and the impact of the technology on provoking or resolving conflicts between different goals


Social and physical aspects of the environment - integration of target technology with requirements, tools, resources, and social rules of the environment
Learning, cognition, and articulation,  internal vs external components of activity and support of their mutual transformations with target technology


Development -Developmental transformation of the above components as a whole" 
"Taken together, these sections cover various aspects of the way the target technology supports, or is intended to support, human actions".  (page 270)


I especially like the activity checklist included in the appendix of the book, as well as the concept of tool mediation. "The Activity Checklist is intended to be used at early phases of system design or for evaluating existing systems.  Accordingly, there are two slightly different versions of the Checklist, the "evaluation version" and the "design version".  Both versions are implemented as organized sets of items covering the contextual factors that can potentially influence the use of computer technology in real-life settings.  It is assumed that the Checklist can help to identify the most important issues, for instance, potential trouble spots that designers can address". (page 269)


"The Checklist covers a large space.  It is intended to be used first by examining the whole space for areas of interest, then focusing on the identified areas of interest in as much depth as possible...there is a heavy emphasis on the principle of tool mediation"  (page 270).


Other Thoughts
What is missing from this picture is a Universal Design component, something that I think holds up across time and technologies.  Following the principles of Universal design doesn't mean dumbing down or relying on simplicity. It is a multi-faceted approach, and relies on conctructing flexibility in use, one of the key concepts of Universal Design. I'd like to see this concept embedded in the post-WIMP conceptualization somehow. 


Because of my background in education/psychology/ special education, I try to follow the principles of  Universal Design for Learning (UDL) when I work on technology project.  I've spent some time thinking about how the principles of UDL could be realized through new interaction/interface systems.   Although this approach focuses on the educational technology domain, it is important to consider, given that a good percentage of our population - potential users, clients, consumers - has a temporary or permanent disability of one kind or another.


Components of Universal Design for Learning:
Multiple Means of Representation
Provide options for perception
Provide options for language and symbols
Provide options for comprehension
Multiple Means of Action and Expression
Provide options for physical action
Provide options for expressive skills and fluency
Provide options for executive functions
Multiple Means of Engagement
Provide options for recruiting interest
Provide options for sustaining effort and persistence
Provide options for self-regulation
-Adapted from the UDL Guidelines/Educator Checklist, which breaks down the components into more specific details.


Note:  The concept of Universal Design for Learning shares historical roots with some of the work behind Activity Theory and Interaction Design. Obviously, there is still much to contemplate regarding OCGM and other permutations of post-WIMP concepts!   


Here are my comments to the discussion on Johnathan Brill's blog from January 2009:
Thoughts: I'm seeing a trend to "optimize" WIMP-type applications so they can be manipulated by gesture and touch. People who program kiosks, ATM's and POS touch screens are examples of what I'm talking about. Touch and hold, two-fingered touch, and double-tap are just a slight transformation of the WIMP world-view, and in my opinion, are still WIMP (wimpy!). The mouse interaction "pretenders" are fine for using legacy productivity applications, OK in the short run.

For example, I have an HP TouchSmart, but I don't use the touch screen as often as I'd hoped. Try using using Visual Studio to code something on a touch screen. There is so much more that can be done! I know from the touch-screen prototype/demos I've worked on in various classes that applications that support collaboration and decision-making are important, and not just for work.

What do people DO, really? First of all, we are social beings, most of us. Think of what we share and discuss with others, and think about what sort of interactions on a display might best correspond with this interaction. Here are some of the things I've been DOING recently that involved some sort of technology and communication/collaboration with others:



---Travel planning - I recently went on a cruise and with various family members, selected activities I wanted to do on the ship as well plan my shore excursions (a complicated process)


---Picture sharing- I came back from the cruise with lots of pictures that I uploaded on Flickr. Related to this process: Picture annotating, tagging, choosing/comparing & editing it would be SO cool if I could use two sliders to enhance my pictures just so!


---Talking on the phone and responding to e-mail with friends and family members about the pictures, and what they wanted to see on Flickr- "you know, the ones of the dogs in Jamaica"... of which I had about 68!


---Financial planning with my husband. (I took info-viz last semester, so I know the possibilities are there.)


---Using the touch-screen to check-in at my eye-doctor's office: This was a user-unfriendly experience. Such a nice little screen. I was provided with a WIMPY PowerPoint-like interface which was confusing to use- and time consuming!


---Shopping at the new Super Wal-Mart: I asked the greeter if there was a map, and he said, "Food is over there, and the rest of the stuff is that way." Flat panel displays were all over the store, but of course,they weren't interactive. I had no idea where anything was, and the few items I was looking at had no bar-code tags. There wasn't a clerk in sight.   Wal-Mart TV rolled on-and-on via the display above my head. If I could only harness the display and have the talking head answer my questions! I gave up on my shopping trip when I was in the facial lotion/potion section. Too many choices, and too much fine print to read.

---Shopping at the mall with my young-adult daughters... I'll have to hold my thoughts on that one for now!

Some suggestions:
I think the artist/designers, (even dancers,) who are interested in multi-touch and gesture interaction have some interesting things to consider. (I linked to some of my previous posts.)


Again:
I am still mulling things over through the prism of NUI, post-WIMP, PATA, TOCA, OCGM, etc.  So that is why there will be at "Part II".  With specific examples!


RELATED
Multimedia, Multi-touch, Gesture, and Interaction Resources


My thoughts:
2007 Letter to the Editor, Pervasive Computing
Useful Usability Studies (pdf)
2007 Blog Post
Usability/Interaction Hall of Shame (In a Hospital)
2008 Blog Posts

Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors
Interactive Touch-Screen Technology, Participatory Design, and "Getting It"
An Example of Convergence: Interactive TV: uxTV 2008
2009 Blog Posts

Why "new" ways of interaction?
Microsoft: Are You Listening?  Cool Cat Teacher (Vicki Davis) Tries out Microsoft's Multi-touch Surface Table
Haptic/Tactile Interface:  Dynamically Changeable Physical Buttons
The Convergence of TV, the Internet, and Interactivity:  Update
UX of ITV:  The User Experience and Interactive TV (or Let's Stamp Out Bad Remote Controls)
Digital Convergence and Interactive Television;  Boxee and Digital Convergence 

ElderGadget Blog: Useful Tech and Tools


Other People's Thoughts
Ron George's blog, OCGM (pronounced Occam['s Razor] is the replacement for WIMP  12/28/09
Ron George: Welcome to the OCGM Generation! Part 2 
Stephen, Microsoft Kitchen: OCGM, A New Windows User Experience
Richard Monson-Haefel's blog, Multi-touch and NUI:  What is NUI's WIMP?
Richard Monson-Haefel:  OCGM: George's Razor
Josh Blake's blog,  Deconstructing the NUI: WIMP is to GUI as OCGM (Occam) is to NUI
Bill Buxton: Gesture Based Interaction (pdf) (Updated 5/2009)
Bill Buxton: "Surface and Tangible Computing, and the "Small" Matter of People and Design" (pdf) - ISSCC 2008
Dan Saffer, Designing for Gestural Interfaces: Touchscreens and Interactive Devices
Dan Saffer, Designing for Interaction 
Mark Weiser,  Computer for the 21st Century  Scientific American, 09, 1991
Touch User Interface:  Readings in Touch Screen, Multi-Touch, and Touch User Interface
Jacob O Wobbrock, Meredith Ringel Morris, Andrew D. Wilson User-Defined Gestures for Surface Computing CHI 2009, April 4–9, 2009, Boston, Massachusetts, USA.

Dec 26, 2009

DYI multi-touch...

If you follow this blog, you know I like to share what people are doing with multi-touch and related natural user interfaces/interaction. In this post, I'd like to share an article about two students who decided to build and market a multi-touch table- the article below explains the story in-depth, and video shows the nuts and bolts.


Enterprising roomates build multi-touch LCD, market their business to West Coast*
Walter Valencia, Collegiate Times 12/1/09



According to the above article, Aaron Bitler and Brady Simpson they were inspired by CNN's Magic Wall during the 2008 election.  Bitler and Simpson learned more about natural user interface/interaction during a presentation in a business class that featured a video about the Microsoft Surface table and natural user interface technologies.  They formed a company, 3M8,  to build and market mutli-touch display/tables.


Vision x32 from Aaron Bitler on Vimeo.


From what I can tell, it looks like Bitler and Simpson relied on the DYI information and support from the NUI-group website to carry out their ideas. Bitler and Simpson met with representatives of 22Miles, a company located in San Jose that provides interactive solutions, including multi-touch, for web, mobile, and touch screen implementations.

I'll post more about 22Miles in an upcoming post.

Until then, take a look at 22Miles' promo video, featuring a huge 3D interactive multi-touch heart:

Dec 20, 2009

For Techies & Tech Curious: Python and PyMT developments - PyMT and speech recognition

PyMT is short for Python Multi-Touch, a project that is the work of several members of the NUI Group.  Sharath Patali experimented with speech input for PyMT and used the pocketsphinx library from CMU to integrate into PyMT. It worked out well, as you can see from his video demo below:


PyMT Speech Recognition from Sharath Patali on Vimeo.

Other people involved with the PyMT project are Nathanael Lecaude, Matthew Virbel, Thomas Hansen, and Xelapond.


Sharath Patali's Blog/Website Roll (Links to some NUI-group members)


Matthieu Virbel on Vimeo
NUI Group on Vimeo


Python and Game Programming Resources
Adaptation and Evaluation of Numpty Physics for Multi-touch Multiplayer Interaction (pdf)
(A python-based module called "numptyphysics" was created to integrate Python multi-touch cold to allow the C++ code to parse data, converting it to C structs passed to the game code using pointers.)
Python Programming Language Official Website
Pygame
PythonGames
PythonL Game Programming Wiki, by Geoff Howland and Rene Dudfield
Lectures 1-6
Beginning Game Development with Python and Pygame -Book (Will McGugan)
Game Programming with Python - Book (Sean Riley)

Comment:
The reason I'm putting together resources about Python, multi-touch, and games is that I hope to facilitate an exchange between two of the schools I serve as a school psychologist.

Both of the schools are on the same campus. One is a high school for technology and the arts, and one is a program for teens and young adults who have severe disabilities, including autism.  Next semester, one of the computer teachers will be teaching a game programming class using Python for a class of graduating seniors, and if all goes well, perhaps some of the students will create a game for the students with disabilities that would work well on a SMARTboard.

Even better:  It would be great if the pre-engineering students could build a multi-touch table or two for the students with disabilities, running games in PyMT that the computer students create!

Dec 19, 2009

UPDATED: More News, Videos, and Links about Multi-touch and Screen Technologies

UPDATED
More news about multi-touch and screen technologies!

"Holographic" Multitouch   (Eatang Digital Technology)

Holographic muitltouch from eggwong on Vimeo
Eggwong's company is Eatang Digital Technology.  Although most of the website is in Chinese, it looks like it provides interactive design services as well as software kits.


How to Build a 20" Multiitouch Screen in Three Weeks (Ben Unsworth - Globalcore)

How to Build a 20' Multitouch Screen in 3 Weeks from Ben Unsworth on Vimeo.


Globacore
Globacore used four 72" Blue Ocean Low Gain Screens 16:9 to create their twenty-foot multi-touch wall, which I think was also used for the "holographic" screen in the first video. The tracker and base flash API from the NUI-group were used in this system.
I wasn't sure if I'd heard about Blue Ocean screens.  Much to my surprise, the company behind Blue Ocean is Nippura, located in Japan.  Nippura has another office in Charlotte, N.C.
For years,  Nippura's core business has been focused on creating "the world's largest acrylic panels, tunnels, cylinders and spheres for the professional aquarium industry", but is also focused on screen systems as well.

US Nippura/Blue Ocean Projection Screens
Blue Ocean screens were use in the second videos of this post.  Below is some information the screens from the company's website:
"Blue Ocean® Projection Screens are the first ever to cast the screen into a hi-tolerance viewing plane centered between two ultra-clear panels of cell-cast acrylic, the same type of material used in hi-optic fighter jet canopies and the world’s largest aquariums. Blue Ocean® Projection Screens also offer something never before found in screens: self-rigidity.  The thickness of each screen can be fabricated so that it maintains its form without external frames or tension systems. Frameless high definition projected images without borders amplify the infinite depth viewing experience.  Because of its durability, outdoor applications of entertainment around the garden and pool are also possible. Normal wear and tear scratching has no long-term effect on the screens’ performance, and a ten year old screen, simply re-polished, will literally look and perform exactly as it did the day it left the factory."
Super High Definition Stereo Image using 4K Projection Systems (pdf)
Tetsuro Ogi, Hiroaki Daigo, So Sato, Yoshisuke Tateyama, Yasuaki Nishida, International Conference on Artificial Reality and Telexistence 2008

NUITEQ
 

NUITEQ offers single and duo-touch plugins for Snowflake Suite that enables the software to run on NEXIO and NextWindow screens, including NextWindow's screens that are integrated in the HP TouchSmart and Dell Inspiron all-in-one PC's and others.  SnowFlake Suite is also compatable with 3M Touch Systems, N-Trig, and Lumino.  You can view the SnowFlake Suite application in action on NUITEQ's YouTube channel.
"NUITEQ's software product Snowflake Suite, which received a Stevie Awards finalist recognition for Best New Product or Service of the Year 2009 - Media & Entertainment, is available to end clients, software developers, system integrators, VAR's and OEM's. Snowflake operates on a wide variety of hardware technology platforms including rear camera based solutions, IR overlay systems, bending wave, surface acoustic wave (SAW), projected and surface capacitive, resistive and other touch technologies for small hand held mobile devices and larger scale interactive displays."
Recent news: NUITEQ Assists LTU Skelleftea at SACO event with new educational multi-touch desk


FingerTapps
Software platform for multi-touch solutions:
Video:  FingerTapps on the Dell Inspiron 19
Dell Studio One 19:  All-in-One Stunner Takes Japan
FingerTapps collaborated with DigitalDM Digital Publishing to support interactive media panels (IMAS) for use in retail settings, as well as digital print publishing.  


NextWindow and All-In-One PC's:
NextWindow, the company that made the large single-touch display I used for a couple of class projects nearly three years ago, has really grown over the past few years.  NextWindow provides optical touch screens for the HP TouchSmart PC, and also for the Dell Inspiron Studio One All-In-One PC, and the Dell Sx2210T desktop monitor.   The next Dell with a NextWindow touch screen will be the Inspirion One 19, an all-in-one PC.   You can find NextWindow touch screens in the Medion X9613, the NEC ValueStar W, and the Sony L Series.  (NextWindow has Windows 7 certification for touch screens)

STANTUM and GUNZE USA
Stantum has partnered with Gunze USA, a manufacturer of resistive touch-panels, to co-design and deliver multi-touch products.  "Gunze USA will provide these custom multi-touch modules to the North American market for use in industrial, automotive, defense, aerospace, medical, kiosk/point-of-sales, and other professional applications. Unlike capacitive multi-touch, Stantum’s technology is particularly well suited to these professional markets, thanks to its:
  • immunity to EMI issues, via easy coexistence with Wi-Fi or Bluetooth antennas around the screen;
  • stylus input and high precision, enabling handwriting recognition; and
  • ability to work flawlessly with gloves."
According to the press release, Stantum's multi-touch technologies provide advantages such as handling 10 or more touches at a time, finger and stylus input, finger-pressure consumption, and native Windows 7 support, with no need for a driver. Developers can order the "beta" version of the Slate PC, based on the Dell Inspiron Mini netbook platform, from the Stantum website. 


GESTURETEK
The video below is of an interactive 3D application created for the Gondwana Das Praehistorium in Schiffweler, Germany, and provides an simulation of what it might be like to be a flying ancient pterosaur of giant dragonfly.  Two gamers can play at the same time.  If you want to fly, you flap your arms, and navigate by leaning. You crouch to dive, and increase your speed by flapping your arms even faster:
GestureTek's 3D Interactive Flying and Virtual Game Experience for Gonwana Museum

This company always has something going on that looks fun and interactive.  To get an idea this company's interactive work, take a look at the information on the GestureTek website links below. GestureTek has a variety product videos on the website.


GestureTek Health
GestureTek Mobile Gaming
GestureTek mobile gaming videos
GestureTek Product Summary
GestureTek in Science Centers, Aquariums, Zoos, and Visitor Centers


Lm3Labs


View more documents from LM3LABS.

Lm3labs Air Strike Overview (Slideshare)


SNIBBE INTERACTIVE
Snibbe Interactive was founded by Scott Snibbe to develop social immersive media as a communication medium that incorporates the language of cinema and engages participation and interaction with others (and technology) through unencumbered body movements.  
Snibbe Interactive (22 of 66) by armadillu.


Get in the Action Prototype

Get in the Action Prototype - Snibbe Interactive from Oriol Ferrer Mesià on Vimeo.




Scott Snibbe's Social Immersive Media presentation at Stanford's Seminar on People, Computers and Design, 5/2009

'Scott Snibbe of Sona Research discusses a distinct form of augmented reality focused on social interaction: social immersive media. He discusses design principles and interaction techniques to create strong emotional responses and social engagement through visceral interaction. He also describes approaches to clearly communicate cultural and scientific ideas through the medium, and how to promote specific distinct social behavior in uses." -YouTube description


RELATED
Globacore  "Globacore was founded by Ben Unsworth and Jeff Williams out of a passion for technology, art, and business. Globacore began as a web hosting and development company supporting the the technical needs of sports apparel startup VC Ultimate and it's associated companies."
"Globacore Inc. was officially incorporated in April 2003. In 2007 Ben and Jeff were joined by programmer Kyle Davis and graphic designer (and trail biker) Matt Venhuizen. In 2008, the Flexmaster Lukasz Ruminski joined us and continues to be a huge member of our team.Update: We'd like to welcome our newest team member: Developer Dave Rapin!"
Globacore Interactive Technologies
67 Mowat Ave. Suite 431
Toronto, ON
M6K 3E3
Phone: + 1 (416) 925-2082
Tollfree: + 1 (888) 309-7658
Fax: + 1 (647) 477-3270 
General Email: info@globacore.com


BLUE OCEAN/NIPPURA
Blue Ocean/ Nippura US Website
Blue Ocean Product Information (pdf)
3611 Tryclan Dr.
Charlotte, NC  28217
Phone: 704-561-9191
Web: www.usnippura.com



NEXTWINDOW
NextWindow's Touch Technologies Overview
Optical Touch and All-In-One PC's
News Release:  NextWindow Patents (pdf)

Contact: David Villarina, NextWindow
+1 (925) 272-4530
dvillarina@nextwindow.com



NUITEQ
Natural User Interface Technologies AB(Headquarters)
Laboratorgränd 7
93162 Skellefteå
Sweden

Phone: +46 702865975
Fax: +46 851989240
mail@nuiteq.com



3M Touch Systems US
3M's Touch Topics information site
Multi-touch@mmm.com
 
Natural User Interface Asia Pte Ltd (Asia office)
12 Little Road
#06-01 Lian Cheong Industrial Building
Singapore 536986
Phone: +65 6748 5245
Fax: +65 6748 7961


FINGERTAPPS
Contact: James Bell-Booth
james[at]fingertapps.com
+64 21 226 6399

www.fingertapps.com/


STANTUM
Slate PC Proof of Concept Datasheet (pdf)
Contact: Francois Jeanneau
Sales and Business Development
Director, Stantum North America

Berkeley, CA, USA
Cell: +1 (510) 928 56 17
f.jeanneau@stantum.com

GUNZE USA
Product Overview
Contact:Gunze Electronics USA
2113 Wells Branch Parkway, #5400
Austin, Texas 78728
Tel: 512-990-3400 Fax: 512-252-1181



LM3LABS
Website: http://lm3labs.com/ENG/index_eng.html
info@lm3labs.com
Europe:
Antipolis Innovation Campus
300, route des Cretes
06560 Sophia-Antipolis France
Hong Kong, China
+852-8198.8406
Australia
+61(02)8011.3342
USA: +1(646)461.6146
TGarden Square 409
3-5-26 Funabori
Edogawa-ku
134-0091 Tokyo, Japan
+81(0)3.3877.0779


SNIBBE INTERACTIVE
1073 Howard Street
San Francisco, CA 94103
+1 415 822 1442
sales@snibbeinteractive.com
Snibbe Interactive Blog
2010 Interactive Catalog (pdf)
HOLDING TANK
Peau Productions
Diffusers and Projection Screens Comparison Site
(Provides photos and video of various screens and diffusers used for optical-based multi-touch systems.)


NUI-Group
Wiki: Comparison of rear projection films
Forum Thread

Outdoor Media Company (China)
Products
Contact: Kuo Lin  (?)
 42"46"Multi Touch LCD Display
42x46 Multi-touch LCD Display

HIWAD TOUCH - Beijing Yhjn Science and Technology Co. LTD



Interactive Sandbox 
400×300
"An interactive sand box or sand table for use in tactical training includes an automated control system and image generator so that infinitely variable visual images may be projected onto the contoured surface of the sand for training and preparation for action in the military field and disaster control, industrial planning, environmental protection, and similar situation."


Interactive Holographic Touch Screen
327×245


Beijing Yihejianing Science and Technology Co., Ltd
Address: Room 621, Jinyuan Modern shopping Mall, NO.1 Yuanda Road, Haidian District, Beijing, China
TE:8610-88892897,88892895
FAX:8610-88892867-801
Email: hxrong2001@hotmail.com
http://www.cnhci.cn