The concept of multi-touch/gesture/surface computing is spreading.
Here's more evidence:
Panasonic Touch Air Hockey
The game was demonstrated at ISE 2009 (Integrated Systems) Amsterdam. The interface was developed by UI Centric, a Soho, London company.
Microsoft's SurfaceWare at the Tangible Embedded Interactions Conference (TEI 2009): SurfaceWare is a level-sensing software that alerts waitstaff when glasses need refilling. Photos from Nachiket Apte, via Ru Zarin
"The world's first multitouch, multiuser table for primary education - the SMART Table - is now available for purchase.Order the SMART Table"
"As a collaborative learning center, the SMART Table enables engaging and motivating small-group learning experiences. Up to eight students can use their fingers intuitively to sweep, slide and spin objects on the interactive screen. The SMART Table's ready-made activities help primary students gain and further their skills in areas like counting and reading."
"The SMART Table also makes an ideal complement to whole-class activities on the SMART Board™ interactive whiteboard. It helps reinforce concepts in a small-group setting and ensures students can participate in interactive and creative learning experiences."
Ubiq'window, by LM3LABS, is a gesture-based system that is used for interactive show windows, interactive in-store marketing, museum installations, and more.
The slides provide details of the Ubiq'window's system specifications, including a gesture recognition set. The slides also higlight "Airstrike", a system that allows for free-air, touchless interaction.
The following video is a demonstration of "Sparsh", an interactive multi-touch FTIR table built in eight weeks by a group of engineering students in India. Most of the information regarding the hardware and software you see running on this low-cost system can be found on the open-source NUI-group website, forums, and wiki.
I'm preparing myself to explore the multi-touch potential of my HP TouchSmart PC with the beta version of Win7, Microsoft's newest operating system. While I was searching for information, I came across this cute video of a baby interacting with the touch-screen. The dad in the video is software developer Kurt Brockett.
I have lots of ideas for touch screen interaction applications for kids of all ages. Please leave a comment if you have a TouchSmart and working with Win7, or plan to do so in the future.
This video showcases the work of Natural User Interface-AB, using NUI Suite 1.0 Snowflake and Flash.
Here is the plug from the company's website: "Natural User Interface (NUI) is a Swedish innovative emerging technology company specializing in commercially available advanced multi-touch software, hardware and service solutions. NUI's solutions can convert an ordinary surface into an interactive, appealing and intelligent display that creates a stunning user experience."
I received couple of interesting links about interactive multimedia applications from Anthony Uhrick, of NextWindow, who is attending CES (Consumer Electronics Show). (NextWindow is the company who produces large touch-screen displays that have duo and multi-touch capabilities.)
Kevin Kennedy and his team at InterKnowlogy partnered with Zygote 3D Human Anatomy and Intermountain Health Care to develop a health care application developed in Windows Presentation Foundation to run on Microsoft's multi-touch Surface computing table.
The application supports collaboration between health care professionals and could also support collaboration between patients and doctors as well. Aspects of the application could be useful for patient education.
I really liked the part that demonstrates how you can zoom deeply into the 3D heart and look at things from various angles.
The above video demos an application that might be useful for teaching history with an interesting timeline interface dial.
Visit InterKnowlogy for more videos and information about what they are doing with Windows Presentation & Silverlight as partners with Microsoft's Surface team.
If you are interested in more 3D anatomy, visit Zygote's 3D Human Anatomy site and 3D Science. If you are an educator, you'll see that 3D interaction has potential for creating more engaging science and health education lessons!
Another interesting link is to TouchTV Networks, which has partnered with companies such as Vectorform, who are also working with multi-touch applications using Windows Presentation Foundation.
Video from TouchTV Networks of CES 2009 Demo:
Vectorform's Virtual Drum Kit application:
Vectorform's Surface at School - demonstrates how this can be used in a classroom:
If you are a new visitor to this blog and interested in interactive multimedia, you'll need to know more about the NUI-Group. Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications. (For related information, please read my recent post, Usability, Accessibility, and User Experience in a Win7 Environment.)
Seth Sandler, of the NUI-Group, sent out a great email with links and resources for people who are interested in multi-touch/gesture interaction, hardware, and/or software development. The list of NUI-Group members who have completed projects is listed below, with links to project websites as well as related threads on the NUI-Group forum.
(The information can be found on the NUI-Group Wiki, which boasts a nice icon based front-page)
Thanks, Seth, for organizing this wealth of information!
From the New York Times, photographed by Yana Paskova
"The Accenture Interactive Network recently installed a large interactive screen at Chicago's O'Hare International Airport. The plan is ultimately to create a network of interactive, wall-sized screens that deliver a variety of information via touch screen to thousands of users."
Video and Photos from TacTable
The following video shows some applications developed by TacTable. As you can see, many of the applications look like they'd be useful in public spaces, including airports and museums.
Below are some pictures from the TacTable website:
Where Did Language Come From? Liberty Science Center, N.J.
Accenture Welcome Wall, London, England Sprint Studio Digital Lounge Table Related: TacTable Contacts: USA: Henry Kaufman henry@tactable.com
"Clients include Sprint, Accenture, Cirque du Soleil, New York Museum of Modern Art (MoMA), Liberty Science Center, Detroit Institute of Art, Georgia Aquarium, London's Millenium Dome, Boston Museum of Science, Chicago Museum of Science and Industry, Chicago Mercantile Exchange, Warner Brothers, Discovery, Disney, British Telecom Retail, and many others."
The video below shows the "pre-alpha" version of the Cobalt Metaverse Browser:
"This video shows the Cobalt metaverse browser being tested on a 13-foot by 5-foot multi-touch visualization wall equipped with six high-definition projectors located at the Renaissance Computing Institute engagement center at Duke University. The input drivers are being developed by Dr. Xunlei Wu so that users can directly manipulate high-resolution data using both hands and multiple fingers for a more natural and intuitive data exploration experience. In the video, Dr. Wu is using both gesture and touch to navigate through, and rearrange content between, two Cobalt virtual worlds."
This is a promotional video from BMW, showing how potential buyers can interact with a tabletop computing system to preview various ways they can customize the car. The system in the video is Microsoft's Surface:
A few years ago I took a class about virtual reality and how it can be used in education and training. One of the topics we covered was telepresence. One of the companies I looked at was Teliris.
According to a whitepaper on the Teliris website, "Business Value of Telepresence", by S. Ann Earon, "Telepresence is what videoconferencing was meant to be: reliable, highly interactive, easy to operate, resulting in a natural meeting with transparent technology and an emphasis on human factors." Teleris now offers something they call Immersive Collaboration, which involves the use of surface computing that supports document and multimedia content sharing across locations, as if all of the group members are in the same room
Watch the demonstration of the Teliris Collaboration Touch Table in a telepresence meeting. In the video clip below, the narrator shares content from a local Teliris Collaboration Touch Table to a remote meeting participant who is at another table.
"Touch to Telepresence"
Business Holograms! DVE (Digital Video Enterprises) developed a Tele-Immersion room that uses Cristie Digital Systems Mirage HD 3D projectors to create holographic images of remotely located meeting participants:
DVE Telepresence: An Introduction (A plug from DVE, but informative.)
DVE Portable Virtual Presentation -A Volumetric 3D image from a projector hidden from the audience's view:
This system can display 3D images on the stage, and supports 2-way interactive HD feeds.
The above examples demonstrate how newer technologies, including table-top surfaces, can be used for collaborative business meetings. I can envision this technology used for medical education, medical consultations, and collaboration between artists and musicians.
When the price comes down, perhaps we will have these systems in our family rooms!
Here is a short video demonstration of a Touch TV Networks demo on a display using a NextWindow touch screen. It looks like it was built using Windows 7. I understand that it was created by people with former Microsoft connections.
You can find the demo code on the Codeplex website. Here is a quote:
"The Touchless SDK enables developers to create multi-touch based applications using a webcam for input. Touch without touching."
"Touchless started as Mike Wasserman’s college project at Columbia University. The main idea: to offer users a new and cheap way of experiencing multi-touch capabilities, without the need of expensive hardware or software. All the user needs is a camera, which will track colored markers defined by the user." (I posted about the Touchless SDK previously, but I didn't have the video.)
Spencer is a teacher and instructional technology consultant who develops web-based interactive applications for use on interactive whiteboards (IWB's). He's interested in multi-touch applications for education and has some good insights into what HCI researchers call the "problem space".
Here are Spencer's comments:
..."I agree that Flash could have a very important role to play here. I chose Flash as my development tool because it allows quick development of ideas and then easy distribution of the product. The importance of this is that it allows people who have a profession other than software developer to create software with the insight of their main role. In my case, as a teacher, I can identify things I wish I had and then make them. Often I find that other teachers had the same wish and they then appreciate the product."
"The unfortunate thing with multi-touch is that it is far from the technology most of us outside the industry/research areas have to work with. An app created in Flash for single touch follows the mouse and pointer method so it can be developed easily. When done it can be easily tested on a standard IWB for the feel (which is often surprisingly different on the IWB compared to using a mouse)."
"The Flash developer community has a very experimental and creative characteristic and I’m sure would be a great driving force for multi-touch but first there needs to come a reason for more people to have some sort of multi-touch display for general use, beyond facilitating experiments. When the various operating systems support it and have the apps to make having a supporting display viable then the experimentation and ideas will really flow."
"In addition, the display makers need to recognize the benefits of Flash and ensure they address them. At the moment it seems to be too often an afterthought if considered at all. SDKs and APIs make no reference to Flash or they remain indefinitely in beta for older versions of Flash only."
"It is a pity that all of this will take time. The more time that passes the more single touch IWBs are bought and installed which will delay the uptake of the eventual multi-touch ones. Meanwhile children continue to have to keep reminding themselves that they can only touch the board in one place when it is clear that every bit of their brain is telling them to interact with the board in a much more natural multi-touch way." My response to Spencer's comment:
Spencer,
You make good points regarding the barriers to getting the multi-touch approach adopted by the "mainstream". You're right about what the commercial display makers need to do. If they want to market displays that will have more appeal, they must think about the different sorts of applications and programming environments that the displays should support. Display makers also need to think more about the bigger picture - in what sort of environments will the displays be located? Indoors? Outdoors? Near bright sunlight? What about people with disabilities, children, or the elderly?
I can see that in the future, multi-touch displays and other devices would operate within an embedded systems environment and support mobile computing activities as well. There are existing examples of this concept, of course, but there is much room for creative improvement. An embedded systems approach is complex, and would need to handle input from sensors, support multi-modal signal process, and also provide users with a range of connectivity modes, including RFID. (Data management and storage needs would have to be addressed, along with privacy and security concerns.)
Most importantly, in my opinion, these systems would need to have the flexibility required to support human activities and interactions that have not yet emerged! Certainly this will need to take a multidisciplinary approach.
There are many unanswered questions....How does this fit in with mobile computing and "cloud" computing? What sort of middleware needs to be developed?
Even if we don't have solutions to the bigger problems, there are many smaller problems that I think could be somewhat easily solved.
As you mentioned, many applications that are designed for single-touch screens don't fully support the way people identify, select, and move items around the screen. Although educators access websites every day for use on interactive whiteboards, they are hungry for more. There are not enough websites that are optimized for single-touch interaction, or touch-screen interaction in 3-D "space".
Teachers who are successful users of interactive whiteboards know exactly what we are talking about. They spend quite a bit of time searching for new on-line resources they can use with their students. They know how much the students want to interact with the screen at the same time and would be so excited to have capability at their fingertips!
Optimizing websites for touch-screen applications is possible, but this idea hasn't occurred to web developers. Their jobs don't require it, so there is no incentive. Google is developing FlareBrowser, that can support multi-touch interactions, but according to information on the website, it runs on Mac Leopard 1.5, and nothing else. The present version is bare-bones. I haven't yet tested the FlareBrowser.
I think that another barrier to getting multi-touch off the ground is that the people who might have the knack for multi-touch application development simply don't know it! We've mentioned that Flash developers have the potential to create good multi-touch applications. I also think that game developers and designers could make good contributions to the multi-touch movement. Just think about what thought goes into programming interactions and event handling for 3-D web-based multi-player games!
Yet another barrier is that people who work in lower-tech fields could benefit from collaborative multi-touch applications, but they don't know it, either. The research I've reviewed tells me that multi-touch applications can support a wide range of human endeavors- work, creativity, data analysis, education, collaboration, planning, and so forth.
What is missing is the input of potential end users from a variety of fields. No specific discipline "owns" multi-touch, so it is hard to figure out how we can make this happen.
Could we set up multi-touch technology playgrounds at professional and trade conferences? What about airports and hospital lobbies? Libraries and museums? Shopping centers? Sports events and rock concerts?
This leads me to my next idea, which is jumping ahead a bit:
One of the barriers to the development of multi-touch applications is that it is not easy to gather user requirements when the users are not familiar with the technology. That is when my "Miracle Question" technique comes into play. I learned this technique when I studied brief solution-focused counseling and found that if modified, can be useful when figuring out user requirements. (The process still needs some fleshing out.) Why the Miracle Question? The questions that a developer uses to guide the client during the initial planning stages are very important. Keep in mind that people want to use technology because it meets a need and also solves a problem, which is the similar to the reason a person might seek counseling. The Miracle Question technique (actually, a series of questions) might help to tease things out. The goal of this type of questioning is to help the client use their own creativity, resources, and problem-solving skills so they can become effective partners throughout the development cycle. (People with human-computer interaction training might have an easier time understanding how this technique might be modified and applied to different fields.)
FYI A good example the Miracle Question process, as used in therapy and counseling, can be found on the Network of Social Construction Therapies website in an article written by the late Steve de Shazer:
There aren't many resources about the use of the Miracle Question in IT or business. Here are a couple:
Solution Focused Management of Unplanned IT Outages (Read pages 132 and the references.)http://conferences.vu.edu.au/web2006/images/CDProceedings06.pdf Proceedings of 7th International We-B (Working for E-Business) Conference, 2006Katherine O'CallaghanSugumar Mariappandar, Ph.D.School of Business and InformaticsAustralian Catholic University
"I needed to find a screen-free environment!" -John Oliver "It's good to be King." - John King, after disposing of John Oliver...
I just took a look at a hilarious episode about interactive multi-touch screens and a conspiracy theory on the Daily Show. The episode features Jeff Han, the creator of CNN's Magic Wall, John Oliver, John King, and others from CNN.
If you are interested in multi-touch technology, feel free to do a search for additional information on this blog. The following post includes Jeff Han's demonstration of his multi-touch applications from TED 2006, along with resources and links:
"NextWindow Gesture Server Application in conjunction with a NextWindow touch screen enables two-touch gestures to be used on the Microsoft Windows Vista desktop and certain applications.
You perform a gesture by double-tapping or dragging two fingers on the touch surface. The Gesture Server interprets these actions as commands to the operating system. For example a two-touch vertical drag on the Vista desktop can adjust the computer's audio volume control up or down as required."
Also from the website:
Vertical scroll: drag two fingers up or down the touch screen.
Horizontal scroll: drag two fingers left or right on the touch screen.
Zoom: move two fingers apart or together.
Double Tap: double-tap two fingers on screen.
"You can enable or disable the two-touch functionality and adjust the sensitivity of each of the four two-touch gestures. You can also select the command that is executed with the double-tap gesture."
Chris Harrison and Scott Hudson, from the Human-Computer Interaction Group at Carnegie-Mellon University, presented their latest research at the UIST '08 conference. Take a look at the video below to see how gestures that result in sounds can can transformed on unpowered finger input surfaces, using a stethoscope sensors and filters:
Yes, every surface is a computer! (Even your pants...)
The Best Paper Award at UIST '08 was "Bringing Physics to the Surface", by Andrew Wilson, of Microsoft Research, and Ahahram Izadi, Otmar Hilliges, Armando Garcia-Mendoza, and David Kirk, of Microsoft Research, Cambridge.
Here is the abstract:
"This paper explores the intersection of emerging surface technologies, capable of sensing multiple contacts and of-ten shape information, and advanced games physics engines. We define a technique for modeling the data sensed from such surfaces as input within a physics simulation. This affords the user the ability to interact with digital objects in ways analogous to manipulation of real objects. Our technique is capable of modeling both multiple contact points and more sophisticated shape information, such as the entire hand or other physical objects, and of mapping this user input to contact forces due to friction and collisions within the physics simulation. This enables a variety of fine-grained and casual interactions, supporting finger-based, whole-hand, and tangible input. We demonstrate how our technique can be used to add real-world dynamics to interactive surfaces such as a vision-based tabletop, creating a fluid and natural experience. Our approach hides from application developers many of the complexities inherent in using physics engines, allowing the creation of applications without preprogrammed interaction behavior or gesture recognition." Preparation for the Internet of Surfaces & Things?
Despite the increase in interest in systems that support multi-touch, multi-user multimedia interaction, there is a need for creative, tech-savvy types to develop innovative applications. Why? This technology has the potential to make a powerful impact on how people learn, communicate, solve "big picture" problems, and do their various jobs.
CNN's Magic Wall was one of the first applications to gain the attention of the masses, as it was used as an interactive map during the US presidential election process. Touch-screen interaction gained even more notice after the recent SNL parody by Fred Amisen.
If you think about it, the multi-touch applications you see on the news aren't much different than what you'd get from a "single-touch" program.
Fancy, yes. Truly innovative, no.
Just imagine a 3D multi-touch, multi-user, multimedia version of Google Search. I did. I put my sketches in my idea book and hurt my brain thinking about how it could be coded. Jeff Han, the man behind Perceptive Pixel and CNN's magic wall, had much more up his sleeve when he demonstrated his work at TED 2006. Even if you've previously seen this video, it is worth looking at again. (I've provided a link to the transcript below.)
This video presentation had a transformational effect on me as I watched for the first time. Jeff Han brought to life ideas that were similar to my own as a beginning computer student thinking about collaborative educational games and multimedia applications that could be played on interactive whiteboards.
Here are some selected quotes from the video:
"I really really think this is gonna change- really change the way we interact with the machines from this point on." "Again, the interface just disappears here. There's no manual. This is exactly what you kind of expect, especially if you haven't interacted with a computer before."
"Now, when you have initiatives like the hundred dollar laptop, I kind of cringe at the idea that we're gonna introduce a whole new generation of people to computing with kind of this standard mouse-and-windows pointer interface. This is something that I think is really the way we should be interacting with the machines from this point on. (applause)" "Now this is going to be really important as we start getting to things like data visualization. For instance, I think we all really enjoyed Hans Rosling's talk, and he really emphasized the fact that I've been thinking about for a long time too, we have all this great data, but for some reason, it's just sitting there. We're not really accessing it. And one of the reasons why I think that is, is because of things like graphics- will be helped by things like graphics and visualization and inference tools. But I also think a big part of it is gonna be- starting to be able to have better interfaces, to be able to drill down into this kind of data, while still thinking about the big picture here."
So now what?
A recent post by "Alex", on the AFlex World blog discusses a few solutions. Alex had a chance to meet with Harry van derVeen and Pradeep George from theNUI Group, and Georg Kaindl, a multi-touch interaction designer from the Technical University of Vienna. The focus of the discussion was to come up with ideas to encourage Adobe/Flash designers and developers to learn more about multi-touch technology and interaction, and take steps to create innovative applications.
I especially like the following quote from the post:
"...A quick quote from our conversations: “When our children will walk up to a display, they will touch it and expect to do something.”
As a techie and a school psychologist, I see an immediate need for innovative applications. I know that there is a built-in market in the schools, at least for low-cost applications. Despite economic constraints, many school districts continue to invest in interactive whiteboards (IWB's). They are cropping up in preschool and K-12 settings, and teachers are searching for more than what's currently available.
Interactive, collaborative applications are needed in fields such as health care, patient education, finance & economics, urban planning, civil engineering, travel & tourism, museums & exhibitions, special events, entertainment, and more.
Smart Technologies, the company behind SmartBoards, has a new interactive multi-touch, multi-user table designed for K-6 education, the Smart Table. Hewlett Packard has several versions of theTouchSmart PC, which can support at least duo-touch, if not multi-touch, multi-user applications. There are numerous all-in-one large screen displays on the market that support multi-touch and multi-user interaction.
Quotes from Harry van der Veen, of Multitouch NL:
"In 10 years from now when a child walks up to a screen he expects it to be a multi-touch screen with which he can interact with by using gestures."
"...multi-touch screens will be as common as for children is the internet nowadays, as common as mobile phones are for us."
Here is a quote from a conversation I had with Spencer, who blogs at TeacherLED. "It was interesting this week as I was in a classroom with a teacher who I've not worked with before... he had 2 students using the whiteboard who kept touching it together by mistake. The teacher, exasperated, said to himself, "Why can't they make these things to accept 2 touches without going crazy!"
Proof of the demand! I think you are right when teachers spot the limitations and then see the technology on visits to museums, that might stimulate demand." Spencer creates cool interactive mini-applications, mostly for math, using Flash, that teachers (and students) love to use on interactive whiteboards. (He's interested in multi-touch, too.)