I came across this demonstration of I/O DataCenter's 12 by 7 foot interactive video wall that makes playing around with views of data center modules...interesting! The display is a gesture-based "multi-touch" system. (I'll update this post when I get more information.)
Here is the description from the Datacenter YouTube channel:
"Instead of hauling a 40-foot long modular data center to a trade show, i/o Data Centers is taking a high-tech approach to customer tours of their i/o Anywhere modular data center. The i/o team has created a 12-foot by 7-foot touchscreen video wall to provide interactive tours of the company's facilities. Selecting a "hot spot" pops up a virtual data center, complete with cross sections and product info, following the concept of the touch screens in the sci-fi movie "Minority Report.""
FYI: I/O Data Centers has an application that runs on the Surface.
UPCOMING:
Stay tuned for my upcoming posts!
News about LM3LABS (Previous post)
Interactive Surveillance CCeline Latulipe (technologist) Annabel Manning (artist)
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Dec 11, 2010
Dec 6, 2010
UPDATE: Demo 2 of the Kinect Theramin, Therenect, by Martin Kaltenbrunner
I recently posted about the Therenect, a gesture-controlled digital theremin created for Microsoft's Kinect, created by Martin Kaltenbrenner - Therenect: Theremin for the Kinect! (via Martin Kaltenbrenner) It looks like Martin has been busy polishing up the application over the past few days, as you can see from the video below:
Therenect - Kinect Theremin - 2nd Demo from Martin Kaltenbrunner on Vimeo.
RELATED
Virtual Theremin Made with Kinect; Real Thereminists Will Make it Useful
Peter Kirn, Create Digital Music, 11/30/10
Therenect - Kinect Theremin - 2nd Demo from Martin Kaltenbrunner on Vimeo.
RELATED
Virtual Theremin Made with Kinect; Real Thereminists Will Make it Useful
Peter Kirn, Create Digital Music, 11/30/10
Posted by
Lynn Marentette
Labels:
create digital music,
demo,
kinect,
martin kaltenbrunner,
theremin,
therenect,
update,
video
No comments:
Dec 5, 2010
Video: DaVinci Surface Physics Illustrator Interface on Xbox Kinect, with gesture interaction, by Razorfish
DaVinci prototype on Xbox Kinect from Razorfish - Emerging Experiences on Vimeo.
RELATED
Razorfish ports DaVinci interface to Kinect, makes physics cool (video)
Time Stevens, Engaget, 12/5/10
Razorfish
(I love this website.)
Posted by
Lynn Marentette
Labels:
daVinci,
gesture interaction,
interactive surface,
interface,
kinect,
NUI,
razorfish,
Xbox
No comments:
Dec 3, 2010
More gesture and multi-touch interaction! Windows 7 Navigation with Kinect; Product browser by Immersive Labs,
Here are a couple of new natural user interface videos. The first video, by Evoluce, demonstrates gesture interaction/navigation in Windows 7 applications supported by Kinect. The second video, by Immersive Labs, shows multi-touch product browsing interaction on a large display.
Kinect Treatment of Windows 7, by Evoluce
Evoluce: Leading Surface Technologies
Immersive Labs - Multi-touch Product Browser
Immersive Labs
Kinect Treatment of Windows 7, by Evoluce
Evoluce: Leading Surface Technologies
Immersive Labs - Multi-touch Product Browser
Immersive Labs
Posted by
Lynn Marentette
Labels:
evoluce,
gesture,
immersive labs,
kinect,
multi-touch,
NUI,
product browser,
touch,
Windows 7
No comments:
Nov 30, 2010
TuioKinect, by Martin Kaltenbrunner: "A simple TUIO hand gesture tracker for Kinect"
More Kinect from Martin Kaltenbrunner:
Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/ You can download the application from: code.google.com/p/tuiokinect/ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"
I've played around with Tuio and OpenFrameworks, but it has been a while. I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.
RELATED/SOMEWHAT RELATED
TuioKinect: TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)
Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/ You can download the application from: code.google.com/p/tuiokinect/ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"
I've played around with Tuio and OpenFrameworks, but it has been a while. I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.
RELATED/SOMEWHAT RELATED
TuioKinect: TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)
Posted by
Lynn Marentette
Nov 29, 2010
International Conference on Multimodal Interaction: ICMI 2011 Call for Papers
The information below was taken from the website for the 13th International Conference on Multimodal Interaction. I'm excited about the range of topics that the conference will cover. I look forward to sharing more about the work of the members of this group on this blog in the future! (I've highlighted the topics that interest me the most.)
INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION CALL FOR PAPERS
The International Conference on Multimodal Interaction, ICMI 2011, will take place in Alicante (Spain), November 14-18, 2011, just after the ICCV 2011 (in Barcelona, Spain). This is the thirteenth edition of the International Conference on Multimodal Interfaces, which for the last two years joined efforts with the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI 2009 and 2010). Starting in this edition the conference uses the new, shorther name.
The new ICMI is the premium international forum for multimodal signal processing and multimedia human-computer interaction. The conference will focus on theoretical and empirical foundations, varied component technologies, and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development. ICMI 2011 will feature a single-track main conference which includes: keynote speakers, technical full and short papers (including oral and poster presentations), special sessions, demonstrations, exhibits and doctoral spotlight papers. The conference will be followed by workshops. The proceedings of ICMI 2011 will be published by ACM as part of their series of International Conference Proceedings and will be also distributed to the attendees in USB memory sticks.
The new ICMI is the premium international forum for multimodal signal processing and multimedia human-computer interaction. The conference will focus on theoretical and empirical foundations, varied component technologies, and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development. ICMI 2011 will feature a single-track main conference which includes: keynote speakers, technical full and short papers (including oral and poster presentations), special sessions, demonstrations, exhibits and doctoral spotlight papers. The conference will be followed by workshops. The proceedings of ICMI 2011 will be published by ACM as part of their series of International Conference Proceedings and will be also distributed to the attendees in USB memory sticks.
Topics of interest include but are not limited to:
- Multimodal and multimedia interactive processing
Multimodal fusion, multimodal output generation, multimodal interactive discourse and dialogue modeling, machine learning methods for multimodal interaction.
- Multimodal input and output interfaces
Gaze and vision-based interfaces, speech and conversational interfaces, pen-based and haptic interfaces, virtual/augmented reality interfaces, biometric interfaces, adaptive multimodal interfaces, natural user interfaces, authoring techniques, architectures.
- Multimodal and interactive applications
Mobile and ubiquitous interfaces, meeting analysis and meeting spaces, interfaces to media content and entertainment, human-robot interfaces and interaction, audio/speech and vision interfaces for gaming, multimodal interaction issues in telepresence, vehicular applications and navigational aids, interfaces for intelligent environments, universal access and assistive computing, multimodal indexing, structuring and summarization.
- Human interaction analysis and modeling
Modeling and analysis of multimodal human-human communication, audio-visual perception of human interaction, analysis and modeling of verbal and nonverbal interaction, cognitive modeling.
- Multimodal and interactive data, evaluation, and standards
Evaluation techniques and methodologies, annotation and browsing of multimodal and interactive data, standards for multimodal interactive interfaces.
- Core enabling technologies
Pattern recognition, machine learning, computer vision, speech recognition, gesture recognition.
Important dates
| Workshops proposal | March 1, 2011 |
| Paper and demo submission | May 13, 2011 |
| Author notification | August 5, 2011 |
| Camera ready deadline | September 2, 2011 |
| Conference | November 14-16, 2011 |
| Workshops | November 17-18, 2011 |
General Chairs
Hervé Bourlard (Idiap)Thomas S. Huang (Univ. of Illinois)
Enrique Vidal (Tech. Univ. of Valencia)
Program Chairs
Daniel Gatica-Perez (Idiap)Louis-Philippe Morency (Univ. South. California)
Nicu Sebe (Univ. of Trento)
Demo Chairs
Kazuhiro Otsuka (NTT Comm. Sci. Lab.)Jordi Vitrià (UB/CVC, Barcelona)
Workshop Chairs
Fernando de la Torre(Carnegie Mellon Univ.)
Alejandro Jaimes (Yahoo! Research, Barcelona)
Publication Chair
Jose Oncina (Univ. of Alicante)Student & Doctoral Spotlight Chair
Li Deng (Microsoft Research and Univ. of Washington)Sponsorship Chair
Nuria Oliver (Telefónica I+D)Publicity Chair
Helen Mei-Ling Meng (CUHK, Hong Kong)Local Organization Chair
Luisa Micó (Univ. of Alicante)Treasurer
Jorge Calera (Univ. of Alicante)Local organizers
Xavier Anguera (Telefónica I+D)A. Javier Gallego Sánchez (Univ. of Alicante)
Ida Hui (CUHK, Hong Kong)
Jose Manuel Iñesta (Univ. of Alicante)
Alejandro Toselli (Tech. Univ. of Valencia)
RELATED
Accepted Papers for ICMI-MLMI 2010
NOTE: ICMI 2011 will be held after ICCV 2011, the 13th International Conference on Computer Vision in Barcelona.
Posted by
Lynn Marentette
Nov 13, 2010
HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)
MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction. The application was built on Ubuntu Linux written using libfreenect, by marcan42 and Florian's creation, libTISCH.
Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"
(I have SO many ideas for this! I'll throw a few out there in an upcoming post....maybe someone can run with them!)
RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10
FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework. You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.
LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10
Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.
For your convenience, I've reposted something I wrote about libTISCH back in 2009:

Here is NUI-Group member Florian Echtler's proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction. The application was built on Ubuntu Linux written using libfreenect, by marcan42 and Florian's creation, libTISCH.
Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"
(I have SO many ideas for this! I'll throw a few out there in an upcoming post....maybe someone can run with them!)
RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10
FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework. You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.
LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10
Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.
For your convenience, I've reposted something I wrote about libTISCH back in 2009:
For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction, this is great news!
Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge. TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans. libTISCH, a C++ software framework, is included in this project. It provides a means for creating GUIs based on multi-touch and/or tangible input devices.
Here is how it works:

Here is information from libTISCH announcement:
Highlights of this release are, among others, the following features:
- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
as well as custom-defined gestures
- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink
- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python
libTISCH has a lot to offer for the multitouch developer. For example,
- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
as well as custom-defined gestures
- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink
- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python
libTISCH has a lot to offer for the multitouch developer. For example,
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.
More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/ mediawiki/tisch/
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.
More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/
Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his webpage.
I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.
Posted by
Lynn Marentette
Labels:
florian echtler,
kinect,
libfreenect,
libtisch,
mac,
mac os x,
MacOsX,
microsoft,
microsoft kinect,
MicrosoftKinect,
multitouch,
NUI hack,
port,
theo watson,
TheoWatson,
tisch,
UI,
video
2 comments:
Nov 11, 2010
Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited
I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while.
Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same. I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post.
(The missing piece of information? An update about apps for the iPad and similar touch-screen tablets.)
Sit back and enjoy!


-Images: HP; Wired
There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.
As the new owner of the HP TouchSmart, I know that I get it.
The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.
This approach was taken with Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)


-Images: ClassematePC
Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.
I believe we need to take a "touching is believing" approach.
Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.
Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.
The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.
Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing. There is a need for a breath of fresh air!
Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.

Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)
Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart. HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the Dell, the iPad, etc.)
Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!
When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }
More thoughts:
After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)
Think about it.
Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.
For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!
When we think about how newer technologies are introduced to people, we should keep this in mind.
In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.
From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.
Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.
Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.
My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.
The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)
A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.
This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.
For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.
If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.
In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.
Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!
UPDATE: Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world
There are some interesting changes going on at the intersection of HCI and educational technology research. I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe: Next Generation of HCI and Education
Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica
The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.
FYI: TOUCH SCREEN DISPLAYS: NEED FOR IMPROVEMENT!
Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke
User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)
User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic. When I went to visit a relative at the hospital a year or so later, the map was no longer there.
BETTER EXAMPLES OF INTERACTIVE SCREENS:


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction
Midwife Toad App on a Microsoft Surface, Discovery Place Science Center
TellTable: Digital Storytelling on the Surface: Microsoft Research, UK
DECEMBER 31, 2009 -Interactive Soda Machine for Fun
The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them. Note: No one from this family actually purchased a soft drink. I was hoping to time how long it would take them to do so!
If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."
Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same. I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post.
(The missing piece of information? An update about apps for the iPad and similar touch-screen tablets.)
Sit back and enjoy!


-Images: HP; Wired
There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.
As the new owner of the HP TouchSmart, I know that I get it.
The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.
This approach was taken with Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)


-Images: ClassematePC
Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.
I believe we need to take a "touching is believing" approach.
Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.
Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.
The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.
Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing. There is a need for a breath of fresh air!
Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)
Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart. HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the Dell, the iPad, etc.)
Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!
When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }
More thoughts:
After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)
Think about it.
Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.
For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!
When we think about how newer technologies are introduced to people, we should keep this in mind.
In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.
From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.
Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.
Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.
My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.
The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)
A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.
This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.
For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.
If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.
In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.
Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!
UPDATE: Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world
There are some interesting changes going on at the intersection of HCI and educational technology research. I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe: Next Generation of HCI and Education
Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica
The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.
FYI: TOUCH SCREEN DISPLAYS: NEED FOR IMPROVEMENT!
Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke
User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)
User-Unfriendly Information Kiosk Interactive Map
BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction
Midwife Toad App on a Microsoft Surface, Discovery Place Science Center
TellTable: Digital Storytelling on the Surface: Microsoft Research, UK
DECEMBER 31, 2009 -Interactive Soda Machine for Fun
The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them. Note: No one from this family actually purchased a soft drink. I was hoping to time how long it would take them to do so!
Some resources:
Multimedia, Multitouch, Gesture, and Interaction Resources (Mega-list, work in progress)
lm3labs (catchyoo, ubiq'window)
Sparkon (See members links and multi-touch projects)
(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."
FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining. The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below:
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)
