NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:
multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister
JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard. Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks. (I'll discuss this further in a future post.)
Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:
Additional tutorials can be found the Jazzmutant YouTube channel.
Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."
RELATED
Video: Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures
Stantum's Mobile Phone Multi-touch Interface: Demonstration of precise interactions on a resistive touch screen
How the Stantum's Resistive Multi-Touch Screen Works
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by relevance for query stantum. Sort by date Show all posts
Showing posts sorted by relevance for query stantum. Sort by date Show all posts
Oct 25, 2009
GDIF: Gesture Description Interchange Format, a tool for music-related movements, actions, and gestures.
There has been a flurry of work in the computer music technology world that relates to what has been going on with interactive display technology, multi-touch & gesture interaction. I came across a link to the GDIF website when I was searching for information about interactive music and the use of multi-touch technologies for a future blog post.
So what is GDIF? Gesture description interchange format
Upcoming post about innovations at Stantum:
So what is GDIF? Gesture description interchange format
"The Gesture Description Interchange Format (GDIF) is being developed as a tool for streaming and storing data of music-related movements, actions, and gestures. Current general purpose formats developped within the motion capture industry and biomechanical community (e.g. C3D) focus mainly on describing low-level motion of body joints. We are more interested in describing gesture qualities, performer-instrument relationships, and movement-sound relationships in a coherent and consistent way. A common format will simplify working with different software, platforms and devices, and allow for sharing data between institutions." (The Jamoma environment is used to prototype GDIF.)
Alexander Refsum Jensenius is the man who initiated the GDIF project. He's written a variety of articles about music, gestures, movement, and emerging technologies.
Here's Alexander's bio: "Alexander (BA, MA, MSc, PhD) is a music researcher and research musician working in the fields of embodied music cognition and new interfaces for musical expression (NIME) at the University of Oslo and at the Norwegian Academy of Music. He studied informatics, mathematics, musicology, music performance and music technology at UiO, Chalmers, UC Berkeley and McGill. Alexander is active in the international computer music community through a number of collaborative projects, and as the initiator of GDIF. He performs on keyboard instruments and live electronics in various constellations, including the Oslo Laptop Orchestra (OLO)."
Here's Alexander's bio: "Alexander (BA, MA, MSc, PhD) is a music researcher and research musician working in the fields of embodied music cognition and new interfaces for musical expression (NIME) at the University of Oslo and at the Norwegian Academy of Music. He studied informatics, mathematics, musicology, music performance and music technology at UiO, Chalmers, UC Berkeley and McGill. Alexander is active in the international computer music community through a number of collaborative projects, and as the initiator of GDIF. He performs on keyboard instruments and live electronics in various constellations, including the Oslo Laptop Orchestra (OLO)."
Related Publications
Godoy, R. I., E. Haga, and A. R. Jensenius (2006b). Playing `air instruments':Mimicry of sound-producing gestures by novices and experts. InS. Gibet, N. Courty, and J.-F. Kamp (Eds.), Gesture in Human-Computer Interaction and Simulation, GW 2005, Volume LNAI 3881, pp. 256{267.Berlin: Springer-Verlag.
Jensenius, A. R (2009): Motion capture studies of action-sound couplings in sonic interaction. STSM COST Action SID report. fourMs lab, University of Oslo.
Jensenius, A. R. (2007). Action - Sound: Developing Methods and Tools to Study Music-related Body Movement. PhD thesis. Department of Musicology. University of Oslo, Norway
Jensenius, A. R., K. Nymoen and R. I. Godoy (2008): A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians. Proceedings of the International Computer Music Conference, 24-29 August 2008, Belfast.Jensenius, A. R., T. Kvifte, and R. I. Godoy (2006). Towards a gesture description interchange format. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), NIME '06: Proceedings of the 2006 International Conference on New Interfaces for Musical Expression, Paris, pp. 176{179. Paris: IRCAM { Centre Pompidou.}
Kvifte, T. and A. R. Jensenius (2006). Towards a coherent terminology and model of instrument description and design. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), Proceedings of New Interfaces for Musical Expression, NIME 06, IRCAM - Centre Pompidou, Paris, France, June 4-8, pp. 220–225. Paris: IRCAM - Centre Pompidou. [PDF]
Marshall,M. T., N. Peters, A. R. Jensenius, J. Boissinot, M. M. Wanderley, and J. Braasch (2006). On the development of a system for gesture control of spatialization. In Proceedings of the 2006 International Computer Music Conference, 6-11 November, New Orleans. [PDF]RELATED
Sonic Interaction Design (SID)
"Sonic Interaction Design is the exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts."
SID Action has four working groups:
- "SoundHack was my main thing for a long time, and I poured a lot of effort into it. It was the place I put my ideas. I did have something of a mission with SoundHack. I wanted to take some computer music techniques that were only used in academia, and get them out there so that all types of musicians could use them."-Tom Erbe SoundHack Spectral Shapers
Csound Blog "Old School Computer Music"
"Csound is a sound and music synthesis system, providing facilities for composition and performance over a wide range of platforms. It is not restricted to any style of music, having been used for many years in the creation of classical, pop, techno, ambient, experimental, and (of course) computer music, as well as music for film and television."-Csound on Sourceforge
Quote from Dr. Richard Boulanger (Father of CSound):
"For me, music is a medium through which the inner spiritual essence of all things is revealed and shared. Compositionally, I am interested in extending the voice of the traditional performer through technological means to produce a music which connects with the past, lives in the present and speaks to the future. Educationally, I am interested in helping students see technology as the most powerful instrument for the exploration, discovery, and realization of their essential musical nature - their inner voice."
Upcoming post about innovations at Stantum:
I'll be focusing on Stantum and its music and media technologies division, JazzMutant. in my next post. It is interesting to note that the co-founders of Stantum, Guilliam Largilleir and Pascal Joget, have a background in electronic music. Guiliam specializes in multi-modal user interfaces and human-machine interface technologies. Pascal has a background in physics and electronics, and has worked as a sound engineer.
My music back-story:
The very first computer-related course I took was Computer Music Technology (in 2003), since I play an electronic midi/digital keyboard and previously tried to teach myself a few things, long before computers and related technologies were "easy" for me to figure out. During the mid-90's, I tried my hand at Dr. Richard Boulanger's CSound, and tried to acquaint myself with tools from Cycling'74, but I gave up. Not long after that, bought the first version of MOTU's Freestyle, which nicely worked on my Performa 600, hooked up to my Ensoniq 32, after the nice people at MOTU sent me an update that was compatible with my set-up. Later on, I came across Tom Erbe's SoundHack freeware.
A lot has changed since then!
Posted by
Lynn Marentette
Jan 12, 2011
Multi-modal Interactive Maps for People with Visual Impairments: Featuring a Stantum multitouch screen with a tactile layer.
To learn more about this project, take a look at the video and related publications below. This is a great example of a team that is harnessing emerging technologies to improve the lives of people with disabilities.
Video: "Multimodal Maps for Blind People"
Website
Publications
Anke Brock, Philippe Truillet, Bernard Oriola, Christophe Jouffrai (IRIT CNRS and Universite de Toulouse) Usage of Multimodal Maps for Blind People: Why and How
ITS’10, November 7–10, 2010, Saarbrücken, Germany
Paper: http://www.irit.fr/~Philippe.Truillet/projects/doc/MultimodalMapsForTheBlind-ITS10.pdf
Poster: http://www.irit.fr/~Philippe.Truillet/projects/doc/Poster-ITS10.pdf
Stantum (Multi-touch screen used for the application.)
Ivy Middleware (Used in this application.)
Cross-posted on the TechPsych blog.
Video: "Multimodal Maps for Blind People"
Website
Publications
Anke Brock, Philippe Truillet, Bernard Oriola, Christophe Jouffrai (IRIT CNRS and Universite de Toulouse) Usage of Multimodal Maps for Blind People: Why and How
ITS’10, November 7–10, 2010, Saarbrücken, Germany
Paper: http://www.irit.fr/~Philippe.Truillet/projects/doc/MultimodalMapsForTheBlind-ITS10.pdf
Poster: http://www.irit.fr/~Philippe.Truillet/projects/doc/Poster-ITS10.pdf
Stantum (Multi-touch screen used for the application.)
Ivy Middleware (Used in this application.)
Cross-posted on the TechPsych blog.
Posted by
Lynn Marentette
May 29, 2010
Preview: Update on Touch & Multitouch Technologies, Websites, and Touch-Interactive Multimedia Apps
It is about time for an update about touch/gesture- interactive technologies.
I've been researching the latest in "touch" screens and new developments in interactive multi-media content. In just one year, a multitude of websites have been transformed from static to interactive.
Although the initial objective for some of these websites was to optimize the interface and navigation for people accessing websites via touch-screen cell phones, some are ideal for use on touch-enabled slates, the iPad, and even larger touch screen displays and surfaces.
Convergence seems to be the buzz word of the day. Interactive TV. Game sets with Internet access. Movies on your cell phone. Touch screen Coke machines displaying movie trailers. What's happening now, and what is next?
I welcome input from my readers in the form of links to websites, university labs with grad students and professors who are obsessed with emerging interactive technologies, proof-of-concept video clips, video clips of related technologies that are new-to-market, etc.
I will add video clips to the following playlist:
FYI: I'm also in the middle of writing a series of posts about 3D television technologies for the Innovative Interactivity blog, and welcome input from my readers about this topic.
RELATED (Previous posts)
(the above post includes links to various multi-touch developer kits and resources)
Multi-touch Linux on a Stantum Slate PC & More (links to a nice overview about multi-touch interaction from ENAC)
Multimedia, Multi-touch, Gesture, and Interaction Resources (needs a little updating)
Posted by
Lynn Marentette
Labels:
display,
interactive,
interactive website,
multi-touch,
post-WIMP,
research,
resources,
technology,
touch,
update
No comments:
Oct 23, 2012
Tablet and Mobile Day (or Era): Lots of upcoming news about tablets and mobile devices. New Chromebook, too.
I will be devoting a few upcoming posts to tablets, larger smartphones, and other mobile computing devices. I'll be hearing from the folks at Stantum later today.
This morning I woke up and turned on the TV as I was getting ready for work, and the first thing I saw on my local station was this cute family-focused ad for the new Google Chromebook, which at $249.00, is affordable:
I especially like the scene were a toddler stomps on a Chromebook left on the floor.
This morning I woke up and turned on the TV as I was getting ready for work, and the first thing I saw on my local station was this cute family-focused ad for the new Google Chromebook, which at $249.00, is affordable:
I especially like the scene were a toddler stomps on a Chromebook left on the floor.
Posted by
Lynn Marentette
Labels:
ad,
chromebook,
cute,
family,
family technology,
google,
interaction,
ipad,
mobile,
multitouch,
new,
smart,
smartphone,
stantum,
tablet,
technology,
touch,
ubiquitous
No comments:
Sep 16, 2011
MindHabits Online Demo: Useful Serious Game for Social Skills Group Activities
I'd like to share the on-line demo of MindHabit's suite of serious games that I've found useful in my work with teens and young adults who need support in the area of social-emotional skills.
What I like about the online demo is that it adjusts to the player's responses. This feature made it fun to use during the last few social skills groups I facilitated at work, since it could be played by students with a range of cognitive abilities. I had students take turns playing the game using a SMARTboard, and found that all of the students paid attention to what was going on. In my opinion, using the interactive whiteboard supported "off-the-shoulder" learning among the students who were not at the board.
MindGames is available for Windows and Macs, and the full version is just $19.99 and provides 100 game levels. The full version tracks progress and includes four games.
Here's some information from the company's website:
"Based on social intelligence research conducted at McGill University, these stress busting, confidence boosting games use simple, fun-to-play exercises designed to help players develop and maintain a more positive state of mind."
"Based on the principles of social intelligence: Inhibition - uses game mechanics to promote positive habits; Association - connects personal info to positive feedback; Activation - uses personal references"
Upcoming:
- Interview with Guillaume Largillier about Stantum's multi-touch tablet.
- Collaboration with Stevens Institute of Technology, focusing on a serious game project to support learning of "on-the-job" social skills for teens and young adults with autism spectrum disorders and related challenges.
- More news about large interactive displays, multi-touch, and gesture applications/installations.
- Cross-platform/device interactive "TV".
- Cathy Davidson's new book: Now You See It: How The Brain Science of Attention Will Transform The Way We Live, Work, and Learn (Viking)
Posted by
Lynn Marentette
Sep 8, 2011
Update, plus iGaze app by Dunedin Multimedia for use during social skills group activities
So what am I up to now?
I'd like to share with my readers that I've decided to continue in my present position as a school psychologist, while still devoting a portion of my free time to technology. From time-to-time I think deep thoughts about usability, accessibility, and UX/Interaction related to off-the-desktop interactive multimedia applications running on screens of all sizes. I'm hoping to create a few multimedia experiments using HTML5 and JavaScript, and explore jQuery if and when I can find the time!
For the present school year, my main school is a program for students with more significant disabilities, including autism spectrum disorders. My second school is a magnet high school for technology and the arts, located on the same campus. I also consult throughout the district on cases involving students who have suffered traumatic brain injuries, as well as students who have multiple disabilities. I am thankful that I have a job in a school district that values 21st Century technology.
I'm looking forward to another technology-rich school year. I've spent some of the time I usually devote to blogging devoted exploring iPad apps instead. Since I'm new to the world of iPads, I'm still in discovery mode. What an adventure!
There are plenty of educational apps out there, and many of them are suitable for students with special needs. On the other hand, there is much room for improvement - across all iPad app categories. Since there is very little research about what makes up a killer app- or suite of apps- for students with special needs, experimenting with iPad apps is uncharted territory.
One of my intervention themes this year focuses on social skills. This is especially important for students who participate in our schools community-based job training program. I'm using some content from Unique Learning's transition materials, as well as on-line activities from Do2Learn's JobTips website, because my aim is to facilitate social skills that will be useful in a variety of job and community settings.
Although my main technology tool for working with groups is the SMARTBoard, I've found that using a combination of interactive whiteboard and iPad activities to be especially effective. I'm paving the way for more role-play activities in the future, and attempting to use technology to my advantage.
This past week, I used the iGaze app, created by Dunedin Multimedia, to help a group of high-school level students practice establishing and maintaining eye gaze, something that is difficult for most of them to demonstrate "in-person". I was amazed. Each student was excited to take his or her turn. Even more amazing? When each student took a turn, the other students looked at their eyes and faces. No one rocked or "stimmed". No one made noises. I observed several instances of joint attention, much to my delight.
During the social skills activity involving the iGaze app, I used the SMART Board to display a large picture of a boss and a worker standing face to face, making eye contact, engaged in conversation. The picture served as an anchor to remind the students of pictures and videos they'd previously viewed that illustrated the concept of face-to-face interaction and the importance of establishing eye-contact with others from time-to-time.
I'm hoping I will be able to access the YouTube videos from Dunedin so I can use them on the SMART Board. It will be interesting to see how this plays out! I'm also planning to take a closer look at Dunedin Multimedia's emotion x app for the iPad.
RELATED
Screen-shot of iGaze for the iPad Dunedin:

SOMEWHAT RELATED
The SMART Table at my school was updated today - I'm looking forward to using it for some group activities, now that it is back in working order and has new applications loaded up and ready to go!
If you are interested in learning more about technology related to students with special needs, be sure to check out Kate Ahern's blog, Teaching Learners with Multiple Special Needs
Kate's post about the features of Unique Learning Systems.
Upcoming: more about tablets, interview with folks from Stantum, social-skills game-in-progress.....large displays in public spaces update....
I'd like to share with my readers that I've decided to continue in my present position as a school psychologist, while still devoting a portion of my free time to technology. From time-to-time I think deep thoughts about usability, accessibility, and UX/Interaction related to off-the-desktop interactive multimedia applications running on screens of all sizes. I'm hoping to create a few multimedia experiments using HTML5 and JavaScript, and explore jQuery if and when I can find the time!
For the present school year, my main school is a program for students with more significant disabilities, including autism spectrum disorders. My second school is a magnet high school for technology and the arts, located on the same campus. I also consult throughout the district on cases involving students who have suffered traumatic brain injuries, as well as students who have multiple disabilities. I am thankful that I have a job in a school district that values 21st Century technology.
I'm looking forward to another technology-rich school year. I've spent some of the time I usually devote to blogging devoted exploring iPad apps instead. Since I'm new to the world of iPads, I'm still in discovery mode. What an adventure!
There are plenty of educational apps out there, and many of them are suitable for students with special needs. On the other hand, there is much room for improvement - across all iPad app categories. Since there is very little research about what makes up a killer app- or suite of apps- for students with special needs, experimenting with iPad apps is uncharted territory.
I made the decision to bring my personal iPad2 to work after I discovered a number of apps that I thought would be useful in my work as a school psychologist with students who have special needs, including autism spectrum disorders.
Although my main technology tool for working with groups is the SMARTBoard, I've found that using a combination of interactive whiteboard and iPad activities to be especially effective. I'm paving the way for more role-play activities in the future, and attempting to use technology to my advantage.
This past week, I used the iGaze app, created by Dunedin Multimedia, to help a group of high-school level students practice establishing and maintaining eye gaze, something that is difficult for most of them to demonstrate "in-person". I was amazed. Each student was excited to take his or her turn. Even more amazing? When each student took a turn, the other students looked at their eyes and faces. No one rocked or "stimmed". No one made noises. I observed several instances of joint attention, much to my delight.
Below is a video from Dunedin Multimedia's YouTube channel that is similar to what the students viewed during their group activity:
Here is some information from Dunedin Multimedia about the iGaze app:
"Eye contact is important to communication and social development, and yet the impaired ability to make and maintain eye contact is one of the most striking aspects of autism. iGaze is an eye contact simulator that can help to build confidence in using this important means of nonverbal social communication. The app also contains information on eye contact and eye gaze, with links to relevant research."
I'm hoping I will be able to access the YouTube videos from Dunedin so I can use them on the SMART Board. It will be interesting to see how this plays out! I'm also planning to take a closer look at Dunedin Multimedia's emotion x app for the iPad.
RELATED
Screen-shot of iGaze for the iPad Dunedin:

SOMEWHAT RELATED
The SMART Table at my school was updated today - I'm looking forward to using it for some group activities, now that it is back in working order and has new applications loaded up and ready to go!
If you are interested in learning more about technology related to students with special needs, be sure to check out Kate Ahern's blog, Teaching Learners with Multiple Special Needs
Kate's post about the features of Unique Learning Systems.
Upcoming: more about tablets, interview with folks from Stantum, social-skills game-in-progress.....large displays in public spaces update....
Posted by
Lynn Marentette
Apr 26, 2011
Multi-touch and Gesture Interaction News and Updates You Might Have Missed (Part I)
Over the past couple of months, I've come across many interesting links related to multi-touch and gesture interaction, but I haven't had time to devote a thoughtful post to each one. "Part I", is a nice collection of experimental, commercial, and non-commercial efforts by a variety of creative technologists, with a smattering of industry news that might be of interest to IMT readers.
Ideum's MT55 HD Multitouch Table 4/19/11
New MT55 HD Multitouch Table Now Shipping, Jim Spadaccini, Ideum Blog 4/11/11
Smithsonian American Art Museum to Open Education Center Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center. The center was funded by an anonymous $8 million dollar gift.)
"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler
Mac OX 10.7 Lion: new multi-touch gestures, Dock integration for Expose, Launchpad, Mission Control Appleinsider, 4/14/11
Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11
3M Touch Systems's YouTube Channel
Social Mirror 3D Gestural Display, Now Using Kinnect: SnibbeInteractive
Ideum's MT55 HD Multitouch Table 4/19/11
New MT55 HD Multitouch Table Now Shipping, Jim Spadaccini, Ideum Blog 4/11/11
Smithsonian American Art Museum to Open Education Center Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center. The center was funded by an anonymous $8 million dollar gift.)
Bill Buxton, Microsoft Research, 4/7/11 - Includes lots of pictures, links to videos, and more information of what might be the first touch-screen. Also see Bill Buxton's companion website, Multi-Touch Systems that I have Known and Loved, updated on 3/21/11. Bill Buxton knows all (almost!)
How to make a Multitouch Biggie (Seth Sandler)
"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler
(Also check out NodeBeat, a multi-touch music/audio sequencer/generator app by Seth Sandler and Justin Windle)
Sparkon: Videos and links related to multi-touch and gesture-based applications
"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen." -Multitouchfi Also see the Multitouch website.
Big Size Multitouch Display Turned into a Microscope, Microscopy-News, 3/28/11
Intuilab, 4/13/11
"IntuiLab, a global leader in surface computing software applications, today announced support for the revolutionary Microsoft Kinect device across its full line of IntuiFace products and solutions including IntuiFace Presentation and IntuiFace Commerce...Microsoft Kinect brings distant gesture control to interactive solutions. These gesture controls allow users to interact with displayed digital assets from a distance at their own pace and path – for example, browsing through a large quantity of products in a store catalog or manipulating 3D models (such as a mobile phone) – all without having to actually touch the screen..." -IntuiLab (Take a look at the IntuiLab team- an interactive page!)
EON Reality Launches EON Interact - Interact with 3D Data by Using Your Bare Hands Animation xPress, 3/31/11
Sparkon: Videos and links related to multi-touch and gesture-based applications
Official Kinect SDK to be Open Source, Josh Blake, Deconstructing the NUI, 4/18/11
9 This bit of news excited me, but don't get your hopes up. If anyone knows what will happen with the Kinect SDK, please leave a comment.)
"Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea." -Josh Blake
New Channel 9 Video: Josh Blake on NaturalShow Multi-touch in WPF, Pete Brown, 10rem.net, 4/22/11
Kenrick Kin, Tom Miller, Bjoern Bollensdorff, Tony DeRose, Bjoern Hartmann, Manees Agrawala (Pixar Online Library)
Flight Race Game on 3DFeel lm3Labs, 4/18/11
JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."
Harry van der Veen's Multitouch Blog (NUITEQ)
Stantum "Unlimited Multi-Touch" Latest News
At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11
Immersive Labs
Hard Rock Cafe International Using NextWindow Touch Screens: "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)
JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."
Harry van der Veen's Multitouch Blog (NUITEQ)
Stantum "Unlimited Multi-Touch" Latest News
At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11
Immersive Labs
Hard Rock Cafe International Using NextWindow Touch Screens: "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)
Razorfish: Thoughts on MIX 11 ,James Ashley, Razorfish Blog, 4/20/11 Also see: Razorfish Lab's Prototypes
"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen." -Multitouchfi Also see the Multitouch website.
Big Size Multitouch Display Turned into a Microscope, Microscopy-News, 3/28/11
Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11
3M Touch Systems's YouTube Channel
Social Mirror 3D Gestural Display, Now Using Kinnect: SnibbeInteractive
Posted by
Lynn Marentette
Labels:
3M touch,
ideum,
immersive labs,
Intuilab,
Josh Blake,
kinect,
lm3labs,
microscope,
multitouch,
MultitouchFi,
NextWindow,
nodebeat,
NUI,
razorfish,
vectorform
No comments:
Nov 11, 2010
Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited
I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while.
Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same. I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post.
(The missing piece of information? An update about apps for the iPad and similar touch-screen tablets.)
Sit back and enjoy!


-Images: HP; Wired
There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.
As the new owner of the HP TouchSmart, I know that I get it.
The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.
This approach was taken with Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)


-Images: ClassematePC
Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.
I believe we need to take a "touching is believing" approach.
Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.
Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.
The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.
Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing. There is a need for a breath of fresh air!
Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.

Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)
Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart. HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the Dell, the iPad, etc.)
Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!
When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }
More thoughts:
After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)
Think about it.
Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.
For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!
When we think about how newer technologies are introduced to people, we should keep this in mind.
In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.
From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.
Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.
Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.
My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.
The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)
A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.
This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.
For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.
If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.
In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.
Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!
UPDATE: Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world
There are some interesting changes going on at the intersection of HCI and educational technology research. I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe: Next Generation of HCI and Education
Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica
The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.
FYI: TOUCH SCREEN DISPLAYS: NEED FOR IMPROVEMENT!
Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke
User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)
User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic. When I went to visit a relative at the hospital a year or so later, the map was no longer there.
BETTER EXAMPLES OF INTERACTIVE SCREENS:


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction
Midwife Toad App on a Microsoft Surface, Discovery Place Science Center
TellTable: Digital Storytelling on the Surface: Microsoft Research, UK
DECEMBER 31, 2009 -Interactive Soda Machine for Fun
The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them. Note: No one from this family actually purchased a soft drink. I was hoping to time how long it would take them to do so!
If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."
Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same. I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post.
(The missing piece of information? An update about apps for the iPad and similar touch-screen tablets.)
Sit back and enjoy!


-Images: HP; Wired
There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.
As the new owner of the HP TouchSmart, I know that I get it.
The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.
This approach was taken with Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)


-Images: ClassematePC
Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.
I believe we need to take a "touching is believing" approach.
Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.
Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.
The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.
Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing. There is a need for a breath of fresh air!
Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)
Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart. HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the Dell, the iPad, etc.)
Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!
When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }
More thoughts:
After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)
Think about it.
Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.
For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!
When we think about how newer technologies are introduced to people, we should keep this in mind.
In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.
From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.
Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.
Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.
My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.
The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)
A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.
This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.
For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.
If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.
In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.
Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!
UPDATE: Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world
There are some interesting changes going on at the intersection of HCI and educational technology research. I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe: Next Generation of HCI and Education
Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica
The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.
FYI: TOUCH SCREEN DISPLAYS: NEED FOR IMPROVEMENT!
Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke
User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)
User-Unfriendly Information Kiosk Interactive Map
BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction
Midwife Toad App on a Microsoft Surface, Discovery Place Science Center
TellTable: Digital Storytelling on the Surface: Microsoft Research, UK
DECEMBER 31, 2009 -Interactive Soda Machine for Fun
The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them. Note: No one from this family actually purchased a soft drink. I was hoping to time how long it would take them to do so!
Some resources:
Multimedia, Multitouch, Gesture, and Interaction Resources (Mega-list, work in progress)
lm3labs (catchyoo, ubiq'window)
Sparkon (See members links and multi-touch projects)
(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."
FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining. The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below:
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)

