From the DISPLAX press kit materials: "DISPLAX™ Multitouch Technology turns any non-conductive material into an interactive multi-touch surface. Based on projected capacitive technology, DISPLAX™ Multitouch Technology has been developed using a transparent thinner-than-paper polymer film which can be applied to a variety of flat and curved surfaces including glass, plastic and wood."
"DISPLAX™ Multitouch Technology was primarily developed to enable touch screen integration for LCD and projection displays. Application of the technology ranges from converting entire store windows into a touch surface, creating information screens, or developing innovative user interfaces. Potential customers are found in retail and diverse industries such as telecoms, museums, property, broadcast, pharma or finance. The technology will also be available for LCD manufacturers, audiovisual integrators or gaming platforms."
"DISPLAX works with partners and directly with customers to deliver multi-touch rich-media
applications, enabling people to take full advantage of the latest developments. DISPLAX
Multitouch Technology will be available in the market with several embedded business
applications at no extra cost, designed especially for the kind of installations that interactive
technology companies work on. This Apps Pack will allow customers to display photo and
video streams, provide users access to Google Maps and social networks, integrate news
streams via RSS and play multitouch games. More applications will be available later in 2010
Is Amazon Building a SuperKindle? Nick Bilton, NYT, 2/3/2010 "Unlike traditional capacitive sensors, our patent-pending system can detect any object — not just a finger — and can determine how much pressure is being applied to every point on a sensor simultaneously. IFSR sensors are natively multitouch, use less power than capacitive sensors, and are much less expensive to produce, making them a highly disruptive technology with widespread market applications."
I would have to take a week off from my regular day job to follow up on the iPad chatter, which is not possible. In the meantime, here are a few links for those of you who want to be on top of the iPad brouhaha:
Apple Info iPad Technical Specifications -Apple iPad Design -Apple (you can access the iPad video from the iPad Design page) iPad Features -Apple
Steve Wonsiak Talks iPad (Video) via TechCrunch
"I don't see it having a role in video editing or music production"- Steve Wonsiak
Hopefully the iPad would support basic video editing and music production, since I use digital video storytelling/social stories and music in my work with students who have autism or other complex special needs. Some of my students help to create content, and despite their limitations, can understand and use storyboarding in this process.
Comment: There are quite a few jokes circulating around the web about the name of this new product. For the males (I'm assuming) who named the iPad, I'll give you a hint. Half of the population knows that "pad" is what we call an important feminine hygiene product.
If you are a talented interactive web designer/developer, game designer/developer, traditional programmer with a creative bent, or someone who who is thinking about working with technology in the future as a programmer or designer, I urge you to consider thinking about designing/developing multi-touch applications in the near future.
In my opinion, there will be a need for multi-touch web applications as well as for multi-touch education and collaboration applications for the SMART Table, Microsoft's Surface, multi-touch tablets like the rumored iTablet from Apple, and the multi-touch laptops and all-in-ones (Dell, HP, etc.).
Below are direct links to some of my blog posts related to multi-touch applications and screens. If you are fairly new to multi-touch, I'm sure that looking through some of my blog posts will be helpful. All of the posts have links to resources, and most have photos and video clips of multi-touch in action.
If you are new to this blog, I have a great deal of information, links, photos, and video clips of various multi-touch screens and applications. The best way to find the stuff is to enter in a keyword in the search box for this blog: multitouch, touch screen, gesture, multi-touch, etc. on this blog.
Here are some links: Do you have an HP TouchSmart, Dell Studio One or NextWindow touch-screen? NUITech's Snowflake Suite upgrade provides a multi-touch plug-in http://bit.ly/5tdlhc
The following blog post has a video clip that shows someone from Adobe painting with a multi-touch application in development: More Multi-Touch!: Rumor of the mobile apple iTablet; AdobeXD & Multitouch; 10-finger Mobile Multitouch: http://bit.ly/4S9Upm
Albrecht Schmidt posted a link to the following video on his User Interface Engineering blog -
watch how the multi-touch surface is built from ice blocks:
Jim Spadaccini, of Ideum, shared information about his company's product, Gestureworks, highlighting how it provides better multi-touch and gesture support than Adobe AIR2 and Flash Player 10.1.Gestureworks supports multiple-point drag, rotate, and scale at the same time. In the video, the application is demonstrated on an HP Touchsmart 600 and a 3M multitouch screen.
Adobe AIR 2 and Flash Player 10.1 vs Gestureworks 1.0: A direct comparison of multitouch and gesture support
"A direct comparison between the built-in support for multitouch found in Adobe Flash Player 10.1 beta / Adobe AIR 2 and that of the Gestureworks multitouch framework for Flash. More about this comparison can be found on the Gestureworks website (http://www.gestureworks.com) and the Ideum website (http://www.ideum.com) There is a blog post with more about this comparison and links to all of the example files at: www.ideum.com/2010/01/true-multitouch-wi th-adobe-flash/ "
The SMART Multitouch Application Contest was recently announced. The SMART Table is a multitouch, multiuser interactive learning center designed for early education which supports hands-on collaborative play and learning activities. I think that this would be a great project for an after-school technology club at a high school.
If you haven't seen a SMART Table, the following videos will give you an idea of what they are all about:
The SMART Table in a multi-age Montessori classroom at an elementary school:
Video from 2008 about the SMART Table:
Below is information about the contest prizes from the SMART Technologies website:
Grand prize – One SMART Table, valued at US$6,499. Plus, the winning application will be promoted on the SMART website for one year, providing you with the opportunity to establish a revenue stream if users purchase your application.
First runner-up – The second-place application will be promoted on the SMART website for one year.
Second runner-up – The third-place application will be promoted on the SMART website for six months
Here you can review the contest guidelines, terms and conditions, review the FAQ's and find out how to enter the contest. There are two categories. The Commercial category is for professional and amateur developers, and the Academic is for students and educators affiliated with an educational institution.
I don't have much information about Mediateam. I think it might be MediaTeam Oulu, but I'm not sure. MediaTeam Oulu has quite a bit of research that focuses on ubiquitous computing.
This demo was created by the Emerging Experiences team at Razorfish. Here's the video description from Vimeo:
"Customers are being faced with increasingly complex buying decisions, especially when it comes to technology and services. As a result, increased pressure is being placed on store associates to provide knowledgeable service to customers. Our Emerging Experiences team used this opportunity to develop a solution to demonstrate how an immersive interactive experience can assist customers and store associates with complex buying decisions in a retail setting."
Comment: We've graduated from 2D multi-touch manipulation of photos via pan-zoom-rotate-resize-drag to 3D multi-touch manipulation of "objects".
So? I'm expecting much more.
There is much room for creative growth in this area!
There has been a flurry of work in the computer music technology world that relates to what has been going on with interactive display technology, multi-touch & gesture interaction. I came across a link to the GDIF website when I was searching for information about interactive music and the use of multi-touch technologies for a future blog post.
So what is GDIF? Gesture description interchange format
"The Gesture Description Interchange Format (GDIF) is being developed as a tool for streaming and storing data of music-related movements, actions, and gestures. Current general purpose formats developped within the motion capture industry and biomechanical community (e.g. C3D) focus mainly on describing low-level motion of body joints. We are more interested in describing gesture qualities, performer-instrument relationships, and movement-sound relationships in a coherent and consistent way. A common format will simplify working with different software, platforms and devices, and allow for sharing data between institutions." (The Jamoma environment is used to prototype GDIF.)
Alexander Refsum Jensenius is the man who initiated the GDIF project. He's written a variety of articles about music, gestures, movement, and emerging technologies.
Godoy, R. I., E. Haga, and A. R. Jensenius (2006b). Playing `air instruments':Mimicry of sound-producing gestures by novices and experts. InS. Gibet, N. Courty, and J.-F. Kamp (Eds.), Gesture in Human-Computer Interaction and Simulation, GW 2005, Volume LNAI 3881, pp. 256{267.Berlin: Springer-Verlag.
Jensenius, A. R., T. Kvifte, and R. I. Godoy (2006). Towards a gesture description interchange format. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), NIME '06: Proceedings of the 2006 International Conference on New Interfaces for Musical Expression, Paris, pp. 176{179. Paris: IRCAM { Centre Pompidou.}
Kvifte, T. and A. R. Jensenius (2006). Towards a coherent terminology and model of instrument description and design. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), Proceedings of New Interfaces for Musical Expression, NIME 06, IRCAM - Centre Pompidou, Paris, France, June 4-8, pp. 220–225. Paris: IRCAM - Centre Pompidou. [PDF]
Marshall,M. T., N. Peters, A. R. Jensenius, J. Boissinot, M. M. Wanderley, and J. Braasch (2006). On the development of a system for gesture control of spatialization. In Proceedings of the 2006 International Computer Music Conference, 6-11 November, New Orleans. [PDF]
"Sonic Interaction Design is the exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts."
SID Action has four working groups:
WG1: Perceptual, cognitive, and emotional study of sonic interactions
"SoundHack was my main thing for a long time, and I poured a lot of effort into it. It was the place I put my ideas. I did have something of a mission with SoundHack. I wanted to take some computer music techniques that were only used in academia, and get them out there so that all types of musicians could use them."-Tom Erbe SoundHack Spectral Shapers
"Csound is a sound and music synthesis system, providing facilities for composition and performance over a wide range of platforms. It is not restricted to any style of music, having been used for many years in the creation of classical, pop, techno, ambient, experimental, and (of course) computer music, as well as music for film and television."-Csound on Sourceforge
"For me, music is a medium through which the inner spiritual essence of all things is revealed and shared. Compositionally, I am interested in extending the voice of the traditional performer through technological means to produce a music which connects with the past, lives in the present and speaks to the future. Educationally, I am interested in helping students see technology as the most powerful instrument for the exploration, discovery, and realization of their essential musical nature - their inner voice."
Upcoming post about innovations at Stantum:
I'll be focusing on Stantum and its music and media technologies division, JazzMutant. in my next post. It is interesting to note that the co-founders of Stantum, Guilliam Largilleir and Pascal Joget, have a background in electronic music. Guiliam specializes in multi-modal user interfaces and human-machine interface technologies. Pascal has a background in physics and electronics, and has worked as a sound engineer.
My music back-story:
The very first computer-related course I took was Computer Music Technology (in 2003), since I play an electronic midi/digital keyboard and previously tried to teach myself a few things, long before computers and related technologies were "easy" for me to figure out. During the mid-90's, I tried my hand at Dr. Richard Boulanger's CSound, and tried to acquaint myself with tools from Cycling'74, but I gave up. Not long after that, bought the first version of MOTU's Freestyle, which nicely worked on my Performa 600, hooked up to my Ensoniq 32, after the nice people at MOTU sent me an update that was compatible with my set-up. Later on, I came across Tom Erbe's SoundHack freeware.
Various multi-touch and natural-user interface/interaction companies have been collaborating, and results are promising.
Here is the result of a collaboration between 3M and NUITEQ, the company based in Sweden:
3M Multi-Touch &; NUITEQ Suite Software Video
In this video, you'll see SnowFlake software used in a variety of contexts, such as education, architecture, health, financial consultation, product browsing, games, playing with special effects, and more.
What a difference since 2007, when I shared what I was doing in my graduate classes on this blog!
This is a company I've been meaning to write about! Here is the plug from the IntuiLab website:
"Founded in 2002 and headquartered in Toulouse, France, IntuiLab is a leader in the design and development of surface computing-based applications. Through IntuiFace, the company’s portable, scalable and extensible software surface computing platform, IntuiLab delivers and deploys applications that bring tangible returns on investment to its clients by providing their customers and users with a more natural, immersive and memorable interactive experience. IntuiLab is a Microsoft and Adobe partner and has clients in a wide variety of industries such as retail, food and health, banking, aerospace and defense, telecoms and hospitality."
"We are able to deliver such benefits to our customers thanks to a unique blend of skills (our multidisciplinary IntuiTeam), technology (the IntuiFace Surface Computing Software Platform), process (the IntuiSign design process) and a wide range of partners."
Amazon Multi-touch Clinet on the IntuiFace Platform
"This video shows an example of how to access to an online retail catalogue (such as Amazon) from a Surface Computer rich client, and create, browse and merge lists of queried or selected items in a natural way." -Intuilabs
Multi-user Web Browsing on a Windows 7-based 19" 3M Multi-touch
Pictures from the Intuilab website:
If you take a look at Intuilab's "the team" page, you'll find that they look like...graduate students! They probably are, or were, given the size of list of published papers. This tells me that they must have their heart, mind, and souls poured into the business!
According to Priya Ganapati's recent article in WIRED, only 3% of all PC's and notebooks have interactive touchscreens. More are coming to market, such as Sony's Vaio L Touch HD PC, Dell's all-in-one, and Lenovo's tablet PC and ThinkPad laptop. HP came out with the TouchSmart PC and touch-enabled laptops, and will be adding updated versions soon.
This is a great opportunity for developers interested in touch, multi-touch, and gesture interaction applications. I think there will be even more opportunity for web developers to create websites that are touch-enabled, or at least optimized for touch screen interaction.
Will the multi-touch web might be right around the corner?
At the moment, there is some confusion about what designers should consider when developing applications. There is not standard system of touch or gesture interaction, and researchers are still very busy figuring things out. Some companies have rushed out and patented gestures, which in my opinion, is like patenting how we breath. But that is another story.
If you are a designer or developer, you might be interested in the Touch First Microsoft Surface Developer Challenge. It is a chance to win your very own Microsoft Surface! Hurry, the deadline is October 12th.
If you are curious and would like to dig deeper into this topic, take some time to look at some of my blog-posts about multi-touch, touch screen news, innovations, and related musings:
The Medion x9613 is a Windows 7, multi-touch all-in-one that has all of the features needed for playing games, working collaboratively with a colleague or classmate, watching HD video content, and more.
The following YouTube video is from Gizmodo, via Engadget.
According to Cali Lewis, of GeekBrief TV, this all-in-one will be available in the US. (Cali also previews other interesting gadgets and tech on the GeekBrief TV clip below.)
I haven't set eyes on this one yet. If anyone has played with a Medion X9613 a bit, or even used it for work, please leave a comment! How do YOU use your "All-in-One?"
I've been following Yvonne's research for some time now, and I thought I'd take this opportunity highlight her work. (I've provided additional information for those of you who are new to this blog.)
Background (from my perspective)
Back in the early 1990's, I ran a "paired learning" study skills/social skills group three mornings a week in a computer lab, for a crew of fourth grade students referred to me (the school psychologist) for support for behavior, learning, and social skills problems. Since many of the students needed to work on social skills such as waiting turns, cooperating, and working nicely with others, I decided to pair the students up, two to a computer.
Some of the applications were designed for two students, taking turns, but other applications were designed for only one person, and for those applications, the student's were encouraged to forget about the "rules" and help each other out, in other words, one student would control the mouse, and the other would be the "mouse-seat driver". I spent some time teaching "mouse-sharing" skills. One of the students was trying to grab the mouse when it wasn't his turn, and his partner whacked him over the head with it!
Moving on...
When I first touched an interactive whiteboard, it was 2002 or 2003. It was a SmartBoard, and I was fascinated by the possibilities that it held. Since I used it as a tool for group counseling and study skills in my role as a school psychologist, I couldn't help but hope that a new version would come out that would allow at least two students to interact with the screen at the same time.
Better yet, wouldn't it be cool to have the group table offer the same sort of functionality? I realize when people envision a learning environment, they are likely to think of students seated in single desks. While this is the case in many classrooms, many teachers manage to focus on group work, such as group projects, collaborative/cooperative learning lessons, or science experiments...even if they DON'T have tables. They push the desks together to form a "table".
In my life as a school psychologist, particularly during my days at an elementary school, I spent a good part of my day sitting at a table, with 3-6 students, for the purpose of group counseling, social skills training, and so forth. (The speech and language therapist worked with groups of children around a table, too.) You can imagine that something like the SMARTTable or the Surface would have been a godsend, provided that it came with a wealth of therapeutic activities and templates that could be easily customized to meet the needs of the students!
Interactive displays and whiteboards are now common in our schools and workplaces. There are a few classrooms that have interactive tables, and these tables are also found in museums and visitor centers. But there is much work yet to be done- in terms of content development, hardware design, interaction design, and research.
No one discipline "owns" this problem-space. Better said, no one owns this "solution-space"!
Yvonne Rogers has been a key player in this arena. She is focused, but at the same time, takes a big-picture, multi-faceted point of view, something that I find lacking in much of the computer science research that crosses my path. Yvonne's approach is consistent with other forward-minded thinkers in the field, such as
When she was a professor at Indiana University, Yvonne's team produced a short video to outline the types of problems when groups of people encounter when they share information and collaborate on projects.
What I like about this video is that it works towards solutions. Although the video is from 2006, it still rings true.
If you have ever worked with students in small groups forced to share one mouse and computer screen, some of the footage will bring you to a smile.
More about Dr. Rogers and her colleagues:
Dr. Rogers is the lead researcher of the SharedIT Project, an interdisciplinary endeavor that exploring how new technologies, such as multi-touch tables, gesture-based systems, interactive wall displays, and interactive tangibles, can support collaborative activities among small groups of people in a range of situations and settings.
Some of the people affiliated with Dr. Rogers and the Open University Pervasive Interaction Lab are highlighted in the video clip below. The video was filmed at an event held on June 23, 2009 at th Science Museums' Dana Center (UK), in collaboration with the London Knowledge Lab. The video highlights the work of researchers, artists, educators, and technologists, all exploring collaborative tangible, interactive technologies and surfaces. Some of the questions asked: What sort of new interfaces can support "collective creativity"? How can these technologies support interactions between people who are in different places around the world?
Surface Tension: Interactives and Workshops (June 23, 2009)
Doug Clow provides a good overview of some of the SharedIT projects in his April 2009 post, "Low-hanging fruit: interactive tables for collaborative learning". If you are curious and want to dig deeper, below is a link to the SharedIT publications web page, along with an assortment of articles related to Yvonne Roger's work.
Rick, J, and Rogers, Y. (2008) From DigiQuilt to DigiTile: Adapting Educational Technology to a Multi-Touch Table. IEEE Tabletops and Interactive Surfaces. 79-86.
Rogers, Y., Lim, Y., Hazlewood, W. R. and Marshall, P. (2008) Equal Opportunities: Do Shareable Interfaces Promote More Group Participation than Single User Displays? - To Appear in Human Computer Interaction
Rogers, Y. (2006) Moving on from Weiser's vision of of calm computing: engaging UbiComp experiences. In: P. Dourish and A. Friday (Eds.) Ubicomp 2006 Proceedings, LNCS 4206, pp. 404-421, Springer-Verlag, PDF version
Rick, J., Harris, A., Marshall, P., Fleck, R., Yuill, N. and Rogers, Y. (2009) Children designing together on a multi-touch tabletop: An analysis of spatial orientation and user interactions. Proceedings of Interaction Design and Children (IDC '09), 106-114
Harris, A., Rick, J., Bonnett, V., Yuill, N., Fleck, R., Marshall, P. and Rogers, Y. (2009) Around the Table: Are multiple-touch surfaces better than single-touch for children's collaborative interactions? Proceedings of CSCL 2009
Rick, J., Rogers, Y., Haig, C. and Yuill, N. (2009) Learning by Doing with Shareable Interfaces. Children, Youth & Environments, In Press.
Marshall, P., Fleck, R., Harris, A. Rick, J., Hornecker, E., Rogers, Y., Yuill, N. and Dalton, N. S. (2009) Fighting for Control: Children's Embodied Interactions When Using Physical and Digital Representations. Proceedings of CHI'09, 2149-2152. PDF version
Rogers, Y. and Price, S. (2009) How mobile technologies are changing the way children learn. In A. Druin, (Ed.) On the Move: Children, Learning and Technology. Elsevier. 3-22. PDF version
Rogers, Y. (2008) A comparison of how animation has been used to support formal, informal and playful learning. Learning with Animation, edited by R. Lowe and W. Schnotz, CUP, Cambridge. 286-303.
Rogers, Y. and Price, S. (2008) The role of mobile devices in facilitating collaborative inquiry in situ. Research and Practice in Technology Enhanced Learning. 3 (3), 209-229. PDF version
Rogers, Y. (2008) Using External Visualizations to Extend and Integrate Learning in Mobile and Classroom Settings. In J. Gilbert (ed.), Visualization: and practice in science education. CUP. 89-102.
Rogers, Y. and Muller, H. (2006) A framework for designing sensor-based interactions to promote exploration and reflection. International Journal of Human-Computer Studies, 64 (1), 1-15. PDF version
Brignull, H., Izadi, S, , Fitzpatrick, G., Rogers, Y. and Rodden, T. (2004) The Introduction of a Shared Interactive Surface into a Communal Space. In Proc. of CSCW 2004, Conference on Computer Supported Cooperative Work, November 6-10, 2004, Chicago, USA: ACM Press. 49-58. PDF version
Randell, C., Phelps, T. and Rogers, Y. (2003) Ambient Wood: Demonstration of a digitally enhanced field trip for school children. In Adjunct Proc. UbiComp 2003, 100-104. PDF version
Brignull, H. and Rogers, Y (2003) Enticing people to interact with large public displays in public spaces. In Proceedings of INTERACT'03, Zurich, 17-24. PDF version
Rogers, Y., Brignull, H. and Scaife, M. (2002) Designing Dynamic Interactive Visualisations to Support Collaboration and Cognition. In First International Symposium on Collaborative Information Visualization Environments, IV 2002, London, July 10-12, 2002, IEEE, 39-50. PDF version
Of course, there are many other researchers who are exploring collaborative technologies, and I've featured a good number on this blog. I'll be sure to feature more, from a range of disciplines, in future posts.