Nov 23, 2008

Need for Multi-touch, Multi-user Interactive Multimedia Applications, and the Miracle Question

Last week I received a few comments on my post, "Multi-touch and Flash: Links to Resources; Revisiting Jeff Han's Presentation". I started to respond to a thoughtful comment by Spencer, of TeacherLED, and I wanted to share it as a post:

Spencer is a teacher and instructional technology consultant who develops web-based interactive applications for use on interactive whiteboards (IWB's). He's interested in multi-touch applications for education and has some good insights into what HCI researchers call the "problem space".

Here are Spencer's comments:

..."I agree that Flash could have a very important role to play here. I chose Flash as my development tool because it allows quick development of ideas and then easy distribution of the product. The importance of this is that it allows people who have a profession other than software developer to create software with the insight of their main role. In my case, as a teacher, I can identify things I wish I had and then make them. Often I find that other teachers had the same wish and they then appreciate the product."

"The unfortunate thing with multi-touch is that it is far from the technology most of us outside the industry/research areas have to work with. An app created in Flash for single touch follows the mouse and pointer method so it can be developed easily. When done it can be easily tested on a standard IWB for the feel (which is often surprisingly different on the IWB compared to using a mouse)."

"The Flash developer community has a very experimental and creative characteristic and I’m sure would be a great driving force for multi-touch but first there needs to come a reason for more people to have some sort of multi-touch display for general use, beyond facilitating experiments. When the various operating systems support it and have the apps to make having a supporting display viable then the experimentation and ideas will really flow."

"In addition, the display makers need to recognize the benefits of Flash and ensure they address them. At the moment it seems to be too often an afterthought if considered at all. SDKs and APIs make no reference to Flash or they remain indefinitely in beta for older versions of Flash only."

"It is a pity that all of this will take time. The more time that passes the more single touch IWBs are bought and installed which will delay the uptake of the eventual multi-touch ones. Meanwhile children continue to have to keep reminding themselves that they can only touch the board in one place when it is clear that every bit of their brain is telling them to interact with the board in a much more natural multi-touch way."

My response to Spencer's comment:

Spencer,

You make good points regarding the barriers to getting the multi-touch approach adopted by the "mainstream". You're right about what the commercial display makers need to do. If they want to market displays that will have more appeal, they must think about the different sorts of applications and programming environments that the displays should support.

Display makers also need to think more about the bigger picture - in what sort of environments will the displays be located? Indoors? Outdoors? Near bright sunlight? What about people with disabilities, children, or the elderly?

I can see that in the future, multi-touch displays and other devices would operate within an embedded systems environment and support mobile computing activities as well. There are existing examples of this concept, of course, but there is much room for creative improvement. An embedded systems approach is complex, and would need to handle input from sensors, support multi-modal signal process, and also provide users with a range of connectivity modes, including RFID. (Data management and storage needs would have to be addressed, along with privacy and security concerns.)

Most importantly, in my opinion, these systems would need to have the flexibility required to support human activities and interactions that have not yet emerged! Certainly this will need to take a multidisciplinary approach.

There are many unanswered questions....How does this fit in with mobile computing and "cloud" computing? What sort of middleware needs to be developed?

Even if we don't have solutions to the bigger problems, there are many smaller problems that I think could be somewhat easily solved.

As you mentioned, many applications that are designed for single-touch screens don't fully support the way people identify, select, and move items around the screen. Although educators access websites every day for use on interactive whiteboards, they are hungry for more. There are not enough websites that are optimized for single-touch interaction, or touch-screen interaction in 3-D "space".

Teachers who are successful users of interactive whiteboards know exactly what we are talking about. They spend quite a bit of time searching for new on-line resources they can use with their students. They know how much the students want to interact with the screen at the same time and would be so excited to have capability at their fingertips!

Optimizing websites for touch-screen applications is possible, but this idea hasn't occurred to web developers. Their jobs don't require it, so there is no incentive. Google is developing FlareBrowser, that can support multi-touch interactions, but according to information on the website, it runs on Mac Leopard 1.5, and nothing else. The present version is bare-bones. I haven't yet tested the FlareBrowser.

I think that another barrier to getting multi-touch off the ground is that the people who might have the knack for multi-touch application development simply don't know it! We've mentioned that Flash developers have the potential to create good multi-touch applications. I also think that game developers and designers could make good contributions to the multi-touch movement. Just think about what thought goes into programming interactions and event handling for 3-D web-based multi-player games!

Yet another barrier is that people who work in lower-tech fields could benefit from collaborative multi-touch applications, but they don't know it, either. The research I've reviewed tells me that multi-touch applications can support a wide range of human endeavors- work, creativity, data analysis, education, collaboration, planning, and so forth.

What is missing is the input of potential end users from a variety of fields. No specific discipline "owns" multi-touch, so it is hard to figure out how we can make this happen.

Could we set up multi-touch technology playgrounds at professional and trade conferences? What about airports and hospital lobbies? Libraries and museums? Shopping centers? Sports events and rock concerts?

This leads me to my next idea, which is jumping ahead a bit:

One of the barriers to the development of multi-touch applications is that it is not easy to gather user requirements when the users are not familiar with the technology.
That is when my "Miracle Question" technique comes into play. I learned this technique when I studied brief solution-focused counseling and found that if modified, can be useful when figuring out user requirements. (The process still needs some fleshing out.)
Why the Miracle Question?
The questions that a developer uses to guide the client during the initial planning stages are very important. Keep in mind that people want to use technology because it meets a need and also solves a problem, which is the similar to the reason a person might seek counseling.
The Miracle Question technique (actually, a series of questions) might help to tease things out. The goal of this type of questioning is to help the client use their own creativity, resources, and problem-solving skills so they can become effective partners throughout the development cycle.
(People with human-computer interaction training might have an easier time understanding how this technique might be modified and applied to different fields.)

FYI
A good example the Miracle Question process, as used in therapy and counseling, can be found on the Network of Social Construction Therapies website in an article written by the late Steve de Shazer:

http://brianmft.talkspot.com/aspx/templates/topmenuclassical.aspx/msgid/366482

There aren't many resources about the use of the Miracle Question in IT or business. Here are a couple:

Solution Focused Management of Unplanned IT Outages (Read pages 132 and the references.)http://conferences.vu.edu.au/web2006/images/CDProceedings06.pdf
Proceedings of 7th International We-B (Working for E-Business) Conference, 2006Katherine O'CallaghanSugumar Mariappandar, Ph.D.School of Business and InformaticsAustralian Catholic University

Miracle Question in Executive Coaching
http://www.1to1coachingschool.com/Coaching_Miracle_Question.htm

Nov 22, 2008

Rome Reborn Update: New Google Earth layer of Ancient Rome - Great Idea for Engaging Interactive Whiteboard Activities

Steven Musil's recent article in CNET discusses Google Earth's 3-D view of ancient Rome. The project is an outgrowth of Rome Reborn, the effort of an interdisciplinary, international team of people, including computer scientists, artists, archaeologists, and historians from UCLA, the Politecnico di Milano (Italy), and the Institute for Advanced Technology in the Humanities at the University of Virginia in Charlottesville.

Visitors can explore inside the city's buildings, and obtain related historical information through pop-up windows. The 3-D interaction is great on the large screen or interactive whiteboard.

I posted about the Rome Reborn

Below is the "how-to" video:




Google Earth's Ancient Roman Holiday

Rome Curriculum Competition for Educators
Prize Package:

Apple MacBook laptop
Digital classroom projector
Digital camera
3D Navigation mouse
$500 in gift cards to Target or Office Depot
Engraved Google "Top Educator" plaque


"We're accepting curricula from all grade levels and K-12 subject areas including art history, math, social studies, physics, and philosophy, so whether you teach 5th grade art or high school engineering, there's glory and a nice prize package waiting for you."

Rome Example

Related:
Digital Rome

Nov 20, 2008

CNN's Magic Wall Conspiracy Thriller on the Daily Show: John Oliver, Jeff Han, John King and a cast of TouchScreens and Windows...

"I needed to find a screen-free environment!" -John Oliver

"It's good to be King." - John King, after disposing of John Oliver...

I just took a look at a hilarious episode about interactive multi-touch screens and a conspiracy theory on the Daily Show. The episode features Jeff Han, the creator of
CNN's Magic Wall, John Oliver, John King, and others from CNN.

Enjoy!



Via John Herrman and Gizmodo


If you are interested in multi-touch technology, feel free to do a search for additional information on this blog. The following post includes Jeff Han's demonstration of his multi-touch applications from TED 2006, along with resources and links:

Multi-touch and Flash: Links to Resources, Revisiting Jeff Han's TED 2006 Presentation

Note: If you are a parent, please screen the video clip before deciding if it is OK for your child to view.

Update from Multi-Touch Barcelona and Guten Touch: Multi-Touch Space Invaders


Multitouch Space Invaders Basic Demo from multitouch-barcelona on Vimeo..

For more sights and sounds from this group, visit the Red Bull Music Academy Barcelona 2008 website.

For more about multi-touch technology, including DYI instructions for creating multi-touch tables and displays, open-source code,and tutorials, visit the NUI Group website.

Nov 19, 2008

More from Hewlett Packard: a TouchSmart Notebook!

The new HP TouchSmart tx2 Notebook is duo-touch, with a swivel display. You can use a stylus or your fingertips to interact with the screen. The price listed on the site is $1,149.99, but it s a bit more expensive if you order "the works".




According to the HP website, some models of the notebook come with a built-in fingerprint reader to assist with log-on or lock-up functions. It includes integrated Altec Lansing stereo speakers and supports multimedia entertainment applications. The screen is 12.1", with an HP BrightView LED display. It is capable of playing HD content.

Watch the video:
http://h30440.www3.hp.com/campaigns/tx2/demo/Model.html

Explore the features in the interactive presentation.
http://h30440.www3.hp.com/campaigns/tx2/demo/Model.html


Read the WSJ Market Watch article:


"The enhanced HP MediaSmart digital entertainment software suite on the tx2 allows users to more naturally select, organize and manipulate digital files such as photos, music, video and web content by simply touching the screen.

"Breezing through websites and enjoying photos or video at the tap, whisk or flick of a finger is an entirely new way to enjoy digital content on a notebook PC," said Ted Clark, senior vice president and general manager, Notebook Global Business Unit, Personal Systems Group, HP. "With the introduction of the TouchSmart tx2, HP is providing users an easier, more natural way to interact with their PCs, and furthering touch innovation." "

The notebook uses capacitive touch technology, and supports gestures such as "pinch, rotate, arc, flick, pres and drag, and single & double tap."

For more information, see Hugo Jobling's recent post on the TrustedReviews website.
The touch-screen in HP's products are from NextWindow. NextWindow now has drivers that will work with the upcoming Windows 7, which will allow for multi-touch applications.

FYI: Video clip of HP's TouchSmart single-touch interaction, from July 2008:


From Andy Vandervells' Trusted Reviews post, "Hands On with the HP TouchSmart"

Video of touch interaction on a HP TouchSmart, with NextWindow's Gesture Server Technology

Here is a short video clip of some TouchSmart interaction:



The video shows the new NextWindow Gesture Server Application.

Info from the NextWindow website:

"NextWindow Gesture Server Application in conjunction with a NextWindow touch screen enables two-touch gestures to be used on the Microsoft Windows Vista desktop and certain applications.

You perform a gesture by double-tapping or dragging two fingers on the touch surface. The Gesture Server interprets these actions as commands to the operating system. For example a two-touch vertical drag on the Vista desktop can adjust the computer's audio volume control up or down as required."


Also from the website:

Vertical Scroll Vertical scroll: drag two fingers up or down the touch screen.

Vertical Scroll Horizontal scroll: drag two fingers left or right on the touch screen.

Vertical Scroll Zoom: move two fingers apart or together.

Vertical Scroll Double Tap: double-tap two fingers on screen.

"You can enable or disable the two-touch functionality and adjust the sensitivity of each of the four two-touch gestures. You can also select the command that is executed with the double-tap gesture."