Showing posts sorted by date for query post-WIMP. Sort by relevance Show all posts
Showing posts sorted by date for query post-WIMP. Sort by relevance Show all posts

Apr 8, 2009

Joel Eden's Informative Post: Designing for Multi-Touch, Multi-User and Gesture-Based Systems

Joel Eden is a User Experience Consultant at Infragistics- he recently wrote a detailed article/post in the Architecture & Design section of Dr. Dobbs Portal, "Designing for Multi-Touch, Multi-User and Gesture Based Systems". I thought I'd share the link, since I've been writing on the same topic.

In his article, Joel explains the differences between traditional WIMP (Window, Icon, Menue, Pointer) interaction and gesture, multi-touch, and multi-user systems. These systems are also known as Natural User Interfaces, or NUI. He recommends that "rather than trying to come up with new complicated ways to interact with digital objects, your first goal should be to try to leverage how people already interact with objects and each other when designing gesture based systems."

Joel goes on to outline UX (User Experience, IxD (Interaction Design), and HCI (Human-Computer Interaction) concepts that designers should consider when developing new systems, - Affordances, Engagement, Feedback, and "Don't Make Us Think"
, which he summarizes in the conclusion of his article.

I especially liked Joel's references:

Clark, Andy. Supersizing the Mind: Embodiment, Action, and Cognitive Extension

Few, Stephen. Information Dashboard Design: The Effective Visual Communication of Data

Gibson, John J. The Ecological Approach to Visual Perception

Krug, Steve. Don't Make Me Think: A Common Sense Approach to Web Usability, Second Edition

Norman, Don. The Design of Everyday Things

Norman, Don. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine

I would also add the following references:
Bill Buxton
Multi-touch Systems I have Known and Loved
(Regularly updated!)
Sketching User Experiences: Getting the Design Right and the Right Design

"Our lack of attention to place, time, function, and human considerations means these fancy new technologies fail to deliver their real potential to real people." - Bill Buxton

Dan Saffer
Designing for Interaction: Creating Smart Applications and Clever Devices
Designing Gestural Interfaces

SAP
Touchscreen Usability in Short
(Summary by Gerd Waloszek of the SAP Design Guild)
SAP Design Guild Resources (User-Centered Design, User Experience, Usability, UI Guidelines, Visual Design, Accessibility)
Kevin Arthur (Synaptics)
Touch Usability
Bruce "Tog" Tognazzini
Ask Tog: Interaction Design Solutions for the Real World
Inclusive Design, Part I
First Principles of Interaction Design
John M. Carroll
Human Computer Interaction (HCI) (History of HCI)
Bill Moggridge
Designing Interactions
Ben Shneiderman
Leonardo's Laptop: Human Needs and the New Computing Technologies
Edward Tufte

Visual Explanations
Beautiful Evidence
The Visual Display of Quantitative Information
Envisioning Information
Rudolf Arnheim (Gestalt)
Art and Visual Perception: A Psychology of the Creative Eye

Update: A great reading list on general HCI. Some of the authors were involved in the early days of touch, bi-manual, and multi-touch interaction.

Jan's Top Ten List of Books on Human-Computer Interaction


FYI: If you know much about Windows Presentation Foundation, you probably know that Josh Smith, WPF guru, also works at Infragistics


Feb 27, 2009

Tangible User Interfaces Part II: More Examples, Resources, and Use for TUI's in Education

In Part I of my "mini-series" about Tangible User Interfaces, I discussed the origins of TUI and provided some examples of Siftables. In this section, I've provided some links to information about Tangible User Interfaces and the abstracts of two articles pertaining to TUI's in educational settings.

Zen Waves: A Digital (musical) Zen Garden



reactable from Nick M. on Vimeo.

Reactable
http://upload.wikimedia.org/wikipedia/commons/e/e3/Reactable_Multitouch.jpg
More about the Reactable
"The reactable hardware is based on a translucent, round multi-touch surface. A camera situated beneath the table, continuously analyzes the surface, tracking the player's finger tips and the nature, position and orientation of physical objects that are distributed on its surface. These objects represent the components of a classic modular synthesizer, the players interact by moving these objects, changing their distance, orientation and the relation to each other. These actions directly control the topological structure and parameters of the sound synthesizer. A projector, also from underneath the table, draws dynamic animations on its surface, providing a visual feedback of the state, the activity and the main characteristics of the sounds produced by the audio synthesizer."


The Bubblegum Sequencer: Making Music with Candy



Jabberstamp: Embedding Sound and Voice in Children's Drawings
(pdf)
(A TUI application to support literacy development in children)

Affective TouchCasting
(pdf)

TapTap: A Haptic Wearable for Asynchronous Distributed Touch Therapy
(pdf)

BodyBeats: Whole-Body, Musical Interfaces for Children
(pdf)

Telestory is a Siftables application that looks like it would be quite useful for supporting children who have communication disorders or autism spectrum disorders.

Telestory Siftables application from Jeevan Kalanithi on Vimeo.

"Telestory is an educational, language learning application created by Seth Hunter. In this video, the child is looking at a television screen. He can control onscreen characters, events and objects with the siftables. For example, he has the dog and cat interact by placing the dog and cat siftables next to each other."
TeleStory Project Website

Here is a video of how Siftables can be used as equation editors:


Siftables Equation Editor from Jeevan Kalanithi on Vimeo.

RESOURCES ABOUT TUI'S:


5 lessons about tangible interfaces, GDC Lyon, December 2007(pdf) Nicolas Nova


Special Issue on Tangible and Embedded Interaction (Guest Editors: Eva Hornecker, Albrecht Schmidt, Brygg Ullmer) Journal of Arts and Technology (IJART) Volume 1 Issue 3/4 - 2008


Reality-Based Interaction: A Framework for Post-WIMP Interfaces (pdf)


Here are a couple of abstracts of articles related to the use of TUI's in education:

Evaluation of the Efficacy of Computer-Based Training Using Tangible User Interface for Low-Functioning Children with Autism Proceedings of the 2008 IEEE International Conference on Digital Games and Intelligent Toys

"Recently, the number of children having autism disorder increases rapidly all over the world. Computer-based training (CBT) has been applied to autism spectrum disorder treatment. Most CBT applications are based on the standard WIMP interface. However, recent study suggests that a Tangible User Interface (TUI) is easier to use for children with autism than the WIMP interface. In this paper, the efficiency of the TUI training system is considered, in comparison with a conventional method of training basic geometric shape classification. A CBT system with TUI was developed using standard computer equipment and a consumer video camera. The experiment was conducted to measure learning efficacy of the new system and the conventional training method. The results show that, under the same time constraint, children with autism who practiced with the new system were able to learn more shapes than those participating in the conventional method."

Towards a framework for investigating tangible environments for learning Sara Price, Jennifer G. Sheridan, Taciana Pontual Falcao, George Roussos, London Knowledge Lab, 2008

"External representations have been shown to play a key role in mediating cognition. Tangible environments offer the opportunity for novel representational formats and combinations, potentially increasing representational power for supporting learning. However, we currently know little about the specific learning benefits of tangible environments, and have no established framework within which to analyse the ways that external representations work in tangible environments to support learning. Taking external representation as the central focus, this paper proposes a framework for investigating the effect of tangible technologies on interaction and cognition. Key artefact-action-representation relationships are identified, and classified to form a structure for investigating the differential cognitive effects of these features. An example scenario from our current research is presented to illustrate how the framework can be used as a method for investigating the effectiveness of differential designs for supporting science learning"

Sep 1, 2008

Interactive Touch-Screen Technology, Participatory Design,and "Getting It"....

PLEASE SEE THE UPDATED VERSION OF THIS POST:
Interactive Touch Screen Technology, Participatory Design, and "Getting It", Revisited

http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg

Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:

When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?

Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios in which surface computing would be a welcome breath of fresh air.

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.

http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.

HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it.


Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers.

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like.

(The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.)


Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.

A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities.

(By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

Some resources:
HP TouchSmart PC website, with demo
HP's TouchSmart YouTube videos
lm3labs (catchyoo, ubiq'window)
NUI Group (See member's links)
NextWindow
Fingertapps
thirteen23
SmartTechnologies
Perceptive Pixel - Jeff Hans
Microsoft Surface
iPhone
(More can be found by doing a search on this blog or The World Is My Interactive Interface.)

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI:

Need for Improvement: User-Unfriendly Information Kiosk Interactive Map


Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0



Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.


For more about lm3labs, including several videoclips, take a look at one of my previous posts:
Lm3Labs, Nicolas Leoillot, and Multimedia Interaction