Showing posts with label interaction. Show all posts
Showing posts with label interaction. Show all posts

Jan 11, 2009

Usability, Accessibility, and User Experience in a Win7 Environment

Microsoft's newest operating system, Win7 (beta), allows for multi-touch interaction and application development. From what I can tell from the news from the Consumer Electronic Show (CES 2009), touch and gesture interaction holds quite a bit of interest among consumers and application developers alike.

Hopefully people will think carefully about user experience and usability factors when developing the applications we'll be using in the not too distant future!

Because of my background as a school psychologist, I’m interested how touch/gesture applications can address accessibility and universal usability issues.

I wasn't able to locate user experience or usability information for Win7. Here are a few links from the Microsoft website that I think are worth reading:

Touch Interaction Guidelines

How to Design a Great User Experience

Designing with Windows Presentation Foundation

Accessibility

According to the research from Microsoft, ”more than half of all computer users experience difficulties or impairments related to accessibility, and are likely to benefit from the use of accessible technology. Moreover, approaching software design with the flexibility and inclusiveness that are the hallmarks of accessibility often results in overall improved usability and customer satisfaction.”

Regarding touch/gesture interaction, know that “19% (24.4 million) of computer users have a mild dexterity difficulty or impairment, and 5% (6.8 million) of computer users have a severe dexterity difficulty or impairment.”

Christian Moore, from the NUI-Group, shared this nice graphic that depicts the various fields that intersect or converge with interface design, which I think is useful when thinking about how we will design, develop, and use emerging technologies:



Concepts such as accessibility and universal usability can fit into this design!

For more information regarding multi-touch, see my previous post, "For Techies and the Tech-Curious: Multi-touch/Gesture from the NUI Group"

Jan 8, 2009

For Techies and the Tech Curious: Multi-Touch/Gesture from the NUI-Group

If you are a new visitor to this blog and interested in interactive multimedia, you'll need to know more about the NUI-Group. Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.
(For related information, please read my recent post, Usability, Accessibility, and User Experience in a Win7 Environment.)

Seth Sandler, of the NUI-Group, sent out a great email with links and resources for people who are interested in multi-touch/gesture interaction, hardware, and/or software development. The list of NUI-Group members who have completed projects is listed below, with links to project websites as well as related threads on the NUI-Group forum.

(The information can be found on the NUI-Group Wiki, which boasts a nice icon based front-page)

Thanks, Seth, for organizing this wealth of information!

The following projects are divided up by type. (Links to information about the various types of multi-touch and gesture systems can be found near the end of this message.)

FTIR - Frustrated Total Internal Reflection

http://img144.imageshack.us/img144/8105/shemeftir2yu7.jpg
Name: Seth (cerupcat)
Project Name: AudioTouch
Project Website: http://ssandler.wordpress.com
Project Thread: http://nuigroup.com/forums/viewthread/1352/ http://nuigroup.com/forums/viewthread/2309/

Name: bassmang5
Project Name: Æ-table
Project Thread: http://nuigroup.com/forums/viewthread/3144/

Name: Daniel (Zin)
Project Name: Prometheus
Project Thread: http://nuigroup.com/forums/viewthread/2612/
Name: Carsten (carschdn)
Project Name : aTRACKtive
Project Thread: http://nuigroup.com/forums/viewthread/3223/

LLP Laser Light Plane

http://www.codelaboratories.com/images/LLP/HiPressure.jpg

Name: Denis Santelli(dsan)
Project Website: http://www.touchwall.fr/
Project Thread: http://nuigroup.com/forums/viewthread/3051/


LED-LP Light-Emitting Diode Laser Plane
Name: Nolan - (PeauProductions)
Project Name: PeauProductions (LCD)
Project Website: http://peauproductions.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/3291/
DI - Diffused Illumination
http://img359.imageshack.us/img359/1143/shemedi2bx3.jpg
Name: Seth (cerupcat)
Project Name: MTmini
Project Website: http://ssandler.wordpress.com/MTmini
Project Thread: http://nuigroup.com/forums/viewthread/1731/

Name: Fairlane
Project Name: ORION v2
Project Website: http://orionmultitouch.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/1709/
Name: Abdullah (EfeNDy)
Project Name: EfeNDy’s Diffused Illumination MT
Project Thread: http://nuigroup.com/forums/viewthread/3176/
Name: Sandor
Project Name: EXPO REAL 2008
Project Website: http://vimeo.com/2240537
Project Thread: http://nuigroup.com/forums/viewthread/3576/
Name: Matthew (MatthewW)
Project Name: Design Garage
Project Website: http://www.gotuasciencecenter.org/
Project Thread: http://nuigroup.com/forums/viewthread/3730/


Touchless

Name: Jimi Hertz
Project Name: Touchless Wall
Project Website: http://sassexperience.org/projettouchwall.html
(try: http://sassexperience.org/multitouch/inprogress.html )
Project Thread: http://nuigroup.com/forums/viewthread/2414/

YouTube: http://www.youtube.com/watch?v=KCFbWPf37jw

Other Types:
DSI Diffused Screen Illumination

http://iad.projects.zhdk.ch/multitouch/wp-content/uploads/2008/06/sheme_dsi_web.jpg



RELATED LINKS

Tips for Success

Resources
Thanks to all NUI-Group members who have been working so hard at this mission!

I'll end this post with a YouTube video created by Jimi Hertz, a NUI-Group member. "MULTI-TOUCHLESS WALL HOW TO?"

I especially like the music!


Dec 7, 2008

Demo of Duke University's multi-touch wall at RENCI, running the Cobalt Metaverse Browser

The video below shows the "pre-alpha" version of the Cobalt Metaverse Browser:



"This video shows the Cobalt metaverse browser being tested on a 13-foot by 5-foot multi-touch visualization wall equipped with six high-definition projectors located at the Renaissance Computing Institute engagement center at Duke University. The input drivers are being developed by Dr. Xunlei Wu so that users can directly manipulate high-resolution data using both hands and multiple fingers for a more natural and intuitive data exploration experience. In the video, Dr. Wu is using both gesture and touch to navigate through, and rearrange content between, two Cobalt virtual worlds."

Related:
The Open Cobalt Project (on ning)
Cobalt Website & link to download to the latest pre-alpha build
Cobalt Community,
Cobalt Google Group
EduSim (A 3D multi-user virtual world platform and authoring toolkit for K-12 interactive whiteboards. The latest version is powered by Cobalt)

Cobalt can import objects from the Google 3D warehouse as well as Google Sketch-up:
Video Tutorial: Using Google 3D Warehouse to build Cobalt & Edusim Virtual Worlds

People:
Julian Lombardi, Duke University
Xunlei Wu, Senior Visualization Researcher, RENCI, Duke University
Rich White, EduSim; Greenbush Education Service Center, Girard, KS

Kids using Cobalt-based EduSim on desktop computers, via Rich White:

Dec 6, 2008

Windows Vista User Experience Guidelines: All Windows programs should be touchable!

If you are a designer/developer who would like to develop applications that can support touch interaction, it will save you some time to study the Windows Vista User Experience "Touch" guidelines. You'll find information about the eight flick events, (gestures that correspond to keyboard shortcuts for navigation and editing), the 40 or so gestures recognized by Vista, tips for creating controls, GetMessageExtraInfo function, and more.


"As touch spreads from Tablet PCs to other types of computers, software program developers and designers will find it increasingly important to support touch as well. All Windows programs should have a great touch experience. Users should be able to perform your program's most important tasks efficiently using a finger."


Nov 19, 2008

Video of touch interaction on a HP TouchSmart, with NextWindow's Gesture Server Technology

Here is a short video clip of some TouchSmart interaction:



The video shows the new NextWindow Gesture Server Application.

Info from the NextWindow website:

"NextWindow Gesture Server Application in conjunction with a NextWindow touch screen enables two-touch gestures to be used on the Microsoft Windows Vista desktop and certain applications.

You perform a gesture by double-tapping or dragging two fingers on the touch surface. The Gesture Server interprets these actions as commands to the operating system. For example a two-touch vertical drag on the Vista desktop can adjust the computer's audio volume control up or down as required."


Also from the website:

Vertical Scroll Vertical scroll: drag two fingers up or down the touch screen.

Vertical Scroll Horizontal scroll: drag two fingers left or right on the touch screen.

Vertical Scroll Zoom: move two fingers apart or together.

Vertical Scroll Double Tap: double-tap two fingers on screen.

"You can enable or disable the two-touch functionality and adjust the sensitivity of each of the four two-touch gestures. You can also select the command that is executed with the double-tap gesture."

Nov 16, 2008

Every Surface a Computer: "Scratch" Capturing Finger Input on Surfaces using Sound. Video by Chris Harrison and Scott Hudson's Video - UIST '08

Chris Harrison and Scott Hudson, from the Human-Computer Interaction Group at Carnegie-Mellon University, presented their latest research at the UIST '08 conference. Take a look at the video below to see how gestures that result in sounds can can transformed on unpowered finger input surfaces, using a stethoscope sensors and filters:



Yes, every surface is a computer!
(Even your pants...)

For detailed information, read the paper presented at UIST '08 by Chris Harrison and Scott E. Hudson:
Scratch Input: Creating Large, Inexpensive, Unpowered, and Mobile Finger Input Surfaces

RELATED:

The Best Paper Award at UIST '08 was "Bringing Physics to the Surface", by Andrew Wilson, of Microsoft Research, and Ahahram Izadi, Otmar Hilliges, Armando Garcia-Mendoza, and David Kirk, of Microsoft Research, Cambridge.

Here is the abstract:

"This paper explores the intersection of emerging surface technologies, capable of sensing multiple contacts and of-ten shape information, and advanced games physics engines. We define a technique for modeling the data sensed from such surfaces as input within a physics simulation. This affords the user the ability to interact with digital objects in ways analogous to manipulation of real objects. Our technique is capable of modeling both multiple contact points and more sophisticated shape information, such as the entire hand or other physical objects, and of mapping this user input to contact forces due to friction and collisions within the physics simulation. This enables a variety of fine-grained and casual interactions, supporting finger-based, whole-hand, and tangible input. We demonstrate how our technique can be used to add real-world dynamics to interactive surfaces such as a vision-based tabletop, creating a fluid and natural experience. Our approach hides from application developers many of the complexities inherent in using physics engines, allowing the creation of applications without preprogrammed interaction behavior or gesture recognition."
Preparation for the Internet of Surfaces & Things?




(Cross-posted on the Technology-Supported Human World Interaction blog)

Oct 20, 2008

The atracTable Multi-Touch System from Atracsys

The atracTable is a multi-touch presentation system developed by the Swiss engineering and development group, Atracsys. It is similar to Microsoft's Surface. Interaction on the table can be triggered by laying objects on the table.

(Marc Hottinger and
Lionel Tardy , of Amorpik, designed the interface.)

http://www.atracsys.com/images/atracTable2_ex.jpg

atractable_1-480x321




AtracTable FAQ's

From the Atracsys Website -"How does it work":

"atracTable is the combination of a video-based movements tracking system, a computer, a beamer, and a screen.

When you lay an object on the screen, the tracking system recognizes the object wiht a visual tracking tag on the base of the product. At the same time, the tracking system detects the positions and movements of your fingers and of the objects on the screen.

The whole pieces of information concerning the product(s) and the different locations and movements are sent to the computer. The data is processed and sent back to the beamer.

The processing is performed instantaneously. The real-time interaction is obtained by continous detection of fingers and objects movements. The whole technology fits in the table and is invisible for customers".

Another creation by Atracsys is beMerlin, a gesture-based interactive system that plays out as an interactive window. Although it is used for visual merchandising, it looks like it has potential for other uses, such as wayfinding, building directories, interactive museum exhibits, and education.


Click to enlarge image

http://www.atracsys.com/images/beMerlin2_ex.jpg

http://www.atracsys.com/images/beMerlin1_ex.jpg

How it works:

Oct 17, 2008

Time for More Touch! Part Two: Microsoft's "Oahu", a hypothetical (?), affordable version of the Surface multi-touch table..

Long Zheng, from the I Started Something blog, was privy to a survey from Microsoft about "Oahu" (via someone named Kerien).

The following description of Oahu is a quote from Long Zheng's website, and reportedly was the introductory section of Microsoft's survey:

"The following questions refer to a computing device called “Oahu” that has an innovative multi-touch screen. Oahu is a flat screen that sits horizontally like a table top. You can interact with Oahu by touching the screen, instead of using a mouse, and more than one person can interact with Oahu at the same time. You and others can move objects on the screen with your hands and touch icons to open up programs, games, or music. People using the device can also use their fingertips to expand and shrink objects on the screen. The screen recognizes people’s hand movements and touches and reacts accordingly. You can bring up an on-screen keyboard to input information. Oahu also works with other devices (such as digital cameras, cell phones, and MP3 players) by getting information from or sending information to them. Oahu is on with no waiting time to start up. Oahu can come as a freestanding table, placed into a piece of furniture, or built into a countertop. The type of Oahu devices we are asking about today are not portable but if they are furniture or tables, they can be placed anywhere in your home.
"

mmmm.... sounds just like a Surface....


Photo via I Started Something

The price of Oahu quoted in the survey? $1,499.00. A substantial savings, considering that the price of Microsoft's Surface is $10,000.00.

This price approaches the affordable range for schools. I wonder if any questions in the survey addressed the learning aspects of the Oahu, other than helping children with homework. With the upcoming Windows 7 OS and its multi-touch capabilities, I'm sure we'll be seeing the spread of this technology.

FYI:
Long Zheng is working on a Business Commerce and Multimedia Systems double degree at
Monash University in Australia. His purpose in blogging is to be on the cutting edge of first-breaking news related to technology.

For more information about Windows 7, see the Engineering Windows 7 blog.

Aug 16, 2008

Microsoft Research project: MouseMischief - Multi-user, Multi-Mice Interaction on Large Displays

This is an interesting demonstration of the use of multiple mice, controlled by children on an interactive whiteboard. The collaborative application uses Microsoft's Multi-Point technology. For more information and free downloads, go to MouseMischief.org.


Jun 18, 2008

Hands On Computing: How Multi-Touch Screens Could Change the Way we Interact with Computers and Each Other (link to Scientific American Article)

More Multi-Touch!

Scientific American,
June, 2008 Hands On Computing: How Multi-touch Screens Could Change The Way We Interact With Computers and Each Other "The iPhone and even wilder interfaces could improve collaboration without a mouse or keyboard. "

"It is easy to imagine how photographers, graphic designers or architects—professionals who must manipulate lots of visual material and who often work in teams—would welcome this multi-touch computing. Yet the technology is already being applied in more far-flung situations in which anyone without any training can reach out during a brainstorming session and move or mark up objects and plans." -Stuart Brown

Link: Emotiv System's Neural Game Controller Headset: Human-Computer Interface of the Future?

If you are looking for information about brain-computer interfaces, follow the link to my post about Emotive Systems neural interface on the Technology-Supported Human-World Interaction blog.

Emotiv System's Neural Game Controller Headset: Human-Computer Interface of the Future?

Also see:
Game Interaction via Thoughts and Facial Expressions: EPOC - Emotiv Systems Neural Interface

Jun 13, 2008

Fingertapps and NextWindow's Multi-Touch Firmware on the New HP Touchsmart PC,



Fingertapps is working on an application that utilizes NextWindow's multi-touch firmware, running on the second-generation HP Touchsmart PC. Fingertapps will be demonstrating their work in the NextWindow booth at Infocomm 2008 in Las Vegas, June 14-20.

May 28, 2008

Windows 7 Demo on a Dell laptop: More Multi Touch


Video: Multi-Touch in Windows 7

Via SoapBox, Greenbush Labsand CNET

If you are new to this blog and would like to learn more about multi-touch interaction and technology, enter a keyword in the search box.

Also take a look at the Technology Supported Human-World Interaction blog.

For multi-touch DYI, check out the NUI Group!

Apr 10, 2008

Video of OLPC's Social Features

Take a look a One Laptop Per Child's Interactive User Interface:


Here is a video about OLPC's social features:




What do you think?


Somewhat off-topic:
Who's your user- Usability Rap Video, via 90percent of Everything via Information Architecture Television

Sep 17, 2007

Nintendo Wii - Link to John Kirriemuir's "First Impressions"

I've been sharing a Wii with my daughter since December, 2006, and have meant to share my impressions of this innovative game system. Even my mother, who is in her 70's, has played it.

John Kirriemuir's extensive post, "Nintendo Wii: First Impressions", describes the Wii system in detail, with a set of pictures of happy players that span in age across the generations. John links to other Wii resources, and concludes that in the future, the Wii will become a focus of research among academicians from a variety of disciplines.

What I love best about the Wii is flopping down on the sofa, spinning the Wii Globe, and reading the electronic newspaper articles across my wide-screen TV. The fonts can be increased easily via the WiiMote, so those of you of a certain age won't need to remember your reading glasses to stay informed.