One of my interests is how the power and potential post-WIMP interactive technologies can be harnessed for use for formal and informal education purposes, including life-long collaborative learning.
In May, I had a chance to meet with a number of like-minded people during the CHI 2011 conference at the 2nd Workshop on UI Technologies and Impact on Educational Pedagogy. I was impressed with the depth and breadth of the presentations at the workshop. Since then, I've been looking through other papers and videos from CHI 2011 to find interesting applications that hold potential for use in educational settings.
I've come across a good number of interesting applications and prototypes, so be sure to check back for future posts on this topic. For now, here are a few applications that I'd like to share.
Below are a few videos from Phillip Chi-Wing Fu. (He doesn't know it yet, but I've admitted his videos into the Post-WIMP Explorers' Club.)
Interactive Multi-touch Sketching Interface for Diffusion Curves
"A novel multi-touch sketching interface enabling interactive and practical design with 2D diffusion curves is proposed; featured interaction techniques include simultaneous sketching of multiple diffusion curves and at-the-spot colors tuning."
Distinguishing Multiple Smart-Phone Interactions on a Multi-touch Wall Display using Tilt Correlation
"This paper proposes a novel matching technique, called tilt correlation, which employs the built-in tilt sensor on smart-phones to identify their concurrent contacts on a common multi-touch wall display."
WYSIWYF: Exploring and Annotating Volume Data with a Tangible Handheld Device (CHI 2011)
"Integration of a multi-touch wall display with a tangible handheld device with multi-touch and tilt sensing capabilities to provide intuitive what-you-see-is-what-you-feel visual exploration and annotation of volume data."
The following videos were uploaded by alucero:
Pass-Them-Around: Collaborative Use of Mobile Phones for Photo Sharing (CHI 2011)
"Pass-Them-Around is a phone-based application that allows a small group of collocated people to share photos using the metaphor of passing paper photos around. The prototype encourages people to share their devices and use them interchangeably while discussing photos face-to-face. The prototype supports ad-hoc photo sharing in different contexts by taking into account the spatial arrangement of users around a table, measured with sensors embedded in their mobile phones."
The next video was part of MobileHCI '10:
MindMap: Collaborative Use of Mobile Phones for Brainstorming
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Jul 6, 2011
Revisiting CHI 2011: Videos of Interactive Touch, Gesture, Large Surface, and Mobile Apps with Potential for Use in Education (CHI = Computer Human Interaction)
Posted by
Lynn Marentette
Labels:
#CHI2011,
CHI 2011,
education,
gesture,
HCI,
interaction,
learning,
mobile,
multitouch,
NUI,
pedagogy,
post-WIMP,
teaching,
UI technology
No comments:
Jun 17, 2011
In case you missed this: Microsoft Releases Kinect SDK Beta for PC
Kinect for Windows SDK Beta! IT IS TRUE!!!!!
My Kinect and PC are waiting for my summer project. What a great opportunity to "practice" programming over my 5 week summer break..... I already know C#, and I've done a little game programming (ie. AI for Game Development - using XNA Game Studio Express- it has been a while).

-Photo credit: Microsoft Research
I have some cool ideas for basic games that might be good for the students I work with who have autism spectrum disorders... and some ideas that might be fun for my grand-baby. I can't wait to have time to code again!
Here's some info from the Microsoft Kinect for Windows SDK Beta website:
"The Kinect for Windows SDK beta is a programming toolkit for application developers. It enables the academic and enthusiast communities easy access to the capabilities offered by the Microsoft Kinect device connected to computers running the Windows 7 operating system."
"The Kinect for Windows SDK beta includes drivers, rich APIs for raw sensor streams and human motion tracking, installation documents, and resource materials. It provides Kinect capabilities to developers who build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010."
This SDK includes the following features:
Raw sensor streams
Access to raw data streams from the depth sensor, color camera sensor, and four-element microphone array enables developers to build upon the low-level streams that are generated by the Kinect sensor.
Skeletal tracking
The capability to track the skeleton image of one or two people moving within the Kinect field of view make it easy to create gesture-driven applications.
Advanced audio capabilities
Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.
Sample code and documentation
The SDK includes more than 100 pages of technical documentation. In addition to built-in help files, the documentation includes detailed walkthroughs for most samples provided with the SDK.
Easy installation
The SDK installs quickly, requires no complex configuration, and the complete installer size is less than 100 MB. Developers can get up and running in just a few minutes with a standard standalone Kinect sensor unit (widely available at retail outlets)."
My Kinect and PC are waiting for my summer project. What a great opportunity to "practice" programming over my 5 week summer break..... I already know C#, and I've done a little game programming (ie. AI for Game Development - using XNA Game Studio Express- it has been a while).

-Photo credit: Microsoft Research
I have some cool ideas for basic games that might be good for the students I work with who have autism spectrum disorders... and some ideas that might be fun for my grand-baby. I can't wait to have time to code again!
Here's some info from the Microsoft Kinect for Windows SDK Beta website:
"The Kinect for Windows SDK beta is a programming toolkit for application developers. It enables the academic and enthusiast communities easy access to the capabilities offered by the Microsoft Kinect device connected to computers running the Windows 7 operating system."
"The Kinect for Windows SDK beta includes drivers, rich APIs for raw sensor streams and human motion tracking, installation documents, and resource materials. It provides Kinect capabilities to developers who build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010."
This SDK includes the following features:
Raw sensor streams
Access to raw data streams from the depth sensor, color camera sensor, and four-element microphone array enables developers to build upon the low-level streams that are generated by the Kinect sensor.
Skeletal tracking
The capability to track the skeleton image of one or two people moving within the Kinect field of view make it easy to create gesture-driven applications.
Advanced audio capabilities
Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.
Sample code and documentation
The SDK includes more than 100 pages of technical documentation. In addition to built-in help files, the documentation includes detailed walkthroughs for most samples provided with the SDK.
Easy installation
The SDK installs quickly, requires no complex configuration, and the complete installer size is less than 100 MB. Developers can get up and running in just a few minutes with a standard standalone Kinect sensor unit (widely available at retail outlets)."
Nicholas Kolakowski, Application Development News, 6/16/11
Posted by
Lynn Marentette
Apr 9, 2011
"Playful Interface Cultures" exhibit video showreel, just released (Ars Electronica Festival 2010)
Playful Interface Cultures from Interface Culture on Vimeo.
"This video showreel provides a walkthrough through the "Playful Interface Cultures" exhibition at the Ars Electronica Festival 2010. The works shown were created within the Master and Ph.D. program at the Interface Culture Lab, University of Art and Industrial Design in Linz, Austria." http://www.interface.ufg.ac.at/
"The exhibition design has been based on traditional Austrian tobacco brands in reference to the historic venue of the former tobacco factory built by the German designer and architect Peter Behrens." -Interface Culture Lab
Instructors:
Georg Russegger
Martin Kaltenbrunner
Michaela Ortner
Varvara Guljajeva
Design
Vesela Milhaylova
Henning Schulze
Production
Reinhard Gupfinger
Technical Support
Mar Canet
Anika Hirt
University of Art and Industrial Design, Linz
SOMEWHAT RELATED
TuioKinect, by Martin Kaltenbrunner: "A simple Tuio hand gesture tracker for Kinect"
Reactable
Posted by
Lynn Marentette
Mar 27, 2011
Multitouch and Gesture News (Part I) : NUITEQ Updates
In this post, I'd like to share some news from NUITEQ, a privately held multi-touch software technology that is based in Sweden. NUITEQ is known for Snowflake Suite, a multi-touch application that is used in a variety of situations.
"NUITEQ is as happy as a kid in a candy shop to announce that it has released version upgrade 1.9.4 of its award-winning, industry leading multi-touch software product Snowflake Suite" - Harry van der Veen, NUITEQ
The screen shot below gives a nice view of the range of applications that included in Snowflake Suite:


-Photo from NUITEQ
DEMONSTRATION OF SNOWFLAKE SUITE
RELATED
NUITEQ's CTO Mikael Bauer nominated for the Mer-IT award in the category IT-person of the year
NUITEQ Awarded CONNECT Norr Company of the Year 2011
NUITEQ's Partners
From the NUITEQ Gallery, via Harry van der Veen
"NUITEQ is as happy as a kid in a candy shop to announce that it has released version upgrade 1.9.4 of its award-winning, industry leading multi-touch software product Snowflake Suite" - Harry van der Veen, NUITEQ
The screen shot below gives a nice view of the range of applications that included in Snowflake Suite:


-Photo from NUITEQ
DEMONSTRATION OF SNOWFLAKE SUITE
RELATED
NUITEQ's CTO Mikael Bauer nominated for the Mer-IT award in the category IT-person of the year
NUITEQ Awarded CONNECT Norr Company of the Year 2011
NUITEQ's Partners
From the NUITEQ Gallery, via Harry van der Veen
Posted by
Lynn Marentette
Mar 16, 2011
Cute TV Commercial! Intel Gesture Recognition Technology
This was the first television commercial I saw on TV this morning. I had to stop my morning routine to watch it : )
RELATED
From ChannelIntel: Interview of Francis MacDougall, CTO of GestureTek
RELATED
From ChannelIntel: Interview of Francis MacDougall, CTO of GestureTek
Posted by
Lynn Marentette
Mar 4, 2011
Window Shopping in the Web Outside: Interactive Window Displays (videos and links)
This is one of my favorite interactive window display projects:
Interactive Window Concept made for the module Advanced Interface Design at Hyper Island hosted by North Kingdom. Shot with a RED ONE camera at 120fps - Viralmente
Scion Interactive Window Campaign - Monster Media
Starbuck's Interactive Window Display PSFK Salon, 2/7/11
Starbucks Interactive Storefront from The Media Merchants on Vimeo.
Creative Agency: Blast Radius; Media Agency: MediaCom; Execution and Interactivity: The Media Merchants

Video: Gesture Interactive Window Display at a Repetto Store in France, by the Marcel Agency
http://www.repetto.com/boutique/home
Chanel Interactive Windows
"Some of the interactive shopping windows conceived & deployed by Vanksen (www.vanksen.com) for Chanel Coco-mademoiselle perfume relaunch."
\
LM3Labs' Ubiq'window
Ubiq'window installation from Nicolas Loeillot on Vimeo.
Hamleys Interactive Summer Window Display - One Off, Kingston University

Interactive Window Shopping: Just Wave Your Hands -Science Daily, 1/17/11

(Credit: Fraunhofer HHI)
EyeTouch Interactive Window Solution (Eyeclick)
Interactive Window Projection- Diesel
Interactive Catalog Window, by YDreams
Windows 7 Phone Launch: Interactive Window Display
Yourwindow, Vision2Watch
Interactive window from Vision2Watch on Vimeo.
Please leave a comment and a link of any other interactive window displays for my "collection"!
Interactive Window Concept made for the module Advanced Interface Design at Hyper Island hosted by North Kingdom. Shot with a RED ONE camera at 120fps - Viralmente
Scion Interactive Window Campaign - Monster Media
Starbuck's Interactive Window Display PSFK Salon, 2/7/11
Starbucks Interactive Storefront from The Media Merchants on Vimeo.
Creative Agency: Blast Radius; Media Agency: MediaCom; Execution and Interactivity: The Media Merchants

Video: Gesture Interactive Window Display at a Repetto Store in France, by the Marcel Agency
http://www.repetto.com/boutique/home
Chanel Interactive Windows
"Some of the interactive shopping windows conceived & deployed by Vanksen (www.vanksen.com) for Chanel Coco-mademoiselle perfume relaunch."
\
LM3Labs' Ubiq'window
Ubiq'window installation from Nicolas Loeillot on Vimeo.
Hamleys Interactive Summer Window Display - One Off, Kingston University

Interactive Window Shopping: Just Wave Your Hands -Science Daily, 1/17/11

(Credit: Fraunhofer HHI)
EyeTouch Interactive Window Solution (Eyeclick)
Interactive Window Projection- Diesel
Interactive Catalog Window, by YDreams
Windows 7 Phone Launch: Interactive Window Display
Yourwindow, Vision2Watch
Interactive window from Vision2Watch on Vimeo.
Please leave a comment and a link of any other interactive window displays for my "collection"!
Posted by
Lynn Marentette
Labels:
blast radius,
chanel,
hamleys,
hyper island,
lm3labs,
Marcel Agency,
media merchants,
mediacom,
Repetto,
scion,
starbucks,
viralmente,
zenith
1 comment:
Mar 2, 2011
Video: Gesture Interactive Window Display at a Repetto Store in France, by the Marcel Agency
The video below features an interactive window created by the Marcel agency for a Repetto store in France. The display allows passers-by the ability to interact with the content in the display through arm gestures via motion sensors.
Repetto - Vitrine interactive. from Marcel on Vimeo.
Repetto - Vitrine interactive. from Marcel on Vimeo.
Posted by
Lynn Marentette
Labels:
dance,
gesture interaction,
interactive,
Marcel Agency,
motion sensor,
NUI,
Repetto
No comments:
Feb 26, 2011
Why bother switching from GUI to NUI? - Asked and Answered by Josh Blake; My 2-cents; Stevie B’s Microsoft Research Video; Marco Silva’s NUI-HCI Presentation (and links)
In Chapter 1 of Natural User Interfaces in .NET, Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction:
Why bother switching from GUI to NUI? The answer? Read Chapter 1 (pdf) of the book - the chapter is free.
Here are a few of my personal reasons:
1. I want to buy the next version of the iPad or something like it.
2. I want to buy a new large-screen Internet HD TV.
People with special needs and/or health concerns, and the people who care and guide them.
Knowledge sharers and (life-long) learners....
RELATED
Why bother switching from GUI to NUI? The answer? Read Chapter 1 (pdf) of the book - the chapter is free.
Here are a few of my personal reasons:
1. I want to buy the next version of the iPad or something like it.
2. I want to buy a new large-screen Internet HD TV.
3. I want to buy a Kinect.
4. I do NOT want to interact with my new TV with a Sony remote. Too many tiny buttons!
5. I do NOT want to interact with my new TV with a keyboard, because it reminds me of...work.
6. Most importantly:
I want to design apps for the people I care about, and others with similar needs:
My mom.
My grandson.
Moms and dads with kids in tow.People with special needs and/or health concerns, and the people who care and guide them.
Knowledge sharers and (life-long) learners....
RELATED
"Smart" Interactive Display Research
View more presentations from Marco Silva
My YouTube Playlist:
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more... "
Interactive Multimedia Technology Post:
Interactive Touch-Screen Technology, Participatory Design, and "Getting It": Revisited
Interactive Touch-Screen Technology, Participatory Design, and "Getting It": Revisited
RELATED - and somewhat related
Encyclopedia: Human Computer Interaction, Interaction Design, User Experience, Information Architecture, Usability and More (Interaction-Design.org)
"Blake.NUI is a collection of helpful controls, utilities, and samples useful for multi-touch and NUI development with WPF, Surface, and Silverlight."
OpenKinect Community Founder Joshua Blake Comes Up With His Own Version of Win 7 Multitouch on Kinect (Madhav K, KinectHacks 1/14/11)
Kinect SDK Could Open New Worlds for Motion-Controlled Interfaces (Mike Martin, TechNews World, 2/22/11)
(This is not an inclusive list.)
GUI to NUI Post-WIMP Manifesto: TBA
GUI to NUI Post-WIMP Manifesto: TBA
Posted by
Lynn Marentette
Labels:
design,
HCI,
interaction,
Josh Blake,
kinect,
Marco Silva,
microsoft research,
NUI,
openkinect,
post Wimp,
programming,
seth sandler,
Stevie B,
usability
No comments:
Feb 24, 2011
NCSU D.H. Hill Library has a Perceptive Pixel Multitouch Display (updated)
NCSU 'sandbox' lets students touch the technology
Jay Price, Charlotte Observer, 2/22/11

Credit: Corey Lowenstein - clowenst@newsobserver.com
RELATED
Technology Sandbox introduces NCSU students to latest gadgets
WRAL Techwire, 8/25/10
New Learning Spaces for New Learning Styles (pdf)
Terry B. Hill, Mohan Ramaswamy, NCSU
HUNT LIBRARY
North Carolina State University Libraries
Perceptive Pixel
"The Perceptive Pixel display wall, which was switched on this week, is the $100,000 centerpiece of the new Technology Sandbox at the D.H. Hill Library. The sandbox is a room of cutting-edge, interactive hardware that's aimed partly at familiarizing students with the latest high-tech hardware, particularly large displays and gesture-based computing. It was paid for with a federal grant from the Institute of Museum and Library Services that is administered by the State Library of North Carolina."

Credit: Corey Lowenstein - clowenst@newsobserver.com
RELATED
Technology Sandbox introduces NCSU students to latest gadgets
WRAL Techwire, 8/25/10
New Learning Spaces for New Learning Styles (pdf)
Terry B. Hill, Mohan Ramaswamy, NCSU
HUNT LIBRARY
Perceptive Pixel
Posted by
Lynn Marentette
Feb 13, 2011
Wii Just Dance2 and Kinect Dance Central: UI and Usability Approaches; Challenges for Developing Accessible Games (revised)
I love to dance- I studied dance through college, and off and on as an adult. I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust. I'm ready for new challenges.
I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.
So what do the wise men of usability have to say about new ways of interacting with games and other applications?
"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen
Jakob Nielsen, one of the godfathers of usability, shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions. Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.
If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post. Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability
The screen shots below show how the movement icons are used in the game:
I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game. (I don't know if there was anyone consulted about accessibility concerns for the game.)
Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't. Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:
Dance Central Full Motion Preview
In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen. The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing. Players can select dances according to level of difficulty.
Kinect Usability with Regular People
Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:
RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.
So what do the wise men of usability have to say about new ways of interacting with games and other applications?
"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen
Jakob Nielsen, one of the godfathers of usability, shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions. Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.
If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post. Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability
Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years. Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities. For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers. There are students who have milder mobility challenges who love to dance, and the current games don't address their needs. Some of my students have vision or hearing impairments, too. They deserve a chance to play things designed for the Kinect.
"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."
Note: I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs. We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii, and soon (we hope), a Kinect. If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.
FIRST STEPS
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to move and dance, let alone access fast-paced dance games on the Wii or Kinect.
My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm. Where would I start?
Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.
I decided to take a look at Just Dance2's MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting in less movement, and less participation with peers in physical activities, such as playing dance games. If a teen has depression as part of this mix, we know that exercise can help, and a fun dance game might be a life-saver, in more ways than one.
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to move and dance, let alone access fast-paced dance games on the Wii or Kinect.
My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm. Where would I start?
Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.
I decided to take a look at Just Dance2's MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting in less movement, and less participation with peers in physical activities, such as playing dance games. If a teen has depression as part of this mix, we know that exercise can help, and a fun dance game might be a life-saver, in more ways than one.
The screen shots below show how the movement icons are used in the game:
I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game. (I don't know if there was anyone consulted about accessibility concerns for the game.)
Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't. Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:
Dance Central Full Motion Preview
In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen. The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing. Players can select dances according to level of difficulty.
Kinect Usability with Regular People
Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study". The quote below is from the Steve's article:
"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.
We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.
Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
- Allow users to make selections through positive gestures, rather than timed positions
- Place options on a single axis to make them easier and quicker to select
- Allow users to control menus with the game pad if they prefer
- Use large easy to read text
- Don’t make users scroll through options unnecessarily – it takes too long
- Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
- Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"
Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:
RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
Kexa MacDonald, 10/19/10
Posted by
Lynn Marentette
Labels:
accessibility,
Dance Central,
Games for Health,
Just Dance2,
kinect,
NUI design,
openkinect,
quora,
review,
usability,
video,
Wii
No comments:
Feb 3, 2011
Interactions (ACM) Cover Article- "Proxemic Interactions: The New Ubicomp?" - Plus - Close encounters with displays at the airport and JC Penney

Today I came home from and found one of my favorite publications, Interactions, on my kitchen counter. Imagine my excitement when I saw that the cover article for the Jan-Feb issue was "Proxemic Interactions: The New Ubicomp?". (pdf) The authors of this article are Saul Greenberg, Nicolai Marquardt, Till Ballendat, Rob Diaz-Marino, and Miaosen Wang, from the University of Calgary. (A list of some of the articles from the current issue of Interactions can be found at the end of this post.)
On the topic of proxemic interactions.... I welcome the research that is germinating in this area, and agree with the authors of Interaction's cover article that there is much work to be done in this space!
I've been searching for great examples of interactive displays in public spaces. Since there is much room for improvement in this field, much of what I share on this blog is a bit disheartening. Most of my user-unfriendly encounters with large displays happen when I'm in an airport, mall, health care facility, or a cruise ship. Things are improving, but at a slow pace. The best work I've seen so far has been in museums.
A few weeks ago I wrote a post about my nearly comical interaction with a large display that featured a QR tag that simply would NOT cooperate with my smartphone's tag reader. What were they thinking when they decided that the best place for a QR tag on a large display should be at knee level?Interactive Display with QR Tag: Close Encounter at the Orlando Airport
There is more to this story! On my way home, I had a few hours to kill at the Orlando airport. I stopped by a convenience store the Orlando airport, and discovered the non-static display below. I didn't notice the display the first time I was in the store that day.
I spotted another QR tag!

There were a few problems with this set-up. First, the display was located on a wall close to a ceiling, making it very inconvenient to scan with my phone. The page with the QR tag didn't stay up long enough for me to attempt to scan it, although waited around until it cycled back in view- three times. I gave up and took a picture instead, standing on my tip-toes with my arms extended over my head, the same position required to scan the QR tag, if it would stay put!
Another problem was that the display was perched above a high-traffic area, right above the refrigerated drinks. The only place to scan the QR tag was in a narrow aisle, behind the throngs of thirsty travelers making a bee-line to the cold drink section of the store. I was in the way of a number of people in this bee-line.
What were they thinking when they decided that the best place this large display featuring a QR tag should be several feet above eye-level in this particular location?

The displays in front of the store were located on either side of the doorway, at "body" level. This is a logical place to put a display. Potential customers can view the displays without getting in the way of other travelers.
I didn't notice QR tags on these displays. QR tags linking to web-based reviews of books I'd like to purchase would be useful in this scenario.
Non-static displays outside of the store:


PART II: Interactive Touch Screen Display Encounter at JC Penney
Interactive displays are cropping up everywhere. The display featured below is part of the Find More at JC Penney campaign. The following is a quote from a JC Penney press release:
"JC Penney has announced the launch of a new mobile initiative that connects mobile commerce with social and instore media applications that include location-based check-in offers and enhancements to its mobile apps. JCPenney's mobile initiatives are part of the company's investment in its overall digital platform, which also includes jcp.com, social media, and its in-store FindMore™ smart fixture."
"The smart fixture integrates JCPenney's online and offline stores by providing customers with a 52-inch interactive touch screen experience that allows them to view and purchase items from jcp.com while in a JCPenney store. Customers can also scan a bar code to get product details, access recommendations, or email data about an item of interest to themselves or friends."
Can you find the large interactive touch screen display? After shopping around in the mall, I decided to go to JC Penney and check out the sales. Below is what I passed as I entered the store. I went up and down the main aisle a few times before I noticed a large interactive display along a side aisle.

The stack of red holiday boxes obstructed my view:

I didn't notice the display the first few times I walked in the other direction. From the back, the display looked like a poster.

When I finally discovered the display, I was impressed with the slick graphics and stylish models.

I wasn't so impressed by the placement of the display. To interact with the screen, I had to stand in a narrow aisle, in the path of a steady stream of slow-walking shoppers, often moving in small groups, including parents with babies in strollers.
I wasn't impressed by the confusing array of input methods on the display. I noticed that the light blue square in the lower left-hand corner of the display is designed to provide access for people with disabilities. I couldn't figure it out.

Accessibility: There is a need for research in this area!

I shot video of my interaction with the screen, but found that I was in the way of shoppers, so I stopped. Not to worry. Below is a video taken by Mike Cearly, shared on his 11th Screen blog post, "Out and About: JC Penney's "Find More" Touch Screen". (Mike's impressions of this display were similar to mine and worth reading if this is an area that interest you.)
Mike uses his 11th Screen Scorecard to evaluate the technology he encounters in public spaces. I am updating a similar rubric to use when evaluating displays in public spaces. It is important to look at multiple dimensions, such as the physical placement of the display, weather and lighting conditions, how the display interoperates with mobile devices or a system of displays, how the content is presented/accessed across channels and devices, and how users share the experience with others.
Components of the 11th Screen Scorecard -Mike Cearly
Purpose: "What is the purpose of the solution? Is it to drive awareness? Acquisition? Loyalty? What is the brand trying to accomplish in this medium?"
Drama: "Does the solution make a big impact on the user? Does it make them stop and interact?"
Usability: "Can the user navigate through the experience with ease? Are the paths to information intuitive? There's also an element of functions, too, but I think that is much more subjective. Do the functions enhance the user experience?"
Interactivity: "How does the user interact with it? Is it gesture based? Is it touch-based? Can the user interact with it through any other enabling technology?"
Information: "How much and what kind of content is available for the user to interact with? Generally speaking, the more information and the different formats of information, the better."
Personalization: "What level of personalization does the experience provide?"
COMMENT
I've been obsessed with large displays and the amazing potential they hold for supporting all sorts of interactions - collaboration, planning, creating, information sharing, playing, viewing, communicating, learning, shopping, way-finding, artistic expression.
I'm also drawn to smaller screens and how we use our mobile devices- smartphones, iphones, e-readers, digital cameras, etc., to interact with the displays we encounter in public spaces. We are just beginning to figure out how technology can effectively - and seamlessly-support interaction between people who are co-located as well as located in different places.
INTERACTIONS Volume XVIII.1
Here is a list of some articles from the current issue of Interactions:
The (re) usability of everyday computational things: why industrial design will be the new industrial design (Roel Vertegaal)
Visual analytics and human-computer-computer interaction (Richard Arias-Hernandez, John Dill, Brian Fisher, Tera Marie Green)
The Cloud (Yue Pan, Eli Blevis)
Beyond interfaces and flows: abstractions for mapping organic architectures (Davide Bolchini, Adam Neddo)
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)





