In this post, I'd like to share some news from NUITEQ, a privately held multi-touch software technology that is based in Sweden. NUITEQ is known for Snowflake Suite, a multi-touch application that is used in a variety of situations.
"NUITEQ is as happy as a kid in a candy shop to announce that it has released version upgrade 1.9.4 of its award-winning, industry leading multi-touch software product Snowflake Suite" - Harry van der Veen, NUITEQ
The screen shot below gives a nice view of the range of applications that included in Snowflake Suite:
This is one of my favorite interactive window display projects:
Interactive Window Concept made for the module Advanced Interface Design at Hyper Island hosted by North Kingdom. Shot with a RED ONE camera at 120fps - Viralmente
"Some of the interactive shopping windows conceived & deployed by Vanksen (www.vanksen.com) for Chanel Coco-mademoiselle perfume relaunch."
\ LM3Labs' Ubiq'window
The video below features an interactive window created by the Marcel agency for a Repetto store in France. The display allows passers-by the ability to interact with the content in the display through arm gestures via motion sensors.
In Chapter 1 of Natural User Interfaces in .NET, Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction:
Why bother switching from GUI to NUI? The answer? Read Chapter 1 (pdf) of the book - the chapter is free.
Here are a few of my personal reasons: 1. I want to buy the next version of the iPad or something like it. 2. I want to buy a new large-screen Internet HD TV.
3. I want to buy a Kinect.
4. I do NOT want to interact with my new TV with a Sony remote. Too many tiny buttons!
5. I do NOT want to interact with my new TV with a keyboard, because it reminds me of...work.
6. Most importantly:
I want to design apps for the people I care about, and others with similar needs:
My mom.
My grandson.
Moms and dads with kids in tow. People with special needs and/or health concerns, and the people who care and guide them. Knowledge sharers and (life-long) learners....
"The Perceptive Pixel display wall, which was switched on this week, is the $100,000 centerpiece of the new Technology Sandbox at the D.H. Hill Library. The sandbox is a room of cutting-edge, interactive hardware that's aimed partly at familiarizing students with the latest high-tech hardware, particularly large displays and gesture-based computing. It was paid for with a federal grant from the Institute of Museum and Library Services that is administered by the State Library of North Carolina."
I love to dance- I studied dance through college, and off and on as an adult. I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust. I'm ready for new challenges.
I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.
So what do the wise men of usability have to say about new ways of interacting with games and other applications?
"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen
Jakob Nielsen, one of the godfathers of usability, shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions. Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.
If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post. Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability
Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years. Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities. For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers. There are students who have milder mobility challenges who love to dance, and the current games don't address their needs. Some of my students have vision or hearing impairments, too. They deserve a chance to play things designed for the Kinect.
"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."
Note: I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs. We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii, and soon (we hope), a Kinect. If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.
FIRST STEPS Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to move and dance, let alone access fast-paced dance games on the Wii or Kinect.
My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm. Where would I start?
Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.
I decided to take a look at Just Dance2's MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting in less movement, and less participation with peers in physical activities, such as playing dance games. If a teen has depression as part of this mix, we know that exercise can help, and a fun dance game might be a life-saver, in more ways than one.
The screen shots below show how the movement icons are used in the game:
I thought it would be useful to learn more about the story behind the making of JustDance2. At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game. (I don't know if there was anyone consulted about accessibility concerns for the game.)
Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't. Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:
Dance Central Full Motion Preview
In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen. The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing. Players can select dances according to level of difficulty.
Kinect Usability with Regular People
Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study". The quote below is from the Steve's article:
"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.
We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.
Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
Allow users to make selections through positive gestures, rather than timed positions
Place options on a single axis to make them easier and quicker to select
Allow users to control menus with the game pad if they prefer
Use large easy to read text
Don’t make users scroll through options unnecessarily – it takes too long
Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"
Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:
Today I came home from and found one of my favorite publications, Interactions, on my kitchen counter. Imagine my excitement when I saw that the cover article for the Jan-Feb issue was "Proxemic Interactions: The New Ubicomp?". (pdf) The authors of this article are Saul Greenberg, Nicolai Marquardt, Till Ballendat, Rob Diaz-Marino, and Miaosen Wang, from the University of Calgary. (A list of some of the articles from the current issue of Interactions can be found at the end of this post.)
On the topic of proxemic interactions.... I welcome the research that is germinating in this area, and agree with the authors of Interaction's cover article that there is much work to be done in this space!
I've been searching for great examples of interactive displays in public spaces. Since there is much room for improvement in this field, much of what I share on this blog is a bit disheartening. Most of my user-unfriendly encounters with large displays happen when I'm in an airport, mall, health care facility, or a cruise ship. Things are improving, but at a slow pace. The best work I've seen so far has been in museums.
A few weeks ago I wrote a post about my nearly comical interaction with a large display that featured a QR tag that simply would NOT cooperate with my smartphone's tag reader. What were they thinking when they decided that the best place for a QR tag on a large display should be at knee level? Interactive Display with QR Tag: Close Encounter at the Orlando Airport
There is more to this story! On my way home, I had a few hours to kill at the Orlando airport. I stopped by a convenience store the Orlando airport, and discovered the non-static display below. I didn't notice the display the first time I was in the store that day.
I spotted another QR tag! There were a few problems with this set-up. First, the display was located on a wall close to a ceiling, making it very inconvenient to scan with my phone. The page with the QR tag didn't stay up long enough for me to attempt to scan it, although waited around until it cycled back in view- three times. I gave up and took a picture instead, standing on my tip-toes with my arms extended over my head, the same position required to scan the QR tag, if it would stay put!
Another problem was that the display was perched above a high-traffic area, right above the refrigerated drinks. The only place to scan the QR tag was in a narrow aisle, behind the throngs of thirsty travelers making a bee-line to the cold drink section of the store. I was in the way of a number of people in this bee-line.
What were they thinking when they decided that the best place this large display featuring a QR tag should be several feet above eye-level in this particular location? The displays in front of the store were located on either side of the doorway, at "body" level. This is a logical place to put a display. Potential customers can view the displays without getting in the way of other travelers.
I didn't notice QR tags on these displays. QR tags linking to web-based reviews of books I'd like to purchase would be useful in this scenario.
Non-static displays outside of the store:
PART II: Interactive Touch Screen Display Encounter at JC Penney
Interactive displays are cropping up everywhere. The display featured below is part of the Find More at JC Penneycampaign. The following is a quote from a JC Penney press release: "JC Penney has announced the launch of a new mobile initiative that connects mobile commerce with social and instore media applications that include location-based check-in offers and enhancements to its mobile apps. JCPenney's mobile initiatives are part of the company's investment in its overall digital platform, which also includes jcp.com, social media, and its in-store FindMore™ smart fixture."
"The smart fixture integrates JCPenney's online and offline stores by providing customers with a 52-inch interactive touch screen experience that allows them to view and purchase items from jcp.com while in a JCPenney store. Customers can also scan a bar code to get product details, access recommendations, or email data about an item of interest to themselves or friends."
Can you find the large interactive touch screen display? After shopping around in the mall, I decided to go to JC Penney and check out the sales. Below is what I passed as I entered the store. I went up and down the main aisle a few times before I noticed a large interactive display along a side aisle.
The stack of red holiday boxes obstructed my view:
I didn't notice the display the first few times I walked in the other direction. From the back, the display looked like a poster. When I finally discovered the display, I was impressed with the slick graphics and stylish models.
I wasn't so impressed by the placement of the display. To interact with the screen, I had to stand in a narrow aisle, in the path of a steady stream of slow-walking shoppers, often moving in small groups, including parents with babies in strollers.
I wasn't impressed by the confusing array of input methods on the display. I noticed that the light blue square in the lower left-hand corner of the display is designed to provide access for people with disabilities. I couldn't figure it out. Accessibility: There is a need for research in this area!
The accessibility feature was confusing to use. The virtual trackball activated a curser on the screen, but the curser did not accurately map the virtual trackball movements. This section of the screen was located on the lower left hand corner, which would be convenient for someone in a wheelchair to use. The placement of the "accessibility window" was too low for someone who might to interact with the screen in an accessible manner but don't happen to need a wheelchair.
I shot video of my interaction with the screen, but found that I was in the way of shoppers, so I stopped. Not to worry. Below is a video taken by Mike Cearly, shared on his 11th Screen blog post, "Out and About: JC Penney's "Find More" Touch Screen". (Mike's impressions of this display were similar to mine and worth reading if this is an area that interest you.)
Mike uses his 11th Screen Scorecard to evaluate the technology he encounters in public spaces. I am updating a similar rubric to use when evaluating displays in public spaces. It is important to look at multiple dimensions, such as the physical placement of the display, weather and lighting conditions, how the display interoperates with mobile devices or a system of displays, how the content is presented/accessed across channels and devices, and how users share the experience with others.
Components of the 11th Screen Scorecard -Mike Cearly Purpose: "What is the purpose of the solution? Is it to drive awareness? Acquisition? Loyalty? What is the brand trying to accomplish in this medium?" Drama: "Does the solution make a big impact on the user? Does it make them stop and interact?" Usability: "Can the user navigate through the experience with ease? Are the paths to information intuitive? There's also an element of functions, too, but I think that is much more subjective. Do the functions enhance the user experience?" Interactivity: "How does the user interact with it? Is it gesture based? Is it touch-based? Can the user interact with it through any other enabling technology?" Information: "How much and what kind of content is available for the user to interact with? Generally speaking, the more information and the different formats of information, the better." Personalization: "What level of personalization does the experience provide?"
COMMENT
I've been obsessed with large displays and the amazing potential they hold for supporting all sorts of interactions - collaboration, planning, creating, information sharing, playing, viewing, communicating, learning, shopping, way-finding, artistic expression.
I'm also drawn to smaller screens and how we use our mobile devices- smartphones, iphones, e-readers, digital cameras, etc., to interact with the displays we encounter in public spaces. We are just beginning to figure out how technology can effectively - and seamlessly-support interaction between people who are co-located as well as located in different places.
INTERACTIONS Volume XVIII.1
Here is a list of some articles from the current issue of Interactions:
"Our research shows that the vast majority of people polled in both developed and emerging markets see great potential for NUI applications beyond entertainment. This is especially true in China and India, where 9 out of 10 respondents indicate they are likely to use NUI technology across a range of lifestyle areas – from work, education and healthcare, to social connections, entertainment and the environment. We believe that taking technology to the next billion can be aided by NUI – making technology more accessible and more intuitive to a wider audience". - Steve Clayton, Microsoft
The people at Microsoft don't own the concept! I'm a member of the NUI Group (May, 2007) and SparkOn. Both are on-line communities where you can find people who live and breathe NUI, learn about their work, and even share designs and code. If you are intrigued by NUI - as a designer, developer, or user, please join us.
Note: I've been an evangelist and cheerleader for the NUI cause for many years. If you search this blog for "post-WIMP", "NUI", "multi-touch", "gesture", "off-the-desktop", "natural user interaction", "natural user interface", or even "DOOH", you'll be provided with an overwhelming number of posts that include videos, photographs, and links to NUI-related resources, including scholarly articles. There is a small-but-growing number of people from many disciplines, quietly working on NUI-related projects.