Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Apr 3, 2011

INTERACTIVE TABLETOPS AND SURFACES 2011 CALL FOR PAPERS AND NOTES (6th Annual ACM ITS 2011 - Nov. 13-16)

Thanks to Johannes Schöning for sharing this information!

CALL FOR PAPERS AND NOTES
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011
ITS 2011
November 13-16, 2011
Portopia Hotel, Kobe, Japan
The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications.

Building on their success in previous years, ITS again features Papers and Notes presentations, as well as tutorials, posters, demonstrations tracks and a doctoral symposium. ITS 2011 will also include workshops.

ITS 2011 will bring together top researchers and practitioners who are interested in both the technical and human aspects of ITS technology. On behalf of the conference organizing committee, we invite you to begin planning your submissions and participation for this year's conference!

CALL FOR PARTICIPATION: PAPERS AND NOTES
The use of interactive surfaces is an exciting and emerging research area. Display technologies, such as projectors, LCD and OLED flat panels, and even flexible display substrates, coupled with input sensors capable of enabling direct interaction, make it reasonable to envision a not-so-distant future in which many of the common surfaces in our environment will function as digital interactive displays. ITS brings together researchers and practitioners from a variety of backgrounds and interests, such as camera and projector based systems, new display technologies, multi-touch sensing, user interface technologies, augmented reality, computer vision, multimodal interaction, novel input and sensing technologies, computer supported cooperative work (CSCW), and information visualization.

The intimate size of this single-track symposium provides an ideal venue for leading researchers and practitioners to exchange research results and experiences. We encourage submissions on (but not limited to) the following topic areas as they relate to interactive tabletops and surfaces:

* Gesture-based interfaces
* Multi-modal interfaces
* Tangible interfaces
* Novel interaction techniques
* Data handling/exchange on large interactive surfaces
* Data presentation on large interactive surfaces
* Software engineering methods
* Computer supported collaborative work
* Middleware and network support
* Virtual reality and augmented reality
* Social protocols
* Information visualizations
* Hardware, including sensing and input technologies with novel capabilities
* Human-centered design & methodologies
* Applications

CONFERENCE COMMITTEE
Conference Co-Chairs
Jun Rekimoto, The University of Tokyo / SonyCSL, Japan
Hideki Koike, University of Electro-Communications, Japan
Kentaro Fukuchi, Meiji University, Japan

Program Co-Chairs
Yoshifumi Kitamura, Tohoku University, Japan
Daniel Wigdor, University of Toronto, Canada

SUBMISSIONS
Papers/Notes: We invite paper submissions of two kinds: Papers (10 pages) and Notes (4 pages). Papers must present original, highly innovative, prospective and forward-looking research, possibly in one or more of the themes given above. Notes must also report novel and complete research, but where the scope and scale of the contribution is more focused and succinct than papers. Submissions must be submitted as a single PDF file in the ACM format through the submission system. A template for submissions can be found on the ITS

All accepted submissions will be presented at ITS 2011 and appear in the ITS digital proceedings and be archived in the ACM digital library.

IMPORTANT DATES
Paper/Note Submissions: June 30, 2011
Paper/Note Author Notifications: September 2, 2011
Paper/Note Camera-Ready Deadline: October 21, 2011

The calls for the posters, demos, tutorials, workshops and the doctoral symposium will be published soon, please check our website for further information.

RELATED
ITS 2010 Awards

Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by  Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers




Who's Who
ITS 2010Conference General Chairs 
Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH 

ITS 2010 Organization Committee

Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
 
Brave NUI World: Designing User Interfaces for Touch and Gesture

Description: Description: Book Cover


Note: This post includes links, please let me know of anything I should correct.

Mar 27, 2011

Multitouch and Gesture News (Part I) : NUITEQ Updates

In this post, I'd like to share some news from NUITEQ, a privately held multi-touch software technology that is based in Sweden.  NUITEQ is known for Snowflake Suite, a multi-touch application that is used in a variety of situations.  


"NUITEQ is as happy as a kid in a candy shop to announce that it has released version upgrade 1.9.4 of its award-winning, industry leading multi-touch software product Snowflake Suite" - Harry van der Veen, NUITEQ


The screen shot below gives a nice view of the range of applications that included in Snowflake Suite:
http://www.nuiteq.com/images/menu01.png


-Photo from NUITEQ


DEMONSTRATION OF SNOWFLAKE SUITE




RELATED
NUITEQ's CTO Mikael Bauer nominated for the Mer-IT award in the category IT-person of the year
NUITEQ Awarded CONNECT Norr Company of the Year 2011
NUITEQ's Partners
From the NUITEQ Gallery, via Harry van der Veen

Mar 16, 2011

Cute TV Commercial! Intel Gesture Recognition Technology

This was the first television commercial I saw on TV this morning. I had to stop my morning routine to watch it : )



RELATED
From ChannelIntel: Interview of Francis MacDougall, CTO of GestureTek

Mar 4, 2011

Window Shopping in the Web Outside: Interactive Window Displays (videos and links)

This is one of my favorite interactive window display projects:

Interactive Window Concept made for the module Advanced Interface Design at Hyper Island hosted by North Kingdom. Shot with a RED ONE camera at 120fps - Viralmente

Scion Interactive Window Campaign - Monster Media


Starbuck's Interactive Window Display PSFK Salon, 2/7/11

Starbucks Interactive Storefront from The Media Merchants on Vimeo.
Creative Agency:  Blast Radius;  Media Agency: MediaCom; Execution and Interactivity: The Media Merchants














Video: Gesture Interactive Window Display at a Repetto Store in France, by the Marcel Agency

http://www.repetto.com/boutique/home

Chanel Interactive Windows

"Some of the interactive shopping windows conceived & deployed by Vanksen (www.vanksen.com) for Chanel Coco-mademoiselle perfume relaunch."
\
LM3Labs' Ubiq'window

Ubiq'window installation from Nicolas Loeillot on Vimeo.


Hamleys Interactive Summer Window Display - One Off, Kingston University
Hamleys interactive summer window display
Interactive Window Shopping: Just Wave Your Hands -Science Daily, 1/17/11

(Credit: Fraunhofer HHI)

EyeTouch Interactive Window Solution (Eyeclick)


Interactive Window Projection- Diesel


Interactive Catalog Window, by YDreams


Windows 7 Phone Launch: Interactive Window Display


Yourwindow, Vision2Watch

Interactive window from Vision2Watch on Vimeo.


Please leave a comment and a link of any other interactive window displays for my "collection"!

Mar 2, 2011

Video: Gesture Interactive Window Display at a Repetto Store in France, by the Marcel Agency

The video below features an interactive window created by the Marcel agency for a Repetto store in France.  The display allows passers-by the ability to interact with the content in the display through arm gestures via motion sensors.  


Repetto - Vitrine interactive. from Marcel on Vimeo.

Feb 26, 2011

Why bother switching from GUI to NUI? - Asked and Answered by Josh Blake; My 2-cents; Stevie B’s Microsoft Research Video; Marco Silva’s NUI-HCI Presentation (and links)

In Chapter 1 of Natural User Interfaces in .NET,  Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction: 


Why bother switching from GUI to NUI?  The answer?  Read Chapter 1 (pdf) of the book - the chapter is free.


Here are a few of my personal reasons:  
1.  I want to buy the next version of the iPad or something like it.
2.  I want to buy a new large-screen Internet HD TV.
3.  I want to buy a Kinect.
4.  I do NOT want to interact with my new TV with a Sony remote.  Too many tiny buttons!


5. I do NOT want to interact with my new TV with a keyboard,  because it reminds me of...work.

6.  Most importantly: 

I want to design apps for the people I care about, and others with similar needs:
    My mom.  
    My grandson.
Moms and dads with kids in tow.
People with special needs and/or health concerns, and the people who care and guide them.
Knowledge sharers and (life-long) learners....

RELATED

"Smart" Interactive Display Research

 
View more presentations from Marco Silva

My YouTube Playlist:
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more... "
RELATED - and somewhat related   
Encyclopedia:  Human Computer Interaction, Interaction Design, User Experience, Information Architecture, Usability and More (Interaction-Design.org)

Josh Blake's Blog: Deconstructing the NUI    Book: Chapter 1 (pdf)  Free!
Blake.NUI
"Blake.NUI is a collection of helpful controls, utilities, and samples useful for multi-touch and NUI development with WPF, Surface, and Silverlight."
 (This is not an inclusive list.)


GUI to NUI Post-WIMP Manifesto:  TBA

Feb 24, 2011

NCSU D.H. Hill Library has a Perceptive Pixel Multitouch Display (updated)

NCSU 'sandbox' lets students touch the technology
Jay Price, Charlotte Observer, 2/22/11



"The Perceptive Pixel display wall, which was switched on this week, is the $100,000 centerpiece of the new Technology Sandbox at the D.H. Hill Library. The sandbox is a room of cutting-edge, interactive hardware that's aimed partly at familiarizing students with the latest high-tech hardware, particularly large displays and gesture-based computing. It was paid for with a federal grant from the Institute of Museum and Library Services that is administered by the State Library of North Carolina."


Credit: Corey Lowenstein - clowenst@newsobserver.com

RELATED
Technology Sandbox introduces NCSU students to latest gadgets
WRAL Techwire, 8/25/10
New Learning Spaces for New Learning Styles (pdf)
Terry B. Hill, Mohan Ramaswamy,  NCSU

HUNT LIBRARY



North Carolina State University Libraries
Perceptive Pixel

Vision-Based Hand-Gesture Applications: Video from Communications of the ACM



The latest edition of Communications of the ACM, via "snail mail", was the inspiration for this post:



Vision-Based Hand-Gesture Applications
Juan Pablo Wachs, Mathias Kolsch, Helman Stern and Yael Edan

"Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing."


More to come!

Feb 13, 2011

Wii Just Dance2 and Kinect Dance Central: UI and Usability Approaches; Challenges for Developing Accessible Games (revised)

I love to dance- I studied dance through college, and off and on as an adult.   I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust.  I'm ready for new challenges.


I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.


So what do the wise men of usability have to say about new ways of interacting with games and other applications?


"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen


Jakob Nielsen, one of the godfathers of usability,  shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions.  Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.


If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post.  Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability


Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years.  Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities.  For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers.  There are students who have milder mobility challenges who love to dance, and the current games don't address their needs.  Some of my students have vision or hearing impairments, too.  They deserve a chance to play things designed for the Kinect.

"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."

Note:  I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs.   We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii,  and soon (we hope), a Kinect.   If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.

FIRST STEPS
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to  move and dance, let alone access fast-paced dance games on the Wii or Kinect. 


My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm.  Where would I start?


Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.


I decided to take a look at Just Dance2's  MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting  in less movement, and less participation with peers in physical activities, such as playing dance games.  If a teen has depression as part of this mix, we know that exercise can help,  and a fun dance game might be a life-saver, in more ways than one.




The screen shots below show how the movement icons are used in the game:








I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game.  (I don't know if there was anyone consulted about accessibility concerns for the game.)


Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't.  Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:

Dance Central Full Motion Preview


In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen.  The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing.  Players can select dances according to level of difficulty. 


Kinect Usability with Regular People

Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study".  The quote below is from the Steve's article:

"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.

We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.

Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
  1. Allow users to make selections through positive gestures, rather than timed positions
  2. Place options on a single axis to make them easier and quicker to select
  3. Allow users to control menus with the game pad if they prefer
  4. Use large easy to read text
  5. Don’t make users scroll through options unnecessarily – it takes too long
  6. Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
  7. Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"

Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:







RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
Kexa MacDonald, 10/19/10







Feb 3, 2011

Interactions (ACM) Cover Article- "Proxemic Interactions: The New Ubicomp?" - Plus - Close encounters with displays at the airport and JC Penney

Cover Image

Today I came home from and found one of my favorite publications, Interactions, on my kitchen counter.  Imagine my excitement when I saw that the cover article for the Jan-Feb issue was  "Proxemic Interactions:  The New Ubicomp?". (pdf)  The authors of this article are Saul Greenberg, Nicolai Marquardt, Till Ballendat, Rob Diaz-Marino, and Miaosen Wang, from the University of Calgary. (A list of some of the articles from the current issue of Interactions can be found at the end of this post.)


On the topic of proxemic interactions.... I welcome the research that is germinating in this area, and agree with the authors of Interaction's cover article that there is much work to be done in this space!   


I've been searching for great examples of interactive displays in public spaces.  Since there is much room for improvement in this field, much of what I share on this blog is a bit  disheartening.   Most of my user-unfriendly encounters with large displays happen when I'm in an airport, mall, health care facility, or a cruise ship.  Things are improving, but at a slow pace. The best work I've seen so far has been in museums.  


Info display at the Orlando airport with a QR tag.A few weeks ago I wrote a post about my nearly comical interaction with a large display that featured a QR tag that simply would NOT cooperate with my smartphone's tag reader.   What were they thinking when they decided that the best place for a QR tag on a large display should be at knee level?
Interactive Display with QR Tag:  Close Encounter at the Orlando Airport

There is more to this story!  On my way home,  I had a few hours to kill at the Orlando airport. I stopped by a convenience store the Orlando airport, and discovered the non-static display below. I didn't notice the display the first time I was in the store that day.


I spotted another QR tag!

There were a few problems with this set-up.  First, the display was located on a wall close to a ceiling, making it very inconvenient to scan with my phone.  The page with the QR tag didn't stay up long enough for me to attempt to scan it, although waited around until it cycled back in view-  three times.  I gave up and took a picture instead, standing on my tip-toes with my arms extended over my head, the same position required to scan the QR tag, if it would stay put!


Another problem was that the display was perched above a high-traffic area, right above the  refrigerated drinks.  The only place to scan the QR tag was in a narrow aisle, behind the throngs of thirsty travelers making a bee-line to the cold drink section of the store.  I was in the way of a number of people in this bee-line.


What were they thinking when they decided that the best place this large display featuring a QR tag should be several feet above eye-level in this particular location?  

The displays in front of the store were located on either side of the doorway, at "body" level. This is a logical place to put a display.  Potential customers can view the displays without getting in the way of other travelers.


I didn't notice QR tags on these displays.  QR tags linking to web-based reviews of books I'd like to purchase would be useful in this scenario.


Non-static displays outside of the store:





PART II: Interactive Touch Screen Display Encounter at JC Penney

Interactive displays are cropping up everywhere.  The display featured below is part of the Find More at JC Penney  campaign. The following is a quote from a JC Penney press release: 

"JC Penney has announced the launch of a new mobile initiative that connects mobile commerce with social and instore media applications that include location-based check-in offers and enhancements to its mobile apps. JCPenney's mobile initiatives are part of the company's investment in its overall digital platform, which also includes jcp.com, social media, and its in-store FindMore™ smart fixture."


"The smart fixture integrates JCPenney's online and offline stores by providing customers with a 52-inch interactive touch screen experience that allows them to view and purchase items from jcp.com while in a JCPenney store. Customers can also scan a bar code to get product details, access recommendations, or email data about an item of interest to themselves or friends."

Can you find the large interactive touch screen display?    After shopping around in the mall, I decided to go to JC Penney and check out the sales.  Below is what I passed as I entered the store.  I went up and down the main aisle a few times before I noticed a large interactive display along a side aisle.  



The stack of red holiday boxes obstructed my view:



I didn't notice the display the first few times I walked in the other direction. From the back, the display looked like a poster. 
  
When I finally discovered the display, I was impressed with the slick graphics and stylish models. 



I wasn't so impressed by the placement of the display. To interact with the screen, I had to stand in a narrow aisle, in the path of a steady stream of slow-walking shoppers, often moving in small groups, including parents with babies in strollers. 


I wasn't impressed by the confusing array of input methods on the display.  I noticed that the light blue square in the lower left-hand corner of the display is designed to provide access for people with disabilities.   I couldn't figure it out.

Accessibility:  There is a need for research in this area!

The accessibility feature was confusing to use. The virtual trackball activated a curser on the screen, but the curser did not accurately map the virtual trackball movements. This section of the screen was located on the lower left hand corner, which would be convenient for someone in a wheelchair to use. The placement of the "accessibility window" was too low for someone who might to interact with the screen in an accessible manner but don't happen to need a wheelchair.


I shot video of my interaction with the screen, but found that I was in the way of shoppers, so I stopped.  Not to worry.  Below is a video taken by Mike Cearly, shared on his 11th Screen blog post, "Out and About: JC Penney's "Find More" Touch Screen".   (Mike's impressions of this display were similar to mine and worth reading if this is an area that interest you.)




Mike uses his 11th Screen Scorecard to evaluate the technology he encounters in public spaces.  I am updating a similar rubric to use when evaluating displays in public spaces.   It is important to look at multiple dimensions, such as the physical placement of the display, weather and lighting conditions,  how the display interoperates with mobile devices or a system of displays,  how the content is presented/accessed across channels and devices, and how users share the experience with others.

Components of the 11th Screen Scorecard  -Mike Cearly
Purpose: "What is the purpose of the solution? Is it to drive awareness? Acquisition?  Loyalty?  What is the brand trying to accomplish in this medium?"
Drama: "Does the solution make a big impact on the user? Does it make them stop and interact?"
Usability: "Can the user navigate through the experience with ease? Are the paths to information intuitive?  There's also an element of functions, too, but I think that is much more subjective.  Do the functions enhance the user experience?"
Interactivity:  "How does the user interact with it?  Is it gesture based? Is it touch-based? Can the user interact with it through any other enabling technology?"
Information: "How much and what kind of content is available for the user to interact with?  Generally speaking, the more information and the different formats of information, the better."
Personalization: "What level of personalization does the experience provide?"


COMMENT
I've been obsessed with large displays and the amazing potential they hold for supporting all sorts of  interactions -  collaboration, planning, creating, information sharing, playing, viewing, communicating, learning, shopping,  way-finding, artistic expression.  

I'm also drawn to smaller screens and how we use our mobile devices- smartphones, iphones, e-readers, digital cameras, etc.,  to interact with the displays we encounter in public spaces.   We are just beginning to figure out how technology can effectively - and seamlessly-support interaction between people who are co-located as well as located in different places.

INTERACTIONS Volume XVIII.1 
Here is a list of some articles from the current issue of Interactions:
The Cloud (Yue Pan, Eli Blevis)