Showing posts sorted by date for query NUI. Sort by relevance Show all posts
Showing posts sorted by date for query NUI. Sort by relevance Show all posts

Nov 4, 2011

Le Chal: Smart shoes for the visually impaired, with haptic//vibrational feedback, sensors and GPS


Le Chal is a haptic/vibrational feedback shoe designed for the blind and visually impaired, on of the projects of Anirudh Sharma, a member of the NUI Group (Natural User Interface).  Le Chal was conceptualized at the 2011 MIT Media Lab Design and Innovation Workshop at COEP.
Take me there Shoe/ Le Chal

Shoe for visually impaired - Le Chal Himanshu Khanna, 10/31/11

RELATED
touchaddict blog

May 18, 2011

CHI 2011, Bill Buxton, and the Buxton Collection: Explore 35 years of interactive devices, online!

Bill Buxton is a researcher at Microsoft who focuses on Human-Computer Interaction and is known for his work in user experience design and natural user interfaces such as multi-touch surfaces.   His talk at the recent CHI 2011 conference, held in Vancouver, Canada, was an overview of the Buxton Collection, an on-line historical archive of interactive input devices spanning over the past 35 years.  


It was interesting to note that at the time of the presentation, the Vancouver Conference Center, where the conference was taking place, was having serious problems with the network/internet connections, and as a consequence, Buxton was not able to demonstrate the on-line version of his collection as planned.  

Not to worry.  The physical version of the Buxton's archive was on display during the conference, along with Buxton, who was happy to tell the story behind every device and gadget in the archive, with much enthusiasm. The slideshow below provides a glimpse of the Bill Buxton archive displayed at CHI 2011:



My Buxton Collection Slideshow, CHI 2011, Vancouver, Canada

Buxton's archive of gadgets comes with a rich history, accumulated over the years. Much of this history, until now, has resided in Buxton's head.   Holding and touching the items in the archive while listening to Buxton's passionate stories about each one, was unlike anything I had ever experienced.  His archive is a labor of love, and the interactive, on-line version of the Buxton Collection is his way of sharing his knowledge with the world.


During his talk, Buxton pointed out that in computer science programs, students are not required to have much exposure to the "history of ideas" in the field.  Huge chunks of work are often ignored in the literature,  and in some cases, the wheel is unknowingly reinvented, and this is something that must be addressed within the CHI community, according to Buxton. 


I agree.

RELATED
Previous IMT posts about Bill Buxton
Bill Buxton's Presentation Video: "A Little Tale about Touch" (Microsoft Worldwide Partner Conference, 2010)
Two good articles by Bill Buxton: The Mad Dash Towards Touch Technology; The Long Nose of Innovation
Buxton Collection
Buxton Collection, PivotViewer
Back to the Past: Bill Buxton Shows Off 36 Years of Tech Devices
Microsoft News Center, 5/9/11
Microsoft's Bill Buxton exhibits gadget collection 35 years in the making
Donald Melanson, engadget, 5/9/11
Bill Buxton's Haptic Input References (pdf)
Bill Buxton's website
Multi-touch Systems that I Have Known and Loved (Bill Buxton)
CES 2010: NUI with Bill Buxton

On Engineering and Design: An Open Letter Microsoft Research Principal Scientist Bill Buxton calls for engineers and user experience designers to learn to appreciate one another
Bill Buxton, Bloomberg Businessweek, 4/29/09

Apr 26, 2011

Multi-touch and Gesture Interaction News and Updates You Might Have Missed (Part I)

Over the past couple of months, I've come across many interesting links related to multi-touch and gesture interaction, but I haven't had time to devote a thoughtful post to each one.  "Part I",  is a nice collection of experimental, commercial, and non-commercial efforts by a variety of creative technologists, with a smattering of industry news that might be of interest to IMT readers. 


Ideum's MT55 HD Multitouch Table 4/19/11

New MT55 HD Multitouch Table Now Shipping,  Jim Spadaccini, Ideum Blog 4/11/11

Smithsonian American Art Museum to Open Education Center  Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center.  The center was funded by an anonymous $8 million dollar gift.)

Bill Buxton, Microsoft Research, 4/7/11 - Includes lots of pictures, links to videos, and more information of what might be the first touch-screen.  Also see Bill Buxton's companion website, Multi-Touch Systems that I have Known and Loved, updated on 3/21/11.  Bill Buxton knows all (almost!)


"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler

(Also check out NodeBeat, a multi-touch music/audio sequencer/generator app by Seth Sandler and Justin Windle)

Intuilab, 4/13/11
"IntuiLab, a global leader in surface computing software applications, today announced support for the revolutionary Microsoft Kinect device across its full line of IntuiFace products and solutions including IntuiFace Presentation and IntuiFace Commerce...Microsoft Kinect brings distant gesture control to interactive solutions. These gesture controls allow users to interact with displayed digital assets from a distance at their own pace and path – for example, browsing through a large quantity of products in a store catalog or manipulating 3D models (such as a mobile phone) – all without having to actually touch the screen..."  -IntuiLab (Take a look at the IntuiLab team- an interactive page!)




Sparkon:  Videos and links related to multi-touch and gesture-based applications



Official Kinect SDK to be Open SourceJosh Blake, Deconstructing the NUI, 4/18/11  
9 This bit of news excited me, but don't get your hopes up. If anyone knows what will happen with the Kinect SDK, please leave a comment.)
"Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea." -Josh Blake

Kenrick Kin, Tom Miller, Bjoern Bollensdorff, Tony DeRose, Bjoern Hartmann, Manees Agrawala (Pixar Online Library)

Flight Race Game on 3DFeel lm3Labs, 4/18/11


JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."


Harry van der Veen's Multitouch Blog (NUITEQ)


Stantum "Unlimited Multi-Touch" Latest News

At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11 

Immersive Labs

Hard Rock Cafe International Using NextWindow Touch Screens:  "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)

Razorfish: Thoughts on MIX 11 ,James Ashley, Razorfish Blog, 4/20/11  Also see: Razorfish Lab's Prototypes




"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen."  -Multitouchfi  Also see the Multitouch website.



Big Size Multitouch Display Turned into a MicroscopeMicroscopy-News, 3/28/11
Mac OX 10.7 Lion: new multi-touch gestures, Dock integration for Expose, Launchpad, Mission Control Appleinsider, 4/14/11


Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11


3M Touch Systems's YouTube Channel

Social Mirror 3D Gestural Display, Now Using Kinnect:  SnibbeInteractive




Apr 3, 2011

INTERACTIVE TABLETOPS AND SURFACES 2011 CALL FOR PAPERS AND NOTES (6th Annual ACM ITS 2011 - Nov. 13-16)

Thanks to Johannes Schöning for sharing this information!

CALL FOR PAPERS AND NOTES
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011
ITS 2011
November 13-16, 2011
Portopia Hotel, Kobe, Japan
The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications.

Building on their success in previous years, ITS again features Papers and Notes presentations, as well as tutorials, posters, demonstrations tracks and a doctoral symposium. ITS 2011 will also include workshops.

ITS 2011 will bring together top researchers and practitioners who are interested in both the technical and human aspects of ITS technology. On behalf of the conference organizing committee, we invite you to begin planning your submissions and participation for this year's conference!

CALL FOR PARTICIPATION: PAPERS AND NOTES
The use of interactive surfaces is an exciting and emerging research area. Display technologies, such as projectors, LCD and OLED flat panels, and even flexible display substrates, coupled with input sensors capable of enabling direct interaction, make it reasonable to envision a not-so-distant future in which many of the common surfaces in our environment will function as digital interactive displays. ITS brings together researchers and practitioners from a variety of backgrounds and interests, such as camera and projector based systems, new display technologies, multi-touch sensing, user interface technologies, augmented reality, computer vision, multimodal interaction, novel input and sensing technologies, computer supported cooperative work (CSCW), and information visualization.

The intimate size of this single-track symposium provides an ideal venue for leading researchers and practitioners to exchange research results and experiences. We encourage submissions on (but not limited to) the following topic areas as they relate to interactive tabletops and surfaces:

* Gesture-based interfaces
* Multi-modal interfaces
* Tangible interfaces
* Novel interaction techniques
* Data handling/exchange on large interactive surfaces
* Data presentation on large interactive surfaces
* Software engineering methods
* Computer supported collaborative work
* Middleware and network support
* Virtual reality and augmented reality
* Social protocols
* Information visualizations
* Hardware, including sensing and input technologies with novel capabilities
* Human-centered design & methodologies
* Applications

CONFERENCE COMMITTEE
Conference Co-Chairs
Jun Rekimoto, The University of Tokyo / SonyCSL, Japan
Hideki Koike, University of Electro-Communications, Japan
Kentaro Fukuchi, Meiji University, Japan

Program Co-Chairs
Yoshifumi Kitamura, Tohoku University, Japan
Daniel Wigdor, University of Toronto, Canada

SUBMISSIONS
Papers/Notes: We invite paper submissions of two kinds: Papers (10 pages) and Notes (4 pages). Papers must present original, highly innovative, prospective and forward-looking research, possibly in one or more of the themes given above. Notes must also report novel and complete research, but where the scope and scale of the contribution is more focused and succinct than papers. Submissions must be submitted as a single PDF file in the ACM format through the submission system. A template for submissions can be found on the ITS

All accepted submissions will be presented at ITS 2011 and appear in the ITS digital proceedings and be archived in the ACM digital library.

IMPORTANT DATES
Paper/Note Submissions: June 30, 2011
Paper/Note Author Notifications: September 2, 2011
Paper/Note Camera-Ready Deadline: October 21, 2011

The calls for the posters, demos, tutorials, workshops and the doctoral symposium will be published soon, please check our website for further information.

RELATED
ITS 2010 Awards

Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by  Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers




Who's Who
ITS 2010Conference General Chairs 
Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH 

ITS 2010 Organization Committee

Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
 
Brave NUI World: Designing User Interfaces for Touch and Gesture

Description: Description: Book Cover


Note: This post includes links, please let me know of anything I should correct.

Feb 26, 2011

Why bother switching from GUI to NUI? - Asked and Answered by Josh Blake; My 2-cents; Stevie B’s Microsoft Research Video; Marco Silva’s NUI-HCI Presentation (and links)

In Chapter 1 of Natural User Interfaces in .NET,  Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction: 


Why bother switching from GUI to NUI?  The answer?  Read Chapter 1 (pdf) of the book - the chapter is free.


Here are a few of my personal reasons:  
1.  I want to buy the next version of the iPad or something like it.
2.  I want to buy a new large-screen Internet HD TV.
3.  I want to buy a Kinect.
4.  I do NOT want to interact with my new TV with a Sony remote.  Too many tiny buttons!


5. I do NOT want to interact with my new TV with a keyboard,  because it reminds me of...work.

6.  Most importantly: 

I want to design apps for the people I care about, and others with similar needs:
    My mom.  
    My grandson.
Moms and dads with kids in tow.
People with special needs and/or health concerns, and the people who care and guide them.
Knowledge sharers and (life-long) learners....

RELATED

"Smart" Interactive Display Research

 
View more presentations from Marco Silva

My YouTube Playlist:
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more... "
RELATED - and somewhat related   
Encyclopedia:  Human Computer Interaction, Interaction Design, User Experience, Information Architecture, Usability and More (Interaction-Design.org)

Josh Blake's Blog: Deconstructing the NUI    Book: Chapter 1 (pdf)  Free!
Blake.NUI
"Blake.NUI is a collection of helpful controls, utilities, and samples useful for multi-touch and NUI development with WPF, Surface, and Silverlight."
 (This is not an inclusive list.)


GUI to NUI Post-WIMP Manifesto:  TBA

Feb 13, 2011

Wii Just Dance2 and Kinect Dance Central: UI and Usability Approaches; Challenges for Developing Accessible Games (revised)

I love to dance- I studied dance through college, and off and on as an adult.   I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust.  I'm ready for new challenges.


I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.


So what do the wise men of usability have to say about new ways of interacting with games and other applications?


"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen


Jakob Nielsen, one of the godfathers of usability,  shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions.  Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.


If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post.  Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability


Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years.  Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities.  For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers.  There are students who have milder mobility challenges who love to dance, and the current games don't address their needs.  Some of my students have vision or hearing impairments, too.  They deserve a chance to play things designed for the Kinect.

"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."

Note:  I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs.   We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii,  and soon (we hope), a Kinect.   If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.

FIRST STEPS
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to  move and dance, let alone access fast-paced dance games on the Wii or Kinect. 


My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm.  Where would I start?


Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.


I decided to take a look at Just Dance2's  MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting  in less movement, and less participation with peers in physical activities, such as playing dance games.  If a teen has depression as part of this mix, we know that exercise can help,  and a fun dance game might be a life-saver, in more ways than one.




The screen shots below show how the movement icons are used in the game:








I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game.  (I don't know if there was anyone consulted about accessibility concerns for the game.)


Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't.  Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:

Dance Central Full Motion Preview


In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen.  The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing.  Players can select dances according to level of difficulty. 


Kinect Usability with Regular People

Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study".  The quote below is from the Steve's article:

"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.

We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.

Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
  1. Allow users to make selections through positive gestures, rather than timed positions
  2. Place options on a single axis to make them easier and quicker to select
  3. Allow users to control menus with the game pad if they prefer
  4. Use large easy to read text
  5. Don’t make users scroll through options unnecessarily – it takes too long
  6. Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
  7. Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"

Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:







RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
Kexa MacDonald, 10/19/10







Feb 2, 2011

"Undercover User Experience" by Cennydd Bowles and James Box: Charlotte UX Book Club Skype Session and Discussion

I just got back from a meeting of the Charlotte UX Book Club.  The group meets in the loft office of the Charlotte branch of Atlanta-based Macquarium,  and provides a great opportunity for like-minded folks to socialize and toss around informed ideas.


Tonight's meeting featured Cennydd Bowles and James Box, the authors of Undercover User Experience.  They joined the group via Skype, on a large screen TV.  After a minute or so, it was as if they were right in the room, even though they were in the UK.  What a delightful experience!


Cennydd and James both work at Clearleft, a team of designers and creative technologies that provide user-experience consulting.  They are bright, insightful, and know their stuff.  The book packs in a lot of good information in an easy-to-read, concise, and thought provoking manner.  The author's address tools of the trade within the context of how work gets done across disciplines in organizations.  


In my opinion, this book would be useful to people working on traditional projects as well as those who focus NUI design & development for emerging technologies.


BTW: Cennyd Bowles will be the closing plenary speaker at the IA Summit 2011 in March.




"Deliverables are a step on the journey, not the end of the line" -Undercover UX Design





Here are a couple of quotes from the Amazon.com editorial reviews:

  Product Description

"Once you catch the user experience bug, the world changes. Doors open the wrong way, websites don't work, and companies don't seem to care. And while anyone can learn the UX remedies usability testing, personas, prototyping and so on unless your organization 'gets it', putting them into practice is trickier. Undercover User Experience is a pragmatic guide from the front lines, giving frank advice on making UX work in real companies with real problems. Readers will learn how to fit research, ideation, prototyping and testing into their daily workflow, and how to design good user experiences under the all-too-common constraints of time, budget and culture."

  From the Back Cover

"Once you catch the user experience bug, the world changes. Doors open the wrong way, websites don't work, and companies don't seem to care. And while anyone can learn the UX remedies: usability testing, personas, prototyping and so on unless your organization 'gets it', putting them into practice is tricky. Undercover User Experience is a pragmatic guide from the front lines, giving frank advice on making UX work in real companies with real problems. Readers will learn how to fit research, ideation, prototyping and testing into their daily workflow, and how to design good user experiences under the all-too-common constraints of time, budget and culture." 
RELATED