Showing posts sorted by date for query off-the-desktop. Sort by relevance Show all posts
Showing posts sorted by date for query off-the-desktop. Sort by relevance Show all posts

Nov 6, 2012

Jeff Han Discusses Windows 8 for Large Displays, Microsoft's New Multi-touch, Ink & Pen Tech; Experience Design for Large Displays (2012 Microsoft Build)

Earlier this year, Microsoft bought Jeff Han's company, founder of Perceptive Pixel.  Jeff Hans now works for the Microsoft Office division of the company, where there is a push to bring new and improved large multi-touch displays to the corporate world. With this off-the-desktop move, developers are challenged to develop applications for Windows 8 that will be geared towards supporting collaboration in corporate settings.

Jeff Han and Nathan Fish were at Microsoft's "Build" conference recently  to give a presentation about the nuts and bolts of the new technology - how it works, and suggestions for design.  Although the video presentation is over an hour long, it is worth setting aside some time to watch! 

New multi-touch, ink, and pen technologies that can revolutionize your apps on any size of screens (10/30/12, Microsoft Build)



Here is the description of the presentation:
"Develop solutions that flow with your customers business - starting with individuals and information workers on their phones, desks or virtual offices, moving into conference rooms, brainstorm areas, boardrooms and common areas.  Learn how Microsoft's new pen and touch capabilities, software and hardware will enable you to develop products that will help businesses be more efficient.  Have a sneak peek on what our large multi-touch displays (82' and 55') can bring to your solutions." -Microsoft Build

Jeff Han emphasized that touch interaction is everywhere and is a fundamental requirement in applications. Windows 8 apps should be designed for all screen sizes, saying "It's all about collaboration of people and a collaboration of devices that gets your work done....this is the real thing, and it is about to happen.  Devices (referring to large displays) are coming... we do have a mechanism for early access hardware."

During the second half of the video (38:00), Nathan Fish discusses guidelines for designing experiences on large displays, including optimization, handling various interactions via pen, touch, and gestures, the use of application bars and context menus, and more.

At the beginning of his presentation, Nathan Fish briefly discusses the uncharted territory of designing for collaborative large displays, and tells the audience that the guidelines are just suggestions, " nothing is written, we are figuring this out as we go."

My fear is that Windows 8 developers might not have the inclination to think past the smaller screens of smartphones and tablets, resulting in touchy-feely variations of PowerPoint and other Office applications that have been around for too long.

RELATED
Microsoft Welcomes Perceptive Pixel to the Office Division
The Official Microsoft Blog, 7/30/12


What happened to the Surface Table?
I was surprised that Jeff Han's company was not folded into PixelSense, formally known as Surface, the division of Microsoft that focuses on large multi-touch tables. In my opinion, the tables had some potential to support collaborative work in the corporate world.  It will be interesting to see how things unfold!  

PixelSense
Take a look at the video clip below for more information:
 

Discover the Samsung SUR40 (with PixelSense)
Samsung SUR40 with Microsoft PixelSense
PixelSense Development Partners
Insights on Interactive Tabletops: A Survey of Researchers and Developers (pdf)
Hrvoje Benko, Meredith Ringel Morris, A.J. Bernheim Brush, and Andrew D. Wilson 2009, Microsoft Research

SOMEWHAT RELATED
I've been interested in large screen displays and how they interact with smaller devices for a very long time.  In 2007, I was taking graduate courses at UNC-Charlotte and planned to pursue further research and work related to this area.   At the time, I prepared a paper to present at a conference, but was unable to present due a serious health emergency in my family.  The abstract is below:


"Large screen displays support the collaboration of two or more people, especially 
when the nature of the content is visual.  These displays, in the form of walls, windows, 
boards, and touch-tables, will become ubiquitous in the future, and can be found in workplaces, educational institutions, museums, and other public spaces. New display systems have been developed that allow for co-located communication between groups of people, and also allow for collaboration with individuals in remote locations, using mobile computing or communication devices.  This paper will provide an overview of the technologies that support  this new way of communication, discuss the challenges involved in 
establishing "interoperability" within these new systems, and examine the possibilities for 
the future."

If this topic interests you, take a look at one of my Flickr collections:

Ubiquitous Sightings of Urban Screens: Interactive Displays in Public Spaces, Digital Signage, and Screens of All Sizes

Description:

"All kinds of displays are cropping up in shopping centers, malls, banks, stores, airports, schools, businesses, and meeting places. What is the purpose of all of these displays?

Displays and mobile devices, everywhere we go.

It doesn't look like we've really figured out how to harness the potential of the technology that surrounds us."





Jul 29, 2012

Blast from the 2009 past: News, Videos, and Links about Multi-touch and Screen Technologies

One of the things I like to do is share updates about the world of multimedia, multi-touch, gesture, screen, surface, and interactive technologies, focusing on off-the-desktop applications and systems. When I started this blog, I had to put forth quite a bit of effort just to FIND interesting things to blog about.  


These days, there are so many sources that focus on emerging - and now commonplace- interactive technologies, my main challenge is to filter the noise.  Where do I begin?


My archives are vast.   I randomly picked the year 2009 and came across one of my previous posts, "News, Videos, and Links about Multitouch and Screen Technologies."   The post is long, and contains a number of videos and links that probably will be of value to a future curator of the history of technology.


I welcome comments from readers who might be able to help me update information about various applications and systems I've featured on this blog in the past. 

The pictures are screenshots from the results of an  image search for "interactivemultimediatechnology".  Over the past 6 years, I've posted quite a few!








Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Sep 8, 2011

Update, plus iGaze app by Dunedin Multimedia for use during social skills group activities

So what am I up to now?  


I'd like to share with my readers that I've decided to continue in my present position as a school psychologist, while still devoting a portion of my free time to technology. From time-to-time I think deep thoughts about usability, accessibility,  and UX/Interaction related to off-the-desktop interactive multimedia applications running on screens of all sizes.  I'm hoping to create a few multimedia experiments using HTML5 and JavaScript, and explore jQuery if and when I can find the time!  


For the present school year, my main school is a program for students with more significant disabilities, including autism spectrum disorders.  My second school is a magnet high school for technology and the arts,  located on the same campus.  I also consult throughout the district on cases involving students who have suffered traumatic brain injuries, as well as students who have multiple disabilities.  I am thankful that I have a job in a school district that values 21st Century technology.  


I'm looking forward to another technology-rich school year.  I've spent some of the time I usually devote to blogging devoted exploring iPad apps instead.  Since I'm new to the world of iPads, I'm still in discovery mode.  What an adventure!   


There are plenty of educational apps out there, and many of them are suitable for students with special needs.  On the other hand, there is much room for improvement - across all iPad app categories.  Since there is very little research about what makes up a killer app- or suite of apps- for students with special needs, experimenting with  iPad apps is uncharted territory. 



I made the decision to bring my personal iPad2 to work after I discovered a number of apps that I thought would be useful in my work as a school psychologist with students who have special needs, including autism spectrum disorders.  

One of my intervention themes this year focuses on social skills.   This is especially important for students who participate in our schools community-based job training program.  I'm using some content from Unique Learning's transition materials,  as well as on-line activities from Do2Learn's JobTips website, because my aim is to facilitate social skills that will be useful in a variety of job and community settings.   


Although my main technology tool for working with groups is the SMARTBoard,  I've found that using a combination of interactive whiteboard and iPad activities to be especially effective.  I'm paving the way for more role-play activities in the future, and attempting to use technology to my advantage.


This past week, I used the iGaze app, created by Dunedin Multimedia, to help a group of high-school level students practice establishing and maintaining eye gaze, something that is difficult for most of them to demonstrate "in-person".  I was amazed.  Each student was excited to take his or her turn.  Even more amazing?  When each student took a turn, the other students looked at their eyes and faces.  No one rocked or "stimmed".  No one made noises.  I observed several instances of joint attention, much to my delight.   


Below is a video from Dunedin Multimedia's YouTube channel that is similar to what the students viewed during their group activity:
Here is some information from Dunedin Multimedia about the iGaze app:
"Eye contact is important to communication and social development, and yet the impaired ability to make and maintain eye contact is one of the most striking aspects of autism. iGaze is an eye contact simulator that can help to build confidence in using this important means of nonverbal social communication.  The app also contains information on eye contact and eye gaze, with links to relevant research."

During the social skills activity involving the iGaze app,  I used the SMART Board to display a large picture of a boss and a worker standing face to face, making eye contact, engaged in conversation.  The picture served as an anchor to remind the students of pictures and videos they'd previously viewed that illustrated the concept of face-to-face interaction and the importance of establishing eye-contact with others from time-to-time.


I'm hoping I will be able to access the YouTube videos from Dunedin so I can use them on the SMART Board. It will be interesting to see how this plays out!   I'm also planning to take a closer look at Dunedin Multimedia's emotion x app for the iPad.


RELATED 
Screen-shot of iGaze for the iPad Dunedin:
iPad Screenshot 1
SOMEWHAT RELATED
The SMART Table at my school was updated today - I'm looking forward to using it for some group activities, now that it is back in working order and has new applications loaded up and ready to go!


If you are interested in learning more about technology related to students with special needs, be sure to check out Kate Ahern's blog, Teaching Learners with Multiple Special Needs

Kate's post about the features of Unique Learning Systems.


Upcoming:  more about tablets, interview with folks from Stantum, social-skills game-in-progress.....large displays in public spaces update....

Feb 13, 2011

Wii Just Dance2 and Kinect Dance Central: UI and Usability Approaches; Challenges for Developing Accessible Games (revised)

I love to dance- I studied dance through college, and off and on as an adult.   I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust.  I'm ready for new challenges.


I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.


So what do the wise men of usability have to say about new ways of interacting with games and other applications?


"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen


Jakob Nielsen, one of the godfathers of usability,  shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions.  Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.


If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post.  Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability


Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years.  Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities.  For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers.  There are students who have milder mobility challenges who love to dance, and the current games don't address their needs.  Some of my students have vision or hearing impairments, too.  They deserve a chance to play things designed for the Kinect.

"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."

Note:  I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs.   We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii,  and soon (we hope), a Kinect.   If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.

FIRST STEPS
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to  move and dance, let alone access fast-paced dance games on the Wii or Kinect. 


My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm.  Where would I start?


Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.


I decided to take a look at Just Dance2's  MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting  in less movement, and less participation with peers in physical activities, such as playing dance games.  If a teen has depression as part of this mix, we know that exercise can help,  and a fun dance game might be a life-saver, in more ways than one.




The screen shots below show how the movement icons are used in the game:








I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game.  (I don't know if there was anyone consulted about accessibility concerns for the game.)


Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't.  Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:

Dance Central Full Motion Preview


In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen.  The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing.  Players can select dances according to level of difficulty. 


Kinect Usability with Regular People

Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study".  The quote below is from the Steve's article:

"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.

We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.

Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
  1. Allow users to make selections through positive gestures, rather than timed positions
  2. Place options on a single axis to make them easier and quicker to select
  3. Allow users to control menus with the game pad if they prefer
  4. Use large easy to read text
  5. Don’t make users scroll through options unnecessarily – it takes too long
  6. Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
  7. Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"

Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:







RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
Kexa MacDonald, 10/19/10







Jan 28, 2011

"Microsoft is Imagining a NUI Future". You can, too!

Microsoft is Imagining a NUI Future
Steve Clayton, Next at Microsoft Blog, 1/26/11


"Our research shows that the vast majority of people polled in both developed and emerging markets see great potential for NUI applications beyond entertainment. This is especially true in China and India, where 9 out of 10 respondents indicate they are likely to use NUI technology across a range of lifestyle areas – from work, education and healthcare, to social connections, entertainment and the environment. We believe that taking technology to the next billion can be aided by NUI – making technology more accessible and more intuitive to a wider audience". - Steve Clayton, Microsoft


The people at Microsoft don't own the concept!  I'm a member of the NUI Group (May, 2007) and SparkOn.  Both are on-line communities where you can find people who live and breathe NUI, learn about their work, and even share designs and code. If you are intrigued by NUI - as a designer, developer, or user, please join us.


Note: 
I've been an evangelist and cheerleader for the NUI cause for many years.  If you search this blog for "post-WIMP", "NUI", "multi-touch", "gesture", "off-the-desktop""natural user interaction", "natural user interface", or even "DOOH", you'll be provided with an overwhelming number of posts that include videos, photographs, and links to NUI-related resources, including scholarly articles.  There is a small-but-growing number of people from many disciplines, quietly working on NUI-related projects.


RELATED
Microsoft Plans a Natural Interface Future Full of Gestures, Touchscreens, and Haptics
Kit Eaton, Fast Company, 1/26/112
Rethinking Computing (video)
Craig Mundie, Microsoft
Interactive Touch-Screen Technology, Participatory Design, and "Getting It" - Revised
Touch Screen Interaction in Public Spaces:  Room for Improvement, if "every surface is to be a computer".

Jan 15, 2011

Saturday Quick Links: Digital Signage and Digital Out of Home (DOOH), via The Digital Signage Insider, Digital Signage Today, and more

The links:

Walmart and Microsoft Talk Up Digital Signage at the NRF
Bill Gerba, The Digital Signage Insider, 1/14/11
Wirespring


NRF: Walmart unveils ROI data for in-store digital signage campaigns
James Bickers, Digital Signage Today, 1/10/11


"C-Tailing" (Converged Retailing)  - NCR
NRF: Convergent technologies on the retail horizon
Cherrhy Butler, Digital Signage Today, 1/11/11

RELATED
Connecting Your Business to Devices and Customers with Windows Embedded

View more presentations from Microsoft Windows Embedded.
Barb Edson, Sr. Director of Marketing, Windows Embedded



Here are a few resources related to this topic:


(LocaModa blog)

LocaModa's whitepapers
Blogs:

COMMENT
Designers and developers need to think about off-the-desktop technologies as a new form of the web/internet.  Information architects who understand interactive media/transmedia, cross-display/device, cross-platform, and interaction design within a broader context are sorely needed in this space.  


The presentation below, by Chris Thorne, Lead Information Architect and User Experience Consultant working for the BBC, provides a good overview about this topic:
Over the past months I've been gathering video and photos of my encounters and interactions with digital signage as a consumer/customer/user during my every-day activities such as shopping, traveling, vacationing, and so forth, which will be included in a post or series of posts in the near future.   


I have content related to interaction with various QR tags,  interactive kiosks, interactive displays and TV on a cruise ship, an interactive touch screen at J.C. Penny, and more.  If you are a regular reader of this blog, you know that my interests include user experience of DOOH, interactive displays in public spaces, and so forth.  


Note:  Despite all of the technological innovations in this converging field, issues related to context, usability, and accessibility are not consistently addressed from a broader systems point of view.

Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them.