Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them. 




Dec 11, 2010

SMALLab Update: Embodied and Engaged Learning - ASU researchers partner with GameDesk

SMALLab is an interdisciplinary collaborative project at the Arts, Media and Engineering program at Arizona State University, and includes people from fields such as education, art, theatre, computer science, engineering, and psychology.  The SMALLab provides students with a multi-sensory, multi-modal way of learning concepts in an immersive environment, and uses a motion capture system that tracks the position of the students as they move and interact within the environment.

SMALLab's project lead is David Birchfield,  a media artist, researcher, and educator who focuses on K-12 learning, media art installations, and live computer music performances.  SMALLab researchers have recently partnered with GameDesk to develop a 6th grade curriculum for a GameDesk charter school in 2012. (Information and links related to GameDesk are located in the RELATED section of this post.)

Below is a detailed excerpt from an overview of SMALLab:
"In today’s world, digital technology must play a central role in students’ learning. A convergence of trends in the learning science and human-computer interaction (HCI) research offers new theoretical and technological frameworks for learning. in particular, mixed-reality, experiential media systems can support learning in a way that is social, collaborative, multimodal, and embodied. These systems comprise a new breed of student-centered learning environments [SCLE’s]. Importantly, they must address the practicalities of today’s classrooms and informal learning environments (eg.: space, infrastructure, financial resources) while embracing the innovative forms of interactivity that are emerging from our media research communities (eg: multimodal sensing, real time interactive media, context aware computing)...
...SMALLab is an extensible platform for semi-immersive, mixed-reality learning. By semi-immersive, we mean that the mediated space of SMALLab is physically open on all sides to the larger environment. Participants can freely enter and exit the space without the need for wearing specialized display or sensing devices such as head-mounted displays (HMD) or motion capture markers. Participants seated or standing around SMALLab can see and hear the dynamic media, and they can directly communicate with their peers that are interacting in the space. As such, the semi-immersive framework establishes a porous relationship between SMALLab and the larger physical learning environment. By mixed-reality, we mean that there is an integration of physical manipulation objects, 3D physical gestures, and digitally mediated components. By extensible, we mean that researchers, teachers, and students can create new learning scenarios in SMALLab using a set of custom designed authoring tools and programming interfaces."

Below are a few videos about SMALLab, and information about GameDesk, an organization that is collaborating with SMALLab in California.


Below is a demonstration of a Smallab learning activity:

SMALLab from SMALLab on Vimeo.


RELATED
Sara Corbett, NYTimes Magazine, 9/15/10

Info about GameDesk, from the GameDesk website:
"GameDesk is a 501(c)3 nonprofit research and outreach organization that seeks to reshape models for learning through game-play and game development. The organization looks to help close the achievement gap and engage students to learn core STEM curriculum. It develops project-based learning with a strong focus on purpose, ownership, and personal value. The organization (originally developed out of research and support at the University of Southern California's IMSC) has now been in development, practice, and/or evaluation for over two years in various schools in the Los Angeles area." -Gamedesk

Gamedesk Concept Chart

Cross-posted on the Tech Psych blog.

Very Cute! Department of Defense Acquisition Mini Learning Games

If you are the Department of Defense, how do you make sure your workers in the Acquisition Department engage in required learning activities?   


Games!  You can access the games via the Defense Acquisition University game portalBelow are some screenshots, descriptions, and links:


Procurement Fraud Indicators

"Investigate potential Procurement Fraud Indicators in this game which allows you to form hypotheses, test your theories, even question individuals who might have something to hide!" -CLC DAU

Homeward Bound
DoD Casual Games
"Join Ratner's friends and help guide him back to the Pentagon; across rivers, highways, and highly guarded walls using your knowledge of Acquisition Strategy and Contract Execution." -CLC DAU


Acquisition Proposition

"How well do you know the Acquisition Lifecycle? Test your knowledge in this fast paced game!" -CLC DAU


About the Defense Acquisition University
"The Defense Acquisition University is the one institution that touches nearly every member of the Defense Acquisition Workforce throughout all career stages. The university provides a full range of basic, intermediate, and advanced certification training, assignment-specific training, applied research, and continuous learning opportunities. The university also fosters professional development through mission assistance, rapid-deployment training on emerging acquisition initiatives, online knowledge-sharing tools, and continuous learning modules." - DAU Website


RELATED
Listen to the DoD Roundtable:  Interview and discussion about the casual learning games, featuring Dr. Alicia Sanchez, Games Czar, Defense Acquisition University
DoD Roundtable Transcript (pdf)
Defense video games perfectly capture excitement of acquisition process
Stephen Losey, Fedline, 12/10/10
DoD launches its own causal games site
milgamer, 12/8/10

Quick Post: Journey, the next game from thatgamecompany (developers of Flower, flOw, and Cloud).

I've been following the work of some of the people behind thatgamecompany since they were graduate students at USC, working on Cloud, an enchanting and relaxing game. They went on to develop Flower and flOw, and are now working on Journey, the next game planned for release:





To view video trailers of other games by thatgamecompany, see the following post:


Games to Lift Stress Away: Flower, flOw (and Cloud), from thatgamecompany

Also visit thatgamecompany's website!

Gesture "multitouch" 12 x 7 interactive video wall provides tours of I/O Data Centers' facilities

I came across this demonstration of I/O DataCenter's 12 by 7 foot interactive video wall that makes playing around with views of data center modules...interesting! The display is a gesture-based "multi-touch" system. (I'll update this post when I get more information.)



Here is the description from the Datacenter YouTube channel:


"Instead of hauling a 40-foot long modular data center to a trade show, i/o Data Centers is taking a high-tech approach to customer tours of their i/o Anywhere modular data center. The i/o team has created a 12-foot by 7-foot touchscreen video wall to provide interactive tours of the company's facilities. Selecting a "hot spot" pops up a virtual data center, complete with cross sections and product info, following the concept of the touch screens in the sci-fi movie "Minority Report.""


FYI: I/O Data Centers has an application that runs on the Surface.

UPCOMING:
Stay tuned for my upcoming posts! 


News about LM3LABS (Previous post)
Interactive Surveillance CCeline Latulipe (technologist) Annabel Manning (artist)

Dec 9, 2010

Interested in the OpenNI Initiative? OpenKinect? To learn more, read Josh Blake's Interview of Tamir Berliner of PrimeSense




Josh Blake, Deconstructing the NUI, 12/9/10



Josh Blake recently interviewed Tamir Berliner, one of the founders of PrimeSense.  If you haven't heard, Microsoft's Kinect was based on work by PrimeSense, and licensed their technology. PrimeSense provides consumer electronics with natural user interaction capabilities. The good news is that the company recently released open-sourced middleware for natural interaction and depth-camera drivers. It will be interesting to see how this will play in the near future!




In the interview, Tamir discussed a number of topics related to postWIMP technologies.  He also announced the newly created  OpenNI, "an industry-led, not-for-profit organization formed to certify compatibility and interoperability of Natural Interaction (NI) devices, applications, and middleware."   It is good to see this level of support for the cause!


Here is a quote from the interview that I especially liked:

"I believe that till today the devices we’ve been using, made us learn greatly lot about them before we could use them and gain their value. I’m pretty sure everyone who is reading this has got at least 3 remotes sitting on his living room table, and at least once a week needs to help someone use their computer/media center/phone/etc. It’s time for that to change and it’s up to us, the technologists to make this revolution happen, it’s time for the devices to take the step of understanding what we want and making sure we get that, even without asking if it’s a trivial task as opening a door when we approach, closing the lights when we leave the room, even making sure we have hot water to shower with when we return from work or wake up in the morning, depends on what we normally do." -Tamir


RELATED
Here are a couple of videos from the OpenNI website that demonstrate OpenNI-compliant applications:

OpenNI-compliant real time skelton tracking by PrimeSense


OpenNI-compliant real time SceneAnalyzer by PrimeSense



FYI: 
Josh Blake is the author of the Deconstructing the NUI blog. Over the past couple of years, he's explored natural user interfaces and interactions through his work on applications designed for Microsoft Surface and Win7 with Windows Presentation Foundation.
About a month ago, Josh organized OpenKinect, an on-line community to support collaboration among people interested in exploring ways to use Kinect with PCs and other devices.  An example of this effort is the open source code, libfreenect, which includes drivers and libraries for Windows, Linux, and OS X. 


The Natural User Interface Revolution
Josh Blake, 1/5/09


Kinect for Xbox 360: The inside story of Microsft's secret 'Project Natal'  (long, but worth reading) David Rowan, Wired UK, 10/29/10


People of libreenect

OpenNI User Guide (pdf)