Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts

Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Jul 18, 2008

Natural User Interface: Overview of multi-touch technology and application development by Harry van der Veen,- Business to Buttons

The image “http://www.multitouch.nl/sverige/smoke1.jpg” cannot be displayed, because it contains errors.The image “http://transfer.naturalui.com/cpc/P1000258.JPG” cannot be displayed, because it contains errors.

Harry van der Veen from Natural User Interface Europe AB, was one of the keynote speakers at the Business to Buttons: Designing for Effect conference, held in June 2008.
In this presentation video, Harry discusses the past, present, and future of multi-touch technology, and reviews the importance of multi-touch over single touch displays. He also provides a good overview of gesture interaction, something that he researched when he was a student. This presentation includes several video examples of multi-touch applications in action.

The presentation is well worth the 30-minute view!


"Harry van der Veen is a Bachelor of Multimedia, derived from the Dutch education Communication, Multimedia and Design, focused on Interaction Design and Project Management. He is CEO, co-founder and co-owner of the Sweden based commercial company Natural User Interface Europe AB, which focuses on delivering standardized and customized multi-touch hardware / software solutions and services to the global market. In addition to that, he co-founded the NUIGroup community, which is the worlds largest online platform where a global network of people share their ideas and information in an open source community, focused on multi-touch hardware and software solutions."

The image “http://nuigroup.com//images/nui.jpg” cannot be displayed, because it contains errors.
NUIGroup Community

The image “http://www.multitouch.nl/sverige/nuilogo.bmp” cannot be displayed, because it contains errors.
Harry van der Veen's blog

Natural User Interface Europe AB (Harry van der Veen's company)

NUIGroup Wiki: This wiki includes tutorials for developing multi-touch applications, building your own low-cost multi-touch table, and information about current projects that are in progress.

Related Information:


The Business to Buttons: Designing for Effect conference was held on June 12-12 in Malmo, Sweden, organized by Malmo University and inUse, a user experience consultancy. Partners in this conference included Adaptive Path, a product experience strategy and design company, Patrick W. Jordan, a design, marketing, and brand strategist, the cocktail, a user experience and interaction design studio, cooper, a product design company, and OresundIT, a non-profit network.


Don Norman, the author of books such as "Design of Everyday Things" and "The Design of Future Things", presented at this conference. Don Norman is one of the founding fathers of the Human-Computer Interaction and related fields, and is the co-founder of the Nielsen Norman Group, a consultant firm that helps company create human-centered products.

Videos of Don Norman's Presentations:
Emotional Design: Total User Experience
Cautious Cars and Cantankerous Kitchens

Other:
Business to Buttons 2008 Recorded Sessions

Business to Buttons 2008 Downloads

My posts about the work of NUI Group members:

Multi-Touch Plug-in for NASA World Wind?!

More Multitouch: NUI Group's Christopher Jette's multi-touch work featured in Engaget ; Croquet?

More Multi-Touch from members of the NUI group!

Multi-touch Crayon Physics from multitouch-barcelona, inspired by Crayon Physics by Kloonig Games

Cross Post: Seth Sandler's YouTube Video, "How to Make a Cheap Multi-touch Pad" goes viral

NUI-Group Member Bridger Maxwell Receives High School Science Fair Award for Multi-Touch Screen Project

Look, touch, listen, and play: Seth Sandler's interactive Audio Touch Table video; NUI Group and Google's Summer of Code

[nuiab.jpg]

Apr 8, 2009

Joel Eden's Informative Post: Designing for Multi-Touch, Multi-User and Gesture-Based Systems

Joel Eden is a User Experience Consultant at Infragistics- he recently wrote a detailed article/post in the Architecture & Design section of Dr. Dobbs Portal, "Designing for Multi-Touch, Multi-User and Gesture Based Systems". I thought I'd share the link, since I've been writing on the same topic.

In his article, Joel explains the differences between traditional WIMP (Window, Icon, Menue, Pointer) interaction and gesture, multi-touch, and multi-user systems. These systems are also known as Natural User Interfaces, or NUI. He recommends that "rather than trying to come up with new complicated ways to interact with digital objects, your first goal should be to try to leverage how people already interact with objects and each other when designing gesture based systems."

Joel goes on to outline UX (User Experience, IxD (Interaction Design), and HCI (Human-Computer Interaction) concepts that designers should consider when developing new systems, - Affordances, Engagement, Feedback, and "Don't Make Us Think"
, which he summarizes in the conclusion of his article.

I especially liked Joel's references:

Clark, Andy. Supersizing the Mind: Embodiment, Action, and Cognitive Extension

Few, Stephen. Information Dashboard Design: The Effective Visual Communication of Data

Gibson, John J. The Ecological Approach to Visual Perception

Krug, Steve. Don't Make Me Think: A Common Sense Approach to Web Usability, Second Edition

Norman, Don. The Design of Everyday Things

Norman, Don. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine

I would also add the following references:
Bill Buxton
Multi-touch Systems I have Known and Loved
(Regularly updated!)
Sketching User Experiences: Getting the Design Right and the Right Design

"Our lack of attention to place, time, function, and human considerations means these fancy new technologies fail to deliver their real potential to real people." - Bill Buxton

Dan Saffer
Designing for Interaction: Creating Smart Applications and Clever Devices
Designing Gestural Interfaces

SAP
Touchscreen Usability in Short
(Summary by Gerd Waloszek of the SAP Design Guild)
SAP Design Guild Resources (User-Centered Design, User Experience, Usability, UI Guidelines, Visual Design, Accessibility)
Kevin Arthur (Synaptics)
Touch Usability
Bruce "Tog" Tognazzini
Ask Tog: Interaction Design Solutions for the Real World
Inclusive Design, Part I
First Principles of Interaction Design
John M. Carroll
Human Computer Interaction (HCI) (History of HCI)
Bill Moggridge
Designing Interactions
Ben Shneiderman
Leonardo's Laptop: Human Needs and the New Computing Technologies
Edward Tufte

Visual Explanations
Beautiful Evidence
The Visual Display of Quantitative Information
Envisioning Information
Rudolf Arnheim (Gestalt)
Art and Visual Perception: A Psychology of the Creative Eye

Update: A great reading list on general HCI. Some of the authors were involved in the early days of touch, bi-manual, and multi-touch interaction.

Jan's Top Ten List of Books on Human-Computer Interaction


FYI: If you know much about Windows Presentation Foundation, you probably know that Josh Smith, WPF guru, also works at Infragistics


Feb 26, 2011

Why bother switching from GUI to NUI? - Asked and Answered by Josh Blake; My 2-cents; Stevie B’s Microsoft Research Video; Marco Silva’s NUI-HCI Presentation (and links)

In Chapter 1 of Natural User Interfaces in .NET,  Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction: 


Why bother switching from GUI to NUI?  The answer?  Read Chapter 1 (pdf) of the book - the chapter is free.


Here are a few of my personal reasons:  
1.  I want to buy the next version of the iPad or something like it.
2.  I want to buy a new large-screen Internet HD TV.
3.  I want to buy a Kinect.
4.  I do NOT want to interact with my new TV with a Sony remote.  Too many tiny buttons!


5. I do NOT want to interact with my new TV with a keyboard,  because it reminds me of...work.

6.  Most importantly: 

I want to design apps for the people I care about, and others with similar needs:
    My mom.  
    My grandson.
Moms and dads with kids in tow.
People with special needs and/or health concerns, and the people who care and guide them.
Knowledge sharers and (life-long) learners....

RELATED

"Smart" Interactive Display Research

 
View more presentations from Marco Silva

My YouTube Playlist:
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more... "
RELATED - and somewhat related   
Encyclopedia:  Human Computer Interaction, Interaction Design, User Experience, Information Architecture, Usability and More (Interaction-Design.org)

Josh Blake's Blog: Deconstructing the NUI    Book: Chapter 1 (pdf)  Free!
Blake.NUI
"Blake.NUI is a collection of helpful controls, utilities, and samples useful for multi-touch and NUI development with WPF, Surface, and Silverlight."
 (This is not an inclusive list.)


GUI to NUI Post-WIMP Manifesto:  TBA

Aug 31, 2010

Osmosis: Multi-touch systems for... everywhere!

Not long ago I had the opportunity to chat with Stuart McLean, the founder of Osmosis, a company that delivers customized multi-touch systems of hardware and software that support human-centered natural user interaction.   Stuart has many years of experience working in more traditional IT/business roles, and knows from this experience that there is  better way to support  human computer interaction, including interaction between people.

Like many of us in the "NUI" community, Stuart was impressed by the video of Jeff Han's 2006 TED Talk, which demonstrated a variety of awesome multi-touch, multi-user applications on a high-resolution drafting table.  Stuart saw the importance of natural user interfaces and interaction and became involved with the NUI Group, a "global research community focused on the open discovery of natural user interfaces". 

Unlike traditional tech companies, Osmosis is a collaboration between a global network of engineers, designers, and developers who share the "NUI" vision. This collaboration enables the company to provide solutions for clients across a range of countries, cultures, and domains.


Below is a photo-gallery of some of the applications and systems developed by Osmosis:


Multi-touch by Osmosis
GALLERY
As you can see from the gallery photos, Osmosis provides a range of possibilities for their clients and potential clients.  All of the displays are high-definition.  Some are projection-systems, and others are displays with multi-touch sensing technology.  Since the construction is modular, a variety of form factors are available.  High-quality surround and domed sound systems are available.  Applications include information kiosks, point of sale/digital signage, hospitality, presentation and training, education, and audio-visual performance and production.  Osmosis also provides applications that support interaction with tangible objects.

Below are two videos that give a taste of what Osmosis is all about:

OSMOSIS DEMO REEL

Demo Reel from Osmosis on Vimeo.

MULTI-TOUCH EVERYWHERE

MT Everywhere from Osmosis on Vimeo.

I can see where some of these applications would be great in K-12 educational settings.  Just look at the joy on the faces of the kids in the Multi-Touch Everywhere video!

(Short video clips of the Osmosis applications in action can be found in the showcase page of the company's website.)

Jul 7, 2011

I want to travel around the globe and attend all of the cool conferences about innovative interactive technologies. Any sponsors? (Yes, I'm day-dreaming)

Here are a few I missed:


NIME 2011 OSLO: The International Conference on New Interfaces for Musical Expression - NIME is an outgrowth of a workshop held at CHI 2001 (Human Factors in Computing Systems).
"The NIME conference draws a varied group of participants, including researchers (musicology, computer science, interaction design, etc.), artists (musicians, composers, dancers, etc.) and developers (self-employed and industrial). The common denominator is the mutual interest in groundbreaking technology and music, and contributions to the conference cover everything from basic research on human cognition through experimental technological devices to multimedia performances." Just take a look at all of the presentations that were at NIME 2011!  NIME 2011 Program (pdf)


Touch the Web 2011: 2nd International Workshop on Web-Enabled Objects June 20-24, 2011, Paphos, Cyprus (in conjunction with the International Conference on Web Engineering ICWE)
"The vision of the Internet of Things builds upon the use of embedded systems to control devices, tools and appliances. With the addition of novel communications capabilities and identification means such as RFID, systems can now gather information from other sensors, devices and computers on the network, or enable user-oriented customization and operations through short-range communication. When the information gathered by different sensors is shared by means of open Web standards, new services can be defined on top of physical elements. In addition, the new generation of mobile phones enables a true mobile Internet experience. These phones are today’s ubiquitous information access tool, and the physical token of our "Digital Me“. These meshes of things and “Digital Me” will become the basis upon which future smart living, working and production places will be created, delivering services directly where they are needed."


Upcoming Conferences and Workshops


Ubicomp 2011, September 17-21, 2011, Beijing, China
"Ubicomp is the premier outlet for novel research contributions that advance the state of the art in the design, development, deployment, evaluation and understanding of ubiquitous computing systems. Ubicomp is an interdisciplinary field of research and development that utilizes and integrates pervasive, wireless, embedded, wearable and/or mobile technologies to bridge the gaps between the digital and physical worlds. The Ubicomp 2011 program features keynotes, technical paper sessions, specialized workshops, live demonstrations, posters, video presentations, panels, industrial exhibition and a Doctoral Colloquium."

"Given its multi-disciplinary nature, Ubicomp has developed a broad base of audience over the past 12 years. Key audience communities are: Human Computer Interaction, Pervasive Computing, Distributed and Mobile Computing, Real World Modeling, Sensors and Devices, Middleware and Systems research, Programming Models and Tools, and Human Centric Validation and Experience Characterization. More detailed information about the topical focus of Ubicomp can be found in the Call for Papers."



Eurodisplay 2011: XXXI International Display Research Conference, September 19-22, Bordeaux-Arcacho, France
Eurodisplay 2011 in Bordeaux/Arcachon provides researchers, engineers, and technical managers a unique opportunity to present their results and update their knowledge in all display-related research fields...The two keynote addresses on the first day of the Eurodisplay 2011 Symposium on September 20th will be a unique chance to hear a global overview on the future of the display market (Samsung LCD) and on display applications in the automotive industry (Daimler AG)


Eurodisplay ’11 in Bordeaux-Arcachon will be the right spot to learn more about the last results on display research and related fields such as organic electronics. The cognitic science will give a new vision on the impact of display in our day to day life, not only from a perception but from the Information standpoint since SID deals with “Information Display” and not only technologies.The two invited talk of Pr. Bernard Claverie and Dr. Lauren Palmateer will focus on this aspect......A dedicated business oriented session will address two important aspect of our business from display company creation by Thierry Leroux to end user trends analysis for TV market by Dr. Jae Shin. This conference will provide a global vision of our Display World including not only the well-known display leaders but also the very actives BRIC countries......Dr. V. Pellegrini Mammana will give a vivid illustration of this display industry dynamic in Brazil."

UIST Symposium, October 16-19, 2011, Santa Barbara, California
"UIST (ACM Symposium on User Interface Software and Technology) is the premier forum for innovations in the software and technology of human-computer interfaces. Sponsored by ACM's special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together researchers and practitioners from diverse areas that include traditional graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW. The intimate size, single track, and comfortable surroundings make this symposium an ideal opportunity to exchange research results and implementation experiences."


VisWeek 2011: Viz, Infovis, VAST October 23-28, Providence, RI
"Computer-based information visualization centers around helping people explore or explain data through interactive software that exploits the capabilities of the human perceptual system. A key challenge in information visualization is designing a cognitively useful spatial mapping of a dataset that is not inherently spatial and accompanying the mapping by interaction techniques that allow people to intuitively explore the dataset. Information visualization draws on the intellectual history of several traditions, including computer graphics, human-computer interaction, cognitive psychology, semiotics, graphic design, statistical graphics, cartography, and art. The synthesis of relevant ideas from these fields with new methodologies and techniques made possible by interactive computation are critical for helping people keep pace with the torrents of data confronting them."

6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011 ITS 2011
November 13-16, 2011 Portopia Hotel, Kobe, Japan
"The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications."


AFFINE: 4th International Workshop on Affective Interaction in Natural Environments (ICMI 2011)  November 17, 2011, Alicante, Spain (CFP deadline is August 19, 2011)
Scope: "Computer gaming has been acknowledged as one of the computing disciplines which proposes new interaction paradigms to be replicated by software engineers and developers in other fields. The abundance of high-performance, yet lightweight and mobile devices and wireless controllers has revolutionized gaming, especially when taking into account the individual affective expressivity of each player and the possibility to exploit social networking infrastructure. As a result, new gaming experiences are now possible, maximizing users’ skill level, while also maintaining their interest to the challenges in the same, resulting in a state which psychologists call flow: “a state of concentration or complete absorption with the activity at hand and the situation”. The result of this amalgamation of gaming, affective and social computing has brought increased interest in the field in terms of interdisciplinary research........Natural interaction plays an important role in this process, since it gives game players the opportunity to leave behind traditional interaction paradigms, based on keyboards and mice, and control games using the same concepts they employ in everyday human-human interaction: hand gestures, facial expressions and head nods, body stance and speech. These means of interaction are now easy to capture, thanks to low-cost visual, audio and physiological signal sensors, while models from psychology, theory of mind and ergonomics can be put to use to map features from those modalities to higher-level concepts, such as desires, intentions and goals. In addition to that, non-verbal cues such as eye gaze and facial expressions can serve as valuable indicators of player satisfaction and help game designers provide the optimal experience for players: games which are not frustratingly hard, but still challenging and not boring.........Another aspect which makes computer gaming an important field for multimodal interaction is the new breed of multimodal data it can generate: besides videos of people playing games in front of computer screens or consoles, which include facial, body and speech expressivity, researchers in the field of affective computing and multimodal interaction may benefit from mapping events in those videos (e.g. facial signs of frustration) to specific events in the game (large number of enemies or obstacles close to the player) and infer additional user states such as engagement and immersion. Individual and prototypical user models can be built based on that information, helping produce affective and immersive experiences which maintain the concept of ‘flow’. This workshop will cover real-time and off-line computational techniques for the recognition and interpretation of multimodal verbal and non-verbal activity and behaviour, modelling and evolution of player and interaction contexts, and synthesis of believable behaviour and task objectives for non-player characters in games and human-robot interaction.The workshop also welcomes studies that provide new insights into the use of gaming to capture multimodal, affective databases, low-cost sensors to capture user expressivity beyond the visual and speech modalities and concepts from collective intelligence and group modelling to support multi-party interaction."

Intelligent User Interfaces (IUI 2012) Lisbon, Portugal, February 14-17 (pdf) (CFP Submission Deadline is October 21, 2011)
"Major Topics of Interest to IUI include: Intelligent interactive interfaces, systems, and devices, Ubiquitous interfaces, Smart environments and tools, Human-centered interfaces, Mobile
interfaces, Multimodal interfaces, Pen-based interfaces, Spoken and natural language interfaces, Conversational interfaces, Affective and social interfaces, Tangible interfaces, Collaborative multiuser interfaces, Adaptive interfaces, Sensor-based interfaces, User modeling and interaction with novel interfaces and devices, Interfaces for personalization and recommender systems, Interfaces for plan-based systems, Interfaces that incorporate knowledge- or agent-based approaches, Help interfaces for complex tasks, Example- and demonstration-based interfaces, Interfaces for intelligent generation and presentation of information, Intelligent authoring systems, Synthesis of multimodal virtual characters and social robots, Interfaces for games and entertainment, learning based interactions, health informatics, Empirical studies and evaluations of IUI interfaces, New approaches to designing Intelligent User Interfaces, and related areas"


IXDA: Interaction 12, Dublin, Ireland, February 1-4, 2012
"Interaction|12 is an ideal venue to showcase the most inspiring and original stories in interaction design. We have a world of creative talent to tap into, so our conference roster will fill up fast. We’re on the lookout for thoughtful, original proposals that will inspire our community of interaction designers from all over the world. Do you have an uncommon or enlightening design story, valuable lessons learned from hands-on experience and want to be a part of the programme at Interaction|12? You do? Great!"


More to come!


BTW, I'd like to go to a few Urban Screens or Media Facades festivals:

Media Facades Festival Europe 2010 from MediaFacades on Vimeo


Of course, I'd like to go to educational technology, school psychology, and special education conferences...

Dec 9, 2010

Interested in the OpenNI Initiative? OpenKinect? To learn more, read Josh Blake's Interview of Tamir Berliner of PrimeSense




Josh Blake, Deconstructing the NUI, 12/9/10



Josh Blake recently interviewed Tamir Berliner, one of the founders of PrimeSense.  If you haven't heard, Microsoft's Kinect was based on work by PrimeSense, and licensed their technology. PrimeSense provides consumer electronics with natural user interaction capabilities. The good news is that the company recently released open-sourced middleware for natural interaction and depth-camera drivers. It will be interesting to see how this will play in the near future!




In the interview, Tamir discussed a number of topics related to postWIMP technologies.  He also announced the newly created  OpenNI, "an industry-led, not-for-profit organization formed to certify compatibility and interoperability of Natural Interaction (NI) devices, applications, and middleware."   It is good to see this level of support for the cause!


Here is a quote from the interview that I especially liked:

"I believe that till today the devices we’ve been using, made us learn greatly lot about them before we could use them and gain their value. I’m pretty sure everyone who is reading this has got at least 3 remotes sitting on his living room table, and at least once a week needs to help someone use their computer/media center/phone/etc. It’s time for that to change and it’s up to us, the technologists to make this revolution happen, it’s time for the devices to take the step of understanding what we want and making sure we get that, even without asking if it’s a trivial task as opening a door when we approach, closing the lights when we leave the room, even making sure we have hot water to shower with when we return from work or wake up in the morning, depends on what we normally do." -Tamir


RELATED
Here are a couple of videos from the OpenNI website that demonstrate OpenNI-compliant applications:

OpenNI-compliant real time skelton tracking by PrimeSense


OpenNI-compliant real time SceneAnalyzer by PrimeSense



FYI: 
Josh Blake is the author of the Deconstructing the NUI blog. Over the past couple of years, he's explored natural user interfaces and interactions through his work on applications designed for Microsoft Surface and Win7 with Windows Presentation Foundation.
About a month ago, Josh organized OpenKinect, an on-line community to support collaboration among people interested in exploring ways to use Kinect with PCs and other devices.  An example of this effort is the open source code, libfreenect, which includes drivers and libraries for Windows, Linux, and OS X. 


The Natural User Interface Revolution
Josh Blake, 1/5/09


Kinect for Xbox 360: The inside story of Microsft's secret 'Project Natal'  (long, but worth reading) David Rowan, Wired UK, 10/29/10


People of libreenect

OpenNI User Guide (pdf)

Nov 29, 2010

International Conference on Multimodal Interaction: ICMI 2011 Call for Papers

The information below was taken from the website for the 13th International Conference on Multimodal Interaction. I'm excited about the range of topics that the conference will cover.  I look forward to sharing more about the work of the members of this group on this blog in the future!  (I've highlighted the topics that interest me the most.)

INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION CALL FOR PAPERS

The International Conference on Multimodal Interaction, ICMI 2011, will take place in Alicante (Spain), November 14-18, 2011, just after the ICCV 2011 (in Barcelona, Spain). This is the thirteenth edition of the International Conference on Multimodal Interfaces, which for the last two years joined efforts with the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI 2009 and 2010). Starting in this edition the conference uses the new, shorther name.

The new ICMI is the premium international forum for multimodal signal processing and multimedia human-computer interaction. The conference will focus on theoretical and empirical foundations, varied component technologies, and combined multimodal processing techniques that define the field of multimodal interaction analysis, interface design, and system development. ICMI 2011 will feature a single-track main conference which includes: keynote speakers, technical full and short papers (including oral and poster presentations), special sessions, demonstrations, exhibits and doctoral spotlight papers. The conference will be followed by workshops. The proceedings of ICMI 2011 will be published by ACM as part of their series of International Conference Proceedings and will be also distributed to the attendees in USB memory sticks.


Topics of interest include but are not limited to:

  • Multimodal and multimedia interactive processing
    Multimodal fusion, multimodal output generation, multimodal interactive discourse and dialogue modeling, machine learning methods for multimodal interaction.
  • Multimodal input and output interfaces
    Gaze and vision-based interfaces, speech and conversational interfaces, pen-based and haptic interfaces, virtual/augmented reality interfaces, biometric interfaces, adaptive multimodal interfaces, natural user interfaces, authoring techniques, architectures.
  • Multimodal and interactive applications
    Mobile and ubiquitous interfaces, meeting analysis and meeting spaces, interfaces to media content and entertainment, human-robot interfaces and interaction, audio/speech and vision interfaces for gaming, multimodal interaction issues in telepresence, vehicular applications and navigational aids, interfaces for intelligent environments, universal access and assistive computing, multimodal indexing, structuring and summarization.
  • Human interaction analysis and modeling
    Modeling and analysis of multimodal human-human communication, audio-visual perception of human interaction, analysis and modeling of verbal and nonverbal interaction, cognitive modeling.
  • Multimodal and interactive data, evaluation, and standards
    Evaluation techniques and methodologies, annotation and browsing of multimodal and interactive data, standards for multimodal interactive interfaces.
  • Core enabling technologies
    Pattern recognition, machine learning, computer vision, speech recognition, gesture recognition.

Important dates

Workshops proposalMarch 1, 2011
Paper and demo submissionMay 13, 2011
Author notificationAugust 5, 2011
Camera ready deadlineSeptember 2, 2011
ConferenceNovember 14-16, 2011
WorkshopsNovember 17-18, 2011


General Chairs

Hervé Bourlard (Idiap)
Thomas S. Huang (Univ. of Illinois)
Enrique Vidal (Tech. Univ. of Valencia)

Program Chairs

Daniel Gatica-Perez (Idiap)
Louis-Philippe Morency (Univ. South. California)
Nicu Sebe (Univ. of Trento)

Demo Chairs

Kazuhiro Otsuka (NTT Comm. Sci. Lab.)
Jordi Vitrià (UB/CVC, Barcelona)

Workshop Chairs

Fernando de la Torre
(Carnegie Mellon Univ.)
Alejandro Jaimes (Yahoo! Research, Barcelona)

Publication Chair

Jose Oncina (Univ. of Alicante)

Student & Doctoral Spotlight Chair

Li Deng (Microsoft Research and Univ. of Washington)

Sponsorship Chair

Nuria Oliver (Telefónica I+D)

Publicity Chair

Helen Mei-Ling Meng (CUHK, Hong Kong)

Local Organization Chair

Luisa Micó (Univ. of Alicante)

Treasurer

Jorge Calera (Univ. of Alicante)

Local organizers

Xavier Anguera (Telefónica I+D)
A. Javier Gallego Sánchez (Univ. of Alicante)
Ida Hui (CUHK, Hong Kong)
Jose Manuel Iñesta (Univ. of Alicante)
Alejandro Toselli (Tech. Univ. of Valencia)



RELATED
Accepted Papers for ICMI-MLMI 2010


NOTE:  ICMI 2011 will be held after ICCV 2011, the 13th International Conference on Computer Vision in Barcelona.