Showing posts sorted by date for query "natural user interaction". Sort by relevance Show all posts
Showing posts sorted by date for query "natural user interaction". Sort by relevance Show all posts

Mar 1, 2012

Seamless Collaborative Computing Between Tables and Tablets: nSquared Presenter -Video

I came across the following video about nsquare's presenter application. It looks like it has potential.  It supports "seamless" collaboration between people, multimedia content, interactive tables, interactive touch screens, and tablet devices.






RELATED
nsquared website
Video presentation about "Seamless Computing", by Neil Roodyn


More blog posts on the topic of natural user interaction, interfaces, and examples of seamless computing are planned for the future!


This was shared by one of my FB friends.

Dec 12, 2011

UPDATE POST: Educational Interfaces, Software, and Technology: 2012 ACM-CHI Workshop Call for Papers/Presentations

There is still time left to submit your paper!


CALL FOR PAPERS
EDUCATIONAL INTERFACES, SOFTWARE, AND TECHNOLOGY 2012
3rd Workshop on UI Technologies and Educational Pedagogy
May 5-6 2012
in conjunction with ACM-CHI 2012, Austin, Texas

This will be our third annual workshop in conjunction with CHI 2012.



One of the primary goals of teaching is to prepare learners for life in the real world. In this ever changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Teachers and students can leverage these tools to improve learning outcomes. Educational interfaces and software are needed to ensure that new technologies serve a clear purpose in the classrooms and homes of the future.



Since teachers are always looking for creative ways to engage 21st century learners, there needs to be an academic venue for researchers to discuss novel educational tools and their role in improving learning outcomes. This workshop aims at filling this void: combining the pedagogical expertise of the cooperative learning, and learning sciences communities with the technical creativity of the CHI, UIST and interactive surface communities. The objective of this workshop is to become a conference within two years


We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. 




Topics of interest include:

  • Gestural input, multitouch, large displays
  • Mobile Devices, response systems (clickers)
  • Tangible, VR, AR & MR, Multimodal interfaces
  • Console gaming, 3D input devices
  • Co-located interaction, presentations
  • Educational Pedagogy, learner-centric, Child Computer Interaction
  • Empirical methods, case studies
  • Multi-display interaction
  • Wearable educational media
Submission:  The deadline for workshop paper submissions is Dec 20, 2011. Interested researchers should submit a 4-page position paper in the ACM CHI adjunct proceedings style to the workshop management system. Acceptance notifications will be sent out February 20, 2012. The workshop will be held May 5-6, 2012 in Austin, Texas. Please note that at least one author of an accepted position paper must register for the workshop and for one or more days of the CHI 2012 conference.

Website: http://smarttech.com/eist2012
Contact: Edward Tse, SMART Technologies, edwardtse@smarttech.com


RELATED
Educational Interfaces, Software, and Technology Workshop Organizers
Edward Tse, SMART Technologies 
Lynn V. Marentette, Union County Public Schools
 Syed Ishtiaque Ahmed, Cornell University
 Alex Thayer, University of Washington
 Jochen Huber, Technische Universität Darmstadt

 Max Mühlhäuser, Technische Universität Darmstadt
 Si Jung “Jun” Kim, University of Central Florida

 Quincy Brown, Bowie State University

Oct 11, 2011

Hacking Autism: Touch Technology for Young People with Autism Spectrum Disorders (October 11 is the Hackathon!)

October 11, 2011 is a special day. A number of software programmers will be working to develop "innovative, touch-enabled applications for the autism cimmunity and make this software available for free on HackingAutism.org." Take a moment to watch the following video clip, and then explore the Hacking Autism website!
"When touch-enabled computing was introduced to the world, no one could have anticipated that this technology might help open up a new world of communication, learning and social possibilities for autistic children. Yet it has. Hacking Autism is a story of technology and hope and the difference it's making in the lives of some people who need it most.Hacking Autism doesn't seek to cure autism, but rather it aims to facilitate and accelerate technology-based ideas to help give those with autism a voice." -hackingautism.org
Touch technology + people with autism spectrum disorders = 
One of the reasons why I returned to school to take computer courses and explore natural user interfaces and interaction.   

RELATED
Interacting with HP TouchSmart Notes: Photo, Video, Audio and More
Interactive Visual Supports for Children with Autism:  Gillian Hayes' Work at the Social and Technology Action Research Group
Open Source Multi-touch Software for Young People with Autism
Interactive iPad Apps for Kids with Autism: Could some of these be transformed for multi-touch tabletop activities?
iPad Apps: Supporting Communication for Young People with Autism (links to Moms with Apps)
Reflections about interactivity in my present world (Aug. 2010)
Interactive Multi-touch for Children with Autism Spectrum Disorders: Research and Apps by Juan Paplo Hourcade, Thomas Hanson, and Natasha Bullock-Rest, University of Iowa
Open Autism Software "Where Social Skills and Interest in Computers Meet"
Sen H. Hirano, Michael T. Yeganyan, Gabriela Marcu, David H. Nguyen, Lou Anne Boyd, Gillian R. Hayes vSked: Evaluation of a System to Support Classroom Activities for Children with Autism. In CHI 2010 (Atlanta, GA, 2010).(pdf) Gillian R. Hayes, Sen Hirano, Gabriela 
Marcu, Mohamad Monibi, David H. Nguyen, and Michael Yeganyan. Interactive Visual Supports for Children with Autism. Personal and Ubiquitous Computing. April 2010. 
Monibi, M., Hayes, G.R. Mocotos: Mobile Communication Tools for Children with Special Needs. Proceedings of Interaction Design and Children, pages 121-124 ACM, 2008 
SOMEWHAT RELATED
Hope Technology School
Do2Learn JobTips
Autism Research Group at Georgia Tech
Immersive Cocoon Interaction"  "It's people who are now the interface"
Today I hooked up a Will to the IWB in the school's therapy room.  Next, a Kinect? 
(IWBs + Games + Social Skills)

Jul 7, 2011

I want to travel around the globe and attend all of the cool conferences about innovative interactive technologies. Any sponsors? (Yes, I'm day-dreaming)

Here are a few I missed:


NIME 2011 OSLO: The International Conference on New Interfaces for Musical Expression - NIME is an outgrowth of a workshop held at CHI 2001 (Human Factors in Computing Systems).
"The NIME conference draws a varied group of participants, including researchers (musicology, computer science, interaction design, etc.), artists (musicians, composers, dancers, etc.) and developers (self-employed and industrial). The common denominator is the mutual interest in groundbreaking technology and music, and contributions to the conference cover everything from basic research on human cognition through experimental technological devices to multimedia performances." Just take a look at all of the presentations that were at NIME 2011!  NIME 2011 Program (pdf)


Touch the Web 2011: 2nd International Workshop on Web-Enabled Objects June 20-24, 2011, Paphos, Cyprus (in conjunction with the International Conference on Web Engineering ICWE)
"The vision of the Internet of Things builds upon the use of embedded systems to control devices, tools and appliances. With the addition of novel communications capabilities and identification means such as RFID, systems can now gather information from other sensors, devices and computers on the network, or enable user-oriented customization and operations through short-range communication. When the information gathered by different sensors is shared by means of open Web standards, new services can be defined on top of physical elements. In addition, the new generation of mobile phones enables a true mobile Internet experience. These phones are today’s ubiquitous information access tool, and the physical token of our "Digital Me“. These meshes of things and “Digital Me” will become the basis upon which future smart living, working and production places will be created, delivering services directly where they are needed."


Upcoming Conferences and Workshops


Ubicomp 2011, September 17-21, 2011, Beijing, China
"Ubicomp is the premier outlet for novel research contributions that advance the state of the art in the design, development, deployment, evaluation and understanding of ubiquitous computing systems. Ubicomp is an interdisciplinary field of research and development that utilizes and integrates pervasive, wireless, embedded, wearable and/or mobile technologies to bridge the gaps between the digital and physical worlds. The Ubicomp 2011 program features keynotes, technical paper sessions, specialized workshops, live demonstrations, posters, video presentations, panels, industrial exhibition and a Doctoral Colloquium."

"Given its multi-disciplinary nature, Ubicomp has developed a broad base of audience over the past 12 years. Key audience communities are: Human Computer Interaction, Pervasive Computing, Distributed and Mobile Computing, Real World Modeling, Sensors and Devices, Middleware and Systems research, Programming Models and Tools, and Human Centric Validation and Experience Characterization. More detailed information about the topical focus of Ubicomp can be found in the Call for Papers."



Eurodisplay 2011: XXXI International Display Research Conference, September 19-22, Bordeaux-Arcacho, France
Eurodisplay 2011 in Bordeaux/Arcachon provides researchers, engineers, and technical managers a unique opportunity to present their results and update their knowledge in all display-related research fields...The two keynote addresses on the first day of the Eurodisplay 2011 Symposium on September 20th will be a unique chance to hear a global overview on the future of the display market (Samsung LCD) and on display applications in the automotive industry (Daimler AG)


Eurodisplay ’11 in Bordeaux-Arcachon will be the right spot to learn more about the last results on display research and related fields such as organic electronics. The cognitic science will give a new vision on the impact of display in our day to day life, not only from a perception but from the Information standpoint since SID deals with “Information Display” and not only technologies.The two invited talk of Pr. Bernard Claverie and Dr. Lauren Palmateer will focus on this aspect......A dedicated business oriented session will address two important aspect of our business from display company creation by Thierry Leroux to end user trends analysis for TV market by Dr. Jae Shin. This conference will provide a global vision of our Display World including not only the well-known display leaders but also the very actives BRIC countries......Dr. V. Pellegrini Mammana will give a vivid illustration of this display industry dynamic in Brazil."

UIST Symposium, October 16-19, 2011, Santa Barbara, California
"UIST (ACM Symposium on User Interface Software and Technology) is the premier forum for innovations in the software and technology of human-computer interfaces. Sponsored by ACM's special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together researchers and practitioners from diverse areas that include traditional graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW. The intimate size, single track, and comfortable surroundings make this symposium an ideal opportunity to exchange research results and implementation experiences."


VisWeek 2011: Viz, Infovis, VAST October 23-28, Providence, RI
"Computer-based information visualization centers around helping people explore or explain data through interactive software that exploits the capabilities of the human perceptual system. A key challenge in information visualization is designing a cognitively useful spatial mapping of a dataset that is not inherently spatial and accompanying the mapping by interaction techniques that allow people to intuitively explore the dataset. Information visualization draws on the intellectual history of several traditions, including computer graphics, human-computer interaction, cognitive psychology, semiotics, graphic design, statistical graphics, cartography, and art. The synthesis of relevant ideas from these fields with new methodologies and techniques made possible by interactive computation are critical for helping people keep pace with the torrents of data confronting them."

6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011 ITS 2011
November 13-16, 2011 Portopia Hotel, Kobe, Japan
"The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications."


AFFINE: 4th International Workshop on Affective Interaction in Natural Environments (ICMI 2011)  November 17, 2011, Alicante, Spain (CFP deadline is August 19, 2011)
Scope: "Computer gaming has been acknowledged as one of the computing disciplines which proposes new interaction paradigms to be replicated by software engineers and developers in other fields. The abundance of high-performance, yet lightweight and mobile devices and wireless controllers has revolutionized gaming, especially when taking into account the individual affective expressivity of each player and the possibility to exploit social networking infrastructure. As a result, new gaming experiences are now possible, maximizing users’ skill level, while also maintaining their interest to the challenges in the same, resulting in a state which psychologists call flow: “a state of concentration or complete absorption with the activity at hand and the situation”. The result of this amalgamation of gaming, affective and social computing has brought increased interest in the field in terms of interdisciplinary research........Natural interaction plays an important role in this process, since it gives game players the opportunity to leave behind traditional interaction paradigms, based on keyboards and mice, and control games using the same concepts they employ in everyday human-human interaction: hand gestures, facial expressions and head nods, body stance and speech. These means of interaction are now easy to capture, thanks to low-cost visual, audio and physiological signal sensors, while models from psychology, theory of mind and ergonomics can be put to use to map features from those modalities to higher-level concepts, such as desires, intentions and goals. In addition to that, non-verbal cues such as eye gaze and facial expressions can serve as valuable indicators of player satisfaction and help game designers provide the optimal experience for players: games which are not frustratingly hard, but still challenging and not boring.........Another aspect which makes computer gaming an important field for multimodal interaction is the new breed of multimodal data it can generate: besides videos of people playing games in front of computer screens or consoles, which include facial, body and speech expressivity, researchers in the field of affective computing and multimodal interaction may benefit from mapping events in those videos (e.g. facial signs of frustration) to specific events in the game (large number of enemies or obstacles close to the player) and infer additional user states such as engagement and immersion. Individual and prototypical user models can be built based on that information, helping produce affective and immersive experiences which maintain the concept of ‘flow’. This workshop will cover real-time and off-line computational techniques for the recognition and interpretation of multimodal verbal and non-verbal activity and behaviour, modelling and evolution of player and interaction contexts, and synthesis of believable behaviour and task objectives for non-player characters in games and human-robot interaction.The workshop also welcomes studies that provide new insights into the use of gaming to capture multimodal, affective databases, low-cost sensors to capture user expressivity beyond the visual and speech modalities and concepts from collective intelligence and group modelling to support multi-party interaction."

Intelligent User Interfaces (IUI 2012) Lisbon, Portugal, February 14-17 (pdf) (CFP Submission Deadline is October 21, 2011)
"Major Topics of Interest to IUI include: Intelligent interactive interfaces, systems, and devices, Ubiquitous interfaces, Smart environments and tools, Human-centered interfaces, Mobile
interfaces, Multimodal interfaces, Pen-based interfaces, Spoken and natural language interfaces, Conversational interfaces, Affective and social interfaces, Tangible interfaces, Collaborative multiuser interfaces, Adaptive interfaces, Sensor-based interfaces, User modeling and interaction with novel interfaces and devices, Interfaces for personalization and recommender systems, Interfaces for plan-based systems, Interfaces that incorporate knowledge- or agent-based approaches, Help interfaces for complex tasks, Example- and demonstration-based interfaces, Interfaces for intelligent generation and presentation of information, Intelligent authoring systems, Synthesis of multimodal virtual characters and social robots, Interfaces for games and entertainment, learning based interactions, health informatics, Empirical studies and evaluations of IUI interfaces, New approaches to designing Intelligent User Interfaces, and related areas"


IXDA: Interaction 12, Dublin, Ireland, February 1-4, 2012
"Interaction|12 is an ideal venue to showcase the most inspiring and original stories in interaction design. We have a world of creative talent to tap into, so our conference roster will fill up fast. We’re on the lookout for thoughtful, original proposals that will inspire our community of interaction designers from all over the world. Do you have an uncommon or enlightening design story, valuable lessons learned from hands-on experience and want to be a part of the programme at Interaction|12? You do? Great!"


More to come!


BTW, I'd like to go to a few Urban Screens or Media Facades festivals:

Media Facades Festival Europe 2010 from MediaFacades on Vimeo


Of course, I'd like to go to educational technology, school psychology, and special education conferences...

May 18, 2011

CHI 2011, Bill Buxton, and the Buxton Collection: Explore 35 years of interactive devices, online!

Bill Buxton is a researcher at Microsoft who focuses on Human-Computer Interaction and is known for his work in user experience design and natural user interfaces such as multi-touch surfaces.   His talk at the recent CHI 2011 conference, held in Vancouver, Canada, was an overview of the Buxton Collection, an on-line historical archive of interactive input devices spanning over the past 35 years.  


It was interesting to note that at the time of the presentation, the Vancouver Conference Center, where the conference was taking place, was having serious problems with the network/internet connections, and as a consequence, Buxton was not able to demonstrate the on-line version of his collection as planned.  

Not to worry.  The physical version of the Buxton's archive was on display during the conference, along with Buxton, who was happy to tell the story behind every device and gadget in the archive, with much enthusiasm. The slideshow below provides a glimpse of the Bill Buxton archive displayed at CHI 2011:



My Buxton Collection Slideshow, CHI 2011, Vancouver, Canada

Buxton's archive of gadgets comes with a rich history, accumulated over the years. Much of this history, until now, has resided in Buxton's head.   Holding and touching the items in the archive while listening to Buxton's passionate stories about each one, was unlike anything I had ever experienced.  His archive is a labor of love, and the interactive, on-line version of the Buxton Collection is his way of sharing his knowledge with the world.


During his talk, Buxton pointed out that in computer science programs, students are not required to have much exposure to the "history of ideas" in the field.  Huge chunks of work are often ignored in the literature,  and in some cases, the wheel is unknowingly reinvented, and this is something that must be addressed within the CHI community, according to Buxton. 


I agree.

RELATED
Previous IMT posts about Bill Buxton
Bill Buxton's Presentation Video: "A Little Tale about Touch" (Microsoft Worldwide Partner Conference, 2010)
Two good articles by Bill Buxton: The Mad Dash Towards Touch Technology; The Long Nose of Innovation
Buxton Collection
Buxton Collection, PivotViewer
Back to the Past: Bill Buxton Shows Off 36 Years of Tech Devices
Microsoft News Center, 5/9/11
Microsoft's Bill Buxton exhibits gadget collection 35 years in the making
Donald Melanson, engadget, 5/9/11
Bill Buxton's Haptic Input References (pdf)
Bill Buxton's website
Multi-touch Systems that I Have Known and Loved (Bill Buxton)
CES 2010: NUI with Bill Buxton

On Engineering and Design: An Open Letter Microsoft Research Principal Scientist Bill Buxton calls for engineers and user experience designers to learn to appreciate one another
Bill Buxton, Bloomberg Businessweek, 4/29/09

Apr 26, 2011

Multi-touch and Gesture Interaction News and Updates You Might Have Missed (Part I)

Over the past couple of months, I've come across many interesting links related to multi-touch and gesture interaction, but I haven't had time to devote a thoughtful post to each one.  "Part I",  is a nice collection of experimental, commercial, and non-commercial efforts by a variety of creative technologists, with a smattering of industry news that might be of interest to IMT readers. 


Ideum's MT55 HD Multitouch Table 4/19/11

New MT55 HD Multitouch Table Now Shipping,  Jim Spadaccini, Ideum Blog 4/11/11

Smithsonian American Art Museum to Open Education Center  Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center.  The center was funded by an anonymous $8 million dollar gift.)

Bill Buxton, Microsoft Research, 4/7/11 - Includes lots of pictures, links to videos, and more information of what might be the first touch-screen.  Also see Bill Buxton's companion website, Multi-Touch Systems that I have Known and Loved, updated on 3/21/11.  Bill Buxton knows all (almost!)


"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler

(Also check out NodeBeat, a multi-touch music/audio sequencer/generator app by Seth Sandler and Justin Windle)

Intuilab, 4/13/11
"IntuiLab, a global leader in surface computing software applications, today announced support for the revolutionary Microsoft Kinect device across its full line of IntuiFace products and solutions including IntuiFace Presentation and IntuiFace Commerce...Microsoft Kinect brings distant gesture control to interactive solutions. These gesture controls allow users to interact with displayed digital assets from a distance at their own pace and path – for example, browsing through a large quantity of products in a store catalog or manipulating 3D models (such as a mobile phone) – all without having to actually touch the screen..."  -IntuiLab (Take a look at the IntuiLab team- an interactive page!)




Sparkon:  Videos and links related to multi-touch and gesture-based applications



Official Kinect SDK to be Open SourceJosh Blake, Deconstructing the NUI, 4/18/11  
9 This bit of news excited me, but don't get your hopes up. If anyone knows what will happen with the Kinect SDK, please leave a comment.)
"Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea." -Josh Blake

Kenrick Kin, Tom Miller, Bjoern Bollensdorff, Tony DeRose, Bjoern Hartmann, Manees Agrawala (Pixar Online Library)

Flight Race Game on 3DFeel lm3Labs, 4/18/11


JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."


Harry van der Veen's Multitouch Blog (NUITEQ)


Stantum "Unlimited Multi-Touch" Latest News

At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11 

Immersive Labs

Hard Rock Cafe International Using NextWindow Touch Screens:  "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)

Razorfish: Thoughts on MIX 11 ,James Ashley, Razorfish Blog, 4/20/11  Also see: Razorfish Lab's Prototypes




"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen."  -Multitouchfi  Also see the Multitouch website.



Big Size Multitouch Display Turned into a MicroscopeMicroscopy-News, 3/28/11
Mac OX 10.7 Lion: new multi-touch gestures, Dock integration for Expose, Launchpad, Mission Control Appleinsider, 4/14/11


Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11


3M Touch Systems's YouTube Channel

Social Mirror 3D Gestural Display, Now Using Kinnect:  SnibbeInteractive




Mar 29, 2011

SIFTEO, the next-gen Siftables! (Tangible User Interfaces for All)

Despite my enthusiasm for TUI's , I somehow missed the news about the transformation of Siftables to a commercial version, Sifteo:

Sifteo Inc. Debuts Sifteo™ Cubes - A New Way To Play (PDF



"Sifteo cubes are 1.5 inch computers with full-color displays that sense their motion, sense each other, and wirelessly connect to your computer. You, your friends, and your family can play an ever-growing array of interactive games that get your brain and body engaged.
Sifteo’s initial collection of titles includes challenging games for adults, fun learning puzzles for kids, and games people can play together." -Sifteo website
For more information, see the Sifteo website,  blog, and YouTube  channel.  If you can't wait to get your own set,  take a look at Josh Blake's Sifteo Cube Unboxing Video!

RELATED
About two years ago, I was interviewed about my thoughts about the interactive, hands-on, programmable cubes, then called Siftables,  for an article published in IEEE's Computing Now magazine:  Siftables Offer New Interaction Mode  (James Figeuroa, Computing Now, 3/2009). 

For those of you who'd like more information about tangible user interfaces (TUIs) and  the development of Siftables, I've copied my 2009 post,   Tangible User Interfaces, Part I:  Siftables,  below:

TANGIBLE USER INTERFACES, PART I: SIFTABLES (2009)
In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms" (pdf).   According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth a look, for those interested in this seminal work.

Another must-read is Hiroshi Ishii's 2008 article, Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related to Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. 



According to the Fluid Interfaces website, the goal of this research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined. Siftables is the work of David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement..No special sensing surface or cameras are needed."





Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg


More about Siftables:
Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces (pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."

In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.


Feb 26, 2011

Why bother switching from GUI to NUI? - Asked and Answered by Josh Blake; My 2-cents; Stevie B’s Microsoft Research Video; Marco Silva’s NUI-HCI Presentation (and links)

In Chapter 1 of Natural User Interfaces in .NET,  Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction: 


Why bother switching from GUI to NUI?  The answer?  Read Chapter 1 (pdf) of the book - the chapter is free.


Here are a few of my personal reasons:  
1.  I want to buy the next version of the iPad or something like it.
2.  I want to buy a new large-screen Internet HD TV.
3.  I want to buy a Kinect.
4.  I do NOT want to interact with my new TV with a Sony remote.  Too many tiny buttons!


5. I do NOT want to interact with my new TV with a keyboard,  because it reminds me of...work.

6.  Most importantly: 

I want to design apps for the people I care about, and others with similar needs:
    My mom.  
    My grandson.
Moms and dads with kids in tow.
People with special needs and/or health concerns, and the people who care and guide them.
Knowledge sharers and (life-long) learners....

RELATED

"Smart" Interactive Display Research

 
View more presentations from Marco Silva

My YouTube Playlist:
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more... "
RELATED - and somewhat related   
Encyclopedia:  Human Computer Interaction, Interaction Design, User Experience, Information Architecture, Usability and More (Interaction-Design.org)

Josh Blake's Blog: Deconstructing the NUI    Book: Chapter 1 (pdf)  Free!
Blake.NUI
"Blake.NUI is a collection of helpful controls, utilities, and samples useful for multi-touch and NUI development with WPF, Surface, and Silverlight."
 (This is not an inclusive list.)


GUI to NUI Post-WIMP Manifesto:  TBA