Showing posts sorted by date for query "natural user interface". Sort by relevance Show all posts
Showing posts sorted by date for query "natural user interface". Sort by relevance Show all posts

Jan 17, 2013

XBox Kinect in the OR: Kinect supports gesture interaction with 3D imaging of the patient, while operating.

Here's an interesting use of technology for health - the Xbox Kinect in the OR!

Thanks to Harry van der Veen for the link!


RELATED
Kinect sensor poised to leap into everyday life
Niall Firth, NewScientist, 1/17/13

For the tech-curious:
PrimeSense (Company that developed the 3D depth sensor that powers the Kinect, the sensor in Ava, a healthcare robot by iRobot, and more.)

OpenNI (Framework for the development of 3D sensing middleware libraries and applications.)

NiTE: Natural Interface Technology for End User (Perception algorithms layer for 3D computer vision, allows for hand locating, tracking, analyzing scenes, and tracking skeleton joints.)

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7



Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Nov 4, 2011

Le Chal: Smart shoes for the visually impaired, with haptic//vibrational feedback, sensors and GPS


Le Chal is a haptic/vibrational feedback shoe designed for the blind and visually impaired, on of the projects of Anirudh Sharma, a member of the NUI Group (Natural User Interface).  Le Chal was conceptualized at the 2011 MIT Media Lab Design and Innovation Workshop at COEP.
Take me there Shoe/ Le Chal

Shoe for visually impaired - Le Chal Himanshu Khanna, 10/31/11

RELATED
touchaddict blog

Oct 11, 2011

Hacking Autism: Touch Technology for Young People with Autism Spectrum Disorders (October 11 is the Hackathon!)

October 11, 2011 is a special day. A number of software programmers will be working to develop "innovative, touch-enabled applications for the autism cimmunity and make this software available for free on HackingAutism.org." Take a moment to watch the following video clip, and then explore the Hacking Autism website!
"When touch-enabled computing was introduced to the world, no one could have anticipated that this technology might help open up a new world of communication, learning and social possibilities for autistic children. Yet it has. Hacking Autism is a story of technology and hope and the difference it's making in the lives of some people who need it most.Hacking Autism doesn't seek to cure autism, but rather it aims to facilitate and accelerate technology-based ideas to help give those with autism a voice." -hackingautism.org
Touch technology + people with autism spectrum disorders = 
One of the reasons why I returned to school to take computer courses and explore natural user interfaces and interaction.   

RELATED
Interacting with HP TouchSmart Notes: Photo, Video, Audio and More
Interactive Visual Supports for Children with Autism:  Gillian Hayes' Work at the Social and Technology Action Research Group
Open Source Multi-touch Software for Young People with Autism
Interactive iPad Apps for Kids with Autism: Could some of these be transformed for multi-touch tabletop activities?
iPad Apps: Supporting Communication for Young People with Autism (links to Moms with Apps)
Reflections about interactivity in my present world (Aug. 2010)
Interactive Multi-touch for Children with Autism Spectrum Disorders: Research and Apps by Juan Paplo Hourcade, Thomas Hanson, and Natasha Bullock-Rest, University of Iowa
Open Autism Software "Where Social Skills and Interest in Computers Meet"
Sen H. Hirano, Michael T. Yeganyan, Gabriela Marcu, David H. Nguyen, Lou Anne Boyd, Gillian R. Hayes vSked: Evaluation of a System to Support Classroom Activities for Children with Autism. In CHI 2010 (Atlanta, GA, 2010).(pdf) Gillian R. Hayes, Sen Hirano, Gabriela 
Marcu, Mohamad Monibi, David H. Nguyen, and Michael Yeganyan. Interactive Visual Supports for Children with Autism. Personal and Ubiquitous Computing. April 2010. 
Monibi, M., Hayes, G.R. Mocotos: Mobile Communication Tools for Children with Special Needs. Proceedings of Interaction Design and Children, pages 121-124 ACM, 2008 
SOMEWHAT RELATED
Hope Technology School
Do2Learn JobTips
Autism Research Group at Georgia Tech
Immersive Cocoon Interaction"  "It's people who are now the interface"
Today I hooked up a Will to the IWB in the school's therapy room.  Next, a Kinect? 
(IWBs + Games + Social Skills)

Jul 22, 2011

Quicklinks: Cute video about need for Google+, Spielburg on 3D, Tactile Pixels, Touch Screen Steering Wheel, and More

Here are a few interesting links  and a couple of videos.  Enjoy exploring!


Comic-Con 2011: Steven Spielberg Gives His Thoughts on 3D
Jason Barr, Collider, 7/22/11



Kwame Opam, Gizmodo, 7/9/11

Albrecht Schmidt, User Interface Engineering Blog, 7/17/11


Potential to improve some user experiences, using HTML 5
HCI 596 Course Blog, Iowa State University, 7/11/11


PBS Launches LearningMedia, a New Digital Repository for Educational Content
Audrey Watters, Hack Education, 6/27/11


Link to Microsoft Surface 2.0 SDK and Resources
Luis Cabrera, Surface Blog, 7/12/11


AI (Artificial Intelligence) Demonstrates Natural Learning, Applies New Skills To Civilization
Devin Coldewey, Tech Crunch, 7/13/11


21 Google+ Privacy Tips: the Ultimate Guide
Craid Kanalley, Huffington Post, 7/21/11


iPad K-12 Sales Outpace Mac Products
Ian Quillen, Education Week, 7/20/11


Wearable lab coat TV packs thousands of LEDs, heads for Burning Man
Zach Honig, Engaget, 7/13/11

(I'd like a job where I can do tech experiments, silly ones, too!)




Jul 14, 2011

Multi-touch Update from Stantum

The people at Stantum have been working hard to improve multi-touch technology, focusing on smaller tablet-sized systems.  Stantum is a company I've been following for several years, from the time it was known as Jazz Mutant.  I have been impressed by Stantum's focus on the needs of people as well as the company's careful attention to important details.


I'm pleased to see that the company has an idea of how its multi-modal technology can support multi-touch in education:   "Ambidexterity and multi-modality are the two pillars of Stantum's core project – making the use of touch-enabled devices more creative and productive. Amongst others, there is one field of application where we truly see a soaring need for ambidexterity and multi-modality – augmented textbooks." -Guillaume Largillier


At the Society for Information Display's Display Week exhibition this past May, Stantum introduced a new palm rejection feature for its Interpolated Voltage Sensitivity technology. This technology provides users with a more natural way to interact with the interface and application content on tablets.   The technology supports Android's multi-touch framework and is also Windows 7 certified.  The palm rejection feature will be a welcome improvement for future multi-touch applications designed for education settings, where it is likely that  more than one hand - or person, might be interacting with content on the screen at the same time.


Below are two videos that provide a glimpse of Stantum's innovations:




Stantum's technology can enable ten simultaneous touches, is highly responsive, and supports high-resolution content. According to a May press release, "Palm rejection is available as an API (application programming interface) to Windows and Android operating systems on x86 and ARM platforms. IVSM touch modules are offered to OEMs through the company’s Qualified Manufacturers Partners, comprising tier-one touch-screen manufacturers with high-volume production capabilities. More information is available at info@stantum.com"


RELATED
Stantum's TouchPoints Newsletter (July 2011)


Stantum Whitepapers:
How to Evaluate Multi-Touch While Standing in a Store (pdf) - a great source of information.
Jim Meador, Pascal Auriel, Gauthier Chastan, Stantum
Specifying and Characterizing Tactile Performances for Multi-touch Panels: Toward a User-Centric Metrology (pdf) - outlines some important points!
Guillaume Largillier, Pascal Joguet, Cyril Recoquillon, Pascal Auriel, Axel Balley, Jim Meador, Julien Olivier, Gauthier Chastan





Jul 7, 2011

I want to travel around the globe and attend all of the cool conferences about innovative interactive technologies. Any sponsors? (Yes, I'm day-dreaming)

Here are a few I missed:


NIME 2011 OSLO: The International Conference on New Interfaces for Musical Expression - NIME is an outgrowth of a workshop held at CHI 2001 (Human Factors in Computing Systems).
"The NIME conference draws a varied group of participants, including researchers (musicology, computer science, interaction design, etc.), artists (musicians, composers, dancers, etc.) and developers (self-employed and industrial). The common denominator is the mutual interest in groundbreaking technology and music, and contributions to the conference cover everything from basic research on human cognition through experimental technological devices to multimedia performances." Just take a look at all of the presentations that were at NIME 2011!  NIME 2011 Program (pdf)


Touch the Web 2011: 2nd International Workshop on Web-Enabled Objects June 20-24, 2011, Paphos, Cyprus (in conjunction with the International Conference on Web Engineering ICWE)
"The vision of the Internet of Things builds upon the use of embedded systems to control devices, tools and appliances. With the addition of novel communications capabilities and identification means such as RFID, systems can now gather information from other sensors, devices and computers on the network, or enable user-oriented customization and operations through short-range communication. When the information gathered by different sensors is shared by means of open Web standards, new services can be defined on top of physical elements. In addition, the new generation of mobile phones enables a true mobile Internet experience. These phones are today’s ubiquitous information access tool, and the physical token of our "Digital Me“. These meshes of things and “Digital Me” will become the basis upon which future smart living, working and production places will be created, delivering services directly where they are needed."


Upcoming Conferences and Workshops


Ubicomp 2011, September 17-21, 2011, Beijing, China
"Ubicomp is the premier outlet for novel research contributions that advance the state of the art in the design, development, deployment, evaluation and understanding of ubiquitous computing systems. Ubicomp is an interdisciplinary field of research and development that utilizes and integrates pervasive, wireless, embedded, wearable and/or mobile technologies to bridge the gaps between the digital and physical worlds. The Ubicomp 2011 program features keynotes, technical paper sessions, specialized workshops, live demonstrations, posters, video presentations, panels, industrial exhibition and a Doctoral Colloquium."

"Given its multi-disciplinary nature, Ubicomp has developed a broad base of audience over the past 12 years. Key audience communities are: Human Computer Interaction, Pervasive Computing, Distributed and Mobile Computing, Real World Modeling, Sensors and Devices, Middleware and Systems research, Programming Models and Tools, and Human Centric Validation and Experience Characterization. More detailed information about the topical focus of Ubicomp can be found in the Call for Papers."



Eurodisplay 2011: XXXI International Display Research Conference, September 19-22, Bordeaux-Arcacho, France
Eurodisplay 2011 in Bordeaux/Arcachon provides researchers, engineers, and technical managers a unique opportunity to present their results and update their knowledge in all display-related research fields...The two keynote addresses on the first day of the Eurodisplay 2011 Symposium on September 20th will be a unique chance to hear a global overview on the future of the display market (Samsung LCD) and on display applications in the automotive industry (Daimler AG)


Eurodisplay ’11 in Bordeaux-Arcachon will be the right spot to learn more about the last results on display research and related fields such as organic electronics. The cognitic science will give a new vision on the impact of display in our day to day life, not only from a perception but from the Information standpoint since SID deals with “Information Display” and not only technologies.The two invited talk of Pr. Bernard Claverie and Dr. Lauren Palmateer will focus on this aspect......A dedicated business oriented session will address two important aspect of our business from display company creation by Thierry Leroux to end user trends analysis for TV market by Dr. Jae Shin. This conference will provide a global vision of our Display World including not only the well-known display leaders but also the very actives BRIC countries......Dr. V. Pellegrini Mammana will give a vivid illustration of this display industry dynamic in Brazil."

UIST Symposium, October 16-19, 2011, Santa Barbara, California
"UIST (ACM Symposium on User Interface Software and Technology) is the premier forum for innovations in the software and technology of human-computer interfaces. Sponsored by ACM's special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together researchers and practitioners from diverse areas that include traditional graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW. The intimate size, single track, and comfortable surroundings make this symposium an ideal opportunity to exchange research results and implementation experiences."


VisWeek 2011: Viz, Infovis, VAST October 23-28, Providence, RI
"Computer-based information visualization centers around helping people explore or explain data through interactive software that exploits the capabilities of the human perceptual system. A key challenge in information visualization is designing a cognitively useful spatial mapping of a dataset that is not inherently spatial and accompanying the mapping by interaction techniques that allow people to intuitively explore the dataset. Information visualization draws on the intellectual history of several traditions, including computer graphics, human-computer interaction, cognitive psychology, semiotics, graphic design, statistical graphics, cartography, and art. The synthesis of relevant ideas from these fields with new methodologies and techniques made possible by interactive computation are critical for helping people keep pace with the torrents of data confronting them."

6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011 ITS 2011
November 13-16, 2011 Portopia Hotel, Kobe, Japan
"The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications."


AFFINE: 4th International Workshop on Affective Interaction in Natural Environments (ICMI 2011)  November 17, 2011, Alicante, Spain (CFP deadline is August 19, 2011)
Scope: "Computer gaming has been acknowledged as one of the computing disciplines which proposes new interaction paradigms to be replicated by software engineers and developers in other fields. The abundance of high-performance, yet lightweight and mobile devices and wireless controllers has revolutionized gaming, especially when taking into account the individual affective expressivity of each player and the possibility to exploit social networking infrastructure. As a result, new gaming experiences are now possible, maximizing users’ skill level, while also maintaining their interest to the challenges in the same, resulting in a state which psychologists call flow: “a state of concentration or complete absorption with the activity at hand and the situation”. The result of this amalgamation of gaming, affective and social computing has brought increased interest in the field in terms of interdisciplinary research........Natural interaction plays an important role in this process, since it gives game players the opportunity to leave behind traditional interaction paradigms, based on keyboards and mice, and control games using the same concepts they employ in everyday human-human interaction: hand gestures, facial expressions and head nods, body stance and speech. These means of interaction are now easy to capture, thanks to low-cost visual, audio and physiological signal sensors, while models from psychology, theory of mind and ergonomics can be put to use to map features from those modalities to higher-level concepts, such as desires, intentions and goals. In addition to that, non-verbal cues such as eye gaze and facial expressions can serve as valuable indicators of player satisfaction and help game designers provide the optimal experience for players: games which are not frustratingly hard, but still challenging and not boring.........Another aspect which makes computer gaming an important field for multimodal interaction is the new breed of multimodal data it can generate: besides videos of people playing games in front of computer screens or consoles, which include facial, body and speech expressivity, researchers in the field of affective computing and multimodal interaction may benefit from mapping events in those videos (e.g. facial signs of frustration) to specific events in the game (large number of enemies or obstacles close to the player) and infer additional user states such as engagement and immersion. Individual and prototypical user models can be built based on that information, helping produce affective and immersive experiences which maintain the concept of ‘flow’. This workshop will cover real-time and off-line computational techniques for the recognition and interpretation of multimodal verbal and non-verbal activity and behaviour, modelling and evolution of player and interaction contexts, and synthesis of believable behaviour and task objectives for non-player characters in games and human-robot interaction.The workshop also welcomes studies that provide new insights into the use of gaming to capture multimodal, affective databases, low-cost sensors to capture user expressivity beyond the visual and speech modalities and concepts from collective intelligence and group modelling to support multi-party interaction."

Intelligent User Interfaces (IUI 2012) Lisbon, Portugal, February 14-17 (pdf) (CFP Submission Deadline is October 21, 2011)
"Major Topics of Interest to IUI include: Intelligent interactive interfaces, systems, and devices, Ubiquitous interfaces, Smart environments and tools, Human-centered interfaces, Mobile
interfaces, Multimodal interfaces, Pen-based interfaces, Spoken and natural language interfaces, Conversational interfaces, Affective and social interfaces, Tangible interfaces, Collaborative multiuser interfaces, Adaptive interfaces, Sensor-based interfaces, User modeling and interaction with novel interfaces and devices, Interfaces for personalization and recommender systems, Interfaces for plan-based systems, Interfaces that incorporate knowledge- or agent-based approaches, Help interfaces for complex tasks, Example- and demonstration-based interfaces, Interfaces for intelligent generation and presentation of information, Intelligent authoring systems, Synthesis of multimodal virtual characters and social robots, Interfaces for games and entertainment, learning based interactions, health informatics, Empirical studies and evaluations of IUI interfaces, New approaches to designing Intelligent User Interfaces, and related areas"


IXDA: Interaction 12, Dublin, Ireland, February 1-4, 2012
"Interaction|12 is an ideal venue to showcase the most inspiring and original stories in interaction design. We have a world of creative talent to tap into, so our conference roster will fill up fast. We’re on the lookout for thoughtful, original proposals that will inspire our community of interaction designers from all over the world. Do you have an uncommon or enlightening design story, valuable lessons learned from hands-on experience and want to be a part of the programme at Interaction|12? You do? Great!"


More to come!


BTW, I'd like to go to a few Urban Screens or Media Facades festivals:

Media Facades Festival Europe 2010 from MediaFacades on Vimeo


Of course, I'd like to go to educational technology, school psychology, and special education conferences...

Mar 29, 2011

SIFTEO, the next-gen Siftables! (Tangible User Interfaces for All)

Despite my enthusiasm for TUI's , I somehow missed the news about the transformation of Siftables to a commercial version, Sifteo:

Sifteo Inc. Debuts Sifteo™ Cubes - A New Way To Play (PDF



"Sifteo cubes are 1.5 inch computers with full-color displays that sense their motion, sense each other, and wirelessly connect to your computer. You, your friends, and your family can play an ever-growing array of interactive games that get your brain and body engaged.
Sifteo’s initial collection of titles includes challenging games for adults, fun learning puzzles for kids, and games people can play together." -Sifteo website
For more information, see the Sifteo website,  blog, and YouTube  channel.  If you can't wait to get your own set,  take a look at Josh Blake's Sifteo Cube Unboxing Video!

RELATED
About two years ago, I was interviewed about my thoughts about the interactive, hands-on, programmable cubes, then called Siftables,  for an article published in IEEE's Computing Now magazine:  Siftables Offer New Interaction Mode  (James Figeuroa, Computing Now, 3/2009). 

For those of you who'd like more information about tangible user interfaces (TUIs) and  the development of Siftables, I've copied my 2009 post,   Tangible User Interfaces, Part I:  Siftables,  below:

TANGIBLE USER INTERFACES, PART I: SIFTABLES (2009)
In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms" (pdf).   According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth a look, for those interested in this seminal work.

Another must-read is Hiroshi Ishii's 2008 article, Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related to Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. 



According to the Fluid Interfaces website, the goal of this research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined. Siftables is the work of David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement..No special sensing surface or cameras are needed."





Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg


More about Siftables:
Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces (pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."

In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.


Feb 20, 2011

Human-Computer Interaction (HCI) Is Changing the World: BLUR Conference, February 22-23, Omni Orlando Resort (Includes video)

Tuesday, February 22, 2011 at 8:00 AM - Wednesday, February 23, 2011 at 5:00 PM (ET) Omni Orlando Resort, 1500 Masters Blvd. 
 ChampionsGate, Florida 33896 Phone: (407) 390-6664
Blur Conference

ABOUT BLUR 
(from the conference website)

"It’s easy to forget that the computer mouse is over 45 years old."

"What’s not as easy to forget is that we’re now collectively getting used to interacting with computers via means and interfaces that have moved way beyond the keyboard and the mouse — the iPhone and Wii being the most prominent examples."

"The truth is that we stand on the verge of a major revolution in the models of Human Computer Interaction (HCI). A revolution that will fly right past academic and into a world of retail, medical, gaming, military, public event, sporting, personal and marketing applications."

"From multi-touch to motion capture to spatial operating environments, over the next 10 years, everything we know about HCI will change."

"Blur is the only conference that is exploring the line of interaction between computers and humans in a substantive, real-world and hands-on way."

"At Blur, vendors, strategists, buyers and visionaries assemble to not only discuss the larger issues of HCI, but also to lay their hands on the latest in HCI technology. Blur is the only forum for a focused, hands-on exploration of the varied technologies evolving in the HCI."

"Come play, investigate, learn and apply at Blur — where we’re changing how you interact with computers forever." -Blur




BLUR Conference Agenda
(Note:   I added the links to conference participants and/or their organizations. Feel free to leave a comment if you know of any corrections or better links!)
Keynotes:

Neuroergonomics: How an Understanding of the Brain is Changing the Practice of Human Factors Engineering - Dr. Kay Stanney, Design Interactive
When Computers Feel: Understanding Human Emotional Measurement  - Hans Lee, EmSense
A Quick Hit on Mobility and HCI - Juan Pons, Swype
Panel Discussion: Haptics- The Beginnings and Future of Touch  - Nimish Mehta
Why HCI will lead the biggest tech revolution yet - Andrew Tschesnok, Organic Motion
Location as a Primary Interface Input - Matt Galligan, SimpleGeo; Nick Brachet, Skyhook Wireless
Robotics, Gaming and The Future of Entertainment- Paul Berberian, Orbotix
Virtual Coaches in Healthcare: A Vision of the Future - Dan Siewiorek, Carnegie Mellon University
10 reasons to be happy about giving computers emotion sensing - Dr. Rosalind Picard, MIT
Commercializing HCI Technology - Dr. Paul Kedrosky, Ewing Marion Kauffman Foundation and Dr. Gerry Barnett


Breakout Sessions:
Human Instrumentation - James Park, FitBit; Ben Rubin, Zeo; Jason Jacobs, RunKeeper;
Steve Larsen, moderator

New Museum Experiences: Learning from Multitouch and Multiuser Installations - Jim Spadacinni, Ideum
Kinect Hacks - Jonathan C. Hall; Lonergan Harrington; Jim Spadacinni, Sean Kean, moderator
Interactive Ads and Consumer Experiences - Alessio Signorini, Immersive Labs; Jon Fox, Helios Interactive
Augmented Reality - Ready for Primetime? - Vikas Reddy, Occipital; Carlin Getliffe, OmniarEdwin Rivera, Credelis; Dan Rua, moderator
Building an Interface for Endangered Language Learners - Finn Thye and Kelson Adams, Univ of Colorado - Boulder
Alternative Interface Inputs - Gary Clayton, Nuance; Nick Langdale-Smith, Seeing Machines; RJ Auburn, Voxeo; Steve Larsen, moderator
Ewing Marion Kauffman Foundation "Idea Hack" - Commercializing HCI Technology: A Discussion - led by Paul Kedrosky
Building Natural User Interfaces - Thomas Peterson, SoftKinetic; Ohad Shvueli, Prime Sense; David Minnen, Oblong
3D Interactive Design for the Human Body - Albert Hwang
Panel: Will the Kinect Change the HCI Industry Forever? A Group Discussion
HCI in the 21st Century:  Technologies for Extending and Amplifying the Human Experience (pdf) - Dr. Charlie Hughes, UCF; Dr. David Pratt; Dr. Joseph LaViola;  moderated by Steve Fiore, UCF


Some Videos of HCI/Tech featured at Blur 2011 
360 Panorama occipitalhq


"Illuminous" Eric Gradman



"Standard Gravity" Eric Gradman, OpenKinect (libfreenect/python)


Organic Motion Markerless Motion Capture


Advisory:
Steve Fiore, University of Central Florida
Bob Allen, Disney R&D
Kay Stanney, Design Interactive
Capt. Dylan Schmorrow, USN