Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts

Jun 26, 2010

A few links: GizmoWatch's 10 Interactive User Interfaces for the Future, CNN's Eatocracy, EVA 2010 and More!

Here's a quick link to a recent Gizmowatch post, Ten Interactive User Interfaces for the Future.  Bharat, the authro, reviews a variety of interfaces, input methods, and interaction techniques, such as Skinput, a water-based touch screen, a muscle-computer interface, air gestures, brain-computer systems, and even a mud-tub interface.


I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.


Totally Unrelated


Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz".  The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.


Here is the description of Eatocracy from the website:

Eatocracy  "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."

Enjoy!

(The above is a repost from The World is My Interactive Interface)

Coming Soon
--More about 3D TV and Interactive TV
--Highlights from CHI 2010 (better late than never!)
--My experiments- SMARTTable, a game, interactive timeline prototype pictures...
--A post about Lieven van Velthoven's interesting Post-WIMP explorations - here are some links that he recently sent me:
As I took a peek at Lieven's video links, I noticed an interesting video mash-up Lieven created from the open-source code from the RadioHead's House of Cards music video and his One Million Particles app. I'll post them soon.

I'll try to get video, pictures, and commentary about EVA 2010.  EVA stands for Electronic Visualization and the Arts. "Electronic Information, the Visual Arts, and Beyond.

FYI
I'm in the process of sorting through and re-organizing my blogs, which have been around for over four years!  During this time, my blogs have attracted a growing number of readers. Because of this, I'd like to make things a bit user-centered.  So expect to see little changes here and there.  I promise I'll give my readers warnings in advance if I make any serious changes! 

If you are new to this blog, you should know that my blogs started out as on-line filing cabinets, open to the world.  Although there is a bit of overlap of material and some cross-posting between the blogs, they are arranged to serve as a paper-less way of keeping track of things that I've learned through my coursework, conference attendance, readings, and research. Since emerging technologies are high on my list of interests, I also use my blogs to share interesting things that cross my path.    
  
I changed the name of my World Is My Interface blog to The World Is My Interactive Interface.   "Off-the-desktop natural user interfaces, interaction, and user experience" are the main topics of the blog.  It sometimes includes information about ubiquitous computing and DOOH, otherwise known as Digital Out Of Home.

I plan to tinker with my TechPsych blog later on. It focuses on topics that are useful to psychologists, educators, special education teachers, speech and language therapists, health and wellness professionals, and parents.

Feel free to leave comments, as I welcome your input.

Jul 12, 2012

TechCrunch Charlotte Highlights: T1 Vision; inTouch Collaborative Software


Yesterday evening I attended a meetup of TechCrunch and Charlotte-area techies, held at the uptown Packard Place entrepreneurial center.  It was jammed-packed with people all abuzz with tech start-up fever, fueled by awesome food provided by Zen Fusion.  Although my main purpose for attending the TechCrunch meet-up was to learn more about innovative technology start-ups in my region, I also was hoping to capture a few shots of interesting people.   I like to keep my eye open for tee-shirt slogans, and one worn by a young gentleman caught my eye, proclaiming that he'd seen the future, and it is in his browser.  On the back of his tee-shirt was a bright HTML5 logo, something that is dear to my heart, as I am moving from HTML4 to HTML5.  He was polite and agreed to pose for a couple of photographs:
 






It turned out that the HTML5 guy was at the TechCrunch event with one of his colleagues from T1 Visions, a social touchscreen solutions company that I've featured previously on this blog.  They caught me up on the growth of this start-up company, which now has 15 employees and has broadened its reach beyond table-top restaurant applications to the healthcare, education, corporate, retail, and broader hospitality sectors.

What I like about table-top systems is that they provide support for "natural user interaction".  It allows for multiple modes of interaction with and presentation of multimedia content.  Over the past several years, these systems have been proven to be useful to a wider range of people and settings.  Interfaces that support touch and gesture interaction are no longer viewed as novelties, given the pervasiveness of touch-phones and tablets and their ease-of-use for most people.

A useful product from T1 Visions is the T1 Collaboration Table. It supports touch-screen interaction and can also handle up to four simultaneously connected laptops.   The table system provides a media viewer that supports sharing of photos across screens, devices, and surfaces.  It also contains a web browser, a presentation viewer, and a whiteboard that is compatible with video conferencing.  The company provides customized applications for its clients.   In the Charlotte area, some of the tables can be found in restaurants, such as the Mellow Mushroom, Cowfish, and Harpers.  A few were recently installed in the Atkins library at UNC-Charlotte, to support group-work among students.

To learn more about what T1 Visions has to offer, take a few minutes to view the following videos and follow the links at the end of this post!






Demonstration of how the collaboration table can work within a business environment:


Demonstration of the T1VISION touch wall:
RELATED
T1 Visions Gallery
T1 Visions: Social Touchscreen Solutions
Interactive tabletops bring people together
Marty Minchin, Charlotte Observer, South Charlotte News, 2/20/12
Interactive Technology in the Carolinas: T-1 Visions Update

NOTE:
TechCrunch is a technology media group founded in 2005 that focuses on innovative technologies.  This summer, a group of TechCrunchers are visiting cities in the south that were previously not under their radar, such as Savannah and my home region, Charlotte, N.C.   The Charlotte TechCrunch meetup was held on Wednesday, July 11, 2012.  I plan to devote a few more blog posts to share what I learned.

Aug 21, 2010

Reflections About Interactivity in my Present World

Background:  For those of you who are new to this blog, I work as a school psychologist in a public school district in North Carolina. During the mid 2000's, I took a variety of computer courses, leading up to a decision to work part time while taking graduate courses part-time such as human-computer interaction, games, information visualization, ubiquitous computing, VR, etc. I returned to work full time in 2008, due to the economic downturn.

I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs.  I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.

Update:  I don't usually post much about my work on this blog, but I have some exciting things to share.

New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.

School started for Wolfe students on July 26th, and three more whiteboards were installed.  One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.


Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.  

Why is this important to me?

Observations of positive changes:
Students who previously had limited interest in applications on the computer, or a limited attention span for such activities, demonstrate a high degree of interest and engagement with the activities on the IWBs we've tried so far.  Many students who do not attend to learning materials, or only for brief periods of time, actively look at the IWB screen and even interacting with it, and smile.   Some students who rarely-or never- speak sometimes vocalize more often during IWB activities!   Some are making connections between learning materials and what they interact with on the screen, something that simply did not happen previously.  Some students who rarely established eye contact with others or demonstrated joint attention do so during IWB activities, something I discussed in a post on the TechPsych blog.

Technology and my role as a school psychologist:
Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory.  I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation.  There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".

We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district,  how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:

"School psychologists promote the development and maintenance of learning environments that support resilience and academic growth, promote high rates of academic engaged time, and reduce negative influences on learning and behavior."

"School psychologists use information and assistive technology resources to enhance students’ cognitive and academic skills."

"School psychologists work with other school personnel to develop, implement, and evaluate effective interventions for increasing the amount of time students are engaged in learning."

"School psychologists facilitate design and delivery of curricula to help students develop effective behaviors, such as self-regulation and self-monitoring, planning/ organization, empathy, and healthy decision-making."

Working towards solutions:
I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district.  She's used interactive applications and web resources for many years along with more traditional "hands-on" materials,  but like me, has had limited access to IWB's until recently.  She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April.  We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.

One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards.   Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.


vSked_1 575x320
vSKED in action

Workshop Presentation
This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs.  We shared some of the exciting things we've noticed with our students and shared "before and after" examples.  Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)

During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning.   We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens.  Below are a few of the resources we shared:

Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.




Sensory World provides an intuitive navigation system for students to use as they explore various areas of a house and engage in learning and related activities. The site map for all of the areas of the house, as well as the activities, is graphical.




We shared resources from the National Gallery of Art's Kids Zone, a great website that offers plenty of interactive content:


Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
Zac Browser

Another website that is good for visual-based browsing is Symbol World:

"The intention is that symbol readers, teachers and carers will contribute, and we are delighted at the number of contributors that have already sent us their work"

We shared a recent find, Taptu rthat can be used on an IWB.  Although Taptu focuses on the mobile touchable web, accessed through smartphones with internet access, it provides links to a wide range of websites designed with intuitive touch interaction in mind, perfect for students with special needs.  In many cases, Taptu web navigation resources work fine on the larger screen of the IWB.  The advantage is that a icon/visual based touch navigation system supports students who can't type and also those who have very limited reading skills.


Sports Navigation for Taptu

Below is a slideshow about the touchable web:
Taptu: Virtual Roundtable


Update: SMARTTable
At Wolfe, we'd like to create a conversation wheel with related activities for the SMARTTable that co-ordinate with SMARTBoard activities.  We'll probably experiment with the conversation wheel concept on the SMARTboard first.


Considerations:
One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart.   We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated.  The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.

Complications:
Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3.   My HP Touchsmart, my home computer, is 64 bit, but runs Vista.  My school-issued laptop runs Windows XP and is also 64-bit.  To upgrade it, I will have to send it to the tech department for several weeks.   It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.

Possible SMARTTable Solutions:
I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out!  We'd welcome volunteers from other universities as well.   Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.










Upcoming Interactive Multimedia Technology Posts
I have about 6 posts in draft form about interesting interactive technologies, research, people, and companies involved with multi-touch, interactive multimedia, and natural user interface/interaction applications. I'll get them up as soon as possible.

One post is about 3M Touch Systems. Another is about an interesting multi-touch start-up company, Osmosis.  I revisited Ballantyne Village to update the use of interactive touch screen technology, something I discussed in detail, along with photos and video clips, in a 2008 post, Technology Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch-screen usability".   I got some inside scoop about the rationale behind the changes at the upscale center from a seasoned owner of a high-end audio-visual store I interviewed about 3D TV.   I have some interesting information about current research in "glasses-less" 3D displays. Sharp will be coming out with a 3D cell-phone camera AND glasses-less cell-phone display in the not-to-distant future.  

I'm preparing for posts that highlight a few of my favorite blogs. Until then, take a look at the recent posts on Tracy Boyer's Innovative Interactivity blog, and also InteractiveTV Today.

RELATED & SOMEWHAT RELATED
"The AAC-RERC conducts a comprehensive program of research, development, training, and dissemination activities that address the NIDRR priorities and seek to improve technologies for individuals who rely on augmentative and alternative communication (AAC) technologies. The mission of the AAC-RERC is to assist people who rely on augmentative and alternative communication to achieve their goals by advancing and promoting AAC technologies and supporting the individuals who use, manufacture, and recommend them."



Janet Light, Kathy Drager, Penn State University




Jeff Higginbotham's Bibliography:  Viewing AAC Through Authentic Social Interactions (pdf)

ProLoquo2Go (iPhone/iPod Touch/iPad communication system)


Also posted on the TechPsych blog

Nov 4, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) -Extended Deadline: December 9, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) -Extended Deadline: December 9, 2012

Overview 
One of the primary goals of teaching is to prepare learners for life in the real world. In this ever-changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Given the potential of these new interfaces, software, and technologies as learning tools, as well as the ubiquitous application of interactive technology in formal and informal learning environments, there is a growing need to explore how next-generation technologies will impact education in the future. 

As a community of Human-Computer Interaction (HCI) and educational researchers, we need to theorize and discuss how new technologies should be integrated into the classrooms and homes of the future. In the last three years, three CHI workshops have provided a forum to discuss key issues of this sort, particularly in the context of next-generation education. The aim of this special issue of Personal and Ubiquitous Computing is to summarize the potential design challenges and perspectives on how the community should handle next-generation technologies in the education domain for both teachers and students. 


We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include but are not limited to: 

  • Gestural input, multitouch, large displays 
  • Mobile devices, response systems (clickers) 
  • Tangible, VR, AR & MR, multimodal interfaces 
  • Console gaming, 3D input devices 
  • Co-located interaction, presentations 
  • Educational pedagogy, learner-centric, child computer interaction 
  • Empirical methods, case studies 
  • Multi-display interaction 
  • Wearable educational media 
Important Dates 

  • Full papers due: December 9, 2012 
  • Initial reviews to authors: January 18, 2013 
  • Revised papers due: March 15, 2013 
  • Final reviews to authors: April 26, 2013 
  • Final papers due: June 14, 2013 


Submission Guidelines 

Submissions should be prepared according to the Word template located at the bottom of this page. All manuscripts are subject to peer review. Manuscripts must be submitted as a PDF to the easychair submission system. Submissions should be no more than 8000 words in length. 

Guest Editors and Contact Information 

  • Syed Ishtiaque Ahmed, Cornell University 
  • Quincy Brown, Bowie State University 
  • Jochen Huber, Technische Universität Darmstadt 
  • Si Jung “Jun” Kim, University of Central Florida 
  • Lynn Marentette, Union County Public Schools, Wolfe School 
  • Max Mühlhäuser, Technische Universität Darmstadt 
  • Alexander Thayer, University of Washington 
  • Edward Tse, SMART Technologies 

Contact: eistjournal2012@easychair.org 

Information about the Journal of Personal and Ubiquitous Computing 


Submission Template: PUC_EIST_article_template.docx  (59k)

Jul 12, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST)

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) 


Overview 
One of the primary goals of teaching is to prepare learners for life in the real world. In this ever-changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Given the potential of these new interfaces, software, and technologies as learning tools, as well as the ubiquitous application of interactive technology in formal and informal learning environments, there is a growing need to explore how next-generation technologies will impact education in the future. 


As a community of Human-Computer Interaction (HCI) and educational researchers, we need to theorize and discuss how new technologies should be integrated into the classrooms and homes of the future. In the last three years, three CHI workshops have provided a forum to discuss key issues of this sort, particularly in the context of next-generation education. The aim of this special issue of Personal and Ubiquitous Computing is to summarize the potential design challenges and perspectives on how the community should handle next-generation technologies in the education domain for both teachers and students. 

We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include but are not limited to: 

  • Gestural input, multitouch, large displays 
  • Mobile devices, response systems (clickers) 
  • Tangible, VR, AR & MR, multimodal interfaces 
  • Console gaming, 3D input devices 
  • Co-located interaction, presentations 
  • Educational pedagogy, learner-centric, child computer interaction 
  • Empirical methods, case studies 
  • Multi-display interaction 
  • Wearable educational media
Important Dates
  • Full papers due: November 9, 2012
  • Initial reviews to authors: January 18, 2013
  • Revised papers due: March 15, 2013
  • Final reviews to authors: April 26, 2013
  • Final papers due: June 14, 2013
Submission Guidelines
Submissions should be prepared according to the Word template located at the bottom of this page. All manuscripts are subject to peer review. Manuscripts must be submitted as a PDF to the easychair submission system. Submissions should be no more than 8000 words in length.

Guest Editors and Contact Information
  • Syed Ishtiaque Ahmed, Cornell University
  • Quincy Brown, Bowie State University
  • Jochen Huber, Technische Universität Darmstadt
  • Si Jung “Jun” Kim, University of Central Florida
  • Lynn Marentette, Union County Public Schools, Wolfe School
  • Max Mühlhäuser, Technische Universität Darmstadt
  • Alexander Thayer, University of Washington 
  • Edward Tse, SMART Technologies

Information about the Journal of Personal and Ubiquitous Computing

Feb 27, 2009

Tangible User Interfaces Part I: Siftables

In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms


"(pdf). According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth taking at least a glance of this seminal work.


Another must-read is Hiroshi Ishii's 2008 article,
Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related tothe Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. According to the Fluid Interfaces website, the goal of this
research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined.
Siftables is the work of
David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement
..No special sensing surface or cameras are needed."



Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg

More about Siftables:

Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces
(pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."


In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.

Oct 31, 2010

Microsoft is acquiring Canesta, Inc., a developer of 3-D electronic perception technology for natural user interaction, gaming, and more.

Microsoft to Acquire 3-D Chip Firm Canesta
Michael Baron, TheStreet 10/29/10

Thanks to Harry Van Der Veen, of NUITEQ, for this link!

RELATED
The following video is from the Canesta3D YouTube channel. It demonstrates the 3D input sensor in action, with four people moving around in a living room. The chip used in the system depicted in the video was the precursor to the current chip, called the "Cobra 320x200".


Below is a demo of gesture interaction using Canesta3D technology to control and select information and content on a large display.  In my opinion, this will change the way we interact with our TV's, at least for those of us who hate using bad remotes!  Microsoft's acquisition of Canesta is good news, especially if they allow this technology to be used by the masses.   I'm pretty sure it has the capability of supporting  interaction with HD TV's are internet-ready, and can support GoogleTV, LeanBack, and Vimeo's Couch Mode.




Canesta Announces Definitive Agreement to be Acquired by Microsoft
Press Rease, 10/29/10, Canesta

About Canesta (From the Canesta website)
"Canesta (www.canesta.com) is the inventor of revolutionary, low cost electronic perception technology and leading provider of single chip CMOS 3-D sensors that fundamentally change the relationship between devices and their users. This capability makes possible true 3-D perception as input to everyday devices, rather than the widely understood 3-D representational technologies as output. Canesta’s 3-D input technology, based upon tiny, CMOS 3-D imaging chips or “sensors”, enables fine-grained, 3-dimensional depth-perception in a wide range of applications. Products based on this capability can then react on sight to the actions or motions of individuals and objects in their field of view, gaining levels of functionality and ease of use that were simply not possible in an era when such devices were blind. Canesta’s focus is on mass market consumer electronics, but many applications exist in other markets as well. Canesta is located in Sunnyvale, CA. The company has filedin excess of fifty patents, 44 of which have been granted so far."


Canesta Corporate Fact Sheet (pdf)
Videos: http://canesta.com/applications/consumer-electronics/gesture-controls

I posted some videos about Canesta's technologies on the following post. There are two videos that show Canesta's 3D depth camera works on a Hitachi flat-panel display: Interactive Displays 2009 Conference

For more information about interactive TV, GoogleTV, Leanback and Couch Mode, see the second section of my recent post:
Philipp Geist: Blending the Physical with the Digital;  Google TV/Leanback, Vimeo's new Couch Mode, oh..and ViewSonic's 3D (glasses-less) pocket camcorder...

Nov 11, 2010

Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited

I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while. 


Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same.   I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post. 


(The missing piece of information?  An update about apps for the iPad and similar touch-screen tablets.)

Sit back and enjoy!


http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg
-Images: HP; Wired

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg
-Images: ClassematePC


Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing.  There is a need for a breath of fresh air!

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.  
HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the  Dell, the iPad, etc.)

Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.

Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

UPDATE:  Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world


There are some interesting changes going on at the intersection of HCI and educational technology research.  I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe:   Next Generation of HCI and Education

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI: TOUCH SCREEN DISPLAYS:  NEED FOR IMPROVEMENT!

Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke


User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)

User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic.  When I went to visit a relative at the hospital a year or so later, the map was no longer there.


BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Midwife Toad App on a Microsoft Surface, Discovery Place Science Center


TellTable:  Digital Storytelling on the Surface:  Microsoft Research, UK


DECEMBER 31, 2009 -Interactive Soda Machine for Fun

The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them.  Note:  No one from this family actually purchased a soft drink.  I was hoping to time how long it would take them to do so!



Some resources:
lm3labs (catchyoo, ubiq'window)
NUITeqNUI Group (See member's links)
Sparkon (See members links and multi-touch projects)

(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)

If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."

FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining.   The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below: