Mar 6, 2009

Interaction with the Web of Things; LIFT '09

I recently came across the Web of Things blog and found it to be a gem. Vlad Trifa and Dominique Guinard are the brains behind the blog. They have lots of interesting ideas that are at the intersection of WSN (Wireless Sensor Networks) and HCI (Human Computer Interaction).

Vlad and Dominique recently presented at LIFT '09. If you haven't heard of Lift, take some time and visit the website:

"LIFT is a series of events to inspire and connect the community of doers and thinkers exploring the social impact of new technologies. Each LIFT conference is a three days experience made of talks, workshops, interactive art and discussions to understand and anticipate the most important social changes, and meet the people behind them."


Everything on the Web of Things blog inspired me to write two posts, back to back, on my Technology Supported Human World Interaction blog:

More cool things from the Web of Things blog


The Web of Things, Wireless Sensor Networks, Embedded Systems, and (Everyware) Health Care
(The above post was inspired by my experience at the Cleveland Clinic, spending time every day with my father, who has been in the cardio ICU since his surgery a few days ago.)

Mar 4, 2009

Microsoft and the Future of Interaction

Over the past week or so, I've heard quite a bit about Microsoft's vision for the future, across a variety of domains.

To get started, take a look at the following video from CES 2009: Microsoft Future Products Demo


Take a look at the Microsoft Office Labs Vision 2019 video, presented by Microsoft's Business Division president Stephen Elop at the Wharton Business Conference, via the istartedsomething blog:

&<span class="blsp-spelling-error" id="SPELLING_ERROR_3">lt</span>;a <span class="blsp-spelling-error" id="SPELLING_ERROR_4">href</span>="http://video.msn.com/?mkt=en-GB&<span class="blsp-spelling-error" id="SPELLING_ERROR_5">playlist</span>=<span class="blsp-spelling-error" id="SPELLING_ERROR_6">videoByUuids</span>:<span class="blsp-spelling-error" id="SPELLING_ERROR_7">uuids</span>:a517b260-<span class="blsp-spelling-error" id="SPELLING_ERROR_8">bb</span>6b-48b9-87ac-8e2743a28<span class="blsp-spelling-error" id="SPELLING_ERROR_9">ec</span>5&<span class="blsp-spelling-error" id="SPELLING_ERROR_10">showPlaylist</span>=true&from=shared" target="_new" title="Future Vision Montage">Video: Future Vision Montage&<span class="blsp-spelling-error" id="SPELLING_ERROR_11">lt</span>;/a>


Stephen Elop's Keynote Powerpoint Presentation Link

The Microsoft Office Labs Vision 2019 video sparked an interesting discussion in the comments section of the istartedsomething blog post.

There is more! Coldwell Banker will be using a customized home searching application using the technology of Microsoft Surface:




Futuristic Microsoft in the News:


Microsoft aims to turn PCs into personal assistants, teachers (or robot healers).
3/3/09, Byron Acohido, USA Today

Microsoft Mapping Course to a Jetsons-Style Future
3/2/09, Ashlee Vance, New York Times


Photo by Stuart Isett for The New York Times "Eric Horvitz, left, and Dan Bohus of Microsoft with the prototype of a virtual assistant that can understand its surroundings"


Stuart Isett for The New York Times "Hrvoje Benko demonstrating a Microsoft projection system that lets people manipulate large video images with their hands"
Yet another video:
Microsoft Research: A look at tomorrow's health solutions today: Part I
Laura Foy, 8/19/08


"In this special two-part video edition of House Calls for Healthcare Professionals, Bill Crounse, MD, visits with researchers at Microsoft Research. Each program reviews three promising areas of research that may one day lead to solutions with a direct or indirect application to health and healthcare. Viewers will gain insight to advanced ideas and technologies now in the labs at Microsoft Research long before they find their way into future products, solutions, or applications."



Surface Computing in Health Care: VitraView from InterKnowlogy:



Here is a 2008 video from Microsoft: Office Labs: Future of Personal Health Concept



Interesting concepts, but will they translate to the real world? Time will tell.

RELATED

Microsoft HealthVault Beta
3D Multi Touch Application for Heart Surgeries - Microsoft Surface and Health Vault
Microsoft Research Blogs

Feb 27, 2009

Tangible User Interfaces Part II: More Examples, Resources, and Use for TUI's in Education

In Part I of my "mini-series" about Tangible User Interfaces, I discussed the origins of TUI and provided some examples of Siftables. In this section, I've provided some links to information about Tangible User Interfaces and the abstracts of two articles pertaining to TUI's in educational settings.

Zen Waves: A Digital (musical) Zen Garden



reactable from Nick M. on Vimeo.

Reactable
http://upload.wikimedia.org/wikipedia/commons/e/e3/Reactable_Multitouch.jpg
More about the Reactable
"The reactable hardware is based on a translucent, round multi-touch surface. A camera situated beneath the table, continuously analyzes the surface, tracking the player's finger tips and the nature, position and orientation of physical objects that are distributed on its surface. These objects represent the components of a classic modular synthesizer, the players interact by moving these objects, changing their distance, orientation and the relation to each other. These actions directly control the topological structure and parameters of the sound synthesizer. A projector, also from underneath the table, draws dynamic animations on its surface, providing a visual feedback of the state, the activity and the main characteristics of the sounds produced by the audio synthesizer."


The Bubblegum Sequencer: Making Music with Candy



Jabberstamp: Embedding Sound and Voice in Children's Drawings
(pdf)
(A TUI application to support literacy development in children)

Affective TouchCasting
(pdf)

TapTap: A Haptic Wearable for Asynchronous Distributed Touch Therapy
(pdf)

BodyBeats: Whole-Body, Musical Interfaces for Children
(pdf)

Telestory is a Siftables application that looks like it would be quite useful for supporting children who have communication disorders or autism spectrum disorders.

Telestory Siftables application from Jeevan Kalanithi on Vimeo.

"Telestory is an educational, language learning application created by Seth Hunter. In this video, the child is looking at a television screen. He can control onscreen characters, events and objects with the siftables. For example, he has the dog and cat interact by placing the dog and cat siftables next to each other."
TeleStory Project Website

Here is a video of how Siftables can be used as equation editors:


Siftables Equation Editor from Jeevan Kalanithi on Vimeo.

RESOURCES ABOUT TUI'S:


5 lessons about tangible interfaces, GDC Lyon, December 2007(pdf) Nicolas Nova


Special Issue on Tangible and Embedded Interaction (Guest Editors: Eva Hornecker, Albrecht Schmidt, Brygg Ullmer) Journal of Arts and Technology (IJART) Volume 1 Issue 3/4 - 2008


Reality-Based Interaction: A Framework for Post-WIMP Interfaces (pdf)


Here are a couple of abstracts of articles related to the use of TUI's in education:

Evaluation of the Efficacy of Computer-Based Training Using Tangible User Interface for Low-Functioning Children with Autism Proceedings of the 2008 IEEE International Conference on Digital Games and Intelligent Toys

"Recently, the number of children having autism disorder increases rapidly all over the world. Computer-based training (CBT) has been applied to autism spectrum disorder treatment. Most CBT applications are based on the standard WIMP interface. However, recent study suggests that a Tangible User Interface (TUI) is easier to use for children with autism than the WIMP interface. In this paper, the efficiency of the TUI training system is considered, in comparison with a conventional method of training basic geometric shape classification. A CBT system with TUI was developed using standard computer equipment and a consumer video camera. The experiment was conducted to measure learning efficacy of the new system and the conventional training method. The results show that, under the same time constraint, children with autism who practiced with the new system were able to learn more shapes than those participating in the conventional method."

Towards a framework for investigating tangible environments for learning Sara Price, Jennifer G. Sheridan, Taciana Pontual Falcao, George Roussos, London Knowledge Lab, 2008

"External representations have been shown to play a key role in mediating cognition. Tangible environments offer the opportunity for novel representational formats and combinations, potentially increasing representational power for supporting learning. However, we currently know little about the specific learning benefits of tangible environments, and have no established framework within which to analyse the ways that external representations work in tangible environments to support learning. Taking external representation as the central focus, this paper proposes a framework for investigating the effect of tangible technologies on interaction and cognition. Key artefact-action-representation relationships are identified, and classified to form a structure for investigating the differential cognitive effects of these features. An example scenario from our current research is presented to illustrate how the framework can be used as a method for investigating the effectiveness of differential designs for supporting science learning"

Tangible User Interfaces Part I: Siftables

In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms


"(pdf). According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth taking at least a glance of this seminal work.


Another must-read is Hiroshi Ishii's 2008 article,
Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related tothe Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. According to the Fluid Interfaces website, the goal of this
research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined.
Siftables is the work of
David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement
..No special sensing surface or cameras are needed."



Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg

More about Siftables:

Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces
(pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."


In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.

Feb 23, 2009

YDreams: Interactive Experiences, Real Time Interaction with Augmented Reality Characters

YDreams is doing some interesting things. Watch the delight on this little girl's face as she plays with an avatar in mixed reality, viewed on a large display:


YDreams on Vimeo.

More from YDreams:
"...Flapi, YDreams' in-house mascot, and other virtual characters interact in real-time with a little girl and other physical obstacles in a new seamless augmented playground environment."

http://www.ydreams.com/ydreams_2005/images/contents/uploaded/Image/ylabs2(1).jpg
Photo from YDreams Lab
"YLabs’ main focus is on Reality Computing, which uses new technologies such as mobile computing, augmented reality and ubiquitous interactivity to bridge the distance between the user, information and the machine, in a physical, post-browser environment, where the real and the digital come together."

http://www.ydreams.com/ydreams_2005/images/contents/uploaded/Image/mbook1(1).jpg
This is a photo of YDream's Architek's yMagic Books. Architek is used to create interactive digital content, including children's storybooks that are manipulated on a touch-screen.


This is a demonstration of Architek's yWalk, an immersive virtual playground that can be vertically projected onto soft mats and floors.

The Architek software provides information about user interaction. yWalk looks like it might be useful for occupational or physical therapists in their work with young children.

Interesting work!

Feb 22, 2009

Rich White's "Mobile Immersive Learning Lab" Project; EduSim Update

Rich White is an educational technologist for Greenbush County, Kansas, has been working with the 3D interactive virtual world, EduSim, for quite a while. He's taking EduSim to the next level.

"The
concept is one of an enclosed virtual learning space - with surrounded projection of the virtual learning world the students are exploring - similar to the StarLab Concept (with a rectangular configuration). along the lines of a CAVE - however simpler, mobile, and relatively in-expensive by comparison."

The project is at the beginning prototype stage.

Below is a demo of the virtual world as it is projected on two screens that are placed next to each other at a right angle, with the center of the virtual-world view positioned where the two screens meet:


digital_dome_01.jpgreal_cave.pngpicture-2.png
This might be a great way of reaching students who have autism!


More about EduSim:

EduSim, for those of you haven't seen my previous posts on the topic, is a multi-user 3D interactive environment used in classrooms with interactive whiteboards:




Information from Rich White's Greenbush blog about Edusim:

Wikipedia entry:
"Edusim is a Cave Automatic Virtual Environment based concept of lesson driven 3D virtual worlds on the classroom interactive whiteboard or classroom interactive surface. The Edusim concept is demonstrated by the Edusim free and open source multi-user 3D Open Cobalt virtual world platform and authoring tool kit modified for the classroom interactive whiteboard or surface. The Edusim application is a modified edition of the open source Open Cobalt Project and relies heavily on the affordances of direct manipulation of 3D virtual learning models and Constructionist Learning Principles."


History of Edusim:
"The Edusim project began in September 2007 at the Greenbush Education Service Center in Southeast
Kansas as an effort to bring an engaging 3D experience to the classroom interactive whiteboard. Pilot groups were established with 6th and 7th grade middle school students throughout Southeast Kansas to observe how students would be engaged through the software, and how the user interface would need to be augmented to account for the affordances of the whiteboard, and the usability of the students.
"

Here is a virtual world in Edusim in COBALT, showing how a drag and drop function is used for in-world VNC application sharing:



The Cobalt 3D metaverse browser has been modified for multi-touch interaction by some of the members of RENCI, a collaborative venture of Duke University and several other North Carolina universities. The video below is Dr. Xunlei Wu, demonstrating how gesture and touch is used to manipulate items and navigate through two Cobalt virtual worlds:



Some of the members of RENCI built a multi-touch table in addition to the collaborative multi-touch wall. For more information:

RENCI: Multi-Touch Collaborative Wall and Table using TouchLib: More about UNC-C's Viz Lab


(Cross posted on the TechPsych blog.)