Feb 27, 2009

Tangible User Interfaces Part II: More Examples, Resources, and Use for TUI's in Education

In Part I of my "mini-series" about Tangible User Interfaces, I discussed the origins of TUI and provided some examples of Siftables. In this section, I've provided some links to information about Tangible User Interfaces and the abstracts of two articles pertaining to TUI's in educational settings.

Zen Waves: A Digital (musical) Zen Garden



reactable from Nick M. on Vimeo.

Reactable
http://upload.wikimedia.org/wikipedia/commons/e/e3/Reactable_Multitouch.jpg
More about the Reactable
"The reactable hardware is based on a translucent, round multi-touch surface. A camera situated beneath the table, continuously analyzes the surface, tracking the player's finger tips and the nature, position and orientation of physical objects that are distributed on its surface. These objects represent the components of a classic modular synthesizer, the players interact by moving these objects, changing their distance, orientation and the relation to each other. These actions directly control the topological structure and parameters of the sound synthesizer. A projector, also from underneath the table, draws dynamic animations on its surface, providing a visual feedback of the state, the activity and the main characteristics of the sounds produced by the audio synthesizer."


The Bubblegum Sequencer: Making Music with Candy



Jabberstamp: Embedding Sound and Voice in Children's Drawings
(pdf)
(A TUI application to support literacy development in children)

Affective TouchCasting
(pdf)

TapTap: A Haptic Wearable for Asynchronous Distributed Touch Therapy
(pdf)

BodyBeats: Whole-Body, Musical Interfaces for Children
(pdf)

Telestory is a Siftables application that looks like it would be quite useful for supporting children who have communication disorders or autism spectrum disorders.

Telestory Siftables application from Jeevan Kalanithi on Vimeo.

"Telestory is an educational, language learning application created by Seth Hunter. In this video, the child is looking at a television screen. He can control onscreen characters, events and objects with the siftables. For example, he has the dog and cat interact by placing the dog and cat siftables next to each other."
TeleStory Project Website

Here is a video of how Siftables can be used as equation editors:


Siftables Equation Editor from Jeevan Kalanithi on Vimeo.

RESOURCES ABOUT TUI'S:


5 lessons about tangible interfaces, GDC Lyon, December 2007(pdf) Nicolas Nova


Special Issue on Tangible and Embedded Interaction (Guest Editors: Eva Hornecker, Albrecht Schmidt, Brygg Ullmer) Journal of Arts and Technology (IJART) Volume 1 Issue 3/4 - 2008


Reality-Based Interaction: A Framework for Post-WIMP Interfaces (pdf)


Here are a couple of abstracts of articles related to the use of TUI's in education:

Evaluation of the Efficacy of Computer-Based Training Using Tangible User Interface for Low-Functioning Children with Autism Proceedings of the 2008 IEEE International Conference on Digital Games and Intelligent Toys

"Recently, the number of children having autism disorder increases rapidly all over the world. Computer-based training (CBT) has been applied to autism spectrum disorder treatment. Most CBT applications are based on the standard WIMP interface. However, recent study suggests that a Tangible User Interface (TUI) is easier to use for children with autism than the WIMP interface. In this paper, the efficiency of the TUI training system is considered, in comparison with a conventional method of training basic geometric shape classification. A CBT system with TUI was developed using standard computer equipment and a consumer video camera. The experiment was conducted to measure learning efficacy of the new system and the conventional training method. The results show that, under the same time constraint, children with autism who practiced with the new system were able to learn more shapes than those participating in the conventional method."

Towards a framework for investigating tangible environments for learning Sara Price, Jennifer G. Sheridan, Taciana Pontual Falcao, George Roussos, London Knowledge Lab, 2008

"External representations have been shown to play a key role in mediating cognition. Tangible environments offer the opportunity for novel representational formats and combinations, potentially increasing representational power for supporting learning. However, we currently know little about the specific learning benefits of tangible environments, and have no established framework within which to analyse the ways that external representations work in tangible environments to support learning. Taking external representation as the central focus, this paper proposes a framework for investigating the effect of tangible technologies on interaction and cognition. Key artefact-action-representation relationships are identified, and classified to form a structure for investigating the differential cognitive effects of these features. An example scenario from our current research is presented to illustrate how the framework can be used as a method for investigating the effectiveness of differential designs for supporting science learning"

Tangible User Interfaces Part I: Siftables

In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms


"(pdf). According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth taking at least a glance of this seminal work.


Another must-read is Hiroshi Ishii's 2008 article,
Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related tothe Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. According to the Fluid Interfaces website, the goal of this
research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined.
Siftables is the work of
David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement
..No special sensing surface or cameras are needed."



Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg

More about Siftables:

Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces
(pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."


In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.

Feb 23, 2009

YDreams: Interactive Experiences, Real Time Interaction with Augmented Reality Characters

YDreams is doing some interesting things. Watch the delight on this little girl's face as she plays with an avatar in mixed reality, viewed on a large display:


YDreams on Vimeo.

More from YDreams:
"...Flapi, YDreams' in-house mascot, and other virtual characters interact in real-time with a little girl and other physical obstacles in a new seamless augmented playground environment."

http://www.ydreams.com/ydreams_2005/images/contents/uploaded/Image/ylabs2(1).jpg
Photo from YDreams Lab
"YLabs’ main focus is on Reality Computing, which uses new technologies such as mobile computing, augmented reality and ubiquitous interactivity to bridge the distance between the user, information and the machine, in a physical, post-browser environment, where the real and the digital come together."

http://www.ydreams.com/ydreams_2005/images/contents/uploaded/Image/mbook1(1).jpg
This is a photo of YDream's Architek's yMagic Books. Architek is used to create interactive digital content, including children's storybooks that are manipulated on a touch-screen.


This is a demonstration of Architek's yWalk, an immersive virtual playground that can be vertically projected onto soft mats and floors.

The Architek software provides information about user interaction. yWalk looks like it might be useful for occupational or physical therapists in their work with young children.

Interesting work!

Feb 22, 2009

Rich White's "Mobile Immersive Learning Lab" Project; EduSim Update

Rich White is an educational technologist for Greenbush County, Kansas, has been working with the 3D interactive virtual world, EduSim, for quite a while. He's taking EduSim to the next level.

"The
concept is one of an enclosed virtual learning space - with surrounded projection of the virtual learning world the students are exploring - similar to the StarLab Concept (with a rectangular configuration). along the lines of a CAVE - however simpler, mobile, and relatively in-expensive by comparison."

The project is at the beginning prototype stage.

Below is a demo of the virtual world as it is projected on two screens that are placed next to each other at a right angle, with the center of the virtual-world view positioned where the two screens meet:


digital_dome_01.jpgreal_cave.pngpicture-2.png
This might be a great way of reaching students who have autism!


More about EduSim:

EduSim, for those of you haven't seen my previous posts on the topic, is a multi-user 3D interactive environment used in classrooms with interactive whiteboards:




Information from Rich White's Greenbush blog about Edusim:

Wikipedia entry:
"Edusim is a Cave Automatic Virtual Environment based concept of lesson driven 3D virtual worlds on the classroom interactive whiteboard or classroom interactive surface. The Edusim concept is demonstrated by the Edusim free and open source multi-user 3D Open Cobalt virtual world platform and authoring tool kit modified for the classroom interactive whiteboard or surface. The Edusim application is a modified edition of the open source Open Cobalt Project and relies heavily on the affordances of direct manipulation of 3D virtual learning models and Constructionist Learning Principles."


History of Edusim:
"The Edusim project began in September 2007 at the Greenbush Education Service Center in Southeast
Kansas as an effort to bring an engaging 3D experience to the classroom interactive whiteboard. Pilot groups were established with 6th and 7th grade middle school students throughout Southeast Kansas to observe how students would be engaged through the software, and how the user interface would need to be augmented to account for the affordances of the whiteboard, and the usability of the students.
"

Here is a virtual world in Edusim in COBALT, showing how a drag and drop function is used for in-world VNC application sharing:



The Cobalt 3D metaverse browser has been modified for multi-touch interaction by some of the members of RENCI, a collaborative venture of Duke University and several other North Carolina universities. The video below is Dr. Xunlei Wu, demonstrating how gesture and touch is used to manipulate items and navigate through two Cobalt virtual worlds:



Some of the members of RENCI built a multi-touch table in addition to the collaborative multi-touch wall. For more information:

RENCI: Multi-Touch Collaborative Wall and Table using TouchLib: More about UNC-C's Viz Lab


(Cross posted on the TechPsych blog.)

Jonathan Jarvis: Crisis of Credit Animated Short; Interactive Oracles

Jonathan Jarvis created a series of animated shorts as a project for his work as a graduate student in the Media Design Program at Art Center College of Design. He started exploring the concept of system diagrams and integrated them into motion interactions.


The Crisis of Credit Visualized from Jonathan Jarvis on Vimeo.


(Note: Someone commented about the negative way the family who represents sub-prime mortgage holders was depicted in the short.)

Related Economic Sounds:

The short was influenced by information from the following "This American Life" radio broadcasts.

Click on the following links to listen to the broadcasts:

Another Frightening Show About the Economy
Transcript (pdf)
The Giant Pool of Money
Transcript (pdf)


More about Jonathan Jarvis:

Crisis of Credit Project Page
The back story behind Jonathan's work on the Crisis of Credit Project, with story board scenes and his research sketch of the Crisis of Credit system diagram.
Jonathan's Website
Jonathan's Global Storytelling Project (pdf)

Jonathan worked on concept development, interface design & content development with a team for a multi-touch project, Interactive Oracles, for Acura:
http://www.madein.la/featuredprojects/interactiveoracles/wp-content/uploads/madeinla_interactiveoracles_intro.jpg

(Some of the information is cross-posted on the Economic Sounds and Sights blog.)

Feb 20, 2009

More Multi-Touch and Surface Computing...

The concept of multi-touch/gesture/surface computing is spreading.

Here's more evidence:

Panasonic Touch Air Hockey


The game was demonstrated at ISE 2009 (Integrated Systems) Amsterdam. The interface was developed by UI Centric, a Soho, London company.

Microsoft's SurfaceWare at the Tangible Embedded Interactions Conference (TEI 2009):

SurfaceWare is a level-sensing software that alerts waitstaff when glasses need refilling.


http://photos-e.ak.fbcdn.net/photos-ak-snc1/v2385/101/124/727430870/n727430870_2768500_5763.jpg
Photos from Nachiket Apte, via Ru Zarin

More to come...

Feb 18, 2009

Ready for the SMARTTable?

The Smart Table is now available for purchase!



Here is the plug:

"The world's first multitouch, multiuser table for primary education - the SMART Table - is now available for purchase.Order the SMART Table"

"As a collaborative learning center, the SMART Table enables engaging and motivating small-group learning experiences. Up to eight students can use their fingers intuitively to sweep, slide and spin objects on the interactive screen. The SMART Table's ready-made activities help primary students gain and further their skills in areas like counting and reading."

"The SMART Table also makes an ideal complement to whole-class activities on the SMART Board interactive whiteboard. It helps reinforce concepts in a small-group setting and ensures students can participate in interactive and creative learning experiences."

(Cross-posted on the TechPsych and Technology-Supported Human-World Interaction blogs.)

Feb 15, 2009

Interactive Displays 2009 Conference: Tuesday, April 21 -Thursday April 23, Hilton San Jose, California

The Interactive Displays Conference, sponsored by Intertech Pira, will highlight an interesting mix of existing and emerging interactive display technologies and applications. The conference will be held at the Hilton in San Jose, California, from Tuesday, April 21st through Thursday, April 23rd.

The pre-conference seminar will feature Sakuya Morimoto, of CANESTA, who will present his company's innovative single-chip 3D image sensor technology that supports gesture interaction.
Keynote speakers will be
Jeff Han, of Perceptive Pixel, and Steven Bathiche, of Microsoft US.

Some Highlights:

Pre-conference Seminar: Gesture Navigation in the World of Digital Contents, Enabled by a Single-Chip 3D Image Sensor Presenter: Sakuya Morimoto, Senior Director, Business Development in Asia, CANESTA, Japan

Related:
Hitachi at CES 2009: Use of Canesta's 3D sensor to control television and home systems using hand gestures.



"With the wave of a hand, with the shake of a hand, you can control volume, you can actually change the channels, watch your favorite program...the most exciting thing, I think, is that you can actually control your temperature and the lighting in the room, the environmental lighting. So..it is very unique technology that is out there.."

Another demonstration of Hitachi's gesture interaction using the Canesta's 3=D Depth camera:



When a TV Remote is Just Too Much Effort, Wave -
Jennifer Bergen, PC Magazine
CANESTA Corporate Fact Sheet (pdf)

How does Canesta's Electronic Perception Technology Work?
"Canesta’s electronic perception technology forms 3-D, real time moving images in a single chip through patented methods which use light photons to “range” the image, similar to radar. The silicon sensor chip develops 3-D depth maps at a rate in excess of 30 frames per second, and then performs additional processing on these depth maps to resolve the images into application specific information that can easily be processed by embedded processor(s) in the end-use device or machine. Since Canesta’s software starts with a three-dimensional view of the world, provided immediately by the hardware, it has a substantial advantage over classical image processing software that struggles to construct three-dimensional representations using complex mathematics, and images from multiple cameras or points of view. This dramatic reduction in complexity makes it possible to embed the processing software directly into the chips themselves so they may be used in the most cost-conscious applications."



I will highlight some of the featured presentations in future blog posts:

Steven Bathiche, Director of Research, Applied Sciences Group, Entertainment and Devices Division MICROSOFT, US
Guillaume Largillier, Chief Strategy Officer and Co-Founder, STANTUM, France
Jeff Han, PERCEPTIVE PIXEL, US
Mark Fihn, Publisher, VERITAS ET VISUS, US
Derek Mitchell, Conference Producer, INTERTECHPIRA, US
Vinita Jakhanwal, Principal Analyst, Small/Medium Displays, ISUPPLI CORPORATION, US
Joseph Carsanaro, President and CEO F-ORIGIN, US
Tommi Ilmonen, CEO MULTITOUCH OY, Finland
Stephen Sedaker, Director of Component Sales WACOM TECHNOLOGY CORPORATION, US
Brad Gleeson, Managing Director, Business Development TARGETPATH GLOBAL LLC., US
Henry Kaufman, President and Founder, TACTABLE, US
Christophe Ramstein, Chief Technology Officer, IMMERSION CORPORATION, US
Mary Lou Jepsen, CEO, PIXEL QI, US
John Newton, Chief Technology Officer, NEXTWINDOW, New Zealand
Herve Martin, CEO, SENSITIVE OBJECT, France
Scott Hagermoser, Gaming Business Unit Manager 3M TOUCH SYSTEMS, US
Bob Cooney, Vice President, Business Development, ECAST, US
Brent Bushnell, Chief Technology Officer UWINK, US
Stephan Durach, Head, Technology Office, BMW GROUP, US
Jeff Doerr, Senior Manager, Business Development Self Service Solutions Group, FLEXTRONICS, US
Andy Wilson, Senior Researcher, Adaptive Systems and Interaction Group, MICROSOFT, US
Mats W. Johansson, Chief Executive Officer, EON REALITY, US
Lenny Engelhardt, Vice President for Business Development, N-TRIG, Israel
Dr Paul Diefenbach, Director, RePlay Lab, DREXEL UNIVERSITY, US
Andrew Hsu, Technical Marketing and Strategic Partnerships Manager, SYNAPTICS, US
Dean LaCoe, Business Development Manager, GESTURETEK, Canada
Keith Pradhan, Global Director of Product Management, TYCO ELECTRONICS, ELO TOUCHSYSTEMS, US
Jerry Bertrand, Managing Member/Acting CEO, MICROSCENT, LLC, US
Frederic Kaplan, CEO and Co-Founder, OZWE, Switzerland


Related

Visionary Jeff Han and Microsoft's Steven Bathiche to Keynote at Interactive Displays 2009

Feb 11, 2009

Update on Accessibility and Interactive Games

It has been a while since I shared information about accessible games. If you are a parent of a child or teen with a disability, if you have a disability, or if you hope to keep on gaming through your golden years no matter what ails you, keeping an eye on innovations in this field is worth your time.

The following descriptions are from the IGDA Game Accessibility Special Interest Group blog:

Global Assisitive Technology Wiki - AbilityNet Gate

Via One Switch:
"AbilityNet have put together a wonderful open project called the Global Assisitve Technology Wiki or GATE for short. In their words:

"GATE is actually a Wiki, which is a piece of server software that allows users to freely create and edit Web page content using any Web browser. It's a little like Wikipedia, but just concentrating on assistive technology. GATE is very simple to use, with a control panel enabling you to add content and more. More about Wikis . . .

This wiki has been created by AbilityNet, the UK's largest provider of advice and information on all aspects of Access to technology. The purpose of the wiki is to provide live and up to date information on all aspects of Assistive Technology."

"A really good place to start is their Switch Systems entry here. They are lacking their own accessible games section so hopefully someone (maybe me) will take up the gauntlet for that soon.
"

https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj30Koft1v8scNw2p41F_QGj8QHtKhsR1lVpMprgi11ZY4HStzVZ-P8ijIbocjl-zXNCb3kYn-fSATlUIEgzFZ3ztf_3W027mXg1sI-BCosmeWtHh4eCIl0_t8rWdP780LoMtsh/s400/MysticMine.jpg

Mystic Mine Multi-player One Switch Game

"Just released this February - Mystic Mine is now available to buy on-line at Koonsolo for $19.95 (use www.xe.com for a currency conversion). It is massive fun as a multi-player game and highly recommended by OneSwitch.org.uk. If I had an all time top 10 list of one-switch games this would be a strong contender to make the list. Free demo version available here. Sweet." -posted by One-Switch Games on the Game Accessibility blog


https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhv3B_Hqs9ryh6d_EATZlU8HX15WwKDJuCi5UjjKNP-0kjhNzIYMQcydGNWir1y-UL3rztRHFyEfm54AR6lo1NbTX_dClTy0qvfZ_kyfrOfe4SilHaHLMovEpeKBurgxOPGHRrj3A/s400/Sip-Puff.jpg

"
Pantech is one of three giant mobile phone manufacturers in Korea, among them Samsung and LG. Pantech sold over 10 million phones internationally in 2008, and under the "Sky" brand in the local market. Now, it's about to launch a blow-controlled mobile phone, the IM-S410K, which is also known as the Sky Wind." "Looks like it's got potential for some fun accessible games using sip/puff control." Link Via: Thomas Westin at IGDA GASIG Mailing List, via OneSwitch Games


Stevie Wonder calls for accessible technology


Stevie Wonder - calls for accessible technology."Stevie Wonder is calling for greater access in technology: "[technology] being more accessible is always a plus and I think really, for various companies ... making it exciting and accessible for people who can see, it would take very little to make it accessible to everyone. So I encourage all the manufacturers to do that."

"When you can ... make it accessible and make it possible, you should just include that in the overall picture. ".
Link via: Mike Taylor of Excitim.


Accessible Games Controller Video



Games for Health Trailer



http://www.oneswitch.org.uk/IMAGES/ads/GASIGblog.gif

http://gamescc.rbkdesign.com/images/GamesCC_logo_579x180.jpg

http://www.game-accessibility.com/pics/artwork/gafullbannerv1.jpg

GASIG Links
Game Accessibility Forums
Other Links from the GASIC blog: