Mar 22, 2009

Boxee and Digital Convergence

This post is an update to a previous post, Digital Convergence & Interactive Television

Have you heard about Boxee?  I first learned of it today when I came across an interesting post on the Boxee Blog of a debate with Mark Cuban, author of the post, Why Do Internet People Think Content People Are Stupid? on Blog Maverick.

This debate led me to the following related posts:
Boxee CEO: Consumers Will Get a la Carte Online
(Ryan Lawler, Contentinople: Networking the Digital Media Industry)
Bewkes Defends TV Everywhere (Post by Georg Szalai on the Hollywood Report of how Time Warner would provide subscribers to view cable TV network content online.)

What is boxee? Sort of a social-network-internet-cable-TV-Wii application....
"On a laptop or connected to an HDTV, boxee gives you a true entertainment experience to enjoy your movies, TV shows, music and photos, as well as streaming content from websites like Netflix, CBS, Comedy Central, Last.fm, and flickr."

quick intro to boxee from boxee on Vimeo.

From the boxee blog:
"boxee is the developer of the first “social” media center. boxee plays media from your computer and other devices in your home network, as well as connect you to various Internet sources that allow you to stream or download movies, tv shows, music and photos. boxee is based on the xbmc open-source project. we have been working with team-xbmc since early 2007.
We are in the process of alpha testing boxee, so this is why you need to get an invite to participate.. sorry. the alpha is for Mac OS X 10.4 (and above), Ubuntu and Apple TV (a Windows version will come out towards the end of the year"

Mar 18, 2009

Interview of Henry Jenkins on Games-Based Learning and Importance of Collaboration, Learning Ecology, and Media Literacy



Henry Jenkins will be moving from MIT to USC Annenburg School of Communication and the Cinematic Arts School. He'll be teaching courses such as "Transmedia Storytelling" and "New Media Literacies".

"...there is a learning ecology now, one that takes place outside of the classroom in the after school world...and right now, schools are cutting themselves out of the learning ecology by blocking games, by blocking YouTube, by putting filters on the computers. They block off ways the students are technologically connected, from the best ways of learning..and they leave those students who are trapped behind the participation gap from having access to the experiences that prepare the technically literate for the future."

"..Good teachers are fighting a valiant battle just to be able to access the materials of YouTube .. the other day we discovered that students could not access online resources about Moby Dick, the great American novel, because it had the word "Dick" in it."


My high school blocks websites about games. TeacherTube is blocked, too!

In the above video, a school that integrates the use of games within the curriculum is mentioned. The school is Quest to Learn, scheduled to open for the 2009-2010 school year. It is known as "New York's school for today's digital kids".

http://farm4.static.flickr.com/3341/3328403307_a55563628d_m.jpg

"Quest supports a dynamic curriculum that uses the underlying design principles of games to create highly immersive, game-like learning experiences for students. Games and other forms of digital media also model the complexity and promise of “systems.” Understanding and accounting for this complexity is a fundamental literacy of the 21st century."




Institute of Play (A partner of the Quest to Learn school)
Quest to Learn Press Links

A Win-Win Scenario: "Game School" Aims to Engage and Educate
Eliza Strickland, Wired 8/6/07

More for Multi-touch: NextWindow Plug-in for Natural User Interface's Snowflake Multi-touch Software -and more.



Those of you have an HP TouchSmart, Dell Studio One PC, or NextWindow displays might be interested in the new NUI plug-in that supports NUI Suite Snowflake software. Here are the features of the plugin, according to information from the Natural User Interface website:
  • Detailed user manual included with FAQ
  • Developed on fast and reliable C++ platform
  • Intuitive
  • Customizable
  • Gesture recognition library
  • TUIO/OSC (Open Sound Control) support (sending and receiving events)
  • Low level API
  • Hardware accelerated rendering
  • Support for wide variety of media types
  • Advanced window handler that supports scaling and rotation
  • Suitable for Windows® XP and Windows® Vista (Mac OSX and Linux can be developed on request)
  • Audio support
  • Single, dual support
  • Multi-threaded resource handler (For fast data visualization)

"NUI has partnered up with NextWindow™, an international leader in the development of optical multi-touch technology and the manufacturer of optical multi-touch screens, overlays and OEM touch components."

"NextWindow™'s integrated technology allows for natural and intuitive interaction of digital content on flat TFT, LCD and Plasma solutions."

"The NUI NextWindow™ plug-in can be used with any programming language that supports TUIO, i.e. C/C++/C#, Java, Flash, Python, VVVV etc, meaning that software developers can run their own applications on NextWindow™, utilizing the NUI NextWindow™ plug-in."

Comment:
I became a fan of NextWindow touch-screen displays in early 2007 when I worked on a couple of touch-screen projects in my HCI and Ubicomp classes at UNC-Charlotte.


I've been using my HP TouchSmart PC at work with students with disabilities. I'm experimenting with the NUI Suite SnowFlake on my TouchSmart, and found that interacting with the Particles application delighted students with severe autism. The activities provided opportunities to establish joint attention. I also noticed an increase in the number of vocalizations and/or verbalizations among the students. Of course, this was NOT a scientific study.

RELATED
Definition of Joint Attention from UConn:

"Joint Attention is the process of sharing one’s experience of observing an object or event, by following gaze or pointing gestures. It is critical for social development, language acquisition, cognitive development…"

http://eigsti.psy.uconn.edu/jt_attn.JPG


Establishing joint attention is an important step in the development of social interaction skills among young people who have autism spectrum disorders.

More about joint attention:

Joint Attention Study Has Implication for Understanding Autism
Science Daily, 9/29/07

Asperger-Advice: Joint Attention

Autism Games: Joint Attention and Reciprocity

Why is joint attention a pivotal skill in autism?
Tony Charman
Philos Trans R Soc Lond B Biol Sci. 2003 February 28; 358(1430): 315–324.
doi: 10.1098/rstb.2002.1199.

Mar 13, 2009

User Interface and Digital Cameras: Gizmodo's Detailed Visual Tour; PMA 2009


I just love it when someone takes the time to do a detailed review of a broad spectrum of user interfaces, with loads of comments and pics. Matt Buchanan, of Gizmodo, has done so in his recent post, Click: A Visual Tour of Camera Interfaces. Matt has a nice section dedicated to cameras with touchscreen user interfaces, with all of the touch-screen pictures in one spot. He liked the Sony camera touch screen UI the best.

I liked one of the comments to Matt's post:

"Yeah, but when will they come out with a UI that my grandmother can use on her own without calling me every d--- time she wants to know how to turn on the camera and take a picture?"
-someoneUKno

Someday.

More about cameras: PMA 2009

Mar 12, 2009

Dell's All-In-One Studio One 19, With Optional Multi-touch Technology Released in Japan

http://cache.gawker.com/assets/images/gizmodo/2009/03/dellstudioone.jpg
Via BusinessWire
http://i.i.com.com/cnwk.1d/i/bto/20090311/DellStudioOne19desktop_610x457.JPG
Photo via Rafe Needleman/CNET


Dell's Studio One 19 All-in-One System Fits Anywhere in the Home
(BusinessWire)

Here are the specs from the press release:

  • Easy multi-touch photo editing, slideshow creation, playlist compilation, notes, and even web browsing.
  • Unleash creativity with You Paint finger painting software.
  • Record videos and upload directly to YouTube with the touch of a finger.
  • Flick to Flickr – Upload photos to Flickr to share with family and friends.
  • Create a musical masterpiece with the multi-touch percussion center.

†Software is optional and works with multi-touch configurations only.

Power & performance:

  • Intel® Celeron, Dual Core Celeron, Pentium Dual Core, Core 2 Duo, and Core 2 Quad Core Processor options
  • Choice of nVidia GeForce 9200 or GeForce 9400 integrated graphics[i]
  • Up to 4GB[ii] dual channel memory
  • Up to 750GB[iii] HDD
  • Slot load Optical drive
  • 7-in-1 media card reader, six USB ports
  • Optional integrated wireless, web camera, Blu-ray Disc™
  • Optional multi-touch capability
  • Optional facial recognition security (with webcam)

According to Warner Crocker, from Gottabemobile, the Studio One All-in-One will be available in the U.S. later this spring, with a starting price for the non multi-touch version around $700.00.

I'll post more information about this soon!

Update

Here are a few more pics of the Studio One, via Darren Gladstone, PC World:

http://images.pcworld.com/news/graphics/161113-P1020787_350.JPG


http://images.pcworld.com/news/graphics/161113-P1020795_350.JPG

Multi-touch Drum Application on the Dell Studio One 19

Extensive PC World Review:

Dell Studio One 19: All-in-One Stunner Takes Japan

Update:

After I wrote this post, I received a comment from Nicolas (see below). If you are interested in this sort of interaction, take a look at lm3lab's touchless interaction. No fingerprints!

Mar 9, 2009

Digital Convergence & Interactive Television

The idea of interactive television was born long before the Internet era, but it never really emerged.

Why? Television programming was designed to be the opposite of interactive. The medium centered around lulling viewers into passive submission, with mesmerized minds wide open to the influence of entertainers, talking heads, and commercials. All of this helped to perpetuate our growing consumer economy, which was not really a bad thing, right?

It appears that interactive television is re-emerging. Today, DISH Network announced the premiere of HISTORY Interactive, "an enhanced 24/7 interactive (iTV) experience." A collaboration between HISTORY and Ensequence, DISH TV customers with an OpenTV-enabled receivers can watch the Battles BC series beginning Monday, March 9 ET/PT.

So now what?

To get a better understanding of this concept, I dug up some information and found myself somewhat entertained by the process. Take a look:

Interactive Television: A Short History Interactive Television Alliance
(Scroll down to the history section)

A "must-see" gem from 1998:
http://www.pbs.org/opb/crashcourse/3.gif
Welcome to Digital TV: A Cringely Crash Course, Robert X. Cringely, 1998, PBS Online

This interactive presentation includes a nice overview of the history of television. It also includes a letter written in 1998 by the late Fred Rogers, of Mr. Roger's Neighborhood. I love this quote:

"...Imagine how much more meaningful any television can be when children have a caring person sitting right there beside them ... someone who wants to listen to their questions or comments ... someone who encourages their careful looking and listening and learning! That's what I call "interactive."

"We're glad to be your neighbors, and we applaud all the "interactive" ways you and your family are using television." -Fred Rogers

http://www.elsevier.com/framework_products/images/48/675948.gif
Interactive Television Production Mark Gawlinski, 2003

The Road to Convergence: Network Transformation and IP
David Russell, Converge Digest, 5/17/06

Development and Current Issues of Interactive Television in the UK
pdf Barbara Katz, 2004(?)

Blog: bitdamaged - Mike Ryan, Interactive Television Specialist

Translation Please: Broadband cable TV technology explained by Leslie Ellis
Leslie's blog is a treasure of technical information related to trends in broadband television. The information on the blog is well-organized and newer technologies are tagged as "Translation Please 2.0". Here are a few of Leslie's posts that I found interesting:

Translation Please 2.0: Digging Deeper into DSG
06/02/08
A Wireless Decoder for Wired People 7/28/08
What's Up in the Upstream 2/23/09
Widget World (Widgets on your Interactive TV)

RELATED
Another Gem
for techies and the tech-curious:
ODEN: The OCAP/EBIF Developer Network
"Founded in 2007, the mission of the OCAP/EBIF Developer Network (OEDN) community is to drive awareness of and development efforts using the two primary interactive cable television open standards for middleware: OCAP (known to consumers as tru2way) and EBIF."

"As interactive television application development for cable is a (relatively) young field, the initial focus of OEDN is on sharing information and facilitating communication between those "in the know" and those who are new to interactive television development - especially academic researchers and university students. As the community grows and its needs mature, this site will support deeper collaboration.
"

Update to this post, including information about boxee

Mar 6, 2009

Interaction with the Web of Things; LIFT '09

I recently came across the Web of Things blog and found it to be a gem. Vlad Trifa and Dominique Guinard are the brains behind the blog. They have lots of interesting ideas that are at the intersection of WSN (Wireless Sensor Networks) and HCI (Human Computer Interaction).

Vlad and Dominique recently presented at LIFT '09. If you haven't heard of Lift, take some time and visit the website:

"LIFT is a series of events to inspire and connect the community of doers and thinkers exploring the social impact of new technologies. Each LIFT conference is a three days experience made of talks, workshops, interactive art and discussions to understand and anticipate the most important social changes, and meet the people behind them."


Everything on the Web of Things blog inspired me to write two posts, back to back, on my Technology Supported Human World Interaction blog:

More cool things from the Web of Things blog


The Web of Things, Wireless Sensor Networks, Embedded Systems, and (Everyware) Health Care
(The above post was inspired by my experience at the Cleveland Clinic, spending time every day with my father, who has been in the cardio ICU since his surgery a few days ago.)

Mar 4, 2009

Microsoft and the Future of Interaction

Over the past week or so, I've heard quite a bit about Microsoft's vision for the future, across a variety of domains.

To get started, take a look at the following video from CES 2009: Microsoft Future Products Demo


Take a look at the Microsoft Office Labs Vision 2019 video, presented by Microsoft's Business Division president Stephen Elop at the Wharton Business Conference, via the istartedsomething blog:

&<span class="blsp-spelling-error" id="SPELLING_ERROR_3">lt</span>;a <span class="blsp-spelling-error" id="SPELLING_ERROR_4">href</span>="http://video.msn.com/?mkt=en-GB&<span class="blsp-spelling-error" id="SPELLING_ERROR_5">playlist</span>=<span class="blsp-spelling-error" id="SPELLING_ERROR_6">videoByUuids</span>:<span class="blsp-spelling-error" id="SPELLING_ERROR_7">uuids</span>:a517b260-<span class="blsp-spelling-error" id="SPELLING_ERROR_8">bb</span>6b-48b9-87ac-8e2743a28<span class="blsp-spelling-error" id="SPELLING_ERROR_9">ec</span>5&<span class="blsp-spelling-error" id="SPELLING_ERROR_10">showPlaylist</span>=true&from=shared" target="_new" title="Future Vision Montage">Video: Future Vision Montage&<span class="blsp-spelling-error" id="SPELLING_ERROR_11">lt</span>;/a>


Stephen Elop's Keynote Powerpoint Presentation Link

The Microsoft Office Labs Vision 2019 video sparked an interesting discussion in the comments section of the istartedsomething blog post.

There is more! Coldwell Banker will be using a customized home searching application using the technology of Microsoft Surface:




Futuristic Microsoft in the News:


Microsoft aims to turn PCs into personal assistants, teachers (or robot healers).
3/3/09, Byron Acohido, USA Today

Microsoft Mapping Course to a Jetsons-Style Future
3/2/09, Ashlee Vance, New York Times


Photo by Stuart Isett for The New York Times "Eric Horvitz, left, and Dan Bohus of Microsoft with the prototype of a virtual assistant that can understand its surroundings"


Stuart Isett for The New York Times "Hrvoje Benko demonstrating a Microsoft projection system that lets people manipulate large video images with their hands"
Yet another video:
Microsoft Research: A look at tomorrow's health solutions today: Part I
Laura Foy, 8/19/08


"In this special two-part video edition of House Calls for Healthcare Professionals, Bill Crounse, MD, visits with researchers at Microsoft Research. Each program reviews three promising areas of research that may one day lead to solutions with a direct or indirect application to health and healthcare. Viewers will gain insight to advanced ideas and technologies now in the labs at Microsoft Research long before they find their way into future products, solutions, or applications."



Surface Computing in Health Care: VitraView from InterKnowlogy:



Here is a 2008 video from Microsoft: Office Labs: Future of Personal Health Concept



Interesting concepts, but will they translate to the real world? Time will tell.

RELATED

Microsoft HealthVault Beta
3D Multi Touch Application for Heart Surgeries - Microsoft Surface and Health Vault
Microsoft Research Blogs

Feb 27, 2009

Tangible User Interfaces Part II: More Examples, Resources, and Use for TUI's in Education

In Part I of my "mini-series" about Tangible User Interfaces, I discussed the origins of TUI and provided some examples of Siftables. In this section, I've provided some links to information about Tangible User Interfaces and the abstracts of two articles pertaining to TUI's in educational settings.

Zen Waves: A Digital (musical) Zen Garden



reactable from Nick M. on Vimeo.

Reactable
http://upload.wikimedia.org/wikipedia/commons/e/e3/Reactable_Multitouch.jpg
More about the Reactable
"The reactable hardware is based on a translucent, round multi-touch surface. A camera situated beneath the table, continuously analyzes the surface, tracking the player's finger tips and the nature, position and orientation of physical objects that are distributed on its surface. These objects represent the components of a classic modular synthesizer, the players interact by moving these objects, changing their distance, orientation and the relation to each other. These actions directly control the topological structure and parameters of the sound synthesizer. A projector, also from underneath the table, draws dynamic animations on its surface, providing a visual feedback of the state, the activity and the main characteristics of the sounds produced by the audio synthesizer."


The Bubblegum Sequencer: Making Music with Candy



Jabberstamp: Embedding Sound and Voice in Children's Drawings
(pdf)
(A TUI application to support literacy development in children)

Affective TouchCasting
(pdf)

TapTap: A Haptic Wearable for Asynchronous Distributed Touch Therapy
(pdf)

BodyBeats: Whole-Body, Musical Interfaces for Children
(pdf)

Telestory is a Siftables application that looks like it would be quite useful for supporting children who have communication disorders or autism spectrum disorders.

Telestory Siftables application from Jeevan Kalanithi on Vimeo.

"Telestory is an educational, language learning application created by Seth Hunter. In this video, the child is looking at a television screen. He can control onscreen characters, events and objects with the siftables. For example, he has the dog and cat interact by placing the dog and cat siftables next to each other."
TeleStory Project Website

Here is a video of how Siftables can be used as equation editors:


Siftables Equation Editor from Jeevan Kalanithi on Vimeo.

RESOURCES ABOUT TUI'S:


5 lessons about tangible interfaces, GDC Lyon, December 2007(pdf) Nicolas Nova


Special Issue on Tangible and Embedded Interaction (Guest Editors: Eva Hornecker, Albrecht Schmidt, Brygg Ullmer) Journal of Arts and Technology (IJART) Volume 1 Issue 3/4 - 2008


Reality-Based Interaction: A Framework for Post-WIMP Interfaces (pdf)


Here are a couple of abstracts of articles related to the use of TUI's in education:

Evaluation of the Efficacy of Computer-Based Training Using Tangible User Interface for Low-Functioning Children with Autism Proceedings of the 2008 IEEE International Conference on Digital Games and Intelligent Toys

"Recently, the number of children having autism disorder increases rapidly all over the world. Computer-based training (CBT) has been applied to autism spectrum disorder treatment. Most CBT applications are based on the standard WIMP interface. However, recent study suggests that a Tangible User Interface (TUI) is easier to use for children with autism than the WIMP interface. In this paper, the efficiency of the TUI training system is considered, in comparison with a conventional method of training basic geometric shape classification. A CBT system with TUI was developed using standard computer equipment and a consumer video camera. The experiment was conducted to measure learning efficacy of the new system and the conventional training method. The results show that, under the same time constraint, children with autism who practiced with the new system were able to learn more shapes than those participating in the conventional method."

Towards a framework for investigating tangible environments for learning Sara Price, Jennifer G. Sheridan, Taciana Pontual Falcao, George Roussos, London Knowledge Lab, 2008

"External representations have been shown to play a key role in mediating cognition. Tangible environments offer the opportunity for novel representational formats and combinations, potentially increasing representational power for supporting learning. However, we currently know little about the specific learning benefits of tangible environments, and have no established framework within which to analyse the ways that external representations work in tangible environments to support learning. Taking external representation as the central focus, this paper proposes a framework for investigating the effect of tangible technologies on interaction and cognition. Key artefact-action-representation relationships are identified, and classified to form a structure for investigating the differential cognitive effects of these features. An example scenario from our current research is presented to illustrate how the framework can be used as a method for investigating the effectiveness of differential designs for supporting science learning"

Tangible User Interfaces Part I: Siftables

In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms


"(pdf). According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth taking at least a glance of this seminal work.


Another must-read is Hiroshi Ishii's 2008 article,
Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related tothe Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. According to the Fluid Interfaces website, the goal of this
research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined.
Siftables is the work of
David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement
..No special sensing surface or cameras are needed."



Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg

More about Siftables:

Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces
(pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."


In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.

Feb 23, 2009

YDreams: Interactive Experiences, Real Time Interaction with Augmented Reality Characters

YDreams is doing some interesting things. Watch the delight on this little girl's face as she plays with an avatar in mixed reality, viewed on a large display:


YDreams on Vimeo.

More from YDreams:
"...Flapi, YDreams' in-house mascot, and other virtual characters interact in real-time with a little girl and other physical obstacles in a new seamless augmented playground environment."

http://www.ydreams.com/ydreams_2005/images/contents/uploaded/Image/ylabs2(1).jpg
Photo from YDreams Lab
"YLabs’ main focus is on Reality Computing, which uses new technologies such as mobile computing, augmented reality and ubiquitous interactivity to bridge the distance between the user, information and the machine, in a physical, post-browser environment, where the real and the digital come together."

http://www.ydreams.com/ydreams_2005/images/contents/uploaded/Image/mbook1(1).jpg
This is a photo of YDream's Architek's yMagic Books. Architek is used to create interactive digital content, including children's storybooks that are manipulated on a touch-screen.


This is a demonstration of Architek's yWalk, an immersive virtual playground that can be vertically projected onto soft mats and floors.

The Architek software provides information about user interaction. yWalk looks like it might be useful for occupational or physical therapists in their work with young children.

Interesting work!

Feb 22, 2009

Rich White's "Mobile Immersive Learning Lab" Project; EduSim Update

Rich White is an educational technologist for Greenbush County, Kansas, has been working with the 3D interactive virtual world, EduSim, for quite a while. He's taking EduSim to the next level.

"The
concept is one of an enclosed virtual learning space - with surrounded projection of the virtual learning world the students are exploring - similar to the StarLab Concept (with a rectangular configuration). along the lines of a CAVE - however simpler, mobile, and relatively in-expensive by comparison."

The project is at the beginning prototype stage.

Below is a demo of the virtual world as it is projected on two screens that are placed next to each other at a right angle, with the center of the virtual-world view positioned where the two screens meet:


digital_dome_01.jpgreal_cave.pngpicture-2.png
This might be a great way of reaching students who have autism!


More about EduSim:

EduSim, for those of you haven't seen my previous posts on the topic, is a multi-user 3D interactive environment used in classrooms with interactive whiteboards:




Information from Rich White's Greenbush blog about Edusim:

Wikipedia entry:
"Edusim is a Cave Automatic Virtual Environment based concept of lesson driven 3D virtual worlds on the classroom interactive whiteboard or classroom interactive surface. The Edusim concept is demonstrated by the Edusim free and open source multi-user 3D Open Cobalt virtual world platform and authoring tool kit modified for the classroom interactive whiteboard or surface. The Edusim application is a modified edition of the open source Open Cobalt Project and relies heavily on the affordances of direct manipulation of 3D virtual learning models and Constructionist Learning Principles."


History of Edusim:
"The Edusim project began in September 2007 at the Greenbush Education Service Center in Southeast
Kansas as an effort to bring an engaging 3D experience to the classroom interactive whiteboard. Pilot groups were established with 6th and 7th grade middle school students throughout Southeast Kansas to observe how students would be engaged through the software, and how the user interface would need to be augmented to account for the affordances of the whiteboard, and the usability of the students.
"

Here is a virtual world in Edusim in COBALT, showing how a drag and drop function is used for in-world VNC application sharing:



The Cobalt 3D metaverse browser has been modified for multi-touch interaction by some of the members of RENCI, a collaborative venture of Duke University and several other North Carolina universities. The video below is Dr. Xunlei Wu, demonstrating how gesture and touch is used to manipulate items and navigate through two Cobalt virtual worlds:



Some of the members of RENCI built a multi-touch table in addition to the collaborative multi-touch wall. For more information:

RENCI: Multi-Touch Collaborative Wall and Table using TouchLib: More about UNC-C's Viz Lab


(Cross posted on the TechPsych blog.)

Jonathan Jarvis: Crisis of Credit Animated Short; Interactive Oracles

Jonathan Jarvis created a series of animated shorts as a project for his work as a graduate student in the Media Design Program at Art Center College of Design. He started exploring the concept of system diagrams and integrated them into motion interactions.


The Crisis of Credit Visualized from Jonathan Jarvis on Vimeo.


(Note: Someone commented about the negative way the family who represents sub-prime mortgage holders was depicted in the short.)

Related Economic Sounds:

The short was influenced by information from the following "This American Life" radio broadcasts.

Click on the following links to listen to the broadcasts:

Another Frightening Show About the Economy
Transcript (pdf)
The Giant Pool of Money
Transcript (pdf)


More about Jonathan Jarvis:

Crisis of Credit Project Page
The back story behind Jonathan's work on the Crisis of Credit Project, with story board scenes and his research sketch of the Crisis of Credit system diagram.
Jonathan's Website
Jonathan's Global Storytelling Project (pdf)

Jonathan worked on concept development, interface design & content development with a team for a multi-touch project, Interactive Oracles, for Acura:
http://www.madein.la/featuredprojects/interactiveoracles/wp-content/uploads/madeinla_interactiveoracles_intro.jpg

(Some of the information is cross-posted on the Economic Sounds and Sights blog.)

Feb 20, 2009

More Multi-Touch and Surface Computing...

The concept of multi-touch/gesture/surface computing is spreading.

Here's more evidence:

Panasonic Touch Air Hockey


The game was demonstrated at ISE 2009 (Integrated Systems) Amsterdam. The interface was developed by UI Centric, a Soho, London company.

Microsoft's SurfaceWare at the Tangible Embedded Interactions Conference (TEI 2009):

SurfaceWare is a level-sensing software that alerts waitstaff when glasses need refilling.


http://photos-e.ak.fbcdn.net/photos-ak-snc1/v2385/101/124/727430870/n727430870_2768500_5763.jpg
Photos from Nachiket Apte, via Ru Zarin

More to come...

Feb 18, 2009

Ready for the SMARTTable?

The Smart Table is now available for purchase!



Here is the plug:

"The world's first multitouch, multiuser table for primary education - the SMART Table - is now available for purchase.Order the SMART Table"

"As a collaborative learning center, the SMART Table enables engaging and motivating small-group learning experiences. Up to eight students can use their fingers intuitively to sweep, slide and spin objects on the interactive screen. The SMART Table's ready-made activities help primary students gain and further their skills in areas like counting and reading."

"The SMART Table also makes an ideal complement to whole-class activities on the SMART Board interactive whiteboard. It helps reinforce concepts in a small-group setting and ensures students can participate in interactive and creative learning experiences."

(Cross-posted on the TechPsych and Technology-Supported Human-World Interaction blogs.)