Showing posts with label interactive multimedia. Show all posts
Showing posts with label interactive multimedia. Show all posts

Jan 24, 2009

Digital Storytelling Platforms and Multiple Perspectives: A look at the work of Jonathan Harris - Food for Thought for Interactive Timeline Design

I'm in the process of creating an interactive timeline, and as I revisited my links and bookmarks, I came across a link to a video of Jonathan Harris discussing his ideas regarding digital storytelling, overlapping threads, and multiple perspectives.

Jonathan explores real-life stories and celebrates the interconnections between events, ideas, feelings, and people. Linear narrative and linear time lines do not do justice to the richness and complexities of human experience
.


"Combining elements of computer science, anthropology, visual art and storytelling, Jonathan Harris designs systems to explore and explain the human world."



"Jonathan Harris is redefining the idea of what it means to tell a story. Take a ride through an arctic whale hunt and plunge headfirst into the feelings Harris finds running rampant in cyberspace as he describes what he calls “storytelling platforms.” "

Below are links to two story-telling platforms described in the presentations. The Whale Hunt is organized so that the user can explore the story through a variety of perspectives and interfaces, and at different points in time.

The first screenshot shows how the user can select one of the "cast" members to see how the story unfolds from that person's point of view.





"Every few minutes, the system searches the world's newly posted blog entries for occurrences of the phrases "I feel" and "I am feeling". When it finds such a phrase, it records the full sentence, up to the period, and identifies the "feeling" expressed in that sentence (e.g. sad, happy, depressed, etc.). Because blogs are structured in largely standard ways, the age, gender, and geographical location of the author can often be extracted and saved along with the sentence, as can the local weather conditions at the time the sentence was written. All of this information is saved...The result is a database of several million human feelings, increasing by 15,000 - 20,000 new feelings per day. Using a series of playful interfaces, the feelings can be searched and sorted across a number of demographic slices, offering responses to specific questions..."


Jonathan Harris also presented at the December 2007 EG Conference. The video and related information can be found on the TED website.

Jonathan Harris: The art of collecting stories


If you have some time on your hands, explore Jonathan's Universe project:

http://universe.daylife.com/common/statement-universe.gif
This photo depicts the nine stages of the Universe environment.

"Universe is a system that supports the exploration of personal mythology, allowing each of us to find our own constellations, based on our own interests and curiosities. Everyone's path through Universe is different, just as everyone's path through life is different. Using the metaphor of an interactive night sky, Universe presents an immersive environment for navigating the world's contemporary mythology, as found online in global news and information from Daylife. Universe opens with a color-shifting aurora borealis, at the center of which is a moon, and through which thousands of stars slowly move. Each star has a specific counterpart in the physical world — a news story, a quote, an image, a person, a company, a team, a place — and moving the cursor across the star field causes different stars to connect, forming constellations. Any constellation can be selected, making it the center of the universe, and sending everything else into its orbit."

Universe was created using Processing, which is an open-source software that is used by people from all sorts of disciplines to create interesting interactive information visualizations and more. The data used in Universe is from Daylife. For more information about Daylife, visit the Daylife Labs.

Sidenote:
Jonathan Harris collaborated with Sep Kamvar on the We Feel Fine project. Sep Kamvar teaches classes like "Social Software" and "Computational Methods in Data Mining" at Stanford University. Sep is part of the Stanford Human Computer Interaction (HCI) Group.

Stanford's HCI group's weekly seminars highlight a variety of interesting speakers. Current and previous talks are available via Stanford OnLine. You can link to current presentations and videos from the Human Computer Interaction Seminar website. If you are curious, past presentations/abstracts can be accessed on-line alphabetically or by date, going as far back as 1990.

Dan Saffer, author of Designing Gestural Interfaces, presented "Tap is the New Click" on January 23rd, 2009 at Stanford.  You can access the
video directly.

Pop! Tech PopCast

Jan 2, 2009

Play a flute by blowing on your iPhone!

Smule's Ocarina application: Play a flute by blowing on your iPhone!

I meant to post about this a while ago- here is a group of people playing a relaxing version of "Stairway to Heaven" on their iPhones:



"How To" Video:



Picture via Brothersoft Editor's Blog:
http://blog.brothersoft.com/wp-content/uploads/2008/11/ocarina_01-295x300.png

"- Blow into your microphone to generate music
- Hold down combinations of the holes to change your pitch
- Tilt your phone from front to back to change the vibrato depth
- Tilt your phone from side to side to change the vibrato rate
"


You can connect with other Ocarina players around the world via your iPhone.

Video of Smule's
Zephyr application:


Take some time and explore the other applications developed by Smule.

Two more examples of multi-touch and gesture interaction out in public: Accenture at O'Hare Airport, TacTable

Accenture's Touch Wall at O'Hare Airport
Note: Also posted on the Technology-Supported Human-World Interaction blog.


From the New York Times, photographed by Yana Paskova



Video and Photos from TacTable

The following video shows some applications developed by TacTable. As you can see, many of the applications look like they'd be useful in public spaces, including airports and museums.



Below are some pictures from the TacTable website:


http://www.tactable.com/images/homeimage1.jpg

http://www.tactable.com/images/handsTouchingTableLSC.jpg
Where Did Language Come From? Liberty Science Center, N.J.

http://www.tactable.com/images/AccentureWallMultiTouch.jpg

http://www.tactable.com/images/accenture1.jpg
Accenture Welcome Wall, London, England

http://www.tactable.com/images/sprint.jpg
Sprint Studio Digital Lounge Table

Related:

TacTable Contacts:
USA:
Henry Kaufman
henry@tactable.com


Tinsley Galyean
tinsley@tactable.com


UK:
Graham Cosier
graham@tactable.com


"Clients include Sprint, Accenture, Cirque du Soleil, New York Museum of Modern Art (MoMA), Liberty Science Center, Detroit Institute of Art, Georgia Aquarium, London's Millenium Dome, Boston Museum of Science, Chicago Museum of Science and Industry, Chicago Mercantile Exchange, Warner Brothers, Discovery, Disney, British Telecom Retail, and many others."

New Accenture Technology Lands at O'Hare International Airport

Flight Delays Radiate from Chicago and Atlanta

Nov 23, 2008

For the Tech Curious: "Get in Touch with Touchless": Multi-touch with just a webcam and the free demo application!

Via the Seattle Tech Report's Microsoft Blog:



You can find the demo code on the Codeplex website. Here is a quote:

"The Touchless SDK enables developers to create multi-touch based applications using a webcam for input. Touch without touching."

"Touchless started as Mike Wasserman’s college project at Columbia University. The main idea: to offer users a new and cheap way of experiencing multi-touch capabilities, without the need of expensive hardware or software. All the user needs is a camera, which will track colored markers defined by the user."


(I posted about the Touchless SDK previously, but I didn't have the video.)

Need for Multi-touch, Multi-user Interactive Multimedia Applications, and the Miracle Question

Last week I received a few comments on my post, "Multi-touch and Flash: Links to Resources; Revisiting Jeff Han's Presentation". I started to respond to a thoughtful comment by Spencer, of TeacherLED, and I wanted to share it as a post:

Spencer is a teacher and instructional technology consultant who develops web-based interactive applications for use on interactive whiteboards (IWB's). He's interested in multi-touch applications for education and has some good insights into what HCI researchers call the "problem space".

Here are Spencer's comments:

..."I agree that Flash could have a very important role to play here. I chose Flash as my development tool because it allows quick development of ideas and then easy distribution of the product. The importance of this is that it allows people who have a profession other than software developer to create software with the insight of their main role. In my case, as a teacher, I can identify things I wish I had and then make them. Often I find that other teachers had the same wish and they then appreciate the product."

"The unfortunate thing with multi-touch is that it is far from the technology most of us outside the industry/research areas have to work with. An app created in Flash for single touch follows the mouse and pointer method so it can be developed easily. When done it can be easily tested on a standard IWB for the feel (which is often surprisingly different on the IWB compared to using a mouse)."

"The Flash developer community has a very experimental and creative characteristic and I’m sure would be a great driving force for multi-touch but first there needs to come a reason for more people to have some sort of multi-touch display for general use, beyond facilitating experiments. When the various operating systems support it and have the apps to make having a supporting display viable then the experimentation and ideas will really flow."

"In addition, the display makers need to recognize the benefits of Flash and ensure they address them. At the moment it seems to be too often an afterthought if considered at all. SDKs and APIs make no reference to Flash or they remain indefinitely in beta for older versions of Flash only."

"It is a pity that all of this will take time. The more time that passes the more single touch IWBs are bought and installed which will delay the uptake of the eventual multi-touch ones. Meanwhile children continue to have to keep reminding themselves that they can only touch the board in one place when it is clear that every bit of their brain is telling them to interact with the board in a much more natural multi-touch way."

My response to Spencer's comment:

Spencer,

You make good points regarding the barriers to getting the multi-touch approach adopted by the "mainstream". You're right about what the commercial display makers need to do. If they want to market displays that will have more appeal, they must think about the different sorts of applications and programming environments that the displays should support.

Display makers also need to think more about the bigger picture - in what sort of environments will the displays be located? Indoors? Outdoors? Near bright sunlight? What about people with disabilities, children, or the elderly?

I can see that in the future, multi-touch displays and other devices would operate within an embedded systems environment and support mobile computing activities as well. There are existing examples of this concept, of course, but there is much room for creative improvement. An embedded systems approach is complex, and would need to handle input from sensors, support multi-modal signal process, and also provide users with a range of connectivity modes, including RFID. (Data management and storage needs would have to be addressed, along with privacy and security concerns.)

Most importantly, in my opinion, these systems would need to have the flexibility required to support human activities and interactions that have not yet emerged! Certainly this will need to take a multidisciplinary approach.

There are many unanswered questions....How does this fit in with mobile computing and "cloud" computing? What sort of middleware needs to be developed?

Even if we don't have solutions to the bigger problems, there are many smaller problems that I think could be somewhat easily solved.

As you mentioned, many applications that are designed for single-touch screens don't fully support the way people identify, select, and move items around the screen. Although educators access websites every day for use on interactive whiteboards, they are hungry for more. There are not enough websites that are optimized for single-touch interaction, or touch-screen interaction in 3-D "space".

Teachers who are successful users of interactive whiteboards know exactly what we are talking about. They spend quite a bit of time searching for new on-line resources they can use with their students. They know how much the students want to interact with the screen at the same time and would be so excited to have capability at their fingertips!

Optimizing websites for touch-screen applications is possible, but this idea hasn't occurred to web developers. Their jobs don't require it, so there is no incentive. Google is developing FlareBrowser, that can support multi-touch interactions, but according to information on the website, it runs on Mac Leopard 1.5, and nothing else. The present version is bare-bones. I haven't yet tested the FlareBrowser.

I think that another barrier to getting multi-touch off the ground is that the people who might have the knack for multi-touch application development simply don't know it! We've mentioned that Flash developers have the potential to create good multi-touch applications. I also think that game developers and designers could make good contributions to the multi-touch movement. Just think about what thought goes into programming interactions and event handling for 3-D web-based multi-player games!

Yet another barrier is that people who work in lower-tech fields could benefit from collaborative multi-touch applications, but they don't know it, either. The research I've reviewed tells me that multi-touch applications can support a wide range of human endeavors- work, creativity, data analysis, education, collaboration, planning, and so forth.

What is missing is the input of potential end users from a variety of fields. No specific discipline "owns" multi-touch, so it is hard to figure out how we can make this happen.

Could we set up multi-touch technology playgrounds at professional and trade conferences? What about airports and hospital lobbies? Libraries and museums? Shopping centers? Sports events and rock concerts?

This leads me to my next idea, which is jumping ahead a bit:

One of the barriers to the development of multi-touch applications is that it is not easy to gather user requirements when the users are not familiar with the technology.
That is when my "Miracle Question" technique comes into play. I learned this technique when I studied brief solution-focused counseling and found that if modified, can be useful when figuring out user requirements. (The process still needs some fleshing out.)
Why the Miracle Question?
The questions that a developer uses to guide the client during the initial planning stages are very important. Keep in mind that people want to use technology because it meets a need and also solves a problem, which is the similar to the reason a person might seek counseling.
The Miracle Question technique (actually, a series of questions) might help to tease things out. The goal of this type of questioning is to help the client use their own creativity, resources, and problem-solving skills so they can become effective partners throughout the development cycle.
(People with human-computer interaction training might have an easier time understanding how this technique might be modified and applied to different fields.)

FYI
A good example the Miracle Question process, as used in therapy and counseling, can be found on the Network of Social Construction Therapies website in an article written by the late Steve de Shazer:

http://brianmft.talkspot.com/aspx/templates/topmenuclassical.aspx/msgid/366482

There aren't many resources about the use of the Miracle Question in IT or business. Here are a couple:

Solution Focused Management of Unplanned IT Outages (Read pages 132 and the references.)http://conferences.vu.edu.au/web2006/images/CDProceedings06.pdf
Proceedings of 7th International We-B (Working for E-Business) Conference, 2006Katherine O'CallaghanSugumar Mariappandar, Ph.D.School of Business and InformaticsAustralian Catholic University

Miracle Question in Executive Coaching
http://www.1to1coachingschool.com/Coaching_Miracle_Question.htm

Oct 26, 2008

Grafiti - a multi-touch, table-top, surface computing application, from a member of the NUI group.

Are you interested in multi-touch gestures for multimedia tabletop applications, tangible or surface computing, and interaction design? Take a look at this video of Grafiti:



Alessandro De Nardi, a student in computer science at the University of Pisa, Italy, has worked on Grafiti during Google's Summer of Code and is still involved with the project. Alessandro, a member of the NUI-group, is also in the Music Technology Group of Barcelona's UPF University. Allesandro was supervised by by Martin Kaltenbrunner and Sergi Jorda, of the reactable project.

According to De Nardi's
Google Code website:

"Grafiti is a C# framework built on top of the Tuio client that manages multi-touch interactions in table-top interfaces. The possible use of tangible objects is particularly contemplated. It is designed to support the use of third party modules for (specialized) gesture recognition algorithms. However a set of modules for the recognition of some basic gestures is included in this project.

The development is in the alpha state, going to beta soon.

The goals I've been aiming at are: generality, versatility, speed of execution, extensibility, ease of programming (integration) of external modules (simple APIs and protocols) and effective interface design with regards to HCI aspects." -Alessandro De Nardi

Demos in binary form are available for download for MacOSX and Windows on the Google Grafiti site. There is a Grafiti site on Sourceforge. Apparently the alpha version will work on all 32-bit MS Windows,, BSD platforms, and POSIX (linux).

(Note: This post was updated on 12/14/08 and includes corrections.)

Oct 9, 2008

Ideum's multi-touch, multi user table prototype



The above picture is from Jim Sapadaccini's post about Ideum's mutli-touch table prototype, using NUI (Natural User Interface) software, "Snowflake". Ideum is working on gesture-recognition software for use in future projects.

Ideum will be designing for museum installations, and will include user-testing as they work towards applications that will support collaboration among museum visitors as they interact around the exhibits.

For more information see the Ideum blog, and also Ideum's portfolio page.

Oct 5, 2008

Microsoft Surface and the Nintendo Wii?! Interesting mashup from Stimulant


Stimulant: Microsoft Surface + Nintendo Wii Mashup from Stimulant on Vimeo.


Stimulant, a company that has partnered with Microsoft to develop applications for the Surface multi-touch table:

"What do you get when you mash-up Microsoft Surface with a Wii Balance Board? Pressure-sensitive surface computing! Yes, the Surface is sitting directly on the Balance Board (it supports 600 pounds, we checked). Here, Josh demonstrates a simple application that lets users create bubbles of various sizes and roll them around the table using pressure on the edges of the Surface. You also get a sneak peek at the WPF/Silverlight physics engine we've been working on as well. Pressure sensitivity adds an extra dimension to the Surface experience and opens new doors on an already highly advanced platform."


I'd like to learn more about Stimulant's other Surface experiments!

Nintendo Wii

Sep 27, 2008

If you like Spore, you'll appreciate the Genetic Algorithm website!

Since I've been playing Spore and thinking about the the artificial life/genetic algorithm used to create the Black Shoals Stock Market Planetarium (see my previous post), I thought I'd visit the
Philippe Rennard's interactive genetic algorithms website I first visited his site when I was taking a course in artificial intelligence for game design.

http://www.rennard.org/alife/img/ga/gav.gif

The first time I visited the site, I viewed it from a small laptop screen. This time, I expored the site on my 22-inch HP Touchsmart. What a difference! I can't wait until the school week to see how this looks on an interactive whiteboard.

Although the website was created several years ago, it still is fun site to visit, especially when you watch all of your little things morph according to the variables you choose.

Another website I recently revisited was the NERO website.
NERO stands for Neuro-Evolving Robotic Operatives. The original NERO project was developed at the University of Austin a few years ago. The latest version can be found on the NERO 2.0 website.

If you have some time on your hands, visit the site and download the program. You use genetic algorithms to train up soldiers in using real-time neuro evolution. See what happens if you let them evolve overnight on your computer!

http://z.cs.utexas.edu/users/nn/nero/images/territory_screen2_detail.png
"Evolve your own robot army by tuning their artificial brains for challenging tasks, then pit them against your friends' teams in online competitions"

Somewhat Related


While I was at Rennard's website, I linked to his page about a book he edited in 2006 that I want to run out and read ASAP. "Handbook of Research on Nature Inspired Computing for Economics and Management". I think this will help book might help me with my quest to better inform myself about the events surrounding the economic situation.

I'm sharing what I find on my Technology Supported Human-World Interaction blog using one of my recent posts to store my collection of pictures, media, quotes, articles, references, and links related to the quest. I hope to incorporate this, hopefully with the help of others, into a dynamic, interactive web-based time-line of the history we are living through right now. For now, here is the link to the "repository":

The Celestial Economic Sphere, DataViz for the Finance Biz, Truthiness, Behavioral Finance, and Greed

Sep 21, 2008

Resources for the (therapeutic) use of digital and multimedia storytelling and social stories for children and teens...

The most frequently visited post on this blog is Interactive Multimedia for Social Skills, Understanding Feelings, Relaxation, and Coping Strategies, and the hits are increasing. For that reason, I'm working on an extensive update, which is very involved process, given the increase in research in this area. Until then, I thought that it would be helpful to provide a mini-update.

Below are some resources and links for those of you who are interested in digital multimedia story telling or digital social stories with young people. Although some of the resources are specific to children or teens with autism spectrum disorders, I've also included information that is appropriate for use in regular classroom settings.

(If this is your first visit to this blog, it might be a good idea to first check out the resources from the above link, and then return to this post.)


Digital Storytelling and 21st Century Skills (pdf)

This nine-page primer is useful for anyone interested in learning how to create digital stories or develop digital storytelling activities with young people. The information was provided by David Jakes, an instructional technology coordinator for Community High School District 99 in Downers Grove, IL, provides a good case for digital storytelling and an outline of the process of implementing related activities at the high school level.

David Jakes has a website, Jakesonline.org, that contains additional resources about digital storytelling, including strategies for instruction. The website also provide information about collaborative tools and a collection of extensive web resources.

Center for Digital Storytelling

USING MULTIMEDIA SOCIAL STORIES TO INCREASE APPROPRIATE SOCIAL ENGAGEMENT IN YOUNG CHILDREN WITH AUTISM (doc)

Encouraging Positive Behavior with Social Stories: An Intervention for Children with Autism Spectrum Disorders
pdf (Shannon Crozier, Nancy M.Sileo) Teaching Exceptional Children, July/August 2005 pp. 26-31

This article provides information that supports a systematic method of implenting social stories that is integrated into a student's Functional Behavioral Assessment and IEP.

Process:

• Team identifies the need for behavior intervention.
• Functional assessment is completed.
• Social stories included in behavior plan.
• Social story is written.
• Social story is introduced and progress is monitored with data.
• Success is evaluated with data.

Note: The authors cite an article written in 1999 that mentions that the use of multimedia social stories has not been demonstrated to be effective, as little research had been completed in this area at the time. Take this with a grain of salt, as there are recent studies (see below) regarding the effective use of multimedia applications, including computer/video games and VR, for social skills training as well as counseling.


An evaluation of the integrated use of a multimedia storytelling system within a psychotherapy intervention for adolescents. (pdf)

Current Autism Research on Social Stories (Vol 2, Issue 8; August 2007) Positively Autism

Multimedia Instruction of Social Skills
(CITEd Research Center- Center for Implementing Technology in Education: Multimedia Technologies)

This link provides extensive information about on-line resources for programs that simulate social interaction. It also includes information about the use of social stories with students, and resources for putting together multimedia social stories. Included are some summaries of research about multimedia social stories and the use of multimedia for instructional activities.


Be sure to explore the rest of the CITEd site when you have the time.

Related:

Post: Special issue on Multimedia, Media Convergence, and Digital Storytelling

Digital Stories Targeting Social Skills for Children with Disabilities. Cori More (PRO-ED Journal, 2008)

Digtal/Multimedia Storytelling
from A Storied Career: Kathy Hansen's Blog to explore traditional and postmodern forms/uses of storytelling

Digital Storytelling - Katie Christo's Wiki - how-to, resources, tutorials, rubrics, lesson plans, digital storytelling across the curriculum, etc.

The Story-Centered Curriculum
- eLearn Magazine

Mind Reading: An Interactive Guide to Reading Emotions


Mind Habits: The Stress Relief Game

Sep 11, 2008

Special Issue on Multimedia, Media Convergence, and Digital Storytelling (RCETJ)

Mark van 't Hooft, the chair of ISTE's SIGHC, recently shared information and links to an interesting special issue of on-line publications about multimedia related topics:

An Introduction to the RCETJ Special Issue on Multimedia, Media Convergence, and Digital Storytelling (Journal of the Research Center for Educational Technology, Fall 2008 Special Issue)

Video-Based Additional Instruction (Marc Franciszkowicz)

Design and Assessment of an On-line Pre-lab Model in General Chemistry
: A Case Study (Juan Antonion Llorens-Molina)

The iPod Project: A Mobile Mini-Lab
(Nikhil Sathe and Jorg Waltje)

Media Convergence in a College Newsroom: A Longitudinal Study of Identification and Commitment to a Collaborative Web Site (Fred Endres)

Sep 1, 2008

Interactive Touch-Screen Technology, Participatory Design,and "Getting It"....

PLEASE SEE THE UPDATED VERSION OF THIS POST:
Interactive Touch Screen Technology, Participatory Design, and "Getting It", Revisited

http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg

Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:

When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?

Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios in which surface computing would be a welcome breath of fresh air.

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.

http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.

HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it.


Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers.

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like.

(The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.)


Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.

A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities.

(By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

Some resources:
HP TouchSmart PC website, with demo
HP's TouchSmart YouTube videos
lm3labs (catchyoo, ubiq'window)
NUI Group (See member's links)
NextWindow
Fingertapps
thirteen23
SmartTechnologies
Perceptive Pixel - Jeff Hans
Microsoft Surface
iPhone
(More can be found by doing a search on this blog or The World Is My Interactive Interface.)

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI:

Need for Improvement: User-Unfriendly Information Kiosk Interactive Map


Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0



Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.


For more about lm3labs, including several videoclips, take a look at one of my previous posts:
Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Jun 18, 2008

Oculusia, an Interactive Multi-Touch Table Game for Four People

The following video is a nice demonstration of a multi-touch table game, Oculusia. The game was created by four students from the Aalborg University, Copenhagen, for Medialogy 2008. The students, Jens Fursund, Thomas Frohlich, Thomas Miksa, and Tobias Brogaard, also created the table.



More Information:
Students Explore Multi-touch with Oculusia

Related:

The "Workbench" used in the Medialogy Lab at Aalborg University, Copenhagen:
The image “http://media.aau.dk/db/projects/img/markerless.jpg” cannot be displayed, because it contains errors.

"The Workbench displays 3D stereo images and the tracking system allows for interaction with virtual objects on the display surface. The tracking system uses infrared light to illuminate the display surface and the camera is only sensitive to infrared light. Basic image processing techniques are then applied to acquire position of the hand."

A Multi-Touch DigitalPainting Application pdf (Søren Støvlbæk, Larsen Gavin, Francis Murray & Jesper Højgaard Pedersen:Institute of Electonic Systems, Computer Vision, and Graphics, Aalborg University 2007)

The ReacTable: A Tangible Musical Instrument pdf(Martin Kaltenbrunner, Music Technology Group, Universitat Pompeu Fabra, 2006)

May 29, 2008

Umajin Creative -Digital Story Telling for Interactive Whiteboard or Touch Screen- free demo available


Umajin Creative is an application designed for digital storytelling. I haven't had a chance to preview it. I was impressed with the pictures on the website. If you use this application, please leave a comment. It looks kid and teacher friendly. I can see that it has potential!

According to the website, you can "compose multi-page digital documents with rich text, digital photography (including blue screen support), illustrations, sound, video, 3D models, particle fx, interactive functionality, and so much more... runs on both a Mac and PC. It also supports interactive whiteboards and touch screen PC's.. so you can interact directly with the content. With the HP Touchsmart PC you can use real brushes on the screen..to experience digital painting with variable width brush strokes!

Interactive Digital Storybook:

ebook1.gif
Below: Cool-looking digital brushes.
The image “http://www.umajin.com/slides/album1/images/clip.jpg” cannot be displayed, because it contains errors.
http://www.umajin.com/slides/album1/images/3d.jpg
Above- Sample 3D models
Below- Sample of blue-screened cutout images
http://www.umajin.com/slides/album1/images/cutouts.jpg
Below: Samples of animated particle effects
The image “http://www.umajin.com/slides/album1/images/pfx.jpg” cannot be displayed, because it contains errors.
Seven Custom Functions:

The image “http://www.umajin.com/slides/album1/images/animations.jpg” cannot be displayed, because it contains errors.

I think the people from Fingertapps are responsible for UMAJIN. Below is a video highlighting multi-touch applications:

May 24, 2008

Game Based Learning: Second European Conference

The 2008 Second European Conference on Game-Based Learning will be held in Barcelona, Spain, October 16-17, hosted by the Universitat Oberta de Catalunya.

Speaker Bios

Conference Program

If you are working in a K-12 setting and interested in sharing your ideas about game-based learning, please leave me a message. I'm especially interested in how interactive games support engaged, meaningful learning.


May 19, 2008

More Multi-Touch from members of the NUI group!

It is always exciting to see what members of the NUI group are doing!

Here is a new video of a multi-touch creation by some of the members of the NUI group. Although this is a proof-of-concept example, it is fun to see how it is played out, using the little iPhone-like touch-pad widgets as a navigation tool for the large screen.


Read the "Multi-touch Goodness" article in Gizmodo of an interview with Christian Moore about this demo and his Lux open-source framework. (Christian is a colleague of Harry van der Veen, both members of the NUI group.)

Here is an excerpt from the interview:
"JD: Why Flash?
CM: Because it's fast to prototype in. However, the software is broken into several segments. One C++ application that tracks hands that talks to Flash... WPF... or another C++ app... and basically everything you can imagine. You can enable multitouch in any environment, like Cocoa."

High-resolution screen shots and additional information can be found on the nuiman website.

For my tech-minded readers:
I'm pretty sure that the C++ application that track hands and fingers in the video demo uses Touchlib, a library for creating multi-touch interaction. Touchlib can work with TUIO, a protocol for tabletop tangible user interfaces. Applications such as Flash and Processing support TUIO. For more information about TUIO, read
"TUIO: A Protocol for Table-Top Tangible User Interfaces".
(Information from the NUI group website mentions that OpenCV, or Open Computer Vision Library, found on SourceForge, can support blog detection and tracking.)

The people behind TUIO are from the Reactable project, of the Music Technology Group at Pompeu Fabra University in Barcelona:

Here is my plug for the NUI group, once again!

"The NUI group, or Natural User Interface Group, is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.

We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."


FYI
I came across the Harry van der Veen of the NUI group in early 2007 when I was working on touch-screen projects for my HCI and Ubicomp classes, and I'm inspired by all of the creativity I've seen coming from this group.

If you'd like to see more demos, visit the Natural User Interface website, a commercial out-growth of Harry and his colleague's work, where you can view a reel that includes a few touch-screen games. I love the vision statement on this site:

"Technology should enable us to interact with computers, in the same way we interact with the real world; in a way which is natural to us, namely through gestures, expressions, movements, and manipulations. Our vision is to change the way people interact with computers."

May 14, 2008

Paul Bourke's Experiments in Interactive Visual Immersion in Second Life - and more....


"Preprocessed spherical projections from inside a crystal including the Hershfield surface"
Evaluating Second Life as a Collaborative Tool for Scientific Visualization
http://local.wasp.uwa.edu.au/~pbourke/papers/cgat08/

I thought I'd devote a post to Paul Bourke's exciting work in the area of what I call "interactive visual immersion".

According to a bio on a math art website, Paul's working life "has revolved around high performance computing and computer graphics, in particular scientific visualisation: the use of computer graphics to represent scientific data with the aim of enhancing understanding of the underlying structure and processes." Paul is also interested in the use of "novel display technologies that can further enhance the understanding of large and complicated geometric datasets...These engage capabilities of our visual system not exploited by traditional computer displays..."

Paul recently participated in the the Computer Games, Multimedia & and Allied Technology 08: International Conference & Industry Symposium on Computer Games; Animation, Multimedia, IPTV & Edutainment. (CGAT'08, Singapore, April 28-30)

The above picture is from his paper, Evaluating Second Life as a Tool for Collaborative Scientific Visualization. (pdf), taken from Paul's webpage, where you can find a link to the slides from his talk.

Bourke points out that there are few software tools available that support shared, remote collaborative scientific visualization. What is available is expensive, exploratory, and/or difficult to install and operate. Bourke proposes that Second Life might prove to be a useful tool for collaborative SciViz, despite some current limitations. Second Life is a cross-platform 3D environment that was designed specifically for interaction between many people, and it allows for texture mapping, which is a plus for visualization work. Burke notes that there are limitations, such as Second Life's low geometric complexity.

Paul Bourke's website includes a page that lists all of his papers, articles, and seminars, with impressive visuals alongside each entry. Since I recently completed a visualization class and also gave a short presentation about accessible games for health, I found many of Bourke's papers intriguing:

Vertical dome (iDome): Visualisation and Navigable movies
Visualization for scientists, museums, public outreach, and education
Immersive environments and applications to gaming
Exploiting our sense of touch for scientific visualization.
Tactile Visualization: Feel your data!
Apple technology powering displays that engage the human visual system
iDome and digital projection into hemispherical domes

Take the time to explore Bourke's entire website. Look at his texture library and his "other" section!


FYI: Here is a link to the CGAT'08 list of speakers, complete with bios and abstracts.

Mar 6, 2008

More about virtual worlds in education: Edutopia's articles and resources

The recent on-line newsletter from Edutopia has a list of links of articles and media about virtual worlds and games in education. This information compliments the video, "Virtual Worlds Tour", highlighted in my previous post.

Get a Life: Students Collaborate in Simulated Roles (Laila Weir)

The School of Second Life (Wagner James Au)

Simulation Nation: The Promise of Virtual Learning Activities: Inventive computer sims can turn dull lessons into hyperreal experiences, if we can get educators to use them" (Marc Prensky)

Related video: No Gamer Left Behind "Computer simulations area natural learning tools to a generation of video game players"

Let the Games Begin: Entertainment Meets Education (Jenn Shreve)

Related:

Loud and Clear: Students Find Their Voices Through Multimedia
(Edutopia Staff)

Edutopia is part of the non-profit George Lucas Educational Foundation

Post a comment if you are using virtual worlds or games in your school!

Feb 10, 2008

British Library's Online Gallery: Turning the Pages- Great for an interactive whiteboard or display

The British Library offers an on-line interactive gallery, "Turn the Pages" , where you can "turn" the pages of great books, page by page, view additional information about the information of the page you are looking at, and also listen to an audio description. You also have the option of using a magnifying feature that you can move over the surface of the page to get a closer view of the text or illustrations.

If you have an interactive touch screen display or whiteboard, you can turn the pages with your hand, and also move the magnifying viewport about the screen. What a great resource for the classroom!

Here is additional information from the "Turning the Pages" website that will help you get started:

"The standard version of Turning the Pages™ uses the Shockwave plugin, which can be downloaded from the Adobe website, to simulate the action of turning the pages of a real book. For Mac OS X users there is an alternative download. The volumes may not open if you block popups on your computer. Technical specification"

"A new version, Turning the Pages 2.0™, runs on Microsoft Vista operating system (and on Windows XP with the .NET 3 framework). It will also run on other operating systems using the Microsoft Silverlight plugin. Technical specification"

"And there are alternative versions which do not need a plugin but display static images (and enlargements) in standard web pages, in the same window."

Here are some sample titles:

Masterpiece of the Renaissance, Landmarks in Medical History, Glimpses of Medieval Art, Mozart's Musical Diary (with 75 audio excerpts), 15th-Century Church Book, Flemish Masters in Miniature,
Sketches by Leonardo, Classic of Botanical Illustration, The Original Alice, by Lewis Carrol....