Showing posts with label HCI. Show all posts
Showing posts with label HCI. Show all posts

Nov 15, 2008

Multi-touch and Flash: Links to resources, revisiting Jeff Han's TED 2006 presentation

Despite the increase in interest in systems that support multi-touch, multi-user multimedia interaction, there is a need for creative, tech-savvy types to develop innovative applications. Why? This technology has the potential to make a powerful impact on how people learn, communicate, solve "big picture" problems, and do their various jobs.

CNN's Magic Wall was one of the first applications to gain the attention of the masses, as it was used as an interactive map during the US presidential election process. Touch-screen interaction gained even more notice after the recent SNL parody by Fred Amisen.

If you think about it, the multi-touch applications you see on the news aren't much different than what you'd get from a "single-touch" program.

Fancy, yes. Truly innovative, no.

Just imagine a 3D multi-touch, multi-user, multimedia version of Google Search. I did. I put my sketches in my idea book and hurt my brain thinking about how it could be coded.

Jeff Han, the man behind Perceptive Pixel and CNN's magic wall, had much more up his sleeve when he demonstrated his work at TED 2006. Even if you've previously seen this video, it is worth looking at again. (I've provided a link to the transcript below.)



Transcript of Jeff Han's TED 2006 Presentation

This video presentation had a transformational effect on me as I watched for the first time. Jeff Han brought to life ideas that were similar to my own as a beginning computer student thinking about collaborative educational games and multimedia applications that could be played on interactive whiteboards.

Here are some selected quotes from the video:

"
I really really think this is gonna change- really change the way we interact with the machines from this point on."

"
Again, the interface just disappears here. There's no manual. This is exactly what you kind of expect, especially if you haven't interacted with a computer before."

"Now, when you have initiatives like the hundred dollar laptop, I kind of cringe at the idea that we're gonna introduce a whole new generation of people to computing with kind of this standard mouse-and-windows pointer interface. This is something that I think is really the way we should be interacting with the machines from this point on. (applause)"

"Now this is going to be really important as we start getting to things like data visualization. For instance, I think we all really enjoyed Hans Rosling's talk, and he really emphasized the fact that I've been thinking about for a long time too, we have all this great data, but for some reason, it's just sitting there. We're not really accessing it. And one of the reasons why I think that is, is because of things like graphics- will be helped by things like graphics and visualization and inference tools. But I also think a big part of it is gonna be- starting to be able to have better interfaces, to be able to drill down into this kind of data, while still thinking about the big picture here."

So now what?

A recent post by "Alex", on the
AFlex World blog discusses a few solutions. Alex had a chance to meet with Harry van der Veen and Pradeep George from the NUI Group, and Georg Kaindl, a multi-touch interaction designer from the Technical University of Vienna. The focus of the discussion was to come up with ideas to encourage Adobe/Flash designers and developers to learn more about multi-touch technology and interaction, and take steps to create innovative applications.

I especially like the following quote from the post:

"...A quick quote from our conversations: “When our children will walk up to a display, they will touch it and expect to do something.”

As a techie and a school psychologist, I see an immediate need for innovative applications. I know that there is a built-in market in the schools, at least for low-cost applications. Despite economic constraints, many school districts continue to invest in interactive whiteboards (IWB's). They are cropping up in preschool and K-12 settings, and teachers are searching for more than what's currently available.

Interactive, collaborative applications are needed in fields such as health care, patient education, finance & economics, urban planning, civil engineering, travel & tourism, museums & exhibitions, special events, entertainment, and more.

Smart Technologies, the company behind SmartBoards, has a new interactive multi-touch, multi-user table designed for K-6 education, the Smart Table. Hewlett Packard has several versions of the TouchSmart PC, which can support at least duo-touch, if not multi-touch, multi-user applications. There are numerous all-in-one large screen display
s on the market that support multi-touch and multi-user interaction.

Quotes from Harry van der Veen, of Multitouch NL:

"In 10 years from now when a child walks up to a screen he expects it to be a multi-touch screen with which he can interact with by using gestures."

"...multi-touch screens will be as common as for children is the internet nowadays, as common as mobile phones are for us."


Here is a quote from a conversation I had with Spencer, who blogs at TeacherLED.

"It was interesting this week as I was in a classroom with a teacher who I've not worked with before... he had 2 students using the whiteboard who kept touching it together by mistake. The teacher, exasperated, said to himself, "Why can't they make these things to accept 2 touches without going crazy!"

Proof of the demand! I think you are right when teachers spot the limitations and then see the technology on visits to museums, that might stimulate demand."


Spencer creates cool interactive mini-applications, mostly for math, using Flash, that teachers (and students) love to use on interactive whiteboards. (He's interested in multi-touch, too.)


So what are we waiting for?!

Related:
Natural User Interface Europe AB meets Adobe
Georg's Touche Framework
NUI Group
TeacherLED
Interactive Touch-Screen Technology, Participatory Design, and "Getting It".
Hans Rosling's 2007 TED talk

Oct 20, 2008

The atracTable Multi-Touch System from Atracsys

The atracTable is a multi-touch presentation system developed by the Swiss engineering and development group, Atracsys. It is similar to Microsoft's Surface. Interaction on the table can be triggered by laying objects on the table.

(Marc Hottinger and
Lionel Tardy , of Amorpik, designed the interface.)

http://www.atracsys.com/images/atracTable2_ex.jpg

atractable_1-480x321




AtracTable FAQ's

From the Atracsys Website -"How does it work":

"atracTable is the combination of a video-based movements tracking system, a computer, a beamer, and a screen.

When you lay an object on the screen, the tracking system recognizes the object wiht a visual tracking tag on the base of the product. At the same time, the tracking system detects the positions and movements of your fingers and of the objects on the screen.

The whole pieces of information concerning the product(s) and the different locations and movements are sent to the computer. The data is processed and sent back to the beamer.

The processing is performed instantaneously. The real-time interaction is obtained by continous detection of fingers and objects movements. The whole technology fits in the table and is invisible for customers".

Another creation by Atracsys is beMerlin, a gesture-based interactive system that plays out as an interactive window. Although it is used for visual merchandising, it looks like it has potential for other uses, such as wayfinding, building directories, interactive museum exhibits, and education.


Click to enlarge image

http://www.atracsys.com/images/beMerlin2_ex.jpg

http://www.atracsys.com/images/beMerlin1_ex.jpg

How it works:

Oct 17, 2008

Time for More Touch! Part Two: Microsoft's "Oahu", a hypothetical (?), affordable version of the Surface multi-touch table..

Long Zheng, from the I Started Something blog, was privy to a survey from Microsoft about "Oahu" (via someone named Kerien).

The following description of Oahu is a quote from Long Zheng's website, and reportedly was the introductory section of Microsoft's survey:

"The following questions refer to a computing device called “Oahu” that has an innovative multi-touch screen. Oahu is a flat screen that sits horizontally like a table top. You can interact with Oahu by touching the screen, instead of using a mouse, and more than one person can interact with Oahu at the same time. You and others can move objects on the screen with your hands and touch icons to open up programs, games, or music. People using the device can also use their fingertips to expand and shrink objects on the screen. The screen recognizes people’s hand movements and touches and reacts accordingly. You can bring up an on-screen keyboard to input information. Oahu also works with other devices (such as digital cameras, cell phones, and MP3 players) by getting information from or sending information to them. Oahu is on with no waiting time to start up. Oahu can come as a freestanding table, placed into a piece of furniture, or built into a countertop. The type of Oahu devices we are asking about today are not portable but if they are furniture or tables, they can be placed anywhere in your home.
"

mmmm.... sounds just like a Surface....


Photo via I Started Something

The price of Oahu quoted in the survey? $1,499.00. A substantial savings, considering that the price of Microsoft's Surface is $10,000.00.

This price approaches the affordable range for schools. I wonder if any questions in the survey addressed the learning aspects of the Oahu, other than helping children with homework. With the upcoming Windows 7 OS and its multi-touch capabilities, I'm sure we'll be seeing the spread of this technology.

FYI:
Long Zheng is working on a Business Commerce and Multimedia Systems double degree at
Monash University in Australia. His purpose in blogging is to be on the cutting edge of first-breaking news related to technology.

For more information about Windows 7, see the Engineering Windows 7 blog.

Sep 3, 2008

Lazybrains 3D game: Another Brain-Computer Interface!
















I came across an article about the BCI (Brain-Computer Interface) 3D game, Lazybrains, on the Wired website today. "Brain Scanners, Fingercams Take Computer Interfaces Beyond Multitouch"

LazyBrains was a Digital Media Senior Project of Aaron Bohenick, James Borden, Sachary Brooks, Kenneth Oum, and Jordan Santell, students at Drexel University.


Here is a video:

Game Teaser


Description of the BCI, a fNIR:
  • "The Functional Near-Infrared Imaging Device (fNIR) is a technology that was developed at the University of Pennsylvania, but is currently being used by the Drexel University biomedical department. The device shines infrared light into the user's forehead, and records the amount of light that gets transmitted back. The change in the amount of light can be used to deduce information about the amount of oxygen in the blood. When the user concentrates, their frontal lobe needs more oxygen and this change can be detected by the device."
http://www.voxel6.com/images/fNIR_CUTOUT_thumb.png
For more information, see the Voxel6 website.


Here is a link to a post about a similar BCI system:

Emotive System's Neural Game Controller Headset: Human-Computer Interface of the Future?

It will be interesting to see how this technology unfolds. In my opinion, it will be quite useful for cognitive rehabilitation, as well as providing access to games for people who have significant physical limitations.

Jul 30, 2008

Microsoft's Multi-Touch Sphere "Photo Globe"



I just had to post about this, even though this news has been rapidly circulating around the blogSPHERE.

The sphere could support an interactive travel planning/travel memory application.

Imagine if you were on a cruise ship, and uploaded your photos to the globe, and voila, they'd show up on the sphere in your mother's living room! Geo-tagged, cross-referenced, synched with your 2.0 apps, and linked to your vacation video-clips you previously uploaded to YouTube.

Jun 18, 2008

Hands On Computing: How Multi-Touch Screens Could Change the Way we Interact with Computers and Each Other (link to Scientific American Article)

More Multi-Touch!

Scientific American,
June, 2008 Hands On Computing: How Multi-touch Screens Could Change The Way We Interact With Computers and Each Other "The iPhone and even wilder interfaces could improve collaboration without a mouse or keyboard. "

"It is easy to imagine how photographers, graphic designers or architects—professionals who must manipulate lots of visual material and who often work in teams—would welcome this multi-touch computing. Yet the technology is already being applied in more far-flung situations in which anyone without any training can reach out during a brainstorming session and move or mark up objects and plans." -Stuart Brown

Link: Emotiv System's Neural Game Controller Headset: Human-Computer Interface of the Future?

If you are looking for information about brain-computer interfaces, follow the link to my post about Emotive Systems neural interface on the Technology-Supported Human-World Interaction blog.

Emotiv System's Neural Game Controller Headset: Human-Computer Interface of the Future?

Also see:
Game Interaction via Thoughts and Facial Expressions: EPOC - Emotiv Systems Neural Interface

May 24, 2008

Dance.Draw Project : Exquisite Interaction - Collaboration between Software Information Systems -HCI- and Dance Departments at UNC-Charlotte


DANCE.DRAW: EXQUISITE INTERACTION
(Updated)

"The movement of the visualizations are artifacts in real-time of the movements of the dancers. They draw while they dance, they dance together and they draw together. Every performance generates a new visual imprint." -DanceDraw website


Interactive multimedia technology, blended with the arts!

Dr. Celene LaTulipe
, from UNC-Charlotte's Software and Information Systems Department, Professor Sybil Huskey, from the dance department, dance students, and others collaborated to create an amazing performance that I had the opportunity to see performed during the
Visualization in the World Symposium in April (2008).

If you look closely, you will see that each dancer holds two wireless mice, one in each hand. The mice trigger the visualization that is projected in the background. Dr. LaTulipe has focused some of her research on two-handed computer interaction. It is interesting to see how her work has been applied to this beautiful "off-the-desktop" application.

Dance.Draw is a work in progress- visit the following links for more information:

Website (Updated)
Movie
Technical Info
Dr. Kosara's Eager Eyes post about Dance.Draw

Note:
Dr. LaTulipe was my HCI professor- Dr. Kosara was my Visualization/Visual Communication professor.



May 19, 2008

More Multi-Touch from members of the NUI group!

It is always exciting to see what members of the NUI group are doing!

Here is a new video of a multi-touch creation by some of the members of the NUI group. Although this is a proof-of-concept example, it is fun to see how it is played out, using the little iPhone-like touch-pad widgets as a navigation tool for the large screen.


Read the "Multi-touch Goodness" article in Gizmodo of an interview with Christian Moore about this demo and his Lux open-source framework. (Christian is a colleague of Harry van der Veen, both members of the NUI group.)

Here is an excerpt from the interview:
"JD: Why Flash?
CM: Because it's fast to prototype in. However, the software is broken into several segments. One C++ application that tracks hands that talks to Flash... WPF... or another C++ app... and basically everything you can imagine. You can enable multitouch in any environment, like Cocoa."

High-resolution screen shots and additional information can be found on the nuiman website.

For my tech-minded readers:
I'm pretty sure that the C++ application that track hands and fingers in the video demo uses Touchlib, a library for creating multi-touch interaction. Touchlib can work with TUIO, a protocol for tabletop tangible user interfaces. Applications such as Flash and Processing support TUIO. For more information about TUIO, read
"TUIO: A Protocol for Table-Top Tangible User Interfaces".
(Information from the NUI group website mentions that OpenCV, or Open Computer Vision Library, found on SourceForge, can support blog detection and tracking.)

The people behind TUIO are from the Reactable project, of the Music Technology Group at Pompeu Fabra University in Barcelona:

Here is my plug for the NUI group, once again!

"The NUI group, or Natural User Interface Group, is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.

We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."


FYI
I came across the Harry van der Veen of the NUI group in early 2007 when I was working on touch-screen projects for my HCI and Ubicomp classes, and I'm inspired by all of the creativity I've seen coming from this group.

If you'd like to see more demos, visit the Natural User Interface website, a commercial out-growth of Harry and his colleague's work, where you can view a reel that includes a few touch-screen games. I love the vision statement on this site:

"Technology should enable us to interact with computers, in the same way we interact with the real world; in a way which is natural to us, namely through gestures, expressions, movements, and manipulations. Our vision is to change the way people interact with computers."

Mar 17, 2008

Look, touch, listen, and play: Seth Sandler's Interactive Audio Touch Table Video; NUI Group and Google's Summer of Code

Seth Sandler's most recent video of the Audio Touch interactive table provides a good demonstration of how multi-touch on a table can work with music.



Seth is a member of the NUI group (Natural User Interface). He is finishing a Bachelors degree in Interdisciplinary Computing and the Arts, with an emphasis on Music, at the University of California, San Diego. His research and development work centers around multi-touch, multi-user musical interfaces.

Here is an update about the NUI group:

"Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications."

"We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."

The NUI group has been selected for mentoring organization for Google's Summer of Code, for those of you who are interested in working on open-source code for multi-touch systems. The student application process begins Monday, March 24th, 2008, and ends Monday, March 31st, 2008.

NUI group's project ideas page outlines the requirements for the application, which includes a 7500 word project proposal. The project page has a long list of ideas to spark some thinking for potential Summer of Code applicants.

For those of you who aren't into coding, I encourage you to take a look at the NUI Groups project ideas page just to get an idea of th interesting ideas that are being explored. The page has a list of links to other good resources.

Share the word with anyone who might be interested in the NUI Group's projects for the Summer of Code. We need to get more people interested in STEM careers, and the project ideas outlined by the NUI group look enticing.


Sep 8, 2007

About: Interaction Design (off the desktop)

Interaction design is a relatively new field that combines concepts related to human-computer interaction (HCI), mobile, pervasive, and ubiquitous computing (ubicomp), interface design, service design, user-experience design, interactive media design, and more.

According to Dan Saffer, an interaction designer at Adaptive Path, and author of Interaction Design: Creating Smart Applications and Clever Devices, interaction design is "about people: how people connect with other people through the products and services they use." If you are interested in learning more about designing for interaction, Saffer's book is a great starting point.

Saffer has recently established a wikki about interactive gestures, a site for the "dissemination of gestural interface information such as found on the iPhone and Wii." This is an important resource for those of us who are interested in developing useful interactive applications for emerging technologies.

(Related information can also be found on this blog.)

Aug 19, 2007

Technology Supported Human-World Interaction Blog

I've started another blog: Technology Supported Human-World Interaction (TSHWI)

"TSHWI is about the development of newer technologies that support human-world interaction. This concept incorporates the best of of HCI, CSCW, universal usability, interaction design, game design, educational technology, and Ubicomp/ Pervasive Computing. The definition of "World", can include virtual worlds, simulations, VR, or augmented reality - just about anything humans interact with in their daily lives."

Jul 28, 2007

HP TouchSmart PC - an interactive touch screen for the home- and classroom?


Photos from the HP TouchSmart Website





I think this PC has possibilities for use in libraries, school media centers, and in classrooms.




This is a short video clip about the HP TouchSmart PC, a medium-sized touch-screen display that runs on Window's Vista. From what I understand, it uses technology from NextWindow.

If you use TouchSmart PC or something similar, or if you develop applications for the TouchSmart or other interactive touch applications, let me know what you think!

For related videos, visit the TouchSmart YouTube channel.
http://www.youtube.com/user/TouchSmart

Jul 16, 2007

More touch screen "surface" display musings...



I had my first chance to use an interactive touch-screen SmartBoard, by Smart Technologies in 2002-03. Since I work mostly with kids and teens, I wondered why large-display touch screen technology wasn't more widespread, since there are so many free, interactive websites that provide pretty engaging activities for users.

One of the things I learned was that large-display touch-screen technology is in the preschool stage. There are problems with screen responsiveness, screen resolution, durability, and input.

In recent years, the idea of a touch screen has evolved to table-tops and drafting boards, embedded within wireless systems that allow for interoperability with mobile devices and remote applications.

Great technology exists, but no-one has pulled all the components together in a way that can easily scale for the people who would benefit from this sort of technology the most - people who spend most of day time teaching, learning, or both. I had a great experience using a NextWindow Human Touch large-screen display for some of my projects last semester. It was difficult for me to track one down, but once I got my hands on it, I liked it, even though it did not have multi-touch capabilities.

One laptop for each child? That was a good idea for the late 1990's and early 2000's. One high-quality, affordable, large touch-screen display or table for each classroom would be more effective.

One touch-screen display/table for every 4-6 students would be better.
Is there anyone out there who is up for the challenge?

Next Post: Updated links to interactive multimedia websites appropriate for large touch screen surfaces.

May 21, 2007

First attempt at a touch-screen "Poetry Picture Share" application




This was my first attempt at a "poetry picture share" application. It was designed for use on a multi-touch table and can be accessed remotely so people in different places can move things around on the screen.   The video shows how the application works on a NextWindow Human Touch interactive large-screen display.

Version 2 will be posted soon. I am planning on adapting this application for use with students with special needs, such as those who have autism or other communication disorders.

Google Earth with photo overlays on a touch screen 2


Here is another demo videoclip of a globe created in GoogleEarth using photo-overlays, with links to video clips uploaded to YouTube and embedded in individual posts on a blog. The above photo and the video clip show the application on a NextWindow Human Touch large-screen display.

This application would be great on a touch-table or touch-table set up on a drafting board. Although it was designed for a travel-planning application, it would work well in educational settings in subjects such as geography.

NextWindow Human Touch Interactive display using photo overlays on Google Earth




This application was part of a travel-planning prototype developed for a course in Human-Computer Interaction.   The application was demonstrated on a NextWindow Human Touch large screen display.

Would it work on the iPhone?







Jan 25, 2007

Link to info about a "super touch screen" for Google Earth - it has multiple uses.

Watch this video about a "super touch screen" for Google Earth from Perceptive Pixel! More information about this can be found on the TechPsych blog, and from the Google Earth Blog. I think this application would be great for visual learners.

Applications like this are immersive and engaging. If you are an educator, think about the ways you could use this application in your classroom!

Link to related article.

-Lynn