Showing posts with label multi-touch. Show all posts
Showing posts with label multi-touch. Show all posts

May 28, 2009

Multi-player multi-touch: "NuMTyPysics", based on Tim Edmond's Numpty Physics (similar to Crayon Physics)

I have Numpty Physics on my Nokia n800 internet tablet, and Crayon Physics on my HP TouchSmart PC. Both are designed for single touch, and are fun to play. Since my TouchSmart can handle duo-touch input, I wondered what Crayon Physics might be like if it supported two players at once.

http://www.tuxi.com.ar/wp-content/uploads/numpty-physics-tuxi.jpg

As you can see from the video, Thomas Perl and his colleagues have figured this out- at least with Numpty Physics!



The music is worth the watch. It's by Triplexity.

Numpty Physics and Crayon Physics both use the Box2D engine. Here is some information from the website:

"NuMTyPYsics are our enhancements to Tim Edmond's NumptyPhysics game. We added support for receiving TUIO messages from tbeta via PyTUIO by embedding an Python interpreter into the NumptyPhysics code. Currently, we simply emulate mouse input by pushing hand-crafted input events (SDL_Event) onto the SDL event queue (SDL_PushEvent). In the future, we plan to do bi-directional communication between the game engine (written in C++) and our multi-touch handling code, which will be written in Python."
-Thomas Perl

Note:
I've use Crayon Physics Deluxe with several of the students I work with who have severe autism. It is amazing how well they can figure out solutions for the levels. It would be even better if it could be enabled for duo-touch. It supports joint attention, which is a very important social interaction skill for young people with autism to develop.

May 20, 2009

xXtraLab's Multi-touch Projects

xXtraLab is an interaction design firm located in Taiwan. The xxtralab team has been working on some interesting multi-touch projects. Take a look!






















Multi-touch wall for briefing and real-time info sharing
Multisensory iTea-table



"xXtraLab Design Co. is one of the leading multimedia company in Taiwan, focusing on the design & engineering of HCI (Human-Computer Interaction) interfaces in museum, exposition, and showrooms (client lists here). Members of xXtraLab come from diversifying fields such as visual design, digital media, architecture, interior design, information engineering, design computing, industrial design, and fine art. we respect different cultural views and work as a multi-disciplinary team to offer inclusive design services."

May 15, 2009

iPod Touch Apps, WiiMote Whiteboards, 3D multi-user environments in education, and a teacher's video of the SMARTTable in action.

I thought I'd share the last two posts from my TechPsych blog here, since they focus on newer technologies that involve multi-touch or multi-user interaction.

A teacher explores the multi-touch, multi-user SMARTtable in his classroom

From what I can see, multi-touch, multi-user applications are ideal for students to learn collaborative, cooperative social skills at the same time they learn academic skills. Smart Technologies, well-known in the education world for interactive whiteboards, has unleashed a few tables, known as SMARTTables, in classrooms. One teacher, Tom Barret, is sharing his journey with technology, including the SMARTtable, on-line via his blog, SPACE FOR ME TO EXPLORE

The following is a video of young children doing math on a multi-touch SMARTTable. In order to solve the finger- arithmetic problems, the students must work cooperatively


Addition App - Set to multi-touch finger counts from Tom Barrett on Vimeo.

(In the video, you will see some shapes that Tom mistakenly added, so disregard them as you view the video.)


Here is a quote from Tom's blog about his experience with the addition application:

"I was most pleased with the level of engagement from the children and although on the surface this seems to be a simple application, it definitely requires a level of teamwork that you often do not get.

It is intriguing watching the children’s first attempts and how they realise they need to work together. As the challenge is small scale, once they have been successful they begin to refine their approach, communicate better and so get to later answers quicker."


Educational iPod Touch Apps for Students and Teachers: Eric Sailers' blog
Eric Sailers is a speech and language pathologist and assisted technology specialist who explores new technologies that he's found useful in the schools. Below is Eric's demonstration of applications such as "I Write Words", Wikipanion, Preschool Adventure, Twitterific, Google Mobile, and the calendar.

To demonstrate the iPod Touch,Eric uses the Elmo document camera that projects onto a screen. Note that as Eric demonstrates the Twitterific application, , he navigates to a link to a blog of one of his colleagues, which highlights the way one school is using the Wii as an augmentive communication tool and also an assessment tool for occupational therapy.



Take some time to explore Eric's Speech-Language Pathology Sharing blog. It is full of great information!

Update: Here are two video clips Eric created to prepare for an interview as a finalist for the Cox Communication Innovation in Special Education award. In one of the videos, Eric discusses the EduSim application, a 3D multi-user virtual world platform and authoring toolkit intended for classroom interactive whiteboards.

Interactive Applications for Special Education: Wiimote Whiteboards and iPod Touch in Special Education, Part I


Wiimote Whiteboards and iPod Touch in Special Education, Part II

May 10, 2009

Michael Haller Discusses Multi-touch, Interactive Surfaces, and Emerging Technologies for Learning

I came across an excellent overview of interactive display technologies that hold promise for education. The link below is a research article written by Michael Haller for BECTA, formally known as the British Educational Communications and Technology Agency.

Emerging Technologies for Learning: Interactive Displays and Next Generation Interfaces(pdf)
Becta Research Report (2008) Michael Haller Volume 3 (2008)


"Multi-touch and interactive surfaces are becoming more interesting, because they allow a natural and intuitive interaction with the computer system.

These more intuitive and natural interfaces could help students to be more
actively involved in working together with content and could also help improve whole-class teaching activities. As these technologies develop, the barrier of having to learn and work with traditional computer interfaces may diminish.

It is still unclear how fast these interfaces will become part of our daily life and
how long it will take for them to be used in every classroom. However, we strongly believe that the more intuitive the interface is, the faster it will be accepted and used. There is a huge potential in these devices, because they allow us to use digital technologies in a more human way." -Michael Haller

Michael Haller works at the department of Digital Media of the Upper Austria University of Applied Sciences (Hagenberg, Austria), where he is the head of the Media Interaction Lab.

Michael co-organized the Interaction Tomorrow course at SIGGRAPH 2007, along with Chia Shen, of the Mitsubishi Electric Research Laboratories (MERL). Lecturers included Gerald Morrison, of Smart Technologies, Bruce H. Thomas, of the University oof Southern Australia, and Andy Wilson, of Microsoft Research. The course materials from Interaction Tomorrow are available on-line, and include videos, slides, and course notes.

Below is an excerpt from the discription of the Interaction Tomorrow SIGGRAPH 2007 course:

"Conventional metaphors and underlying interface infrastructure for single-user desktop systems have been traditionally geared towards single mouse and keyboard-based WIMP interface design, while people usually meet around a table, facing each other. A table/wall setting provides a large interactive visual surface for groups to interact together. It encourages collaboration, coordination, as well as simultaneous and parallel problem solving among multiple people.

In this course, we will describe particular challenges and solutions for the design of direct-touch tabletop and interactive wall environments. The participants will learn how to design a non-traditional user interface for large horizontal and vertical displays. Topics include physical setups (e.g. output displays), tracking, sensing, input devices, output displays, pen-based interfaces, direct multi-touch interactions, tangible UI, interaction techniques, application domains, current commercial systems, and future research."

It is worth taking the time to look over Haller's other publications. Here is a few that would be good to read:

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009. "
Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

A. D. Cheok, M. Haller, O. N. N. Fernando, and J. P. Wijesena, 2009.
"Mixed Reality Entertainment and Art," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009. "Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]


M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation." IGI Publishing, 2008. [bibtex]

D. Leithinger and M. Haller, 2007. "Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]


J. Leitner, J. Powell, P. Brandl, T. Seifried, M. Haller, B. Dorray, and P. To, 2009."Flux: a tilting multi-touch and pen based surface," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3211-3216. [bibtex]

P. Brandl, J. Leitner, T. Seifried, M. Haller, B. Doray, and P. To, 2009. "Occlusion-aware menu design for digital tabletops," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3223-3228. [bibtex]


References from the BECTA paper:

Elrod, S., Bruce, R., Gold, R., Goldberg, D., Halasz, F., Janssen, W., Lee, D., Mc-Call, K., Pedersen, E., Pier, F., Tang, J., and Welch, B., Liveboard: a large interactive display supporting group meetings, presentations, and remote collaboration, CHI ’92 (New York, NY, USA), ACM Press, 1992, pp. 599–607.

Morrison, G., ‘A Camera-Based Input Device for Large Interactive Displays’, IEEE Computer Graphics and
Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Albert, A. E. The effect of graphic input devices on performance in a cursor positioning task. Proceedings ofthe Human Factors Society 26th Annual Meeting, Santa Monica, CA: Human Factors Society, 1982, pp. 54-58.

Dietz, P.H., Leigh, D.L., DiamondTouch: A Multi-User Touch Technology, ACM Symposium on User
Interface Software and Technology (UIST), ISBN: 1-58113-438-X, pp. 219-226, November 2001.

Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,

CHI 2002, 2002.

Kakehi, Y., Iida, M., Naemura, T., Shirai, Y., Matsushita, M.,
Ohguro, T., ‘Lumisight Table: Interactive View-Dependent Tabletop Display Surrounded by Multiple Users’, In IEEE Computer
Graphics and Applications, vol. 25, no.1, pp 48 – 53, 2005.

Streitz, N., Prante, P., Röcker, C., van Alphen, D., Magerkurth, C.,
Stenzel, R., ‘Ambient Displays and Mobile Devices for the Creation of Social Architectural Spaces: Supporting informal communication and social awareness in organizations’ in Public and Situated Displays: Social and Interactional Aspects of Shared Display Technologies, Kluwer Publishers, 2003. pp. 387-409.

Morrison, G., A Camera-Based Input Device for Large Interactive
Displays, IEEE Computer Graphics and Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-Joseph, E.,
Yeung, L. and Zahra, K., Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. IEEE and ACM International Symposium on Mixed and Augmented Reality ACM Press, Darmstadt, Germany.

Han, Y., Low-cost multi-touch sensing through frustrated total internal reflection, UIST ’05 (New York), ACM
Press, 2005, pp. 115–118.

Hull., J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J., Olst, D., Paper-Based Augmented Reality. In
Proceedings of the 17th International Conference on Artificial Reality and Telexistence (Esbjerg, Denmark,November 28-30, 2007). ICAT ’07. IEEE, 205-209.

Haller, M., Leithinger, D., Leitner, J., Seifried, T., Brandl, P., Zauner, J., Billinghurst, M., The shared design space. In SIGGRAPH ’06: ACM SIGGRAPH 2006 Emerging technologies, page 29, New York, NY,USA, 2006. ACM Press.

Research email: emtech@becta.org.uk

Main email: becta@becta.org.uk
URL: www.becta.org.uk

(This was also posted on the TechPsych blog.)

May 7, 2009

Rhizome 2009: A Lovely Interactive Multi-touch App on a Flexible Lycra Screen

Loran Bey is a member of the NUI group. He created Rhizome 2009 using Unity 3D and tBeta, now know as CCV (Community Core Vision). The screen in the video is made from flexible lycra, and this provides a tangible interaction effect. The music in the background is Aphix Twin's Avril 14th.

MultiTouch Screen Lycra from Loran Bey on Vimeo.

Unity 3D is a game development tool for browser-base games, including games optimized for the iPhone. (If you visit the Unity 3D website, be sure to download their 3D web plugin and visit their relaxing on-line Tropical Paradise.)

The screen displayed in the video was inspired by the 2005 Khronos Projector installation, by Alvero Cassinelli, an assistant professor at the Ishikawa Komuro Laboratory at the University of Tokyo. Khronos is described as a "video time-warping machine with a deformable screen."

The Khronos Projector website provides several simulation applets built with Processing that you can play with. They were fun to interact with on my HP TouchSmart PC. I liked Behind the Door the best.


http://www.k2.t.u-tokyo.ac.jp/members/alvaro/Khronos/SNAPSHOTS/PressureCity2_blurred.jpg


The following video demonstrates how the Khronos application works:


Take a look at Alvaro Cassinelli's archive of his interactive media art if you have the chance!

Cassinelli's Meta-Perception research group is doing some interesting things, too:

"The goal of this group is to research methods for capturing and manipulating information that is normally inaccessible to humans and machines. In doing so, we hope to create new ways of perceiving the world and interacting with technology. Our research methods span fields such as human-computer interaction, media arts, physiology, and ethics."

Apr 26, 2009

Good news about the San Jose Interactive Displays Conference (that I couldn't attend)

Sadly, I was unable to attend the Interactive Displays conference that was held last week in San Jose, California. Jeff Han, the guy in the video clip in my sidebar, was one of the keynote presenters. I really wish I could have seen - and touched - the action!

Update (4/27/09) from Thomas Hansen, a member of NUI Group, who attended IDC:

"..
It was certainly an interesting event. I got to talk (albeit briefly) to both Andy Wilson and Jeff Han. Both of whom produce very inspiring and world class UI/HCI research on a consistent basis. Further, it was a great honor to meet some of the other members of the nuigroup community in person, some of whom where inspiring not only because of their amazing intellect and artistic talents, but especially due to their friendliness, benevolence, and maturity."

Here are a few excerpts from people who were fortunate to attend and then write about the experience:


Putting our arms around the future of touch Ina Fried , CNET 4/23/09

"..if you used one of the interactive displays here to show a heat map of this industry, it would glow red hot. That's because touch displays, for years relegated to kiosks and industrial uses, are quickly becoming mainstream. Hewlett-Packard and
Dell already have touch-capable machines, while Microsoft is set to make gesture input standard with Windows 7...For his own part, Han said he was inspired by seeing a PBS documentary in the early 1980s that showed Microsoft researcher Bill Buxton, then at the University of Toronto, using multitouch to compose music on a computer. The computer itself was a green screen with an ancient processor and little memory, but the key underlying concept was already there..

"Sometimes it takes that long for these things to marinate and gestate," Han said....And while things are now taking off, Han urged the crowd not to forsake quality in the rush to take advantage of a hot market. "That will ruin it and mess it up for all of us, and that would be a real shame," Han said."

Interactive Displays Conference Highlights Kevin Arthur, Touch Usability 4/22/09

".
..He (Jeff Han) showed a great clip from an early 80s TV show called Bits and Bytes that featured a young Bill Buxton demonstrating some of his tablet work at the University of Toronto...Jeff Han's point in showing this clip was not just to share what inspired him as a kid to pursue computer science as a career. He also wanted to make the point that none of this stuff is really new. He urged the audience not to jump on whatever tech is cool this week, but to be aware of the history and to do the research. Be thoughtful and careful about what you're making. He said one of his fears with the multitouch craze is that the waters will be poisoned by bad and poorly conceived implementations. He said "don't add noise" to the ecosystem by using terms sloppily -- like "multitouch"...

The importance of being more thoughtful and mindful of prior work are not exactly new to most of us with design, HCI, or CS backgrounds, but the audience here is largely made up of marketing or other business types, I believe, who sometimes tend to get a bit carried away, you might say. I mean no disrespect to my friends in marketing..."

Interactive Displays Conference San Jose Harry van der Veen, 4/25/09
"
Big thumbs up for Pira tech for managing to get so many multi-touch industry professionals (Wacom, Mindstorm, NextWindow, 3M, Jeff Han (Perceptive Pixel), LG, Tyco Electronics, Stantum and more) and hobbyists together."


Apr 25, 2009

How soon will we see interactive information visualization for multi-touch & gesture systems?

The field of information visualization is growing. Until recently, most visualizations were created for use on a single PC or larger screen, allowing for interaction by only one user at a time. I have a feeling that this will be changing in the very near future.

Why? Interactive duo and multi-touch interfaces are becoming more common, and now come in all sizes of screens, from the iPhone, the Surface, and CNN's multi-touch "Magic Wall". People of all ages play interactive games on the Wii, often on large flat-panel displays. In my opinion, the time is right for those developing applications for the InfoViz world to think about harnessing the power of multi-touch.


Below is a picture of the front page of the Visual Complexity website. If you go to the site, you can select a visualization, and then explore it more in detail, as each picture links to a web-page that provides background information about the visualization project, the artist or team behind the project, and links to the project's website.

I took a look at a variety of the examples posted on the Information Complexity website, and think many would be enhanced by a transformation to a multi-touch, gesture, bi-manual, or duo/multi user system. I'm interested in learning what others think about this. If you are working on a collaborative information visualization project, feel free to add a comment and post a link.

Here is a a nice quote from the website:

"Functional visualizations are more than innovative statistical analyses and computational algorithms. They must make sense to the user and require a visual language system that uses colour, shape, line, hierarchy and composition to communicate clearly and appropriately, much like the alphabetic and character-based languages used worldwide between humans."

Matt Woolman
Digital Information Graphics




Update: I did a search for "multi-touch" on the Visual Complexity website and found a couple of interactive applications:

Reactable (I've posted about this system a few times!)
(Reactable website)



















Prototouch

(Wirmachenbunt Website)



















Apr 24, 2009

SMART Table in the Classroom: Tom Barret's Journey


Tom Barret is a teacher who is using a SMART Table in his classroom. His recent post, "SMART Table in my classroom- Days 2-5: Teething Problems" provides some insight about potential problems teachers might face when introducing this sort of technology to students.

(Tom blogs about educational technology, including topics such as "
Using the Nintendo Wii to Support My Numeracy Lesson")

Here are Tom's first-glance comments about the SMART Table:

"A couple of things that I have learned already:

There is a long way to go in terms of the toolkit and software development"

"The table is very robust."

"There is a place in the primary classroom for this type of technology, it feels natural to have this style of technology in my classroom. "

"My instincts tell me their is a future in this style of work for kids."

"Multi-touch and the behind the scenes technology that is needed to operate it, can be very temperamental."

"Children take to the medium very easily and naturally."

"They can be networked"

"3rd party software can run on them but you would lose the 40 touch capability"


"For 9 and 10 year olds (upper junior), the optimum number for using the Table is 4. Any more and it gets a little congested, limiting the screen real estate that you can use. This is crucial, you might be able to get 6 Year 5s around it but they will not get significant enough access to the surface and so the learning activity. "

Apr 22, 2009

From the NUITEQ (Natural User Interface) Gallery, via Harry van der Veen

Kids take to multi-touch interaction naturally!

The following photos are from Harry van der Veen's Multi-touch blog. (Harry was one of the founding members of the NUI-Group, and also is the CEO of NUITEQ-Natural User Interface)

The last two pictures are of the HP TouchSmart running NUI Suite Snowflake software, developed by the Natural User Interface Europe AB (NUITEQ) for think LCD, Plasma, and FT displays.






Dell Studio One 19 Touch Zone App by Fingertapps: The Video

Here is the video demonstration of Dell Studio One 19 Touch Zone, developed by Fingertapps, a New Zealand software company:



The Dell Studio One with Fingertapp's multi-touch natural user interface software is due for launch soon, according to Ben Wilde and Dave Brebner, of Fingertapps. Here is a link to a recent Engadget article by Paul Miller: Dell demos multitouch on the Studio One 29 (with additional videos)

http://www.fingertapps.com/fingertapps-brand_linear.png

Apr 17, 2009

Pervasive Checkers on Microsoft's Surface: The Gamepack Video

In early 2007, before we knew of the existence of Microsoft's multi-touch surface, I worked on a "Pervasive Checkers" project with Johnny Hopkins, a classmate in my Ubiquitous Computing class. I'd previously worked with XNA Game Studio Express in an AI for Games class, and thought that it would be cool to make a casual checkers game on a multi-touch table that could be played in gathering spots such as coffee houses and neighborhood cafes.

Below is a screen shot of what we created using Inspiration software - (in the application, you can click on an item and it expands to reveal additional information.)



Two years later, and the Pervasive Checkers idea is reality - but I wasn't involved in the process. Checkers is one of the games that is included in a game-pack created specifically for the Surface.

Take a look at the video:



From Surface Computer News:

"The Microsoft Surface Games Pack is a clear illustration of where the Natural User Interface of Windows 7 has the potential to take games. Windows is traditionally the number 1 gaming platform around the world. With the introduction of the NUI, allowing players to literally have titles at their fingertips via touch, Windows 7 can kick the gaming experience up a notch. Provided that developers rise up to the challenge."

Interactive Displays in Public Spaces

Daniel Michelis recently completed his Ph.D. dissertation on a topic that is dear to my heart. Information about his research can be found on his Interactive Displays in Public Spaces blog.

(Note: This was cross posted on the Technology-Supported Human World Interaction blog.)


Here are links to a few of his posts:


Interactive Displays: Perception, Awareness, and Interaction


Evaluating Interaction with Display Applications in Public Space


I especially like the diagrams Daniel uses to depict zones of interaction:

Figure 3: Four-phase Model
(Source: Daniel Michelis (2009), according to: Vogel and Balakrishnan, 2004)

(Author: Daniel Michelis, Institute for Media and Communications Management, University of St. Gallen)


4 Interaction Zones

http://magicalmirrors2006.files.wordpress.com/2008/07/rogersbrignull.jpg

Interaction Thresholds

Figure 1: Perception and Usage of Interactive Displays
(Source: Daniel Michelis (2009), according to: Brignull & Rogers, 2003)


Apr 8, 2009

Joel Eden's Informative Post: Designing for Multi-Touch, Multi-User and Gesture-Based Systems

Joel Eden is a User Experience Consultant at Infragistics- he recently wrote a detailed article/post in the Architecture & Design section of Dr. Dobbs Portal, "Designing for Multi-Touch, Multi-User and Gesture Based Systems". I thought I'd share the link, since I've been writing on the same topic.

In his article, Joel explains the differences between traditional WIMP (Window, Icon, Menue, Pointer) interaction and gesture, multi-touch, and multi-user systems. These systems are also known as Natural User Interfaces, or NUI. He recommends that "rather than trying to come up with new complicated ways to interact with digital objects, your first goal should be to try to leverage how people already interact with objects and each other when designing gesture based systems."

Joel goes on to outline UX (User Experience, IxD (Interaction Design), and HCI (Human-Computer Interaction) concepts that designers should consider when developing new systems, - Affordances, Engagement, Feedback, and "Don't Make Us Think"
, which he summarizes in the conclusion of his article.

I especially liked Joel's references:

Clark, Andy. Supersizing the Mind: Embodiment, Action, and Cognitive Extension

Few, Stephen. Information Dashboard Design: The Effective Visual Communication of Data

Gibson, John J. The Ecological Approach to Visual Perception

Krug, Steve. Don't Make Me Think: A Common Sense Approach to Web Usability, Second Edition

Norman, Don. The Design of Everyday Things

Norman, Don. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine

I would also add the following references:
Bill Buxton
Multi-touch Systems I have Known and Loved
(Regularly updated!)
Sketching User Experiences: Getting the Design Right and the Right Design

"Our lack of attention to place, time, function, and human considerations means these fancy new technologies fail to deliver their real potential to real people." - Bill Buxton

Dan Saffer
Designing for Interaction: Creating Smart Applications and Clever Devices
Designing Gestural Interfaces

SAP
Touchscreen Usability in Short
(Summary by Gerd Waloszek of the SAP Design Guild)
SAP Design Guild Resources (User-Centered Design, User Experience, Usability, UI Guidelines, Visual Design, Accessibility)
Kevin Arthur (Synaptics)
Touch Usability
Bruce "Tog" Tognazzini
Ask Tog: Interaction Design Solutions for the Real World
Inclusive Design, Part I
First Principles of Interaction Design
John M. Carroll
Human Computer Interaction (HCI) (History of HCI)
Bill Moggridge
Designing Interactions
Ben Shneiderman
Leonardo's Laptop: Human Needs and the New Computing Technologies
Edward Tufte

Visual Explanations
Beautiful Evidence
The Visual Display of Quantitative Information
Envisioning Information
Rudolf Arnheim (Gestalt)
Art and Visual Perception: A Psychology of the Creative Eye

Update: A great reading list on general HCI. Some of the authors were involved in the early days of touch, bi-manual, and multi-touch interaction.

Jan's Top Ten List of Books on Human-Computer Interaction


FYI: If you know much about Windows Presentation Foundation, you probably know that Josh Smith, WPF guru, also works at Infragistics


Apr 4, 2009

Put-That-There: Voice and Gesture at the Graphics Interface and more Blasts from the 1980's HCI Past


bigkif's information about "Put-That-There" about Put-That-There gives a good description of this video:

Put-That-There at CHI '84

"In 1980, Richard A. Bolt from MIT wrote Put-that-there : voice and gesture at the graphics interface. It was a pioneering multimodal application that combined speech and gesture recognition.

This demo shows users commanding simple shapes about a large-screen graphics display surface. Because voice can be augmented with simultaneous pointing, the free usage of pronouns becomes possible, with a corresponding gain in naturalness and economy of expression. Conversely, gesture aided by voice gains precision in its power to reference."

Richard A. Bolt "Put-That-There": Voice and Gesture at the Graphics Interface
(pdf) SIGGRAPH '80

Here is another blast from the '80's:

Kankaanpaa A. FIDS- AFlat-Panel Interactive Display System IEEE March 1988 IEEE Computer Graphics Applications(Nokia Information Systems)

"Although the needs and expectations of these various users are very diverse, they all have a common requirement: more natural and easier methods for communicating with the computer than are available today. Furthermore, they do not want to interact with the computer; they want to communicate with the application they are using. They do not want to use computer jargon; they want to use the same natural methods that they use when they perform the same tasks without a computer."

“We believe that only three of the flat-panel technologies described above, namely LCD, EL, and plasma, will be sufficiently advanced for mass production within this decade.”

Bill Buxton was working on multi-touch and gesture interaction in the 1980's, but his dreams did not become a reality until this century, for a variety of reasons. He shared his thoughts about the paradox of the speed of technology in a presentation at the 2008 IEEE International Solid-State Circuits Conference:Surface and Tangible Computing, and the “Small” Matter of People and Design”(pdf)

‘Carrying on from an earlier thesis in our department (Mehta , 1982) , we built a tablet that was sensitive to simultaneous touches at multiple locations, and with the ability to sense the degree of each touch independently (Lee, Buxton & Smith, 1984). We stopped the work in late 1984 when I saw a much better implementation at Bell Labs – one that was transparent and mounted over a CRT. The problem was that they never released the technology, so, the whole multi-touch venture went dormant for 20 years. But, I never stopped dreaming about it. (Lesson: don’t stop your research just because someone else is way ahead of you. It might be transitory, and anyhow, remember the story of the tortoise and the hare.)

“I spoke earlier about the paradox in the speed of technology development it goes at rocket speed, but that of a glacier as well; Simultaneously! In the perfect world, this would be ideal: we could go through several iterations of ideas so that by the time the new paradigms of interaction, such as Surface and Tangible computing are ready for prime time, everything will be in place. But, the rapid iteration is more directed at supporting the old paradigms faster and cheaper, rather then helping shape the new ones. The reasons are not hard to understand. From the perspective of circuit design, the problems are really hard. So, one has to have one’s head down working flat out to get anything done. But, there is a side of me that motivated this paper that asks, If it is so hard, then isn’t it worth making sure that the things one is working on are things that are worthy of one’s hard-earned skills?”

SOMEWHAT RELATED

Bill Buxton's Haptic Input References
(pdf)

Mar 24, 2009

Struktable Multi-touch Installation at TOCA ME Design Conference






Struktable Multitouch Installation from Gregor Hofbauer on Vimeo.


Strukt is a design studio in Vienna, Austria, that specializes in interactive and generative design for a variety of purposes, such as interactive environments and installations, ambient intelligent environments, games, and multi-touch tables, screens, and walls. The video is a demonstration of applications that were presented at the March 2009 TOCA ME Design Conference in Munich, Germany. The applications were developed using
vvvv. (More information regarding vvvv can be found at the end of this post.)


MT Table 01

INFO FOR THE TECH-SAVVY OR TECH-CURIOUS:

According to information from the vvvv website, vvvv is a "toolkit for real time video synthesis. It is designed to facilitate the handling of large media environments with physical interfaces, real-time motion graphics, audio and video that can interact with many users simultaneously. vvvv is a visual programming interface. Therefore it provides a graphical programming language for easy prototyping and development. vvvv is real time, where many other languages have distinct modes for building and running programs, vvv only has one mode, run-time. vvvv is free for non-commercial use."

VVVV Screenshots

VVVV's Propaganda Page
Other projects using VVVV
Struktable: the 70-inch Multitouch Table

STRUK ON A SPHERE: Interactive installation at a Mercedes Benz conference

Pattie Maes TED Talk: Sixth Sense - Mobile Wearable Interface and Gesture Interaction (for the price of a cell phone?!)

In the following video, Pattie Maes and Pranav Mistry,of MIT's Fluid Interfaces Group, demonstrate SixthSense, wearable technology that incorporates a video camera, a projector, a digital camera:


From the SixthSense website:

"The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction."

Photo from the SixthSense website

Of course it is similar to Minority Report, but I don't consider it to be a problem, because it supports multi-touch and multi-user interaction, and it is a "mobile" application of useful technology.
http://digitaldaily.allthingsd.com/files/2009/01/minority_report_interface.png
The interaction that had me at "hello world".

Mar 12, 2009

Dell's All-In-One Studio One 19, With Optional Multi-touch Technology Released in Japan

http://cache.gawker.com/assets/images/gizmodo/2009/03/dellstudioone.jpg
Via BusinessWire
http://i.i.com.com/cnwk.1d/i/bto/20090311/DellStudioOne19desktop_610x457.JPG
Photo via Rafe Needleman/CNET


Dell's Studio One 19 All-in-One System Fits Anywhere in the Home
(BusinessWire)

Here are the specs from the press release:

  • Easy multi-touch photo editing, slideshow creation, playlist compilation, notes, and even web browsing.
  • Unleash creativity with You Paint finger painting software.
  • Record videos and upload directly to YouTube with the touch of a finger.
  • Flick to Flickr – Upload photos to Flickr to share with family and friends.
  • Create a musical masterpiece with the multi-touch percussion center.

†Software is optional and works with multi-touch configurations only.

Power & performance:

  • Intel® Celeron, Dual Core Celeron, Pentium Dual Core, Core 2 Duo, and Core 2 Quad Core Processor options
  • Choice of nVidia GeForce 9200 or GeForce 9400 integrated graphics[i]
  • Up to 4GB[ii] dual channel memory
  • Up to 750GB[iii] HDD
  • Slot load Optical drive
  • 7-in-1 media card reader, six USB ports
  • Optional integrated wireless, web camera, Blu-ray Disc™
  • Optional multi-touch capability
  • Optional facial recognition security (with webcam)

According to Warner Crocker, from Gottabemobile, the Studio One All-in-One will be available in the U.S. later this spring, with a starting price for the non multi-touch version around $700.00.

I'll post more information about this soon!

Update

Here are a few more pics of the Studio One, via Darren Gladstone, PC World:

http://images.pcworld.com/news/graphics/161113-P1020787_350.JPG


http://images.pcworld.com/news/graphics/161113-P1020795_350.JPG

Multi-touch Drum Application on the Dell Studio One 19

Extensive PC World Review:

Dell Studio One 19: All-in-One Stunner Takes Japan

Update:

After I wrote this post, I received a comment from Nicolas (see below). If you are interested in this sort of interaction, take a look at lm3lab's touchless interaction. No fingerprints!

Feb 22, 2009

Rich White's "Mobile Immersive Learning Lab" Project; EduSim Update

Rich White is an educational technologist for Greenbush County, Kansas, has been working with the 3D interactive virtual world, EduSim, for quite a while. He's taking EduSim to the next level.

"The
concept is one of an enclosed virtual learning space - with surrounded projection of the virtual learning world the students are exploring - similar to the StarLab Concept (with a rectangular configuration). along the lines of a CAVE - however simpler, mobile, and relatively in-expensive by comparison."

The project is at the beginning prototype stage.

Below is a demo of the virtual world as it is projected on two screens that are placed next to each other at a right angle, with the center of the virtual-world view positioned where the two screens meet:


digital_dome_01.jpgreal_cave.pngpicture-2.png
This might be a great way of reaching students who have autism!


More about EduSim:

EduSim, for those of you haven't seen my previous posts on the topic, is a multi-user 3D interactive environment used in classrooms with interactive whiteboards:




Information from Rich White's Greenbush blog about Edusim:

Wikipedia entry:
"Edusim is a Cave Automatic Virtual Environment based concept of lesson driven 3D virtual worlds on the classroom interactive whiteboard or classroom interactive surface. The Edusim concept is demonstrated by the Edusim free and open source multi-user 3D Open Cobalt virtual world platform and authoring tool kit modified for the classroom interactive whiteboard or surface. The Edusim application is a modified edition of the open source Open Cobalt Project and relies heavily on the affordances of direct manipulation of 3D virtual learning models and Constructionist Learning Principles."


History of Edusim:
"The Edusim project began in September 2007 at the Greenbush Education Service Center in Southeast
Kansas as an effort to bring an engaging 3D experience to the classroom interactive whiteboard. Pilot groups were established with 6th and 7th grade middle school students throughout Southeast Kansas to observe how students would be engaged through the software, and how the user interface would need to be augmented to account for the affordances of the whiteboard, and the usability of the students.
"

Here is a virtual world in Edusim in COBALT, showing how a drag and drop function is used for in-world VNC application sharing:



The Cobalt 3D metaverse browser has been modified for multi-touch interaction by some of the members of RENCI, a collaborative venture of Duke University and several other North Carolina universities. The video below is Dr. Xunlei Wu, demonstrating how gesture and touch is used to manipulate items and navigate through two Cobalt virtual worlds:



Some of the members of RENCI built a multi-touch table in addition to the collaborative multi-touch wall. For more information:

RENCI: Multi-Touch Collaborative Wall and Table using TouchLib: More about UNC-C's Viz Lab


(Cross posted on the TechPsych blog.)