Showing posts with label multi-touch. Show all posts
Showing posts with label multi-touch. Show all posts

Feb 20, 2009

More Multi-Touch and Surface Computing...

The concept of multi-touch/gesture/surface computing is spreading.

Here's more evidence:

Panasonic Touch Air Hockey


The game was demonstrated at ISE 2009 (Integrated Systems) Amsterdam. The interface was developed by UI Centric, a Soho, London company.

Microsoft's SurfaceWare at the Tangible Embedded Interactions Conference (TEI 2009):

SurfaceWare is a level-sensing software that alerts waitstaff when glasses need refilling.


http://photos-e.ak.fbcdn.net/photos-ak-snc1/v2385/101/124/727430870/n727430870_2768500_5763.jpg
Photos from Nachiket Apte, via Ru Zarin

More to come...

Feb 18, 2009

Ready for the SMARTTable?

The Smart Table is now available for purchase!



Here is the plug:

"The world's first multitouch, multiuser table for primary education - the SMART Table - is now available for purchase.Order the SMART Table"

"As a collaborative learning center, the SMART Table enables engaging and motivating small-group learning experiences. Up to eight students can use their fingers intuitively to sweep, slide and spin objects on the interactive screen. The SMART Table's ready-made activities help primary students gain and further their skills in areas like counting and reading."

"The SMART Table also makes an ideal complement to whole-class activities on the SMART Board interactive whiteboard. It helps reinforce concepts in a small-group setting and ensures students can participate in interactive and creative learning experiences."

(Cross-posted on the TechPsych and Technology-Supported Human-World Interaction blogs.)

Jan 28, 2009

Details about gesture and free-air interaction from LM3LABS an Ubiq'window

Ubiq'window, by LM3LABS, is a gesture-based system that is used for interactive show windows, interactive in-store marketing, museum installations, and more.

The slides provide details of the Ubiq'window's system specifications, including a gesture recognition set. The slides also higlight "Airstrike", a system that allows for free-air, touchless interaction.



RELATED
Lm3lab's Blog

Jan 26, 2009

SPARSH: DYI demo of an open-source multi-touch table and applications by NUI-group members

The following video is a demonstration of "Sparsh", an interactive multi-touch FTIR table built in eight weeks by a group of engineering students in India. Most of the information regarding the hardware and software you see running on this low-cost system can be found on the open-source NUI-group website, forums, and wiki.


Sparsh Multitouch Display from anirudh on Vimeo.

I especially like the multi-touch DJ application!


For more information, view the posts related to the NUI group on this blog.

Sparsh Website

Jan 20, 2009

Baby Multi-Touch Interaction on a Win7 HP TouchSmartPC running BabySmash.

I'm preparing myself to explore the multi-touch potential of my HP TouchSmart PC with the beta version of Win7, Microsoft's newest operating system. While I was searching for information, I came across this cute video of a baby interacting with the touch-screen. The dad in the video is software developer Kurt Brockett.

The application is BabySmash, a free application created by Scott Hanselman for his little ones. If you are interested in learning more about how BabySmash was created, see Scott's 6-part tutorial, "Learning WPF (Windows Presentation Foundation) with BabySmash". It includes information about incorporating speech synthesis into the application. The BabySmash! source code can be found on the CodePlex site. Ideas for improving the application can be found on the BabySmash! feedback forum.


I have lots of ideas for touch screen interaction applications for kids of all ages. Please leave a comment if you have a TouchSmart and working with Win7, or plan to do so in the future.

More Multi-touch Multimedia: Video demonstration of applications created with Snowflake and Flash



This video showcases the work of Natural User Interface-AB, using NUI Suite 1.0 Snowflake and Flash.

Here is the plug from the company's website:
"Natural User Interface (NUI) is a Swedish innovative emerging technology company specializing in commercially available advanced multi-touch software, hardware and service solutions. NUI's solutions can convert an ordinary surface into an interactive, appealing and intelligent display that creates a stunning user experience."

For more information and links:

For Techies and the Tech Curious: Multi-touch/Gesture from the NUI-Group

Search this blog!

Jan 9, 2009

Interactive Multimedia and Multi-touch at CES

I received couple of interesting links about interactive multimedia applications from Anthony Uhrick, of NextWindow, who is attending CES (Consumer Electronics Show). (NextWindow is the company who produces large touch-screen displays that have duo and multi-touch capabilities.)

Kevin Kennedy and his team at InterKnowlogy partnered with Zygote 3D Human Anatomy and Intermountain Health Care to develop a health care application developed in Windows Presentation Foundation to run on Microsoft's multi-touch Surface computing table.

The application supports collaboration between health care professionals and could also support collaboration between patients and doctors as well. Aspects of the application could be useful for patient education.

I really liked the part that demonstrates how you can zoom deeply into the 3D heart and look at things from various angles.


http://silverlight.interknowlogy.com/Videos/VitruView/default.html

Best of InterKnowlogy Surface:



The above video demos an application that might be useful for teaching history with an interesting timeline interface dial.

Visit InterKnowlogy for more videos and information about what they are doing with Windows Presentation & Silverlight as partners with Microsoft's Surface team.

If you are interested in more 3D anatomy, visit Zygote's 3D Human Anatomy site and 3D Science. If you are an educator, you'll see that 3D interaction has potential for creating more engaging science and health education lessons!

Another interesting link is to TouchTV Networks, which has partnered with companies such as Vectorform, who are also working with multi-touch applications using Windows Presentation Foundation.

Video from TouchTV Networks of CES 2009 Demo:


Vectorform's Virtual Drum Kit application:


Vectorform's Surface at School - demonstrates how this can be used in a classroom:


Vectorform's Surface DJ


Does anyone want to give me a multi-touch table?

Jan 8, 2009

For Techies and the Tech Curious: Multi-Touch/Gesture from the NUI-Group

If you are a new visitor to this blog and interested in interactive multimedia, you'll need to know more about the NUI-Group. Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.
(For related information, please read my recent post, Usability, Accessibility, and User Experience in a Win7 Environment.)

Seth Sandler, of the NUI-Group, sent out a great email with links and resources for people who are interested in multi-touch/gesture interaction, hardware, and/or software development. The list of NUI-Group members who have completed projects is listed below, with links to project websites as well as related threads on the NUI-Group forum.

(The information can be found on the NUI-Group Wiki, which boasts a nice icon based front-page)

Thanks, Seth, for organizing this wealth of information!

The following projects are divided up by type. (Links to information about the various types of multi-touch and gesture systems can be found near the end of this message.)

FTIR - Frustrated Total Internal Reflection

http://img144.imageshack.us/img144/8105/shemeftir2yu7.jpg
Name: Seth (cerupcat)
Project Name: AudioTouch
Project Website: http://ssandler.wordpress.com
Project Thread: http://nuigroup.com/forums/viewthread/1352/ http://nuigroup.com/forums/viewthread/2309/

Name: bassmang5
Project Name: Æ-table
Project Thread: http://nuigroup.com/forums/viewthread/3144/

Name: Daniel (Zin)
Project Name: Prometheus
Project Thread: http://nuigroup.com/forums/viewthread/2612/
Name: Carsten (carschdn)
Project Name : aTRACKtive
Project Thread: http://nuigroup.com/forums/viewthread/3223/

LLP Laser Light Plane

http://www.codelaboratories.com/images/LLP/HiPressure.jpg

Name: Denis Santelli(dsan)
Project Website: http://www.touchwall.fr/
Project Thread: http://nuigroup.com/forums/viewthread/3051/


LED-LP Light-Emitting Diode Laser Plane
Name: Nolan - (PeauProductions)
Project Name: PeauProductions (LCD)
Project Website: http://peauproductions.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/3291/
DI - Diffused Illumination
http://img359.imageshack.us/img359/1143/shemedi2bx3.jpg
Name: Seth (cerupcat)
Project Name: MTmini
Project Website: http://ssandler.wordpress.com/MTmini
Project Thread: http://nuigroup.com/forums/viewthread/1731/

Name: Fairlane
Project Name: ORION v2
Project Website: http://orionmultitouch.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/1709/
Name: Abdullah (EfeNDy)
Project Name: EfeNDy’s Diffused Illumination MT
Project Thread: http://nuigroup.com/forums/viewthread/3176/
Name: Sandor
Project Name: EXPO REAL 2008
Project Website: http://vimeo.com/2240537
Project Thread: http://nuigroup.com/forums/viewthread/3576/
Name: Matthew (MatthewW)
Project Name: Design Garage
Project Website: http://www.gotuasciencecenter.org/
Project Thread: http://nuigroup.com/forums/viewthread/3730/


Touchless

Name: Jimi Hertz
Project Name: Touchless Wall
Project Website: http://sassexperience.org/projettouchwall.html
(try: http://sassexperience.org/multitouch/inprogress.html )
Project Thread: http://nuigroup.com/forums/viewthread/2414/

YouTube: http://www.youtube.com/watch?v=KCFbWPf37jw

Other Types:
DSI Diffused Screen Illumination

http://iad.projects.zhdk.ch/multitouch/wp-content/uploads/2008/06/sheme_dsi_web.jpg



RELATED LINKS

Tips for Success

Resources
Thanks to all NUI-Group members who have been working so hard at this mission!

I'll end this post with a YouTube video created by Jimi Hertz, a NUI-Group member. "MULTI-TOUCHLESS WALL HOW TO?"

I especially like the music!


Jan 2, 2009

Two more examples of multi-touch and gesture interaction out in public: Accenture at O'Hare Airport, TacTable

Accenture's Touch Wall at O'Hare Airport
Note: Also posted on the Technology-Supported Human-World Interaction blog.


From the New York Times, photographed by Yana Paskova



Video and Photos from TacTable

The following video shows some applications developed by TacTable. As you can see, many of the applications look like they'd be useful in public spaces, including airports and museums.



Below are some pictures from the TacTable website:


http://www.tactable.com/images/homeimage1.jpg

http://www.tactable.com/images/handsTouchingTableLSC.jpg
Where Did Language Come From? Liberty Science Center, N.J.

http://www.tactable.com/images/AccentureWallMultiTouch.jpg

http://www.tactable.com/images/accenture1.jpg
Accenture Welcome Wall, London, England

http://www.tactable.com/images/sprint.jpg
Sprint Studio Digital Lounge Table

Related:

TacTable Contacts:
USA:
Henry Kaufman
henry@tactable.com


Tinsley Galyean
tinsley@tactable.com


UK:
Graham Cosier
graham@tactable.com


"Clients include Sprint, Accenture, Cirque du Soleil, New York Museum of Modern Art (MoMA), Liberty Science Center, Detroit Institute of Art, Georgia Aquarium, London's Millenium Dome, Boston Museum of Science, Chicago Museum of Science and Industry, Chicago Mercantile Exchange, Warner Brothers, Discovery, Disney, British Telecom Retail, and many others."

New Accenture Technology Lands at O'Hare International Airport

Flight Delays Radiate from Chicago and Atlanta

Dec 16, 2008

Bloom - Play Music with Colors: Seth Sandler's relaxing little on-line application!


















Bloom: Play music with colors (link to application)

For more information about Seth Sandler's work, visit his AudioTouch website.

Here are a few pictures of his applications:


http://img73.imageshack.us/img73/7506/mg9471wd8.jpg

http://img391.imageshack.us/img391/5619/mg9475nb4.jpg

http://img92.imageshack.us/img92/7146/mg9466va8.jpg

Seth integrates music into his multi-touch applications, as he has a background in both music and art. He is a member of the NUI-Group.

Dec 7, 2008

Demo of Duke University's multi-touch wall at RENCI, running the Cobalt Metaverse Browser

The video below shows the "pre-alpha" version of the Cobalt Metaverse Browser:



"This video shows the Cobalt metaverse browser being tested on a 13-foot by 5-foot multi-touch visualization wall equipped with six high-definition projectors located at the Renaissance Computing Institute engagement center at Duke University. The input drivers are being developed by Dr. Xunlei Wu so that users can directly manipulate high-resolution data using both hands and multiple fingers for a more natural and intuitive data exploration experience. In the video, Dr. Wu is using both gesture and touch to navigate through, and rearrange content between, two Cobalt virtual worlds."

Related:
The Open Cobalt Project (on ning)
Cobalt Website & link to download to the latest pre-alpha build
Cobalt Community,
Cobalt Google Group
EduSim (A 3D multi-user virtual world platform and authoring toolkit for K-12 interactive whiteboards. The latest version is powered by Cobalt)

Cobalt can import objects from the Google 3D warehouse as well as Google Sketch-up:
Video Tutorial: Using Google 3D Warehouse to build Cobalt & Edusim Virtual Worlds

People:
Julian Lombardi, Duke University
Xunlei Wu, Senior Visualization Researcher, RENCI, Duke University
Rich White, EduSim; Greenbush Education Service Center, Girard, KS

Kids using Cobalt-based EduSim on desktop computers, via Rich White:

Dec 1, 2008

BMW and Surface Computing: Video of Tabletop Interaction

This is a promotional video from BMW, showing how potential buyers can interact with a tabletop computing system to preview various ways they can customize the car. The system in the video is Microsoft's Surface:



Via Gizmodo Australia

(I wonder if Microsoft is working on a few educational games for the surface....)

Nov 26, 2008

Teliris InterAct TouchTable and TouchWall: Immersive Collaboration & Telepresence; DVE's Holographic Tele-Immersion Room

A few years ago I took a class about virtual reality and how it can be used in education and training. One of the topics we covered was telepresence. One of the companies I looked at was Teliris.

According to a whitepaper on the Teliris website, "Business Value of Telepresence", by S. Ann Earon, "Telepresence is what videoconferencing was meant to be: reliable, highly interactive, easy to operate, resulting in a natural meeting with transparent technology and an emphasis on human factors."

Teleris now offers something they call Immersive Collaboration, which involves the use of surface computing that supports document and multimedia content sharing across locations, as if all of the group members are in the same room


Watch the demonstration of the Teliris Collaboration Touch Table in a telepresence meeting. In the video clip below, the narrator shares content from a local Teliris Collaboration Touch Table to a remote meeting participant who is at another table.

"Touch to Telepresence"











teliris_interact_touchtable3.jpg

Business Holograms!
DVE (Digital Video Enterprises) developed a Tele-Immersion room that uses Cristie Digital Systems Mirage HD 3D projectors to create holographic images of remotely located meeting participants:

DVE Telepresence: An Introduction (A plug from DVE, but informative.)

DVE Portable Virtual Presentation -A Volumetric 3D image from a projector hidden from the audience's view:


This system can display 3D images on the stage, and supports 2-way interactive HD feeds.


The above examples demonstrate how newer technologies, including table-top surfaces, can be used for collaborative business meetings. I can envision this technology used for medical education, medical consultations, and collaboration between artists and musicians.

When the price comes down, perhaps we will have these systems in our family rooms!

Nov 23, 2008

Touch TV Networks Demo using Windows 7 a NextWindow display

Here is a short video demonstration of a Touch TV Networks demo on a display using a NextWindow touch screen. It looks like it was built using Windows 7. I understand that it was created by people with former Microsoft connections.

<a href="http://video.msn.com/video.aspx?vid=74b3a821-3e82-4b50-bc64-04ae4b75bdaf" target="_new" title="Touch TV Networks Demo at Microsoft REC">Video: Touch TV Networks Demo at Microsoft REC</a>

For more information, take a look at the Touch TV Networks website.

For the Tech Curious: "Get in Touch with Touchless": Multi-touch with just a webcam and the free demo application!

Via the Seattle Tech Report's Microsoft Blog:



You can find the demo code on the Codeplex website. Here is a quote:

"The Touchless SDK enables developers to create multi-touch based applications using a webcam for input. Touch without touching."

"Touchless started as Mike Wasserman’s college project at Columbia University. The main idea: to offer users a new and cheap way of experiencing multi-touch capabilities, without the need of expensive hardware or software. All the user needs is a camera, which will track colored markers defined by the user."


(I posted about the Touchless SDK previously, but I didn't have the video.)

Need for Multi-touch, Multi-user Interactive Multimedia Applications, and the Miracle Question

Last week I received a few comments on my post, "Multi-touch and Flash: Links to Resources; Revisiting Jeff Han's Presentation". I started to respond to a thoughtful comment by Spencer, of TeacherLED, and I wanted to share it as a post:

Spencer is a teacher and instructional technology consultant who develops web-based interactive applications for use on interactive whiteboards (IWB's). He's interested in multi-touch applications for education and has some good insights into what HCI researchers call the "problem space".

Here are Spencer's comments:

..."I agree that Flash could have a very important role to play here. I chose Flash as my development tool because it allows quick development of ideas and then easy distribution of the product. The importance of this is that it allows people who have a profession other than software developer to create software with the insight of their main role. In my case, as a teacher, I can identify things I wish I had and then make them. Often I find that other teachers had the same wish and they then appreciate the product."

"The unfortunate thing with multi-touch is that it is far from the technology most of us outside the industry/research areas have to work with. An app created in Flash for single touch follows the mouse and pointer method so it can be developed easily. When done it can be easily tested on a standard IWB for the feel (which is often surprisingly different on the IWB compared to using a mouse)."

"The Flash developer community has a very experimental and creative characteristic and I’m sure would be a great driving force for multi-touch but first there needs to come a reason for more people to have some sort of multi-touch display for general use, beyond facilitating experiments. When the various operating systems support it and have the apps to make having a supporting display viable then the experimentation and ideas will really flow."

"In addition, the display makers need to recognize the benefits of Flash and ensure they address them. At the moment it seems to be too often an afterthought if considered at all. SDKs and APIs make no reference to Flash or they remain indefinitely in beta for older versions of Flash only."

"It is a pity that all of this will take time. The more time that passes the more single touch IWBs are bought and installed which will delay the uptake of the eventual multi-touch ones. Meanwhile children continue to have to keep reminding themselves that they can only touch the board in one place when it is clear that every bit of their brain is telling them to interact with the board in a much more natural multi-touch way."

My response to Spencer's comment:

Spencer,

You make good points regarding the barriers to getting the multi-touch approach adopted by the "mainstream". You're right about what the commercial display makers need to do. If they want to market displays that will have more appeal, they must think about the different sorts of applications and programming environments that the displays should support.

Display makers also need to think more about the bigger picture - in what sort of environments will the displays be located? Indoors? Outdoors? Near bright sunlight? What about people with disabilities, children, or the elderly?

I can see that in the future, multi-touch displays and other devices would operate within an embedded systems environment and support mobile computing activities as well. There are existing examples of this concept, of course, but there is much room for creative improvement. An embedded systems approach is complex, and would need to handle input from sensors, support multi-modal signal process, and also provide users with a range of connectivity modes, including RFID. (Data management and storage needs would have to be addressed, along with privacy and security concerns.)

Most importantly, in my opinion, these systems would need to have the flexibility required to support human activities and interactions that have not yet emerged! Certainly this will need to take a multidisciplinary approach.

There are many unanswered questions....How does this fit in with mobile computing and "cloud" computing? What sort of middleware needs to be developed?

Even if we don't have solutions to the bigger problems, there are many smaller problems that I think could be somewhat easily solved.

As you mentioned, many applications that are designed for single-touch screens don't fully support the way people identify, select, and move items around the screen. Although educators access websites every day for use on interactive whiteboards, they are hungry for more. There are not enough websites that are optimized for single-touch interaction, or touch-screen interaction in 3-D "space".

Teachers who are successful users of interactive whiteboards know exactly what we are talking about. They spend quite a bit of time searching for new on-line resources they can use with their students. They know how much the students want to interact with the screen at the same time and would be so excited to have capability at their fingertips!

Optimizing websites for touch-screen applications is possible, but this idea hasn't occurred to web developers. Their jobs don't require it, so there is no incentive. Google is developing FlareBrowser, that can support multi-touch interactions, but according to information on the website, it runs on Mac Leopard 1.5, and nothing else. The present version is bare-bones. I haven't yet tested the FlareBrowser.

I think that another barrier to getting multi-touch off the ground is that the people who might have the knack for multi-touch application development simply don't know it! We've mentioned that Flash developers have the potential to create good multi-touch applications. I also think that game developers and designers could make good contributions to the multi-touch movement. Just think about what thought goes into programming interactions and event handling for 3-D web-based multi-player games!

Yet another barrier is that people who work in lower-tech fields could benefit from collaborative multi-touch applications, but they don't know it, either. The research I've reviewed tells me that multi-touch applications can support a wide range of human endeavors- work, creativity, data analysis, education, collaboration, planning, and so forth.

What is missing is the input of potential end users from a variety of fields. No specific discipline "owns" multi-touch, so it is hard to figure out how we can make this happen.

Could we set up multi-touch technology playgrounds at professional and trade conferences? What about airports and hospital lobbies? Libraries and museums? Shopping centers? Sports events and rock concerts?

This leads me to my next idea, which is jumping ahead a bit:

One of the barriers to the development of multi-touch applications is that it is not easy to gather user requirements when the users are not familiar with the technology.
That is when my "Miracle Question" technique comes into play. I learned this technique when I studied brief solution-focused counseling and found that if modified, can be useful when figuring out user requirements. (The process still needs some fleshing out.)
Why the Miracle Question?
The questions that a developer uses to guide the client during the initial planning stages are very important. Keep in mind that people want to use technology because it meets a need and also solves a problem, which is the similar to the reason a person might seek counseling.
The Miracle Question technique (actually, a series of questions) might help to tease things out. The goal of this type of questioning is to help the client use their own creativity, resources, and problem-solving skills so they can become effective partners throughout the development cycle.
(People with human-computer interaction training might have an easier time understanding how this technique might be modified and applied to different fields.)

FYI
A good example the Miracle Question process, as used in therapy and counseling, can be found on the Network of Social Construction Therapies website in an article written by the late Steve de Shazer:

http://brianmft.talkspot.com/aspx/templates/topmenuclassical.aspx/msgid/366482

There aren't many resources about the use of the Miracle Question in IT or business. Here are a couple:

Solution Focused Management of Unplanned IT Outages (Read pages 132 and the references.)http://conferences.vu.edu.au/web2006/images/CDProceedings06.pdf
Proceedings of 7th International We-B (Working for E-Business) Conference, 2006Katherine O'CallaghanSugumar Mariappandar, Ph.D.School of Business and InformaticsAustralian Catholic University

Miracle Question in Executive Coaching
http://www.1to1coachingschool.com/Coaching_Miracle_Question.htm

Nov 20, 2008

CNN's Magic Wall Conspiracy Thriller on the Daily Show: John Oliver, Jeff Han, John King and a cast of TouchScreens and Windows...

"I needed to find a screen-free environment!" -John Oliver

"It's good to be King." - John King, after disposing of John Oliver...

I just took a look at a hilarious episode about interactive multi-touch screens and a conspiracy theory on the Daily Show. The episode features Jeff Han, the creator of
CNN's Magic Wall, John Oliver, John King, and others from CNN.

Enjoy!



Via John Herrman and Gizmodo


If you are interested in multi-touch technology, feel free to do a search for additional information on this blog. The following post includes Jeff Han's demonstration of his multi-touch applications from TED 2006, along with resources and links:

Multi-touch and Flash: Links to Resources, Revisiting Jeff Han's TED 2006 Presentation

Note: If you are a parent, please screen the video clip before deciding if it is OK for your child to view.

Nov 19, 2008

Video of touch interaction on a HP TouchSmart, with NextWindow's Gesture Server Technology

Here is a short video clip of some TouchSmart interaction:



The video shows the new NextWindow Gesture Server Application.

Info from the NextWindow website:

"NextWindow Gesture Server Application in conjunction with a NextWindow touch screen enables two-touch gestures to be used on the Microsoft Windows Vista desktop and certain applications.

You perform a gesture by double-tapping or dragging two fingers on the touch surface. The Gesture Server interprets these actions as commands to the operating system. For example a two-touch vertical drag on the Vista desktop can adjust the computer's audio volume control up or down as required."


Also from the website:

Vertical Scroll Vertical scroll: drag two fingers up or down the touch screen.

Vertical Scroll Horizontal scroll: drag two fingers left or right on the touch screen.

Vertical Scroll Zoom: move two fingers apart or together.

Vertical Scroll Double Tap: double-tap two fingers on screen.

"You can enable or disable the two-touch functionality and adjust the sensitivity of each of the four two-touch gestures. You can also select the command that is executed with the double-tap gesture."

Nov 16, 2008

Every Surface a Computer: "Scratch" Capturing Finger Input on Surfaces using Sound. Video by Chris Harrison and Scott Hudson's Video - UIST '08

Chris Harrison and Scott Hudson, from the Human-Computer Interaction Group at Carnegie-Mellon University, presented their latest research at the UIST '08 conference. Take a look at the video below to see how gestures that result in sounds can can transformed on unpowered finger input surfaces, using a stethoscope sensors and filters:



Yes, every surface is a computer!
(Even your pants...)

For detailed information, read the paper presented at UIST '08 by Chris Harrison and Scott E. Hudson:
Scratch Input: Creating Large, Inexpensive, Unpowered, and Mobile Finger Input Surfaces

RELATED:

The Best Paper Award at UIST '08 was "Bringing Physics to the Surface", by Andrew Wilson, of Microsoft Research, and Ahahram Izadi, Otmar Hilliges, Armando Garcia-Mendoza, and David Kirk, of Microsoft Research, Cambridge.

Here is the abstract:

"This paper explores the intersection of emerging surface technologies, capable of sensing multiple contacts and of-ten shape information, and advanced games physics engines. We define a technique for modeling the data sensed from such surfaces as input within a physics simulation. This affords the user the ability to interact with digital objects in ways analogous to manipulation of real objects. Our technique is capable of modeling both multiple contact points and more sophisticated shape information, such as the entire hand or other physical objects, and of mapping this user input to contact forces due to friction and collisions within the physics simulation. This enables a variety of fine-grained and casual interactions, supporting finger-based, whole-hand, and tangible input. We demonstrate how our technique can be used to add real-world dynamics to interactive surfaces such as a vision-based tabletop, creating a fluid and natural experience. Our approach hides from application developers many of the complexities inherent in using physics engines, allowing the creation of applications without preprogrammed interaction behavior or gesture recognition."
Preparation for the Internet of Surfaces & Things?




(Cross-posted on the Technology-Supported Human World Interaction blog)

Nov 15, 2008

Multi-touch and Flash: Links to resources, revisiting Jeff Han's TED 2006 presentation

Despite the increase in interest in systems that support multi-touch, multi-user multimedia interaction, there is a need for creative, tech-savvy types to develop innovative applications. Why? This technology has the potential to make a powerful impact on how people learn, communicate, solve "big picture" problems, and do their various jobs.

CNN's Magic Wall was one of the first applications to gain the attention of the masses, as it was used as an interactive map during the US presidential election process. Touch-screen interaction gained even more notice after the recent SNL parody by Fred Amisen.

If you think about it, the multi-touch applications you see on the news aren't much different than what you'd get from a "single-touch" program.

Fancy, yes. Truly innovative, no.

Just imagine a 3D multi-touch, multi-user, multimedia version of Google Search. I did. I put my sketches in my idea book and hurt my brain thinking about how it could be coded.

Jeff Han, the man behind Perceptive Pixel and CNN's magic wall, had much more up his sleeve when he demonstrated his work at TED 2006. Even if you've previously seen this video, it is worth looking at again. (I've provided a link to the transcript below.)



Transcript of Jeff Han's TED 2006 Presentation

This video presentation had a transformational effect on me as I watched for the first time. Jeff Han brought to life ideas that were similar to my own as a beginning computer student thinking about collaborative educational games and multimedia applications that could be played on interactive whiteboards.

Here are some selected quotes from the video:

"
I really really think this is gonna change- really change the way we interact with the machines from this point on."

"
Again, the interface just disappears here. There's no manual. This is exactly what you kind of expect, especially if you haven't interacted with a computer before."

"Now, when you have initiatives like the hundred dollar laptop, I kind of cringe at the idea that we're gonna introduce a whole new generation of people to computing with kind of this standard mouse-and-windows pointer interface. This is something that I think is really the way we should be interacting with the machines from this point on. (applause)"

"Now this is going to be really important as we start getting to things like data visualization. For instance, I think we all really enjoyed Hans Rosling's talk, and he really emphasized the fact that I've been thinking about for a long time too, we have all this great data, but for some reason, it's just sitting there. We're not really accessing it. And one of the reasons why I think that is, is because of things like graphics- will be helped by things like graphics and visualization and inference tools. But I also think a big part of it is gonna be- starting to be able to have better interfaces, to be able to drill down into this kind of data, while still thinking about the big picture here."

So now what?

A recent post by "Alex", on the
AFlex World blog discusses a few solutions. Alex had a chance to meet with Harry van der Veen and Pradeep George from the NUI Group, and Georg Kaindl, a multi-touch interaction designer from the Technical University of Vienna. The focus of the discussion was to come up with ideas to encourage Adobe/Flash designers and developers to learn more about multi-touch technology and interaction, and take steps to create innovative applications.

I especially like the following quote from the post:

"...A quick quote from our conversations: “When our children will walk up to a display, they will touch it and expect to do something.”

As a techie and a school psychologist, I see an immediate need for innovative applications. I know that there is a built-in market in the schools, at least for low-cost applications. Despite economic constraints, many school districts continue to invest in interactive whiteboards (IWB's). They are cropping up in preschool and K-12 settings, and teachers are searching for more than what's currently available.

Interactive, collaborative applications are needed in fields such as health care, patient education, finance & economics, urban planning, civil engineering, travel & tourism, museums & exhibitions, special events, entertainment, and more.

Smart Technologies, the company behind SmartBoards, has a new interactive multi-touch, multi-user table designed for K-6 education, the Smart Table. Hewlett Packard has several versions of the TouchSmart PC, which can support at least duo-touch, if not multi-touch, multi-user applications. There are numerous all-in-one large screen display
s on the market that support multi-touch and multi-user interaction.

Quotes from Harry van der Veen, of Multitouch NL:

"In 10 years from now when a child walks up to a screen he expects it to be a multi-touch screen with which he can interact with by using gestures."

"...multi-touch screens will be as common as for children is the internet nowadays, as common as mobile phones are for us."


Here is a quote from a conversation I had with Spencer, who blogs at TeacherLED.

"It was interesting this week as I was in a classroom with a teacher who I've not worked with before... he had 2 students using the whiteboard who kept touching it together by mistake. The teacher, exasperated, said to himself, "Why can't they make these things to accept 2 touches without going crazy!"

Proof of the demand! I think you are right when teachers spot the limitations and then see the technology on visits to museums, that might stimulate demand."


Spencer creates cool interactive mini-applications, mostly for math, using Flash, that teachers (and students) love to use on interactive whiteboards. (He's interested in multi-touch, too.)


So what are we waiting for?!

Related:
Natural User Interface Europe AB meets Adobe
Georg's Touche Framework
NUI Group
TeacherLED
Interactive Touch-Screen Technology, Participatory Design, and "Getting It".
Hans Rosling's 2007 TED talk