Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Jul 24, 2011

Video: Kinect SoundWall, links to info and code!





Here is information about the project from the KinetHacks SoundWall site:

"Kinect sound machines become prettier and easier with each development! The Kinect SoundWall is a drum beat music machine controlled by gestures and voice commands. This video by  displays this digital music machine at work and how through various gesture and voice commands, users can create awesome beats to dance to. In the video, the user gestures to to certain blocks in the screen in order to create a beat there or render the beats void. Through various voice commands, the beat can start, increase tempo, stop etc. Through the proper integration of both voice and gesture commands, the Kinect SoundWall sets the standard for a great and efficient sound machine of the Kinect!"
"For more information about the Kinect SoundWall visit the project’s website."
RELATED
Vertigo SoundWall CodePlex Project Site

Apr 29, 2011

Musical Multitouch/Gestural Interfaces by Osmosis

I've been pleasantly surprised by the increase of interesting multi-touch and gesture-based applications developed for musical interaction on large displays.  This topic is dear to my heart -  I took a computer music technology class back in 2003, and think it would have been great if this sort of thing was available back then.  Who wants to point and click around a music app for hours on end?!


The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.


Performance Systems for Stage/Studio from Osmosis on Vimeo.

Transparent Stage System Specs
Design
• Floating, transparent HD displays from 32"
• Haptic surface with tempered glass backing
• High gain image with wide viewing angle
• Rugged aircraft-grade aluminum build
• Enclosed projector and computer
• Minimalist style, compact footprint
• Disassembles for easy transportation


Interactivity
• Projected capacitive foil or IR bezel options
• Up to 32 simultaneous touch points
• Precise, responsive touch tracking (3mm)
• Immune to external light conditions
• Use of fingers, gloves or stylus


Technical
Display:
• DLP portable projector
• 1280×800 HD resolution
• 2500 ANSI-lumen, 1800:1 contrast
Computer:
• Mini-ITX, Core 2 Quad, 4GB RAM
• ATI Radeon X1250 graphics card
• Wireless keyboard and mouse
• Windows 7 Pro

Stuart McClean, the founder of Osmosis, shared the following information about his company:

"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."


cid:image003.jpg@01CBBEEE.FB8A4D70cid:image016.jpg@01CBBEEE.FB8A4D70cid:image011.jpg@01CBBEEE.FB8A4D70


For more information about the applications developed by the Osmosis team, take a look at their showcase page.  

Apr 22, 2011

Pervasive Retail Part I: Web UX Meets Retail CX - Screens Large and Small at the Mall, Revisited

If you follow my blog(s), you know that I have a passion for interactive displays in public spaces, and that I enjoy watching how various technologies converge, jump across platforms and devices, inter-operate, and re-purpose over time.  

The best places for watching this unfold, in my opinion, are airports, malls, shopping districts,  and larger "big box" establishments, where the Web meets Digital Out of Home (DOOH), old-fashioned kiosks morph into multi-touch screens and gesture-based windows, and visual merchandising meets technology, digital culture, architecture, and consumer metrics At the center of it all is the user/consumer - regular people, moms, dads, kids, teens, the elderly, the disabled, the hurried and the worried. Adding to the complexity is that an increasing number of people who are out-and-about are tethered to various mobile devices.

In scholarly tech circles, the concept of DOOH is known "Pervasive Retail".  The explosion of mobile devices and ubiquitous screens has fueled the fire for research, and is the focus of the current issue of IEEE's Pervasive and Ubiquitous Computing.   

Despite the influx of technology, no-one is exactly sure how to do it quite right.  (I have some ideas, which I'll save for a future post.)

If you are interested in learning more about concepts related to "pervasive retail", the Retail Customer Experience website is a treasure trove of information related to DOOH, digital signage, multi-channel retailing, in-store media, kiosks, interactive touch screens and windows, related metrics, and more, with stories about real-life technology implementation.


Mall Video
The following video, taken with my handy HTC Incredible, provides a quick sampling of the screens I encountered during a recent visit to South Park Mall, in Charlotte, N.C.  The last screens in the clip were taken in the Brookstone store, and will be included in another clip that focuses solely on all of the screens that were scattered about the retail space.  


I have a hunch that some of the smaller displays in the Brookstone store were iPads.  iPads and tablets have great potential for use for shelf-level in-store interactive visual merchandising deployments, given the right apps and mounting systems. (See iPads as Cheap Digital Signage, by Tony Hymes of DOOHSocial and the video about Premier's iPad mounts, for more information.)

Much of what you'll see in the following video, taken at the same mall in December of 2009, wasn't around during my most recent trip:
Screens Large and Small at the Mall

Interactive Coke Machine and Kid at the Mall












I was sad to see that the interactive screen on the Coke machine  had been replaced by an ordinary one.  Part of the problem, I think, is that the interactive display was too busy and as a consequence, made the goal getting a quick drink a bit too complicated for the average thirsty customer, as seen in the video below:


Touch Screen Coke Machine at the Mall: 90 seconds to get a coke!

RELATED

Previous Posts:

References and Resources (Partial List)
Ron Brunt, InTouch with Retailing Whitepaper, 1/15/06
Brian Monahan, IPG Emerging Media Blog, 4/15/11
When all the world is a screen (The video is worth taking the time to watch.)
Narayanswami, C.,  Kruger, A.,  Marmasse, N. Pervasive Retail, IEEE Pervasive Computing
April-June 2011 (Vol. 10, No. 2) pp. 16-18 1536-1268/11/$26.00 © 2011 IEEE 
References from the Pervasive Retail article:
Mobile Retail Blueprint, Nat'l Retail Federation; www.nrf.commodules.php?name=Pages&op=viewlive&sp_id=1268 .
G. Belkin, Pervasive Retail Business Intelligence, Aberdeen Group, Apr. 2010; www.slideshare.net/AxiomConsultingAustralia pervasive-retail-business-intelligence .
R. Wasinger, A. Krüger, and O. Jacobs, "Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant,"Proc. 3rd Int'l Conf. Pervasive Computing (Pervasive), Springer, 2005, pp. 297–314.
A. Meschtscherjakov et al., "Enhanced Shopping: A Dynamic Map in a Retail Store," Proc. 10th Int'l Conf. Ubiquitous Computing(UbiComp 08), ACM Press, 2008, pp. 336–339.
C. Stahl and J. Haupert, "Taking Location Modelling to New Levels: A Map Modelling Toolkit for Intelligent Environments," Proc. Int'l Workshop Location- and Context-Awareness (LoCA), LNCS 3987, Springer, 2006, pp. 74–85.

Feb 24, 2011

Vision-Based Hand-Gesture Applications: Video from Communications of the ACM



The latest edition of Communications of the ACM, via "snail mail", was the inspiration for this post:



Vision-Based Hand-Gesture Applications
Juan Pablo Wachs, Mathias Kolsch, Helman Stern and Yael Edan

"Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing."


More to come!

Dec 11, 2010

Gesture "multitouch" 12 x 7 interactive video wall provides tours of I/O Data Centers' facilities

I came across this demonstration of I/O DataCenter's 12 by 7 foot interactive video wall that makes playing around with views of data center modules...interesting! The display is a gesture-based "multi-touch" system. (I'll update this post when I get more information.)



Here is the description from the Datacenter YouTube channel:


"Instead of hauling a 40-foot long modular data center to a trade show, i/o Data Centers is taking a high-tech approach to customer tours of their i/o Anywhere modular data center. The i/o team has created a 12-foot by 7-foot touchscreen video wall to provide interactive tours of the company's facilities. Selecting a "hot spot" pops up a virtual data center, complete with cross sections and product info, following the concept of the touch screens in the sci-fi movie "Minority Report.""


FYI: I/O Data Centers has an application that runs on the Surface.

UPCOMING:
Stay tuned for my upcoming posts! 


News about LM3LABS (Previous post)
Interactive Surveillance CCeline Latulipe (technologist) Annabel Manning (artist)

Dec 3, 2010

More gesture and multi-touch interaction! Windows 7 Navigation with Kinect; Product browser by Immersive Labs,

Here are a couple of new natural user interface videos.  The first video, by Evoluce, demonstrates gesture interaction/navigation in Windows 7 applications supported by Kinect. The second video, by Immersive Labs, shows multi-touch product browsing interaction on a large display.

Kinect Treatment of Windows 7, by Evoluce

Evoluce: Leading Surface Technologies


Immersive Labs - Multi-touch Product Browser

Immersive Labs

Nov 30, 2010

TuioKinect, by Martin Kaltenbrunner: "A simple TUIO hand gesture tracker for Kinect"

More Kinect from Martin Kaltenbrunner:


Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/​ You can download the application from: code.google.com/​p/​tuiokinect/​ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"


I've played around with Tuio and OpenFrameworks, but it has been a while.  I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.

RELATED/SOMEWHAT RELATED
TuioKinect:  TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)

Nov 13, 2010

HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)

MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's  proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction.  The application was built on Ubuntu Linux written using libfreenect, by marcan42  and Florian's creation, libTISCH.



Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"


(I have SO many ideas for this!  I'll throw a few out there in an upcoming post....maybe someone can run with them!)


RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10


FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework.  You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.


LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10

Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.

For your convenience, I've reposted something I wrote about libTISCH back in 2009:

For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers



































Here is information from libTISCH announcement:

Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.


MeTaTop A Multi Sensory Table Top System for Medical Procedures

Oct 31, 2010

Microsoft is acquiring Canesta, Inc., a developer of 3-D electronic perception technology for natural user interaction, gaming, and more.

Microsoft to Acquire 3-D Chip Firm Canesta
Michael Baron, TheStreet 10/29/10

Thanks to Harry Van Der Veen, of NUITEQ, for this link!

RELATED
The following video is from the Canesta3D YouTube channel. It demonstrates the 3D input sensor in action, with four people moving around in a living room. The chip used in the system depicted in the video was the precursor to the current chip, called the "Cobra 320x200".


Below is a demo of gesture interaction using Canesta3D technology to control and select information and content on a large display.  In my opinion, this will change the way we interact with our TV's, at least for those of us who hate using bad remotes!  Microsoft's acquisition of Canesta is good news, especially if they allow this technology to be used by the masses.   I'm pretty sure it has the capability of supporting  interaction with HD TV's are internet-ready, and can support GoogleTV, LeanBack, and Vimeo's Couch Mode.




Canesta Announces Definitive Agreement to be Acquired by Microsoft
Press Rease, 10/29/10, Canesta

About Canesta (From the Canesta website)
"Canesta (www.canesta.com) is the inventor of revolutionary, low cost electronic perception technology and leading provider of single chip CMOS 3-D sensors that fundamentally change the relationship between devices and their users. This capability makes possible true 3-D perception as input to everyday devices, rather than the widely understood 3-D representational technologies as output. Canesta’s 3-D input technology, based upon tiny, CMOS 3-D imaging chips or “sensors”, enables fine-grained, 3-dimensional depth-perception in a wide range of applications. Products based on this capability can then react on sight to the actions or motions of individuals and objects in their field of view, gaining levels of functionality and ease of use that were simply not possible in an era when such devices were blind. Canesta’s focus is on mass market consumer electronics, but many applications exist in other markets as well. Canesta is located in Sunnyvale, CA. The company has filedin excess of fifty patents, 44 of which have been granted so far."


Canesta Corporate Fact Sheet (pdf)
Videos: http://canesta.com/applications/consumer-electronics/gesture-controls

I posted some videos about Canesta's technologies on the following post. There are two videos that show Canesta's 3D depth camera works on a Hitachi flat-panel display: Interactive Displays 2009 Conference

For more information about interactive TV, GoogleTV, Leanback and Couch Mode, see the second section of my recent post:
Philipp Geist: Blending the Physical with the Digital;  Google TV/Leanback, Vimeo's new Couch Mode, oh..and ViewSonic's 3D (glasses-less) pocket camcorder...

Sep 18, 2010

Interactive 360 Degree Glass-less 3D Video Display with Gesture Sensor: Demo of Sony's RayModeler

The video below gives a demo of Sony's RayModeler, "A 360-Degree Display that doesn't require glasses". The video shows how the auto-stereoscopic 3D content is filmed. It also shows how items within the display respond to gesture interaction. The first prototype was introduced in 2009 and then brought out at the SIGGRAPH conference this summer.



According to an article written by Richard Lawler, Core77 created "Breakout" for the RayModeler, a game similar to Pong.  I'll have to think more about this technology before I form an opinion!

RELATED
Sony's 360-degree RayModeler 3D display brings its glasses-free act to LA, plays Breakout
Richard Lawler, Engadget 7/28/10

Sony's 360-degree 3D display prototype makes virtual pets more lifelike, expensive
Thomas Ricker, Engadget 10/19/09

Jul 6, 2010

Samsung Transparent OLED + Wedge Camera, Glassless 3D, Telepresence, Mid-air Interaction: Applying Science at Microsoft

The Microsoft Applied Sciences Group has been working on several projects that have the potential of changing how we interact with various displays and surfaces in the very near future.   Here's some what I came across my RSS feeds and Google Alerts this morning:
INAVATE  July 5, 2010

According to an article in InAVate, "Microsoft has combined Samsung’s transparent OLED with a sub-two-inch camera to revolutionize the Microsoft Surface platform. The touchless telepresence screen creates a 3D gesture-control interface that tracks movement by seeing through the display. The company’s Applied Sciences Group has also added its recently revealed wedge shaped lens, that InAVate reported on last month, to deliver glasses-free 3D content...the latest breakthrough could revolutionize the Surface concept, taking touch away from the display and projecting the images in 3D.-InAVate 7/5/2010


3D Gesture Interaction

"In this demonstration, we've placed the Microsoft Applied Science's wedge technology behind Samsung's transparent OLED display. This enables a camera to image through the display, see the user's hand above it, and alter the image based upon her gestures." -Microsoft Applied Sciences Group

3D Without the Glasses: A new type of display from Microsoft produces multiple images and tracks the viewers eyes - Kate Greene, MIT Technology Review (6/11/2010)

According to an article in MIT's Technology Review, "the new lens, which is thinner at the bottom than at the top, steers light to a viewer's eyes by switching light-emitting diodes along its bottom edge on and off. Combined with a backlight, this makes it possible to show different images to different viewers, or to create a stereoscopic (3-D) effect by presenting different images to a person's left and right eye. "What's so special about this lens is that it allows us to control where the light goes," says Steven Bathiche, director of Microsoft's Applied Sciences Group." -Kate Greene, Technology Review

Steerable Multi-view Display

"In this demonstration, we use head tracking to determine where multiple users are. Then, with the Microsoft Applied Sciences' wedge technology, we steer completely independent images to each user. In the video, one user is seeing a sun while at the same time another is seeing a rocket. This is maintained even as the users change positions relative to each other." -Microsoft Applied Sciences Group
Transparent Display for Telepresence

"In this demonstration, we've placed the Microsoft Applied Science's wedge technology behind Samsung's transparent OLED display. This enables a camera to image directly through the display. In the video, objects held up to the screen are captured and shown to the user on the other side of the telepresence communication (the other monitor in the video), while far away from the screen, the display shows the user a view dependent image."-Microsoft Applied Sciences Group
Steerable 3D Auto Stereo Display

"In this demonstration, we use head tracking to determine where a user's eyes are. Then, with the Microsoft Applied Sciences' wedge technology, we steer different views of the scene to each eye to produce a 3D image without the need for glasses or for fixing the location of the user." -Microsoft Applied Sciences Group
Mid-air Interactive Display

"In this demonstration, we illuminate objects above the display with infrared light. We capture the reflection using the Microsoft Applied Sciences' wedge technology. This enables us to see above the display while keeping the form factor small. Seeing above the display allows us to track the interaction between direct contacts on the display. In the video, the user associates a function (color choice) with one hand and a different function (zoom/rotation) with the other hand. This tool persistence is maintained regardless of the relative positions of the hands." -Microsoft Applied Sciences Group

RELATED
About Microsoft Applied Sciences Group
"The Applied Sciences Group (ASG) is an applied research and development team dedicated to createing the next generation of computer interaction technologies.  The interdisciplinary group focuses on the synergy between optics, electronics and software to create novel human computer interfaces.  The ASG is part of the Entertainment and Devices Division at Microsoft Corp. and mainly supports projects for Microsoft Hardware, XBox, and Microsoft Surface.  It also works closely with Microsoft Research."

May 7, 2010

The attracTable is Coming Soon: Sony will launch a high-definition touch and gesture- interactive tabletop, using Actracsys's technology!

Sony will be introducing a full high-definition interactive table, a result of a collaboration with the Swiss company Atracsys.


EXCLUSIVE: Sony atracTable to take on Microsoft Surface from JuneatracTable Baselworld 2009 reference 3


(At about 2:14 in the video below, there is a demonstration of an application that recognizes facial features and expressions, which are used to control and manipulate images on the screen.)
Images from the Sony Stand at Vision 2009


Here is an "overview" video that shows a number of uses for the Attractable:



Here is a version of the atracTable, using a tangible user interface to create music:





Here is the "Nespresso" table, which provides people with information about the type of coffee that you are drinking. It makes more sense as demonstrated in the video.
Atracsys @ Baselworld 2010


beMerlin:  Interactive gesture-based application for retail:

Jan 26, 2010

There is a need for multi-touch/gesture designers/developers!

If you are a talented interactive web designer/developer, game designer/developer, traditional programmer with a creative bent, or someone who who is thinking about working with technology in the future as a programmer or designer,  I urge you to consider thinking about designing/developing multi-touch applications in the near future.

In my opinion, there will be a need for multi-touch web applications as well as for multi-touch education and collaboration applications for the SMART Table, Microsoft's Surface,  multi-touch tablets like the rumored iTablet from Apple, and the multi-touch laptops and all-in-ones (Dell, HP, etc.).

Below are direct links to some of my blog posts related to multi-touch applications and screens. If you are fairly new to multi-touch, I'm sure that looking through some of my blog posts will be helpful.  All of the posts have links to resources, and most have photos and video clips of multi-touch in action.

If you are new to this blog, I have a great deal of information, links, photos, and video clips of various multi-touch screens and applications. The best way to find the stuff is to enter in a keyword in the search box for this blog:  multitouch, touch screen, gesture, multi-touch, etc. on  this blog.

Also do a search on my other blog: The World Is My Interface http://tshwi.blogspot.com

Here are some links:
Do you have an HP TouchSmart, Dell Studio One or NextWindow touch-screen? NUITech's Snowflake Suite upgrade provides a multi-touch plug-in
http://bit.ly/5tdlhc

The following blog post has a video clip that shows someone from Adobe painting with a multi-touch application in development:
More Multi-Touch!: Rumor of the mobile apple iTablet; AdobeXD & Multitouch; 10-finger Mobile Multitouch: http://bit.ly/4S9Upm

Ideum's GestureWorks: http://bit.ly/4C1p7M

Interactive Walls, Interactive Projection Systems, GestureTek's Motion-Based Games: http://bit.ly/6GRGtW

Intuilab's Interfaces: Multi-touch applications/solutions for presentations, collaboration, GIS, and commercehttp://bit.ly/7RK7qN

For software developers:
How to do Multitouch with WPF 4 in Visual Studio 2010: http://bit.ly/7c4YqC

Dec 27, 2009

Touch, Multi-Touch & Gesture Responsive Web & Related Applications (helpful if you have a touch screen or IWB!)

I regularly share information about applications that work well on touch, multi-touch, and/or gesture-based screens.  Over the past few months, there have been updates and new developments that I'm still exploring. (Some of this information might be "old" news, but for many, it will be "new".)

Here's what I have to share today!

Be sure to explore the activities from the Kids section of the National Gallery of Art website, located at the end of this post.

MULTI-TOUCH FIREFOX



Multi-touch on Firefox from Felipe on Vimeo.

Code Snippets from Felipe's Demo (Includes tracking divs, drawing canvas, image resizing, image crop, & pong) Mozilla Wiki
Bringing Multi-touch to Firefox and the Web
Christopher Blizzard, Mozilla Hacks

COOL IRIS
I have a hunch that someone out there is working on a multi-touch version of Cool Iris. Until I can find out the details, take a look at the videos below:


Cool Iris Overview on Google Chrome


 Here is a short video of what Cool Iris looks like on an iPhone:



Cool Iris Links
Cool Iris and iPhone
Cool Iris and Developers
Cool Iris Blog
Cool Iris Media/Press


About Cool Iris:   "Cooliris, Inc. was founded in January 2006 with a simple mantra: "Think beyond the browser". We focus on creating products that make discovering and enjoying the Web more exciting, efficient, and personal.Our core products include Cooliris (formerly PicLens), which transforms your browser into an interactive, full-screen "cinematic" experience for web media, and CoolPreviews, which lets you preview links instantly. Headquartered in Palo Alto, CA, our team consists of seasoned developers, entrepreneurs, and Stanford computer engineers. Each of us is passionate about serving our users without compromise and seeing that our products deliver the best experience."


BUMPTOP
Bumptop Gets Multi-touch Support on Windows 7


Bumptop Website
You can download Bumptop from the Bumptop website.  Here's the description:
"BumpTop is a fun, intuitive 3D desktop that keeps you organized and makes you more productive.  Like a real desk, but better.  Now with awesome mouse and multi-touch gestures!"
Anand Agarawala's Ted Talk

"Anand Agarawala presents BumpTop, a user interface that takes the usual desktop metaphor to a glorious, 3-D extreme, transforming file navigation into a freewheeling playground of crumpled documents and clipping-covered "walls.""

Discussion about Bumptop on the TED website

NATIONAL GALLERY OF ART KIDS
I've been compiling a list of websites that offer good touch-interaction.  One site that is good for children- and children at heart- is the National Gallery of Art's Kids pages.  There are a few entries that I had fun playing with students on the new SMARTboards at one of my schools:

FACES AND PLACES - LANDSCAPE

interactive landscapes


"FACES & PLACES encourages children of all ages to create portraits and landscape paintings in the style of American naive artists. By combining visual elements borrowed from more than 100 works in the National Gallery's permanent collection, this two-part interactive activity offers an overview of American folk art of the 18th and 19th centuries.(Shockwave, 6 MB)."


This one is so fun!  You can select different characters and make them dance, run, jump, or even fall.  You can design the landscape and add buildings, trees, and animals, and even change the sky pattern.  Press "go", and your character will travel around the panorama you've created.


DUTCH DOLL HOUSE

inDutch-Studio
Dutch Dollhouse  (Shockwave, 4.6 MB)
"Mix and match colorful characters, create decorative objects, and explore the kitchen, living quarters, artist's studio, and courtyard of this interactive 17th-century Dutch House."


NGA KIDS JUNGLE
Jungle interactive


"Create a tropical jungle filled with tigers, monkeys, and other exotic creatures. Inspired by the art of Henri Rousseau, NGAkids Jungle is an interactive art activity for kids of all ages. (Shockwave, 930k)"


What I liked about the Jungle application is that each item can be easily customized.  On the SMARTBoard, as well as on my HP TouchSmart PC, it is very easy for a student who has limited fine-motor control to create beautiful pictures.


FLOW
snow flow
"Flow is a motion painting machine for children of all ages. Enjoy watching the changing patterns and colors as you mix pictures on two overlapping layers. Choose  designs from four sets of menu icons, or add to the flow by clicking the pencil tool to create your own designs."


This application is a favorite of some of the students I work with who enjoy watching things spin. (You don't have to have an autism spectrum disorder to enjoy playing with Flow!)

National Gallery of Art Student and Teacher Online Resources