Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Dec 20, 2009

CityScape 1.8: What would it be like on an interactive touch or gesture screen? (Video and link to demo)


Pixel Active, has announced version 1.8 of CityScape, "the rapid urban modeling tool that allows users to build both custom and real-world environments quickly and easily"


CityScape 1.8 looks ideal for use on an interactive touch/gesture display or table.  Since it allows people to work within the same environment concurrently,  I have a hunch that it might work, perhaps with a few tweaks, in multi-touch, multi-user situations.  

As you watch the video below, imagine your finger as the pointer, creating terrain, landforms, buildings, roads, and other structures and spaces, and editing on-the-fly.  

CityScape  "Real Cities, Real Fast" (via Digital Urban & YouTube)
The above video is of version 1.7. To learn about the new features, see the "What's New in CityScape 1.8" section of the Pixelactive website.

Unfortunately, it comes with a high price tag.  Even so, wouldn't it be a great tool to use in middle and high school classrooms?  I want to get the demo and try it out on a SMARTboard....I can think of quite a few ways that it could be put to work in learning environments.

About CityScape
"CityScape was created to reduce the tedious, labor intensive process of building and modifying urban environments, empowering the interactive world builders of tomorrow."


"Using robust data importing, rapid urban modeling, Active Adjustment features, and procedural modeling, CityScape is able to achieve substantial improvements in worker efficiency over traditional object modeling tools and scene editors.
Worlds created by CityScape are compatible with a variety of file formats and optimized for real-time rendering engines."


Pictures from the Pixelactive website:
CityScape's client-server model allows collaboration between co-located or distributed teams so that the environment can be modified concurrently if needed.  It also lets users to check regions in and out in order to work on things independently.
Client-Server Model diagram for CityScape.

Terrain Painting                           Imported Heightmap
Hills created with the terrain painting tool.Mountain range created by importing a heightmap.


Dynamic adjustment allows you to quickly change your mind and modify without headaches!
Buildings will keep their relative positions to the roads as roads are dynamically adjusted.


If you change elevations, everything will adapt automatically.
Animation of a road adapting to the change in geography as a mountain is moved under it.


Here's a picture of the traffic data generation feature:
Downtown intersection showing all available traffic splines (colorized according to current snapshot in street-light simulation).


Here is a picture of an example of CityScape's GIS data import feature:
Aerial view of La Jolla city model.


Pixelactive Company Information

"PixelActive Inc. was founded in 2003 and is located in Carlsbad, California. Our mission is to develop state of the art interactive 3D technology. We believe that tools and technology should be intuitive to users regardless of its underlying complexity. Our focus is on the video game, virtual world, simulation, serious game, and GIS industries."


"The company's rapid urban modeling tool, CityScape, allows users to build both custom and real-world environments quickly and easily. With the editor, developers can focus on designing their world rather than the labor involved in creation. CityScape's unique feature set allows users to produce environments magnitudes faster than with traditional object modeling tools or scene editors."


"The PixelActive team has a track record of delivering exceptional products on time and on budget. Our core technology is focused on stability, reliability, modularity, usability, high performance and visual quality. PixelActive develops technology for a variety of PC platforms including Microsoft Windows XP and Windows Vista, and video game consoles such as the Microsoft XBox360™, Nintendo Wii™, and Sony Playstation® 3.P"

RELATED
You can download a demo of CityScape from the Pixelactive website.

Nov 23, 2009

GestureTek & Sprint's Interactive Wall: 3D depth-sensing allows wall interaction with a cell phone.

I missed this one! The video and photos below are of the Sprint Center Interactive Wall, powered by GestureTek's 3D depth-sensing system.  The media art was created by Takashi Kawashima,a designer/media artist who lives in San Francisco.   He has an MFA in Design| Media Arts from UCLA.

The interactive display can be controlled by a cell phone.



YouTube description/plug:
"GestureTeks 3D depth sensing technology powers an attention-grabbing interactive digital signage system for telecom leader Sprint. The 3D depth sensing interactive display screen, with mobile phone connectivity, tracks peoples body movements, and responds by sending a Sprint promotional message that follows them the entire length of the interactive billboard. The interactive motion-detecting advertising message invites users to create their own personalized interactive wall art on Sprints gesture control screen, by calling Sprint on their mobile phone. GestureTeks 3D tracker is the heart of the system. Installation lead: Mission Electronics. Creative: Goodby Silverstein."

The Instant DJ application looks fun!  It allows you to mix the music tracks on the large display with your phone.


Phone Painter:  Sprint Center Interactive Wall



Instant DJ



Now Widget


RELATED
Sprint Uses GestureTek 3D Tracking & Control System for New Interactive Digital Signage Campaign
GestureTek Announces 3D Gesture Tracking Initiatives for Sprint and Hitachi; Shares New 3D Patent Information

SOMEWHAT RELATED
GestureFX:  Next Generation Pediatrics Business Case (interactive floor for a pediatric clinic's waiting room)
AirPoint Hand-Tracking Unit for Mouse Replacement and "Point to Control" Interactivity

Cross posted on The World Is My Interface blog

Oct 28, 2009

libTISCH, a multi-touch development framework with multi-touch widgets and more!

For techies and the tech-curious who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers































Here is information from libTISCH announcement:


Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 

the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be
found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


TISCH Project Wiki

RELATED
Florian is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.
MeTaTop A Multi Sensory Table Top System for Medical Procedures

Oct 9, 2009

WIRED's Overview of Touchscreen PC's and Interface Innovations (and some links for the tech-curious)

According to Priya Ganapati's recent article in WIRED,  only 3% of all PC's and notebooks have interactive touchscreens.  More are coming to market, such as Sony's Vaio L Touch HD PC,  Dell's all-in-one, and Lenovo's tablet PC and ThinkPad laptop.  HP came out with the TouchSmart PC and touch-enabled laptops, and will be adding updated versions soon.

This is a great opportunity for developers interested in touch, multi-touch, and gesture interaction applications.   I think there will be even more opportunity for web developers to create websites that are touch-enabled, or at least optimized for touch screen interaction.

Will the  multi-touch web might be right around the corner?    

At the moment, there is some confusion about what designers should consider when developing applications.  There is not standard system of touch or gesture interaction,  and researchers are still very busy figuring things out.   Some companies have rushed out and patented gestures,  which in my opinion, is like patenting how we breath. But that is another story.

If you are a designer or developer, you might be interested in the Touch First Microsoft Surface Developer Challenge.  It is a chance to win your very own Microsoft Surface!  Hurry, the deadline is October 12th.

If you want to learn more, read WIRED's Gadget Lab:  "Touchscreen PC's Prompt Interface Innovations" for a good overview of what's happening in the Touch PC world.

If you are curious and would like to dig deeper into this topic, take some time to look at some of my blog-posts about multi-touch, touch screen news, innovations, and related musings:

The World Is My Interface:  An Introduction (and some links)
Interactive Touch-Screen Technology, Participatory Design, and "Getting It".
Multimedia and Interaction Resources (a mega-list, work-in-progress)
Bump Top 3D Desktop on a Touch Screen:  Toss Your Photos to your Facebook Icon!
Ron George's Interaction Design Toolbox
So how are people using their multi-touch all-in-ones?  Medion X9613 will be released soon..
Multi-touch, multimedia, multi-modal:  Fujitsu LIFEBOOK 54310 has a multitude of possibilties
Windows Vista User Experience Guidelines: "All Windows programs should be touchable!"

Oct 3, 2009

Interactive TV Game Controllers - A variety of permutations for the present and the future (iPhone, iPod Touch, Wii mote, XBox 360 Controller, Gesture...)

Building on my previous post, The Convergence of TV, the Internet, and Interactivity: Updated and Revised, I thought I'd focus on the different ways people can interact with converging content through interactive TV, internet-connected game consoles, and so forth. Is it time someone came up with a user-friendly UNIVERSAL CONTROLLER that could handle cross platform, cross media interaction?

http://www.theiphoneblog.com/images/stories/2009/01/iphone_apple_tv_game_controller-337x400.jpg

How About the iPhone as Controller for Apple TV Gaming Console?
(Read the article's comments section to see what people think about this concept.)


Of course, if you have a Wii, you know about the Internet Channel:

http://www.nintendo.com/images/wii/menu/internetchannel/big_tv.png
Wii + Internet=More

Video of a real 12-year-old kid navigating through the Wii Internet and Wii News/Earth. He provides a great "think aloud" analysis of his interaction.



"They should make a video game that is geography based on something like this..."

Xbox permutations:

This:

 http://www.krunker.com/wp-content/uploads/2006/08/WindowsLiveWriter/MicrosoftXbox360WirelessControllerforWin_AE26/new%20xbox%20wireless%20controller%5B2%5D.jpg

Or maybe even this, if you blog, chat, or  tweet:

http://www.blogcdn.com/www.engadget.com/media/2007/04/qwerty-xbox-360.jpg
The following social/interactive applications can be controlled by your Xbox Controller:(The photos were taken from Gizmodo "Microsoft E3 Keynote Archive" and Zatz Not Funny!: Xbox 360: Welcome to the Social?)

http://www.zatznotfunny.com/wordpress/wp-content/uploads/2009/05/sky-xbox-420x315.jpg
Sky Player on Xbox 360 to Launch Mid-October

http://www.zatznotfunny.com/wordpress/wp-content/uploads/2009/06/xbox-social.jpg


With the introduction of gesture-based interaction, such as Microsoft's Project Natal, in the future, no controller will be required to interact with your screen, no matter the content.

http://technabob.com/blog/wp-content/uploads/2009/06/project_natal_no_controller.jpg

Project Natal for Xbox 360 could put Microsoft ahead of Wii with controller-free gaming


YOU ARE THE CONTROLLER
XBox Project Natal Website

Motion Sensing Confirmed for 2010 (Sony PlayStation 3)


RELATED
Interactive demonstration of how BBC's Red Button works for interactive TV content.  You can press the red button on the mockup, located on the lower right-hand section of the screen.
BBC Red Button Demo

BBC Red Button launches new CBeebies interactive service
Digital Television Group 9/28/09

Social Television and User Interaction
(Scholarly articles on this topic from the ACM Portal) 
Stefan Agamanolis (researcher in this area)


Interactive TV Research from UITV.INFO

Sep 15, 2009

Applications for touch, multi-touch, and gesture interaction: What's Marching to Market?

Every week or so, I hear about a new multi-touch enabled laptop.Today, I came across a review on Gizmodo about the Lenovo ThinkPad T400, which is a 14.1 capacitive touchscreen laptop that can handle four touches at once:

Lenovo ThinkPad T400s with Multitouch Review: Finger Flicking Fun


The Multitouch ThinkPad in Action

Lenovo ThinkPad T400s with Multitouch from Gizmodo on Vimeo.

SCREENSHOT
 http://cache.gawker.com/assets/images/gizmodo/2009/09/SimpleTab.jpg
"The screen is pure beauty. Like a capacitive touchscreen phone, light taps on the display are responsive and you don't have to think about positioning your finger in a certain way to open apps or rearrange windows." - Joanna Stern, Gizmodo, 9/15/09


OTHER EXAMPLES
 
THE FUJITSU LIFEBOOK T5010

Add the Fujitsu LifeBook t5010 to the Windows 7 Multitouch Laptop List
Gizmodo, Joanna Stern, 9/15/09
http://cache.gawker.com/assets/images/gizmodo/2009/09/LifeBook_T5010_multitouch.jpg


DELL LATITUDE XT2
 Dell Latitude XT2 Tablet PC


http://i.dell.com/images/global/products/latit/latit_highlights/laptop-latitude-xt2-overview3.jpg






 












TOUCH DEMO- HP TOUCHSMART TX2 & WINDOWS 7


AIR HOCKEY FOR WINDOWS 7: IDENTITY MINE


FINGERTAPPS





Want to create something multi-touch?

It looks like there will be a growing need for Windows 7 application developers.  Here's some code examples using Windows Presentation Foundation and the Windows 7 Multitouch API

Windows 7 Multitouch Application Development, Part I (The Code Project: Kunal Chowdhury)


Other Developer Resources

Touchlib Multi-touch Development Kit  (NUI GROUP)

MSDN Windows Touch: Developer Resources

nTrig's Multi-Touch Download for Windows 7

NUITech's Snowflake Suite for NextWindow

TouchKit Multi-touch Development Kit


GestureWorks Flash Multitouch SDK

3M Multi-touch Developer Kit


RELATED
Touch-less Interaction from lm3labs

Create the future you want. with ubiq'window from Nicolas Loeillot on Vimeo.

More about gesture interaction and multi-touch can be found on this blog- do a search!

Aug 8, 2009

More about Project Natal: Richochet - Great Gaming for Fitness, Johnnie Chung Lee's Contribution


(Credit: CNET News)
Ina Fried, in a recent CNET Beyond Binary post, recently reviewed her experience playing Ricochet, a 3D game developed by Microsoft for Natal, the company's new gesture-recognition, controller-less Xbox gaming system. Above is a screenshot from Fried's article, Exclusive: Getting up close and personal with Natal:

Here is the video:


Ina Fried had a chance to spend some time in Redmond, Washington to explore the games in development at Microsoft, and hang out with the people responsible for Project Natal.

In her Beyond Binary article, Fried notes that the Ricochet game provides quite a workout, and this has had a positive effect on the Natal team:


"Since I started working on this project, I've lost almost like 10 pounds," said Kudo Tsunoda, general manager of Microsoft Game Studios and the creative director for Project Natal. "We're going to have the most in-shape development team you've ever seen."


Fans of Johnny Chung Lee will be happy to know that his work at Microsoft contributed to this game in someway, if they don't know this by now!

Who is Johnny Chung Lee? Read my post, "I wish I could be Johnny Chung Lee for a Day!".


RELATED

Speaking of Natal, it should be out next year (Ina Fried, CNET)

Gates: Natal to bring gesture recognition to Windows too



May 26, 2009

GestureTek's "Cube": A compact Interactive Gesture-based Display System

GestureTek's "Cube"




From the GestureTek website:

"Introducing The Cube - a compact, turnkey, 'plug and play' interactive display unit that brings the power of gesture control to a variety of display spaces. Project the interactive 80” diagonal display onto almost any floor, wall, table or counter for branding, advertising, entertainment and product promotion. The Cube will engage customers, turn heads and drive business results."

Brochure
(pdf)

May 10, 2009

Future of Interaction: Microsoft Lab's Vision 2019 Montage; Future Healthcare Montage

The following video is from Microsoft Office Vision 2019. If you visit the YouTube source, you'll find an interesting discussion about this five-minute+ clip.



Video Presentation from MIX09: Untold Stories of Touch, Gesture, & NUI
Joseph Fletcher, Design Manager, Microsoft Surface


"Learn about new patterns, behaviors, and design approaches for touch and gesture interfaces from a practitioners point of view. Learn early lessons from applied knowledge of touch applications, devices, and design methods."




LukeW's notes from Joseph Fletcher's presentation
LukeW's post: Design Principles at Microsoft

LukeW is the Senior Director of Product Ideation & Design at Yahoo! Inc.

Apr 25, 2009

How soon will we see interactive information visualization for multi-touch & gesture systems?

The field of information visualization is growing. Until recently, most visualizations were created for use on a single PC or larger screen, allowing for interaction by only one user at a time. I have a feeling that this will be changing in the very near future.

Why? Interactive duo and multi-touch interfaces are becoming more common, and now come in all sizes of screens, from the iPhone, the Surface, and CNN's multi-touch "Magic Wall". People of all ages play interactive games on the Wii, often on large flat-panel displays. In my opinion, the time is right for those developing applications for the InfoViz world to think about harnessing the power of multi-touch.


Below is a picture of the front page of the Visual Complexity website. If you go to the site, you can select a visualization, and then explore it more in detail, as each picture links to a web-page that provides background information about the visualization project, the artist or team behind the project, and links to the project's website.

I took a look at a variety of the examples posted on the Information Complexity website, and think many would be enhanced by a transformation to a multi-touch, gesture, bi-manual, or duo/multi user system. I'm interested in learning what others think about this. If you are working on a collaborative information visualization project, feel free to add a comment and post a link.

Here is a a nice quote from the website:

"Functional visualizations are more than innovative statistical analyses and computational algorithms. They must make sense to the user and require a visual language system that uses colour, shape, line, hierarchy and composition to communicate clearly and appropriately, much like the alphabetic and character-based languages used worldwide between humans."

Matt Woolman
Digital Information Graphics




Update: I did a search for "multi-touch" on the Visual Complexity website and found a couple of interactive applications:

Reactable (I've posted about this system a few times!)
(Reactable website)



















Prototouch

(Wirmachenbunt Website)



















Nov 13, 2008

RENCI at Duke University: Multi-Touch Collaborative Wall and Table utilizing TouchLib; More about UNC-C's Viz lab...

RENCI is a multi-disciplinary collaboration between several universities in North Carolina, with centers located at the Europa Center, Duke University, N.C. State, UNC Chapel Hill, East Carolina University, UNC-Asheville, UNC-Charlotte, and the Health Sciences Library at UNC-Chapel Hill. Many of the centers focus on visualization and collaborative technologies, and have been involved in multi-touch "surface" computing.

The pictures below are from the RENCI center at Duke University:

http://vis.renci.org/multitouch/wp-content/themes/daleri-dark-10/img/front.jpg

Duke Multi-Touch Collaborative Wall

The multi-touch wall is 13 x 5 feet, and utilizes six high-definition projectors, resulting in a combined resolution of 5760-2160, and supports multiple users. According to information on the RENCI website, the design is scalable and applicable to non-flat surfaces. The wall system runs on Windows and Linux.

Duke Multitouch Wall. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

DI, or Direct Illumination is used for touch detection in both the wall and the table for detecting touch. A separate instance of Touchlib runs for each of the 8 cameras used to detect touch. A gesture engine interprets the information about touches on the screen as gesture events. Each camera is handled separately for image processing and blob tracking tasks.

Direct Illumination (DI)

Graphics from the RENCI Vis Group Multi-Touch Blog

The Duke Multi-Touch Wall System

Here is cool picture of the "Multi-touch Calibration Device", which uses a built-in TouchLib utility.

Calibrating using the utility built into TouchLib.

Additional information can be found on the RENCI Vis Group Multi-Touch Blog.

FYI

Touchlib is a multi-touch development kit that can be found on the NUI-Group website.

"Touchlib is a library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light, and sends your programs these multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It includes a configuration app and a few demos to get you started, and will interace with most types of webcams and video capture devices. It currently works only under Windows but efforts are being made to port it to other platforms."

If you are interested in creating your own multi-touch table, the NUI-Group website and forums are a great place to start.

Related:

If you follow my blog, you probably know that I've taken several graduate courses at UNC-Charlotte. Some of my professors and a classmate or two have been involved in some exciting visualization research over the past year. (If you are serious about multi-touch and other visually-based applications, it is worth taking some time to familiarize yourself with visualization and interaction research.)

News from the UNC-Charlotte Vis Center:

At the University of North Carolina at Charlotte, RENCI is a collaboration between the UNC Charlotte Urban Institute, the Center for Applied Geographic Information Science, and the Charlotte Visualization Center.

11/06/2008
Robert Kosara’s group wins two awards at IEEE VisWeek Caroline Ziemkiewicz and Robert Kosara won Honorable Mention (the second highest award) at the IEEE InfoVis Conference for their paper, “The Shaping of Information by Visual Metaphors”. Also, Alex Godwin, Kosara’s student, won Best Poster for his submission, “Visual Data Mining of Unevenly-Spaced Event Sequences”.

The Vis Center is pretty fascinating, as you can see by the group of visitors at an open house.

If you are just as fascinated by this stuff as the guys in the picture, here are links to some recent papers by UNC-Charlotte faculty affiliated with the Vis Center:

The Shaping of Information by Visual Metaphors (Caroline Ziemkiewicz and Robert Kosara)

Evaluating the Relationship Between User Interaction and Financial Visual Analysis (Don Hyun Jeong, Wenwen Dou, Felsia Stukes, William Ribarsky, Heather Richter Lipford, Remco Chang)

Visual Analytics for Complex Concepts Using a Human Cognition Model (Tera Marie Green, William Ribarsky, and Brian Fisher)

Jun 6, 2008

Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors

Every so often I come across a blog that has been around for a while, but is a new discovery to me. If you are interested in interactive multimedia and emerging technology, and you haven't heard of Richard Banks, you will probably be pleased with his vast array of interesting posts!

After browsing around Richard Bank's blogs, I found that he works for Microsoft Research. Like me, he maintains more than one blog, and he blogs about similar topics.

Here is a view into his world:

rb.trends: tracking future technology experience
rb.work: technology, design and research
rb.log: family, photography, architecture etc.

One of Richard's recent posts brought me back to my musings about interactive surfaces:

Via Richard Banks and Gizmodo: A Touch Screen Poker Table from Hard Gaming

If it is possible to program an application to handle the metrics that support poker, then it is possible to program a multi-user application for work or education, and have this application integrated into any type display, according to need.

The form factor above would be useful for team meetings or collaborative project planning, especially during group decision-making or policy planning activities that involve data analysis or information visualization.

Could this concept be modified to fit the form of a kidney table for use in education? Think about it. We could combine the best features of surface computing with the best features of multi-user video games.
http://www.qualitychurchfurniture.com/images/activitytablecolors/lrg_48KID72LOset.jpg
Rationale? Video-game applications can handle the metrics of multi-user interaction, which would be ideal for tracking student progress and interaction. Touch interaction would enable young children with an easy way to access the application- or all of the information that will be available on the multi-touch web of the future, without learning first to type.

(Touch and gesture interaction could help us move towards the implementation of the Internet of Things)

Could this concept be transformed for use by two people collaborating on an electronic drafting table?


http://www.dwvartengine.com/media_downloads/dw_draftingTable_lo.jpg The image “http://ecx.images-amazon.com/images/I/31Y2BKBBhxL._SL500_AA280_.jpg” cannot be displayed, because it contains errors.

If you are a NUI Group member, think about this!


Note: I came up with the phrase, "multi-touch web of the future" as I was typing this post. I guess I didn't invent it - I looked it up and found Andrew Foote's
post, "The Multi-Touch Web"

I'm in.


Somewhat Related.......


More for the tech-curious:

Interactive data visualization in Second Life

Idle together "Technology blog oriented towards good design and impressive web applications"

Apple Developing Full Multi-Touch Macs - Apple Insider


More for the tech-savvy:

Multi-touch gestures in the Factor UI (Apple)- by Pestov. Includes code.

NSResponder Modifications: Swipe, Rotate, and Magnify, by Elliott Harris, a member of the NUI Group, I think.

Web Development Guidelines for iPhone
(lots of good info, with video clips and code)

Web 3D Consortium

Unity 3D kit for the

Google's Android


Multi-touch Web Consortium (in my dreams)

Gesture Web Consortium (in my dreams)