Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Feb 15, 2013

Designing for Touch & Gesture: Tips for Apps and the Web (Updated)

In the past, our fingers did the walking, sifting through files, papers, pamphlets, and phonebooks, and then by point-click-clicking with a mouse to interact with images and text, in essence, electronic imitations of the paper-based world. Traditional forms, brochures, ad inserts, and posters informed much of the design. 

How much have things change?   It is 2013, but you'd think it was 1997 from the PowerPoint look and feel of many apps and web sites!   Touch is everywhere, but from what I can tell, not enough designers and developers have stepped up to the plate to think more deeply about ways their applications can support human endeavors though touch and gesture interactions.  

For an overview of this topic, take a look at my 2011 post, written after a number of ugly encounters with user-unfriendly applications:  Why bother switching from GUI to NUI?  

For an in-depth look into the history of multi-touch, the wisdom of Bill Buxton is well-worth absorbing.  He's worked with all sorts of interfaces, and has been curating the history of multi-touch and gesture systems since 2007:


Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12



Even if you are not a designer or developer, I encourage you to explore some of the links below:

Touch Gestures for Application Design
Luke Wroblewski, 10/9/12

Common Misconceptions About Touch
Steven Hoober, 3/18/13

Designing With Tablets in Mind:  Six Tips to Remember
Connor Turnbull, Webdesign tuts+, 9/27/11

Finger-Friendly Design: IDeal Mobile Touchscreen Target Sizes
Anthony T, Smashing Magazine, 2/21/12

Best Practices: Designing Touch Tablet Experiences for Preschoolers (pdf)
Sesame Street Workshop


Are Touch Screens Accessible?
AcessIT, National center on Accessible Information Technology in Education

iOS Human Interface Guidelines
Apple

Android User Interface Guidelines
Using Touch Gestures
Handling Multi-Touch Gestures
Android

Designing for Tablets?  We're Here to Help!
Roman Nurik, Android Developers Blog 11/26/12

Touch interaction design (Windows Store apps)
Microsoft - MSDN

Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12


Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Apr 19, 2011

Gesture + Multitouch Interactive Media Wall, Wisconsin Institutes for Discovery, 2011 Award of Excellence, Digital Screenmedia Association

Below is a video of an interactive media wall at the Wisconsin Institutes for Discovery, the winner of the  2011 Award of Excellence from the  Digtal Screenmedia Association in the category of Best Government/Education/Non-Profit Agency Deployment.  


Detailed information about the Discovery Wall, including an overview of the technology, objectives of the deployment, and the positive outcomes of his project can be found on the Digital Screenmedia website.   

"Float4 Interactive is a creative technology company that develops interactive systems for entertainment, advertising and design applications."


Credits
Interactive technology: Float4 Interactive
Content: Zebra Dog
AV Integration: R2W
Concept: Sensory Interactive


DISCOVERY WALL: 10 X 4 Christie MicroTiles
5496 X 1650 resolution
Gesture + MultiTouch Interaction
2 X RealMotion Servers


ENTRANCE PORTAL 17 X 2 Christie MicroTiles
9824 X 866 resolution
Multi-Zone Gesture Interaction
2 X RealMotion Servers

RELATED/SOMEWHAT RELATED
Wisconsin Institutes for Discovery: 2011 Excellence Award, Architainment 
Live Design, 3/21/10
Architainment Examples
Christie MicroTiles Drive Fan Experience at new Nascar Hall of Fame
Nascar Hall of Fame (I live in the Charlotte area, so I'll visit the Nascar Hall of Fame soon and check out the Christie MicroTiles display soon!)
Miami Dolphins Choose Montreal's Arsenal Media and Float4 Interactive for Fan-Focused Interactive Wall at SunLife Stadium (BuzzWall) Arsenal Media, 12/16/10

Jan 21, 2010

Ideum's GestureWorks vs Adobe AIR 2 and Flash Player 1.0 comparison of multitouch and gesture support

Jim Spadaccini, of Ideum, shared information about his company's product, Gestureworks, highlighting how it provides better multi-touch and gesture support than Adobe AIR2 and Flash Player 10.1. Gestureworks supports multiple-point drag, rotate, and scale at the same time. In the video, the application is demonstrated on an HP Touchsmart 600 and a 3M multitouch screen.

Adobe AIR 2 and Flash Player 10.1  vs Gestureworks 1.0: A direct comparison of multitouch and gesture support


"A direct comparison between the built-in support for multitouch found in Adobe Flash Player 10.1 beta / Adobe AIR 2 and that of the Gestureworks multitouch framework for Flash. More about this comparison can be found on the Gestureworks website (http://www.gestureworks.com) and the Ideum website (http://www.ideum.com)  There is a blog post with more about this comparison and links to all of the example files at: www.ideum.com/2010/01/true-multitouch-wi th-adobe-flash/ "

True Multitouch with Adobe Flash - Jim Spadaccini


GestureWorks Supported Gestures

Example of Ideum's GestureWorks multi-touch, multi-user design for an exhibit a the Vancouver Aquarium:

Jan 1, 2010

Apple iSlate, iTablet , MacBook Touch: Will it support gesture interaction & haptic feedback?

Soldier Knows Best produces great tech-oriented videos. Here's his spin on all of the rumors about the possibility of the Apple iSlate.


I just inherited a 10 month-old Mac Book, installed Snow Leopard and upgraded to iLife 2009. I'm so used to touching the screen on my HP TouchSmart PC that I found myself touching my Mac Book screen from time to time, especially when I was editing video clips in iMovie. I think the latest version of iMovie was designed with touch/gesture interaction in mind!

From what I can tell, Snow Leopard and iLife 2009 will be able to support a range of touch interactions, if not gesture input as well.

Here are some rumors that have been conjured up and distributed on the web:

The Exhaustive Guide to Applet Tablet Rumors (Matt Buchanan, Gizmodo, 12/26/09)
Apple Expects to Sell 10 Million Tablets in First Year (Pete Cashmore, Mashable, 1/1/10)
iGuide Emerges as Another Potential Apple Tablet Name (Adam Ostrow, Mashable, 12/29/09)
The Tablet (John Gruber, Daring Fireball, 12/31/09)
"And so in answer to my central question, regarding why buy The Tablet if you already have an iPhone and a MacBook, my best guess is that ultimately, The Tablet is something you’ll buy instead of a MacBook."
Apple Owns iSlate.com Domain: The Mystery Deepens (Dan Nosowitz, Gismodo, 12/25/09)
What is the Ultimate Role of the Apple Tablet? (Arnold Kim, MacRumors, 12/31/09)
iPad, iTablet, iSlate, or MacTab  (Cruz Miranda, 8/31/09)

Why am I excited about this?

I want to see if the iSlate would be good for collaborative educational games, assisted technology, augmentative communication, and alternative assessment for students who have multiple/severe disabilities.

That is a huge goal, so I'm going to start simple.  I am not giving up on Windows 7 multi-touch programming. I just have an urge to find out for myself what works, what doesn't, and what platform works best for specific "personas" and "scenarios".

I plan to make a little app for the iPhone/iPod Touch, based on a game I made several years ago, "Shoes Your Battles" for a game class. I think I'd like to make this game for the Apple iTablet!

The first version of Shoes Your Battles created with Game Maker, and the second version was in Flash, back in the days of ActionScript 2.0.  I started on third version, one that could be used as an advergame for people to play while shopping for shoes during shoe sales, but it never got past the planning stage.  

The idea for the third version came to me when I my elderly aunt came to visit from out-of-town and just had to go shoe shopping on the day after Thanksgiving.  It was extremely difficult to figure out what was on sale, how much it cost, after taking off the previous mark-downs and what was on sale that had a price that was not yet marked down.  

Adding to the confusion was the fact that there were few salespeople and herds of women.   It was madness.  There were pairs of shoes in the wrong boxes, boxes of shoes and no way to quickly find out the true prices!   We were in the shoe department for hours, and it wasn't as fun as you'd think. If you've been in a crowded women's shoe department to buy that special pair of shoes during a fantastic shoe sale, you'll know what I mean.

At any rate, I wanted my little "Shoes Your Battles" game to help with this dreadful scenario, by somehow incorporating a shoe shopping advisor and a means to figure out the REAL sales prices of those awesome, to-die-for shoes. Unfortunately, the technology wasn't where it needed to be at the time- I am always dreaming up things that are too d--- futuristic!

4 years later, we have iPhones and SmartPhones and 3G internet and RFID and ubiquitous WiFi and the Wii and more women who like to play games and...and... The time is ripe.

Apple better come up with the iSlate!

SOMEWHAT RELATED


Thinking about post-WIMP HCI
It is always important to re-visit wisdom from the past when thinking about new interfaces and means of technology-supported human interaction.  Here are a few resources from the field of Human-Computer Interaction found on the HCI Vistas website:
The Prism of User Experience  -A nice graphic metaphor to help the conceptualization process. (Denish Katre, 2007)
Journal of HCI Vistas: Multi-disciplinary Perspective of Usability and HCI
Personas as part of a user-centered innovation process Lene Nielsen, 1/08 HCI Vistas Vol-IV
10 Steps to Personas (Lene Nielsen, 7/07, HCI Vistas Vol-III)

Oct 25, 2009

Interactive multi-touch for sound design, dj-ing, and music creation

NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:


multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister



JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard.  Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks.  (I'll discuss this further in a future post.)



Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:


Additional tutorials can be found the Jazzmutant YouTube channel.


Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."

RELATED
Video:  Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures 
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen

How the Stantum's Resistive Multi-Touch Screen Works


Aug 17, 2009

Microsoft's Project Natal & "Fun" Natural User Interfaces for the Office


This looks great!

I came across the picture below today, and discovered more about it in a CNET article:

"Gesture-recognition technology, such as that seen in Project Natal, has changed the face of gaming, but Microsoft's Craig Mundie (pictured) believes it will also transform the office." (Ina Fried, CNET)


This approach looks like it incorporates visual thinking and conceptualization in a nice and fluid way.

RELATED:

"Microsoft: Future desktop PC's will transform the office" (Ina Fried, CNET UK, 7/31/09).


According to Fried, author of the the linked CNET article, Craig Mundie, at the Microsoft Research Faculty Summit this past July demonstrated the set-up in the above picture. The demonstration "...included hologram-like videoconferencing, a virtual digital assistant, and multiple surface computers, along with voice-, touch- and gesture-recognition technology. The desk in the demo was a multitouch surface computer, and the office's walls were also a display that could easily switch from being a virtual window or collection of digital photos to a corkboard of sticky notes or various workspaces. In one case, Mundie also used Project Natal-like depth cameras to put himself in the middle of an architectural demo, essentially putting himself inside a building that was not yet built."

Craig Mundie of Microsoft on the Future of Software: Digital Assistants, Natural User Interfaces, and Room Computing (Gregory T. Huang, Xconomy Seatle, 7/13/09)

The author of the above article was also at the Microsoft Research Faculty Summit and touched on the concept of natural user interfaces:

“All the things we talk about as natural user interfaces have been largely used one at a time as enhancements to [graphical user interfaces],” Mundie said. Gesture recognition, expressive responses, immersive 3-D virtual environments, and understanding of context—these advances in computing algorithms will lead to software that is “better at anticipating what you might want.”

Xbox's Project Natal Revamped for Offices? (J. Nicholas Hoover, 7/30/09)

When I find a video of the Project Natal/Surface/Natural User Interface Office in action, I'll be sure to post it!

Why this is important:

Natural user interfaces might have the potential to support more efficient and effective collaboration among people in the workplace, and the visual display of information may also support better decision-making and problem solving.

Mar 30, 2009

Softkinetic 3D Gesture Recognition for Games and Rehabilitative Play

Taking 3D interaction further, Softkinetic has developed middle-ware that uses a 3D camera to support full-body gesture interaction with games and other applications. No controllers or devices are needed!



The following video is narrated in Portuguese, I, think, but you can understand the content in any language. I you love the Wii, you'll probably like this!


Here is a video that demonstrate how Softkinetic and Silverfit paired together to develop rehabilitative games for the elderly and others:


The following table is from the Silverfit website:
Game Movement trained
Puzzle While sitting down, bend whole body left and right, and stand up. Cognitive/visual component.
Mole Balance exercise by stepping with one leg while standing.
Catching grapes Walking movement left and right.
Walking Walking in place, while avoiding obstacles and thresholds. Activity of Daily Life (ADL) component.
Arm exercise Arm stretching and reaching in all directions with one or both arms. ADL component.
Picking flowers Walking backwards, forwards and sideways. Optionally, bending down.
Memory Arm stretching left, right, forwards and upwards. Cognitive component.

RELATED

Softkinetic and Silverfit Introduce Senior-Targeted Gaming

(Danny Cowan, Gamasutra, 12/19/08)

Softkinetic's Gesture-Based Interactive TV Action:

Feb 15, 2009

Interactive Displays 2009 Conference: Tuesday, April 21 -Thursday April 23, Hilton San Jose, California

The Interactive Displays Conference, sponsored by Intertech Pira, will highlight an interesting mix of existing and emerging interactive display technologies and applications. The conference will be held at the Hilton in San Jose, California, from Tuesday, April 21st through Thursday, April 23rd.

The pre-conference seminar will feature Sakuya Morimoto, of CANESTA, who will present his company's innovative single-chip 3D image sensor technology that supports gesture interaction.
Keynote speakers will be
Jeff Han, of Perceptive Pixel, and Steven Bathiche, of Microsoft US.

Some Highlights:

Pre-conference Seminar: Gesture Navigation in the World of Digital Contents, Enabled by a Single-Chip 3D Image Sensor Presenter: Sakuya Morimoto, Senior Director, Business Development in Asia, CANESTA, Japan

Related:
Hitachi at CES 2009: Use of Canesta's 3D sensor to control television and home systems using hand gestures.



"With the wave of a hand, with the shake of a hand, you can control volume, you can actually change the channels, watch your favorite program...the most exciting thing, I think, is that you can actually control your temperature and the lighting in the room, the environmental lighting. So..it is very unique technology that is out there.."

Another demonstration of Hitachi's gesture interaction using the Canesta's 3=D Depth camera:



When a TV Remote is Just Too Much Effort, Wave -
Jennifer Bergen, PC Magazine
CANESTA Corporate Fact Sheet (pdf)

How does Canesta's Electronic Perception Technology Work?
"Canesta’s electronic perception technology forms 3-D, real time moving images in a single chip through patented methods which use light photons to “range” the image, similar to radar. The silicon sensor chip develops 3-D depth maps at a rate in excess of 30 frames per second, and then performs additional processing on these depth maps to resolve the images into application specific information that can easily be processed by embedded processor(s) in the end-use device or machine. Since Canesta’s software starts with a three-dimensional view of the world, provided immediately by the hardware, it has a substantial advantage over classical image processing software that struggles to construct three-dimensional representations using complex mathematics, and images from multiple cameras or points of view. This dramatic reduction in complexity makes it possible to embed the processing software directly into the chips themselves so they may be used in the most cost-conscious applications."



I will highlight some of the featured presentations in future blog posts:

Steven Bathiche, Director of Research, Applied Sciences Group, Entertainment and Devices Division MICROSOFT, US
Guillaume Largillier, Chief Strategy Officer and Co-Founder, STANTUM, France
Jeff Han, PERCEPTIVE PIXEL, US
Mark Fihn, Publisher, VERITAS ET VISUS, US
Derek Mitchell, Conference Producer, INTERTECHPIRA, US
Vinita Jakhanwal, Principal Analyst, Small/Medium Displays, ISUPPLI CORPORATION, US
Joseph Carsanaro, President and CEO F-ORIGIN, US
Tommi Ilmonen, CEO MULTITOUCH OY, Finland
Stephen Sedaker, Director of Component Sales WACOM TECHNOLOGY CORPORATION, US
Brad Gleeson, Managing Director, Business Development TARGETPATH GLOBAL LLC., US
Henry Kaufman, President and Founder, TACTABLE, US
Christophe Ramstein, Chief Technology Officer, IMMERSION CORPORATION, US
Mary Lou Jepsen, CEO, PIXEL QI, US
John Newton, Chief Technology Officer, NEXTWINDOW, New Zealand
Herve Martin, CEO, SENSITIVE OBJECT, France
Scott Hagermoser, Gaming Business Unit Manager 3M TOUCH SYSTEMS, US
Bob Cooney, Vice President, Business Development, ECAST, US
Brent Bushnell, Chief Technology Officer UWINK, US
Stephan Durach, Head, Technology Office, BMW GROUP, US
Jeff Doerr, Senior Manager, Business Development Self Service Solutions Group, FLEXTRONICS, US
Andy Wilson, Senior Researcher, Adaptive Systems and Interaction Group, MICROSOFT, US
Mats W. Johansson, Chief Executive Officer, EON REALITY, US
Lenny Engelhardt, Vice President for Business Development, N-TRIG, Israel
Dr Paul Diefenbach, Director, RePlay Lab, DREXEL UNIVERSITY, US
Andrew Hsu, Technical Marketing and Strategic Partnerships Manager, SYNAPTICS, US
Dean LaCoe, Business Development Manager, GESTURETEK, Canada
Keith Pradhan, Global Director of Product Management, TYCO ELECTRONICS, ELO TOUCHSYSTEMS, US
Jerry Bertrand, Managing Member/Acting CEO, MICROSCENT, LLC, US
Frederic Kaplan, CEO and Co-Founder, OZWE, Switzerland


Related

Visionary Jeff Han and Microsoft's Steven Bathiche to Keynote at Interactive Displays 2009

Jan 28, 2009

Details about gesture and free-air interaction from LM3LABS an Ubiq'window

Ubiq'window, by LM3LABS, is a gesture-based system that is used for interactive show windows, interactive in-store marketing, museum installations, and more.

The slides provide details of the Ubiq'window's system specifications, including a gesture recognition set. The slides also higlight "Airstrike", a system that allows for free-air, touchless interaction.



RELATED
Lm3lab's Blog

Jan 8, 2009

For Techies and the Tech Curious: Multi-Touch/Gesture from the NUI-Group

If you are a new visitor to this blog and interested in interactive multimedia, you'll need to know more about the NUI-Group. Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.
(For related information, please read my recent post, Usability, Accessibility, and User Experience in a Win7 Environment.)

Seth Sandler, of the NUI-Group, sent out a great email with links and resources for people who are interested in multi-touch/gesture interaction, hardware, and/or software development. The list of NUI-Group members who have completed projects is listed below, with links to project websites as well as related threads on the NUI-Group forum.

(The information can be found on the NUI-Group Wiki, which boasts a nice icon based front-page)

Thanks, Seth, for organizing this wealth of information!

The following projects are divided up by type. (Links to information about the various types of multi-touch and gesture systems can be found near the end of this message.)

FTIR - Frustrated Total Internal Reflection

http://img144.imageshack.us/img144/8105/shemeftir2yu7.jpg
Name: Seth (cerupcat)
Project Name: AudioTouch
Project Website: http://ssandler.wordpress.com
Project Thread: http://nuigroup.com/forums/viewthread/1352/ http://nuigroup.com/forums/viewthread/2309/

Name: bassmang5
Project Name: Æ-table
Project Thread: http://nuigroup.com/forums/viewthread/3144/

Name: Daniel (Zin)
Project Name: Prometheus
Project Thread: http://nuigroup.com/forums/viewthread/2612/
Name: Carsten (carschdn)
Project Name : aTRACKtive
Project Thread: http://nuigroup.com/forums/viewthread/3223/

LLP Laser Light Plane

http://www.codelaboratories.com/images/LLP/HiPressure.jpg

Name: Denis Santelli(dsan)
Project Website: http://www.touchwall.fr/
Project Thread: http://nuigroup.com/forums/viewthread/3051/


LED-LP Light-Emitting Diode Laser Plane
Name: Nolan - (PeauProductions)
Project Name: PeauProductions (LCD)
Project Website: http://peauproductions.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/3291/
DI - Diffused Illumination
http://img359.imageshack.us/img359/1143/shemedi2bx3.jpg
Name: Seth (cerupcat)
Project Name: MTmini
Project Website: http://ssandler.wordpress.com/MTmini
Project Thread: http://nuigroup.com/forums/viewthread/1731/

Name: Fairlane
Project Name: ORION v2
Project Website: http://orionmultitouch.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/1709/
Name: Abdullah (EfeNDy)
Project Name: EfeNDy’s Diffused Illumination MT
Project Thread: http://nuigroup.com/forums/viewthread/3176/
Name: Sandor
Project Name: EXPO REAL 2008
Project Website: http://vimeo.com/2240537
Project Thread: http://nuigroup.com/forums/viewthread/3576/
Name: Matthew (MatthewW)
Project Name: Design Garage
Project Website: http://www.gotuasciencecenter.org/
Project Thread: http://nuigroup.com/forums/viewthread/3730/


Touchless

Name: Jimi Hertz
Project Name: Touchless Wall
Project Website: http://sassexperience.org/projettouchwall.html
(try: http://sassexperience.org/multitouch/inprogress.html )
Project Thread: http://nuigroup.com/forums/viewthread/2414/

YouTube: http://www.youtube.com/watch?v=KCFbWPf37jw

Other Types:
DSI Diffused Screen Illumination

http://iad.projects.zhdk.ch/multitouch/wp-content/uploads/2008/06/sheme_dsi_web.jpg



RELATED LINKS

Tips for Success

Resources
Thanks to all NUI-Group members who have been working so hard at this mission!

I'll end this post with a YouTube video created by Jimi Hertz, a NUI-Group member. "MULTI-TOUCHLESS WALL HOW TO?"

I especially like the music!


Nov 6, 2008

Multi-Touch News from WinHEC and PDC

I received the following videos and links from Anthony Uhrick, who happens to be at WinHEC this week and was at PDC 2008 last week. Touch screen, multi-touch, and gesture technology is starting to catch on. (Anthony is the VP of sales for NextWindow, the company that created the touch screen for the HP TouchSmart and other multi-touch enabled displays.)

Below is a video clip of a multi-touch photo presentation system running Windows 7: Gesture + Touch - has gesture and physics engines.


Apparently the application can run on Vista, Win 7, and Win 7 Touch.

Here is an HP TouchSmart PC, running a Touch Map application on Windows 7:


The following clip is of a newscaster using a multi-touch transparent screen.The display is from U-Touch Ltd. a partner of NextWindow. In my opinion, the application enhances the viewers understanding of the various news topics, and is visually appealing as well.


The graphics engine used in this application was developed by Vizrt, the same folks who were behind CNN's video hologram. Here are a few pictures from the Vizrt website:








The workflow behind the CNN hologram
The workflow behind the CNN hologram (click to enlarge)

The transporter room during setup
The holograph "transporter" room.

For more videos using Windows 7 apps, see
creamhackered's YouTube channel. (Videos appear to be from NeoWin Net.)

Windows 7 Design Concepts and Usability Tests



Feb 1, 2009

Reflections: Need for Interactive Infoviz for the Financial Biz, Business Leaders, Government Officials, Educators and the Rest of Us...

If you follow this blog because you are interested in emerging multimedia technologies such as multi-touch and gesture-based displays and tables, you probably know that there is a huge void in terms of content -rich applications for these systems.

Most of the demos show how you can zoom, rotate, and resize photographs, sort through your "stuff", or bat things around the surface as a game.
There is so much more power behind surface technology that needs to be realized!

Here are some of my reflections...

As I write this post, leaders of the financial industry, large corporations, and governments are in Davos, Switzerland at the annual meeting of the World Economic Forum. It is interesting to note that all of these bright men and women are struggling to grasp the enormity of the world's financial crisis and come up with strategies that hopefully will work.


The graphic below depicts how much has changed in the world economy between the 2008 annual meeting of the World Economic forum and the present. It lacks
the "wow" factor that one would expect for an application running on an interactive display. With some tweaking, it could be transformed into an application that supports two people interacting with the data at the same time.


(Click above photo to link to the interactive graphic.)
Via the Wall Street Journal

Here are more examples related to the current economic crisis:


Annus Horribilis in 3D
Financial chart by artist Andreas Nicholas Fischer
via Dan Pink





Life in the Left Tail
(Click for a larger image) via Greg Mankiw's Blog:
Random Observations for Students of Economics, via
Daily Kos

"On this chart each block represents a year and each column represents a range of return on the S&P index. Over on the right side are those lucky years where the index has soared upward from 50-60%. In the middle are the more typical years, where the market has risen less than 10%. That little box on the far left? Yeah, that's this year..And hey, how many of you knew the S&P had been around since 1825?." - Devilstower of the Daily KOS
I've been thinking about interactive information visualization and how it can support our understanding of the current economic crisis a bit lately, inspired by what I learned in Dr. Robert Kosara's InfoViz class I took last year. In a recent post on the Eager Eyes blog, Dr. Kosara floats the idea of the establishment of a "National Data Agency".

http://eagereyes.org/media/2009/nda.png

"What we need is a National Data Agency (NDA). This agency would be tasked with collecting data that all other agencies collect and produce, and making it available in a central place and in electronic, machine-readable form. There could and should be a reasonable data presentation on its website, perhaps even a National Data Dashboard (showing data of interest like debt, spending, jobless rate, etc.). But the bulk of data analysis would be left to third parties: analysts, journalists, citizens (and also aliens like me). Easily available data would make for more insightful reporting, more informed decisions, and endless business opportunities." -Robert Kosara

This makes sense.

There simply is too much data to absorb, explore, analyze, understand, and act upon. It is difficult to know if you have all of the data that you need, because some of it is difficult to access. It doesn't matter if you are a banker, a stock broker, a CEO, a CFO, a government leader, an economist, a shareholder, or a student. The current state of world economic affairs is the strongest evidence that our methods simply aren't working.

The work of Hans Gosling provides a good example of how information visualization can help increase our understanding of large quantities of data over time. Hans Gosling is a Swedish professor of development and one of the founders of Gapminder. ("Unveiling the beauty of statistics for a fact-based world view".)

The following video is Rosling's latest presentation, focused on debunking the myths regarding population growth:


What stops population growth? from Gapminder Foundation on Vimeo.

"Gapminder is a non-profit venture promoting sustainable global development and achievement of the United Nations Millennium Development Goals by increased use and understanding of statistics and other information about social, economic and environmental development at local, national and global levels. We are a modern “museum” that helps making the world understandable, using Internet."


The visual representation of economic data, if done well, packs a powerful punch. To me, images form a kernel in my memory related to the messages conveyed, and when recalled, also bring up a range of related conceptual details. It is sort of like what happens when I hear the first few notes of a tune from the past.

This doesn't seem to be the case for me when thinking about related text, or even thinking about "boring" charts and graphs.


The world needs effective and efficient data and information analysis and interactive visualization tools in order to solve problems that are on such a colossal scale.

The use of collaborative gesture and multi-touch display systems for data and information visualization is something that I believe will support better methods of decision-making in a variety of fields. Now is the time for the interactive information visualization community and related disciplines such as interactive multimedia and HCI to assist in this effort.

Here are some thoughts:


  • Those who are coding gesture-based or multi-touch programs need to understand what sort of content people will explore, and make sure that applications provide flexibility in use.
  • Human-computer interaction specialists will need to continue the study a range of interfaces and interactions in order to determine what supports human cognition of larger amounts of data and information.
  • Creators of interactive multimedia content, web developers, and others will need to re-examine their work and think about ways their content can support new ways of thinking and problem-solving within the context of "surface" computing.
  • Computer Supported Cooperative Work researchers will need to figure out what needs to be in place so that information can be effectively shared and analyzed between pairs or teams of people, and how this information can best be communicated to others within a business, agency, or organization, as well as the public.
One of the challenges facing this effort is that few people have an in-depth understanding of what it will take to make it happen. We will need to take an inter-disciplinary effort requiring a much higher level of communication and collaboration between people not accustomed to working within this context.

We will also need to take a "big picture" approach.


Because of the world's economic crisis, I think that interactive information/data visualization applications should target the needs of people who are working to understand the crisis and who have the power to do something constructive about it. This can not happen if they rely on the models and data analysis techniques of our recent past.


At the same time, these tools should be available to the rest of us, via the Internet, so that we may do our part to move us forward.

Back Story:
I started keeping up with the current economic on a more serious level in October. I was becoming numb from information overload. My knowledge about the economic and financial fields was lacking, so I decided to create a blog that I entitled "Economic Sounds and Sights" as my personal on-line repository of searchable content.

The blog has lots of pictures, info-graphics, embedded video clips, and links to a wide range of web-based resources. In my quest for information, I came across interesting quotes, jokes about economists, and tales of greed and scandals. I even found one blogger who has responded to each unfolding event of our economic crisis by re-writing lyrics to popular tunes.

For an example of one of my posts, read
"Celestial Economic Sphere, Data Viz for the Finance Biz..." It is my hope that the content I've collected and shared on the blog will become part of an interactive information visualization/timeline designed to support two or more people on a large display or table.

11/4/09: Update: The economic crisis got a bit complicated, so I stopped posting. The blog still remains on-line.  Interactive Infoviz for the Health Care Biz will be the topic of an upcoming post.


RELATED

Three Mirrors of Interaction: A Holistic Approach to User Interfaces (Bill Buxton)
Andreas Nicolas Fischer (Berlin-based artist who works with data, sculpture, and code.)
Google Spreadsheets Data Visualization Gadgets
Google Motion Chart (like Gapminder)
Panopticon
Death and Taxes (Wallstats.Com: The Art of Information)
2009 Index of Economic Freedom (Wall Street Journal and the Heritage Foundation)
Visual Business Intelligence Stephen Few's Blog
Sunlight Foundation
Transparency Timeline - A History of Congressional Public Access Reform
"The Sunlight Foundation is committed to helping citizens, bloggers and journalists be their own best congressional watchdogs, by improving access to existing information and digitizing new information, and by creating new tools and Web sites to enable all of us to collaborate in fostering greater transparency."

MapLight.org "Money and Politics: Illuminating the Connection"

Free Our Data Blog (Guardian Technology campaign for free public access to data about the UK and its citizens)
2009 Death and Taxes Interactive Graphic (Click to explore.)



Via Stephen Few: Example of Horizon Graphs, developed by Panopticon. (Year's worth of prices of 50 stocks in 2005 and comparisons between them, click to enlarge)

Mark Lombardi
Take the time to listen to NPR's Lynn Neary's interview with Robert Hobbs, curator of the an exhibit of the late Lombardi's "conspiracy" art/visualizations linking global finance and international terrorism. Lombardi's background as an archivist and reference librarian served him well in his art depicting interesting large-scale networks. Although his art was not interactive, his techniques have inspired the development of computer-based interactive information visualizations.

FYI:
To satisfy my curiosity about Mark Lombardi, I followed a link to "Obsessive-Generous": Toward a Diagram of Mark Lombardi, by Frances Richard, posted in the 2001-02 section of the WBURG website.

The examples below are of Lombardi's work connecting the relationships between George W. Bush, Harken Energy, and Jackson Stephens:



George W. Bush, Harken Energy and Jackson Stephens
c. 1979-90, 5th Version
1999

Enlarged Version


Close-up of network detail



(missing)
Close up depicting a profit made by Bush, 2 weeks before Saddam Hussein invaded Kuwait
via Frances Richard

"...though he possessed the instincts of a private eye and the acumen of a systems-analyst, Lombardi was of course an artist, and from the raw material of wire-service reports and books by political correspondents, he drew not only chronicles of covert, high-stakes trade, but technically pristine and sensually compelling visual forms"-Frances Richard


Update:
Lombardi's Narrative Structures and Other Mappings of Power Relations
Fosco Lucarelli, SOCKS, 8/22/13

Learning from Lombardi
Ben Fry, 9/2009