Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7



Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them. 




May 29, 2010

Preview: Update on Touch & Multitouch Technologies, Websites, and Touch-Interactive Multimedia Apps

It is about time for an update about touch/gesture- interactive technologies.

I've been researching the latest in "touch" screens and new developments in interactive multi-media content.  In just one year, a multitude of websites have been transformed from static to interactive. 

Although the initial objective for some of these websites was to optimize the interface and navigation for people accessing websites via touch-screen cell phones,  some are ideal for use on touch-enabled slates, the iPad, and even larger touch screen displays and surfaces.   

Convergence seems to be the buzz word of the day.   Interactive TV.  Game sets with Internet access.  Movies on your cell phone.  Touch screen Coke machines displaying movie trailers.  What's happening now, and what is next?

I welcome input from my readers in the form of links to websites, university labs with grad students and professors who are obsessed with emerging interactive technologies, proof-of-concept video clips, video clips of related technologies that are new-to-market, etc.   

I will add video clips to the following playlist:


FYI: I'm also in the middle of writing a series of posts about 3D television technologies for the Innovative Interactivity blog, and welcome input from my readers about this topic.


RELATED (Previous posts)
(the above post includes links to various multi-touch developer kits and resources)




Jan 23, 2010

More interactivity: Interactive Walls, Interactive Projection Systems, GestureTek's Motion-based Game

I recently discovered that Accenture's website has a few interactive web pages that provide information about the company's interactive wall technology.  What I liked about the site is that I could interact with it by touching the screen of my HP TouchSmart PC, and it worked!  (I'm always on the look-out for interactive websites that are good for touch-screen interaction.)


Below are screen shots of the on-line semi-functional demo of Accenture's Strategic Decision Interface:



(The website worked through touch-interaction via my HP TouchSmart PC!)







For more information:  Interactive Wall Technology:  Seeing the Big Picture


Newfangled Projector Systems:
New Projectors Make Any Wall an Interactive WhiteboardEpson, Boxlight unveil potentially game-changing technology -Meris Stansbury, eSchool News 1/13/10



"In a move that could shake up the interactive whiteboard (IWB) market, two projector manufacturers have just released new products that can turn virtually any surface into an IWB...The development means schools no longer have to buy separate hardware to enjoy the benefits of IWBs, whose interactive surface and ability to engage students have made them quite popular in classrooms."
The article highlights Epson's BrightLink 450i ultra short-throw projector which eliminates most shadows, and images can be anywhere from 59 to 96 inches diagonally with WXGA resolution.  The system requires an infrared pen. 


Another system is the ProjectoWrite2/W from Boxlight, which is a short-throw LCD projector with XGA resolution that can project up to 80 inches diagonally.


GestureTek


I've written a few blogs in the past about GestureTek.  I wonder if their technology would work with the projection systems mentioned in the eSchool News article. Below are a few examples of what GestureTek's been doing lately:


GestureTek's Video Game Wall at the Child's Play Activity Center (Las Vegas)
gesture-2



gesture-1


The above pictures of the Child's Play Activity Center show how GestureTek's WallFX interactive display system can be used to create a fun environment for children. The system includes a ceiling projector and a camera that can capture full-body motion.  The system provides 25 games and special effects.  Wouldn't this concept be great for interactive and fun educational games?


For details about this system:
GestureTek's video game wall shows where gesture-based games can go
-Dean Takahashi, GamesBeat, 8/25/09


GestureTek's Immersive Multi-platform Game:  Head-butting Interactive Soccer

"Video gesture control pioneer GestureTek., unveiled its new Momo™ Software Development Kit for game developers and original equipment manufacturers at the 2010 Consumer Electronics Show. Gesture recognition software tracks motion and objects such as faces and hands and brings immersive, gesture-based interactivity to multiple platforms, such as PCs, laptops, mobile phones, toys and other devices. The video is a demonstration of a head-butting soccer game."
GestureTek Interactive City Flight Simulator Game

Jun 6, 2009

Information about Touch Screens, Multi-Touch, & Gesture Interaction is Spreading

Since the news about Windows 7 multi-touch capabilities has spread around, I haven't had enough time to keep up all of information related to multi-touch interaction. Fortunately there are a few bloggers out there who are doing a great job filling in the gaps.

The Touch User Interface blog has a wealth of information in the form of pictures, video clips, slides, and links that I'd like to share.

The following slideshow/videos were highlighted in the Touch User Interface blog post, "Touch UI: HCI Viewpoint":

Untold Stories of Touch, Gesture, & NUI

Joe Fletcher, Design Manager, Microsoft Surface

Touch and Gesture Computing, What You Haven't Heard
Dan Saffer



Other posts of interest on the Touch User Interface blog:
Touch screens and vision impairment
Link: Designing the Palm Pre: An Interview with Michelle Koh

Touch User Interface Overview

I've updated some additional information about UX,interactive multimedia, multi-touch, and gesture interaction on my Multimedia and Interaction Resources page, which is a work in progress.


Apr 8, 2009

Joel Eden's Informative Post: Designing for Multi-Touch, Multi-User and Gesture-Based Systems

Joel Eden is a User Experience Consultant at Infragistics- he recently wrote a detailed article/post in the Architecture & Design section of Dr. Dobbs Portal, "Designing for Multi-Touch, Multi-User and Gesture Based Systems". I thought I'd share the link, since I've been writing on the same topic.

In his article, Joel explains the differences between traditional WIMP (Window, Icon, Menue, Pointer) interaction and gesture, multi-touch, and multi-user systems. These systems are also known as Natural User Interfaces, or NUI. He recommends that "rather than trying to come up with new complicated ways to interact with digital objects, your first goal should be to try to leverage how people already interact with objects and each other when designing gesture based systems."

Joel goes on to outline UX (User Experience, IxD (Interaction Design), and HCI (Human-Computer Interaction) concepts that designers should consider when developing new systems, - Affordances, Engagement, Feedback, and "Don't Make Us Think"
, which he summarizes in the conclusion of his article.

I especially liked Joel's references:

Clark, Andy. Supersizing the Mind: Embodiment, Action, and Cognitive Extension

Few, Stephen. Information Dashboard Design: The Effective Visual Communication of Data

Gibson, John J. The Ecological Approach to Visual Perception

Krug, Steve. Don't Make Me Think: A Common Sense Approach to Web Usability, Second Edition

Norman, Don. The Design of Everyday Things

Norman, Don. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine

I would also add the following references:
Bill Buxton
Multi-touch Systems I have Known and Loved
(Regularly updated!)
Sketching User Experiences: Getting the Design Right and the Right Design

"Our lack of attention to place, time, function, and human considerations means these fancy new technologies fail to deliver their real potential to real people." - Bill Buxton

Dan Saffer
Designing for Interaction: Creating Smart Applications and Clever Devices
Designing Gestural Interfaces

SAP
Touchscreen Usability in Short
(Summary by Gerd Waloszek of the SAP Design Guild)
SAP Design Guild Resources (User-Centered Design, User Experience, Usability, UI Guidelines, Visual Design, Accessibility)
Kevin Arthur (Synaptics)
Touch Usability
Bruce "Tog" Tognazzini
Ask Tog: Interaction Design Solutions for the Real World
Inclusive Design, Part I
First Principles of Interaction Design
John M. Carroll
Human Computer Interaction (HCI) (History of HCI)
Bill Moggridge
Designing Interactions
Ben Shneiderman
Leonardo's Laptop: Human Needs and the New Computing Technologies
Edward Tufte

Visual Explanations
Beautiful Evidence
The Visual Display of Quantitative Information
Envisioning Information
Rudolf Arnheim (Gestalt)
Art and Visual Perception: A Psychology of the Creative Eye

Update: A great reading list on general HCI. Some of the authors were involved in the early days of touch, bi-manual, and multi-touch interaction.

Jan's Top Ten List of Books on Human-Computer Interaction


FYI: If you know much about Windows Presentation Foundation, you probably know that Josh Smith, WPF guru, also works at Infragistics


Jan 11, 2009

Usability, Accessibility, and User Experience in a Win7 Environment

Microsoft's newest operating system, Win7 (beta), allows for multi-touch interaction and application development. From what I can tell from the news from the Consumer Electronic Show (CES 2009), touch and gesture interaction holds quite a bit of interest among consumers and application developers alike.

Hopefully people will think carefully about user experience and usability factors when developing the applications we'll be using in the not too distant future!

Because of my background as a school psychologist, I’m interested how touch/gesture applications can address accessibility and universal usability issues.

I wasn't able to locate user experience or usability information for Win7. Here are a few links from the Microsoft website that I think are worth reading:

Touch Interaction Guidelines

How to Design a Great User Experience

Designing with Windows Presentation Foundation

Accessibility

According to the research from Microsoft, ”more than half of all computer users experience difficulties or impairments related to accessibility, and are likely to benefit from the use of accessible technology. Moreover, approaching software design with the flexibility and inclusiveness that are the hallmarks of accessibility often results in overall improved usability and customer satisfaction.”

Regarding touch/gesture interaction, know that “19% (24.4 million) of computer users have a mild dexterity difficulty or impairment, and 5% (6.8 million) of computer users have a severe dexterity difficulty or impairment.”

Christian Moore, from the NUI-Group, shared this nice graphic that depicts the various fields that intersect or converge with interface design, which I think is useful when thinking about how we will design, develop, and use emerging technologies:



Concepts such as accessibility and universal usability can fit into this design!

For more information regarding multi-touch, see my previous post, "For Techies and the Tech-Curious: Multi-touch/Gesture from the NUI Group"

Nov 26, 2008

For the Tech Curious: Multi-Gesture Net: A Multi-touch and Multi-gesture Research Blog

Laurence Muller, M.Sc. is a scientific programmer at the Universiteit van Amsterdam who develops scientific software for multi-touch devices. His blog, Multigesture.Net, provides good information regarding multi-touch and gesture interaction hardware and software applications.


Laurence links to the DYI tabletop computing bootcamp that was held at
IEEE Tabletops and Interactive Surfaces 2008. From there, you can find a linked list of the organizers of the events, and additional information.

Picture below is from MTC Multi-touch Console:
Image

Here is a link to the group's libavg wiki that includes open-source code and "how-to" instructions.

If you are interested in multi-touch and multi-gesture computing from an academic point of view, Florian Echtler, of the Technische Universitat Munchen has a series of publications listed on his website. Here is the abstract of one of his papers. He is on the right track. I especially like the fact that he's thought about widget layers. (I have, too, but they are only sketches in my idea book.)

TICH: Tangible Interactive Surfaces for Collaboration between Humans (Sourceforge website, with links to libtisch.

F. Echtler, G. Klinker
A Multitouch Software Architecture
NordiCHI 2008: Using Bridges, 18-22 October, Lund, Sweden. (bib)

"In recent years, a large amount of software for multitouch interfaces with various degrees of similarity has been written. In order to improve interoperability, we aim to identify the common traits of these systems and present a layered software architecture which abstracts these similarities by defining common interfaces between successive layers. This provides developers with a unified view of the various types of multitouch hardware. Moreover, the layered architecture allows easy integration of existing software, as several alternative implementations for each layer can co-exist. Finally, we present our implementation of this architecture, consisting of hardware abstraction, calibration, event interpretation and widget layers."

Feb 15, 2013

Designing for Touch & Gesture: Tips for Apps and the Web (Updated)

In the past, our fingers did the walking, sifting through files, papers, pamphlets, and phonebooks, and then by point-click-clicking with a mouse to interact with images and text, in essence, electronic imitations of the paper-based world. Traditional forms, brochures, ad inserts, and posters informed much of the design. 

How much have things change?   It is 2013, but you'd think it was 1997 from the PowerPoint look and feel of many apps and web sites!   Touch is everywhere, but from what I can tell, not enough designers and developers have stepped up to the plate to think more deeply about ways their applications can support human endeavors though touch and gesture interactions.  

For an overview of this topic, take a look at my 2011 post, written after a number of ugly encounters with user-unfriendly applications:  Why bother switching from GUI to NUI?  

For an in-depth look into the history of multi-touch, the wisdom of Bill Buxton is well-worth absorbing.  He's worked with all sorts of interfaces, and has been curating the history of multi-touch and gesture systems since 2007:


Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12



Even if you are not a designer or developer, I encourage you to explore some of the links below:

Touch Gestures for Application Design
Luke Wroblewski, 10/9/12

Common Misconceptions About Touch
Steven Hoober, 3/18/13

Designing With Tablets in Mind:  Six Tips to Remember
Connor Turnbull, Webdesign tuts+, 9/27/11

Finger-Friendly Design: IDeal Mobile Touchscreen Target Sizes
Anthony T, Smashing Magazine, 2/21/12

Best Practices: Designing Touch Tablet Experiences for Preschoolers (pdf)
Sesame Street Workshop


Are Touch Screens Accessible?
AcessIT, National center on Accessible Information Technology in Education

iOS Human Interface Guidelines
Apple

Android User Interface Guidelines
Using Touch Gestures
Handling Multi-Touch Gestures
Android

Designing for Tablets?  We're Here to Help!
Roman Nurik, Android Developers Blog 11/26/12

Touch interaction design (Windows Store apps)
Microsoft - MSDN

Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12


Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Apr 19, 2011

Gesture + Multitouch Interactive Media Wall, Wisconsin Institutes for Discovery, 2011 Award of Excellence, Digital Screenmedia Association

Below is a video of an interactive media wall at the Wisconsin Institutes for Discovery, the winner of the  2011 Award of Excellence from the  Digtal Screenmedia Association in the category of Best Government/Education/Non-Profit Agency Deployment.  


Detailed information about the Discovery Wall, including an overview of the technology, objectives of the deployment, and the positive outcomes of his project can be found on the Digital Screenmedia website.   

"Float4 Interactive is a creative technology company that develops interactive systems for entertainment, advertising and design applications."


Credits
Interactive technology: Float4 Interactive
Content: Zebra Dog
AV Integration: R2W
Concept: Sensory Interactive


DISCOVERY WALL: 10 X 4 Christie MicroTiles
5496 X 1650 resolution
Gesture + MultiTouch Interaction
2 X RealMotion Servers


ENTRANCE PORTAL 17 X 2 Christie MicroTiles
9824 X 866 resolution
Multi-Zone Gesture Interaction
2 X RealMotion Servers

RELATED/SOMEWHAT RELATED
Wisconsin Institutes for Discovery: 2011 Excellence Award, Architainment 
Live Design, 3/21/10
Architainment Examples
Christie MicroTiles Drive Fan Experience at new Nascar Hall of Fame
Nascar Hall of Fame (I live in the Charlotte area, so I'll visit the Nascar Hall of Fame soon and check out the Christie MicroTiles display soon!)
Miami Dolphins Choose Montreal's Arsenal Media and Float4 Interactive for Fan-Focused Interactive Wall at SunLife Stadium (BuzzWall) Arsenal Media, 12/16/10

Jan 21, 2010

Ideum's GestureWorks vs Adobe AIR 2 and Flash Player 1.0 comparison of multitouch and gesture support

Jim Spadaccini, of Ideum, shared information about his company's product, Gestureworks, highlighting how it provides better multi-touch and gesture support than Adobe AIR2 and Flash Player 10.1. Gestureworks supports multiple-point drag, rotate, and scale at the same time. In the video, the application is demonstrated on an HP Touchsmart 600 and a 3M multitouch screen.

Adobe AIR 2 and Flash Player 10.1  vs Gestureworks 1.0: A direct comparison of multitouch and gesture support


"A direct comparison between the built-in support for multitouch found in Adobe Flash Player 10.1 beta / Adobe AIR 2 and that of the Gestureworks multitouch framework for Flash. More about this comparison can be found on the Gestureworks website (http://www.gestureworks.com) and the Ideum website (http://www.ideum.com)  There is a blog post with more about this comparison and links to all of the example files at: www.ideum.com/2010/01/true-multitouch-wi th-adobe-flash/ "

True Multitouch with Adobe Flash - Jim Spadaccini


GestureWorks Supported Gestures

Example of Ideum's GestureWorks multi-touch, multi-user design for an exhibit a the Vancouver Aquarium:

Jan 1, 2010

Apple iSlate, iTablet , MacBook Touch: Will it support gesture interaction & haptic feedback?

Soldier Knows Best produces great tech-oriented videos. Here's his spin on all of the rumors about the possibility of the Apple iSlate.


I just inherited a 10 month-old Mac Book, installed Snow Leopard and upgraded to iLife 2009. I'm so used to touching the screen on my HP TouchSmart PC that I found myself touching my Mac Book screen from time to time, especially when I was editing video clips in iMovie. I think the latest version of iMovie was designed with touch/gesture interaction in mind!

From what I can tell, Snow Leopard and iLife 2009 will be able to support a range of touch interactions, if not gesture input as well.

Here are some rumors that have been conjured up and distributed on the web:

The Exhaustive Guide to Applet Tablet Rumors (Matt Buchanan, Gizmodo, 12/26/09)
Apple Expects to Sell 10 Million Tablets in First Year (Pete Cashmore, Mashable, 1/1/10)
iGuide Emerges as Another Potential Apple Tablet Name (Adam Ostrow, Mashable, 12/29/09)
The Tablet (John Gruber, Daring Fireball, 12/31/09)
"And so in answer to my central question, regarding why buy The Tablet if you already have an iPhone and a MacBook, my best guess is that ultimately, The Tablet is something you’ll buy instead of a MacBook."
Apple Owns iSlate.com Domain: The Mystery Deepens (Dan Nosowitz, Gismodo, 12/25/09)
What is the Ultimate Role of the Apple Tablet? (Arnold Kim, MacRumors, 12/31/09)
iPad, iTablet, iSlate, or MacTab  (Cruz Miranda, 8/31/09)

Why am I excited about this?

I want to see if the iSlate would be good for collaborative educational games, assisted technology, augmentative communication, and alternative assessment for students who have multiple/severe disabilities.

That is a huge goal, so I'm going to start simple.  I am not giving up on Windows 7 multi-touch programming. I just have an urge to find out for myself what works, what doesn't, and what platform works best for specific "personas" and "scenarios".

I plan to make a little app for the iPhone/iPod Touch, based on a game I made several years ago, "Shoes Your Battles" for a game class. I think I'd like to make this game for the Apple iTablet!

The first version of Shoes Your Battles created with Game Maker, and the second version was in Flash, back in the days of ActionScript 2.0.  I started on third version, one that could be used as an advergame for people to play while shopping for shoes during shoe sales, but it never got past the planning stage.  

The idea for the third version came to me when I my elderly aunt came to visit from out-of-town and just had to go shoe shopping on the day after Thanksgiving.  It was extremely difficult to figure out what was on sale, how much it cost, after taking off the previous mark-downs and what was on sale that had a price that was not yet marked down.  

Adding to the confusion was the fact that there were few salespeople and herds of women.   It was madness.  There were pairs of shoes in the wrong boxes, boxes of shoes and no way to quickly find out the true prices!   We were in the shoe department for hours, and it wasn't as fun as you'd think. If you've been in a crowded women's shoe department to buy that special pair of shoes during a fantastic shoe sale, you'll know what I mean.

At any rate, I wanted my little "Shoes Your Battles" game to help with this dreadful scenario, by somehow incorporating a shoe shopping advisor and a means to figure out the REAL sales prices of those awesome, to-die-for shoes. Unfortunately, the technology wasn't where it needed to be at the time- I am always dreaming up things that are too d--- futuristic!

4 years later, we have iPhones and SmartPhones and 3G internet and RFID and ubiquitous WiFi and the Wii and more women who like to play games and...and... The time is ripe.

Apple better come up with the iSlate!

SOMEWHAT RELATED


Thinking about post-WIMP HCI
It is always important to re-visit wisdom from the past when thinking about new interfaces and means of technology-supported human interaction.  Here are a few resources from the field of Human-Computer Interaction found on the HCI Vistas website:
The Prism of User Experience  -A nice graphic metaphor to help the conceptualization process. (Denish Katre, 2007)
Journal of HCI Vistas: Multi-disciplinary Perspective of Usability and HCI
Personas as part of a user-centered innovation process Lene Nielsen, 1/08 HCI Vistas Vol-IV
10 Steps to Personas (Lene Nielsen, 7/07, HCI Vistas Vol-III)

Oct 25, 2009

Interactive multi-touch for sound design, dj-ing, and music creation

NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:


multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister



JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard.  Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks.  (I'll discuss this further in a future post.)



Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:


Additional tutorials can be found the Jazzmutant YouTube channel.


Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."

RELATED
Video:  Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures 
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen

How the Stantum's Resistive Multi-Touch Screen Works


Aug 17, 2009

Microsoft's Project Natal & "Fun" Natural User Interfaces for the Office


This looks great!

I came across the picture below today, and discovered more about it in a CNET article:

"Gesture-recognition technology, such as that seen in Project Natal, has changed the face of gaming, but Microsoft's Craig Mundie (pictured) believes it will also transform the office." (Ina Fried, CNET)


This approach looks like it incorporates visual thinking and conceptualization in a nice and fluid way.

RELATED:

"Microsoft: Future desktop PC's will transform the office" (Ina Fried, CNET UK, 7/31/09).


According to Fried, author of the the linked CNET article, Craig Mundie, at the Microsoft Research Faculty Summit this past July demonstrated the set-up in the above picture. The demonstration "...included hologram-like videoconferencing, a virtual digital assistant, and multiple surface computers, along with voice-, touch- and gesture-recognition technology. The desk in the demo was a multitouch surface computer, and the office's walls were also a display that could easily switch from being a virtual window or collection of digital photos to a corkboard of sticky notes or various workspaces. In one case, Mundie also used Project Natal-like depth cameras to put himself in the middle of an architectural demo, essentially putting himself inside a building that was not yet built."

Craig Mundie of Microsoft on the Future of Software: Digital Assistants, Natural User Interfaces, and Room Computing (Gregory T. Huang, Xconomy Seatle, 7/13/09)

The author of the above article was also at the Microsoft Research Faculty Summit and touched on the concept of natural user interfaces:

“All the things we talk about as natural user interfaces have been largely used one at a time as enhancements to [graphical user interfaces],” Mundie said. Gesture recognition, expressive responses, immersive 3-D virtual environments, and understanding of context—these advances in computing algorithms will lead to software that is “better at anticipating what you might want.”

Xbox's Project Natal Revamped for Offices? (J. Nicholas Hoover, 7/30/09)

When I find a video of the Project Natal/Surface/Natural User Interface Office in action, I'll be sure to post it!

Why this is important:

Natural user interfaces might have the potential to support more efficient and effective collaboration among people in the workplace, and the visual display of information may also support better decision-making and problem solving.