Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Mar 11, 2013

Leap Motion: My Dev Kit Arrived - Now What?! Thoughts About "NUI" Child-Computer-Tech-Interaction - and More



My Leap Motion developer kit arrived last week. I carefully unboxed the small device and tried out the demo apps that came with the SDK.  I'm doing more looking than leaping at this point.

I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year.  It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.

Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson.   Family and friends captured his first moments after birth with iPhones, and shared across the Internet.  Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.

The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video.   He was seven months old when he first encountered my first iPad.  It was fingers-and-toes interaction from the start.  

In the first picture below,  he's playing with NodeBeat.  In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.




















My grandson is new to motion control applications, so I'm just beginning to learn what he likes,  and what he is capable of doing.  A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft!  (I think next time we'll try Kinect Sesame Street TVor revisit Kinectimals.)  


One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far.  There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.

One app I really like is  Adam Somer's AirHarp, featured in the video clip below:


I also like the idea behing the following app, developed by undergraduate students:

Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
 
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."

"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team



There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation

"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab



Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine.  From the video, you can tell that they had a blast!  

Here is an excerpt from the chatter:  "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."

According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."  

The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:



Below are a few more videos featuring Leap Motion:


Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)


The Leap Motion Experience at SXSW 2013


LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...



RELATED
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Social Sign
IntuiLab
Leap Motion: Low Cost Gesture Control for Your Computer Display

SOMEWHAT RELATED
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."

Comment:
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group.  I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs.   It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.

May 13, 2010

Gesture Vocabulary from N-Trig: "N-act Hands-on"

N-Trig is a company founded in 1999 that provides pen and multi-touch solutions that integrate into LCDs and other devices, and provides opportunities for independent software vendors (ISVs) and original equipment manufacturers (OEMs) to create new interactive and hands-on computing experiences, according to the company's profile. The latest news about N-Trig's interactive capabilities was outlined in a recent article by DanaWollman, in Laptop:


I found the following video from N-Trig on YouTube, released on 5/11/10, that shows the new gesture set that is supported by N-Trig:




The N-act Gesture Set (depicted in the video below)
N-act3SideSweep for browsing, use fingers together for browsing
N-act2+1 - select from a displayed menu
N-act3Tap- displays open windows in a 3D carousel
N-act3Hold-rotates the 3D carousel
N-act2Scroll- scroll through a document
N-act2Tap-minimizes the open window, displays the desktop
N-act1Touch- select an item on the screen
N-act4Tap-displays customized, relevant list of web page icons; selected text/item is pasted into the chosen app.
N-act4Zoom-magnifies a movable selected area of the screen
N-act4Select-selects an area and opens a context sensitive menu

avitalntrig
Here is the promotional information from the YouTube video:
"This video demonstrates the N-trig N-act Gesture Vocabulary, a set of true multi-touch gestures for two plus one, three- and four-fingers, enabling users to perform an action directly on the screen, and providing a rich set of hand movements that enhance the overall user experience, enabling a whole new approach to how we interact with our computing devices, for a true Hands-on computing experience."


RELATED

Dana Wollman, 5/1/10, Laptop

www.n-trig.com
N-trig DuoSense Technology
The Future is Now:  Creating and Developing a Touch-Enabled World (pdf)
N-trig N-act Hands-On Gesture Vocabulary (N-Trig website)
Better Multi-Touch Displays Coming 
Mike Miller, Forward Thinking Blog, PC Mag (3/3/10)
DuoSense: Creating a Multi-touch Enabled World (November 2009)

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7



Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them. 




May 29, 2010

Preview: Update on Touch & Multitouch Technologies, Websites, and Touch-Interactive Multimedia Apps

It is about time for an update about touch/gesture- interactive technologies.

I've been researching the latest in "touch" screens and new developments in interactive multi-media content.  In just one year, a multitude of websites have been transformed from static to interactive. 

Although the initial objective for some of these websites was to optimize the interface and navigation for people accessing websites via touch-screen cell phones,  some are ideal for use on touch-enabled slates, the iPad, and even larger touch screen displays and surfaces.   

Convergence seems to be the buzz word of the day.   Interactive TV.  Game sets with Internet access.  Movies on your cell phone.  Touch screen Coke machines displaying movie trailers.  What's happening now, and what is next?

I welcome input from my readers in the form of links to websites, university labs with grad students and professors who are obsessed with emerging interactive technologies, proof-of-concept video clips, video clips of related technologies that are new-to-market, etc.   

I will add video clips to the following playlist:


FYI: I'm also in the middle of writing a series of posts about 3D television technologies for the Innovative Interactivity blog, and welcome input from my readers about this topic.


RELATED (Previous posts)
(the above post includes links to various multi-touch developer kits and resources)




Jan 23, 2010

More interactivity: Interactive Walls, Interactive Projection Systems, GestureTek's Motion-based Game

I recently discovered that Accenture's website has a few interactive web pages that provide information about the company's interactive wall technology.  What I liked about the site is that I could interact with it by touching the screen of my HP TouchSmart PC, and it worked!  (I'm always on the look-out for interactive websites that are good for touch-screen interaction.)


Below are screen shots of the on-line semi-functional demo of Accenture's Strategic Decision Interface:



(The website worked through touch-interaction via my HP TouchSmart PC!)







For more information:  Interactive Wall Technology:  Seeing the Big Picture


Newfangled Projector Systems:
New Projectors Make Any Wall an Interactive WhiteboardEpson, Boxlight unveil potentially game-changing technology -Meris Stansbury, eSchool News 1/13/10



"In a move that could shake up the interactive whiteboard (IWB) market, two projector manufacturers have just released new products that can turn virtually any surface into an IWB...The development means schools no longer have to buy separate hardware to enjoy the benefits of IWBs, whose interactive surface and ability to engage students have made them quite popular in classrooms."
The article highlights Epson's BrightLink 450i ultra short-throw projector which eliminates most shadows, and images can be anywhere from 59 to 96 inches diagonally with WXGA resolution.  The system requires an infrared pen. 


Another system is the ProjectoWrite2/W from Boxlight, which is a short-throw LCD projector with XGA resolution that can project up to 80 inches diagonally.


GestureTek


I've written a few blogs in the past about GestureTek.  I wonder if their technology would work with the projection systems mentioned in the eSchool News article. Below are a few examples of what GestureTek's been doing lately:


GestureTek's Video Game Wall at the Child's Play Activity Center (Las Vegas)
gesture-2



gesture-1


The above pictures of the Child's Play Activity Center show how GestureTek's WallFX interactive display system can be used to create a fun environment for children. The system includes a ceiling projector and a camera that can capture full-body motion.  The system provides 25 games and special effects.  Wouldn't this concept be great for interactive and fun educational games?


For details about this system:
GestureTek's video game wall shows where gesture-based games can go
-Dean Takahashi, GamesBeat, 8/25/09


GestureTek's Immersive Multi-platform Game:  Head-butting Interactive Soccer

"Video gesture control pioneer GestureTek., unveiled its new Momo™ Software Development Kit for game developers and original equipment manufacturers at the 2010 Consumer Electronics Show. Gesture recognition software tracks motion and objects such as faces and hands and brings immersive, gesture-based interactivity to multiple platforms, such as PCs, laptops, mobile phones, toys and other devices. The video is a demonstration of a head-butting soccer game."
GestureTek Interactive City Flight Simulator Game

Jun 6, 2009

Information about Touch Screens, Multi-Touch, & Gesture Interaction is Spreading

Since the news about Windows 7 multi-touch capabilities has spread around, I haven't had enough time to keep up all of information related to multi-touch interaction. Fortunately there are a few bloggers out there who are doing a great job filling in the gaps.

The Touch User Interface blog has a wealth of information in the form of pictures, video clips, slides, and links that I'd like to share.

The following slideshow/videos were highlighted in the Touch User Interface blog post, "Touch UI: HCI Viewpoint":

Untold Stories of Touch, Gesture, & NUI

Joe Fletcher, Design Manager, Microsoft Surface

Touch and Gesture Computing, What You Haven't Heard
Dan Saffer



Other posts of interest on the Touch User Interface blog:
Touch screens and vision impairment
Link: Designing the Palm Pre: An Interview with Michelle Koh

Touch User Interface Overview

I've updated some additional information about UX,interactive multimedia, multi-touch, and gesture interaction on my Multimedia and Interaction Resources page, which is a work in progress.


Apr 8, 2009

Joel Eden's Informative Post: Designing for Multi-Touch, Multi-User and Gesture-Based Systems

Joel Eden is a User Experience Consultant at Infragistics- he recently wrote a detailed article/post in the Architecture & Design section of Dr. Dobbs Portal, "Designing for Multi-Touch, Multi-User and Gesture Based Systems". I thought I'd share the link, since I've been writing on the same topic.

In his article, Joel explains the differences between traditional WIMP (Window, Icon, Menue, Pointer) interaction and gesture, multi-touch, and multi-user systems. These systems are also known as Natural User Interfaces, or NUI. He recommends that "rather than trying to come up with new complicated ways to interact with digital objects, your first goal should be to try to leverage how people already interact with objects and each other when designing gesture based systems."

Joel goes on to outline UX (User Experience, IxD (Interaction Design), and HCI (Human-Computer Interaction) concepts that designers should consider when developing new systems, - Affordances, Engagement, Feedback, and "Don't Make Us Think"
, which he summarizes in the conclusion of his article.

I especially liked Joel's references:

Clark, Andy. Supersizing the Mind: Embodiment, Action, and Cognitive Extension

Few, Stephen. Information Dashboard Design: The Effective Visual Communication of Data

Gibson, John J. The Ecological Approach to Visual Perception

Krug, Steve. Don't Make Me Think: A Common Sense Approach to Web Usability, Second Edition

Norman, Don. The Design of Everyday Things

Norman, Don. Things That Make Us Smart: Defending Human Attributes In The Age Of The Machine

I would also add the following references:
Bill Buxton
Multi-touch Systems I have Known and Loved
(Regularly updated!)
Sketching User Experiences: Getting the Design Right and the Right Design

"Our lack of attention to place, time, function, and human considerations means these fancy new technologies fail to deliver their real potential to real people." - Bill Buxton

Dan Saffer
Designing for Interaction: Creating Smart Applications and Clever Devices
Designing Gestural Interfaces

SAP
Touchscreen Usability in Short
(Summary by Gerd Waloszek of the SAP Design Guild)
SAP Design Guild Resources (User-Centered Design, User Experience, Usability, UI Guidelines, Visual Design, Accessibility)
Kevin Arthur (Synaptics)
Touch Usability
Bruce "Tog" Tognazzini
Ask Tog: Interaction Design Solutions for the Real World
Inclusive Design, Part I
First Principles of Interaction Design
John M. Carroll
Human Computer Interaction (HCI) (History of HCI)
Bill Moggridge
Designing Interactions
Ben Shneiderman
Leonardo's Laptop: Human Needs and the New Computing Technologies
Edward Tufte

Visual Explanations
Beautiful Evidence
The Visual Display of Quantitative Information
Envisioning Information
Rudolf Arnheim (Gestalt)
Art and Visual Perception: A Psychology of the Creative Eye

Update: A great reading list on general HCI. Some of the authors were involved in the early days of touch, bi-manual, and multi-touch interaction.

Jan's Top Ten List of Books on Human-Computer Interaction


FYI: If you know much about Windows Presentation Foundation, you probably know that Josh Smith, WPF guru, also works at Infragistics


Jan 11, 2009

Usability, Accessibility, and User Experience in a Win7 Environment

Microsoft's newest operating system, Win7 (beta), allows for multi-touch interaction and application development. From what I can tell from the news from the Consumer Electronic Show (CES 2009), touch and gesture interaction holds quite a bit of interest among consumers and application developers alike.

Hopefully people will think carefully about user experience and usability factors when developing the applications we'll be using in the not too distant future!

Because of my background as a school psychologist, I’m interested how touch/gesture applications can address accessibility and universal usability issues.

I wasn't able to locate user experience or usability information for Win7. Here are a few links from the Microsoft website that I think are worth reading:

Touch Interaction Guidelines

How to Design a Great User Experience

Designing with Windows Presentation Foundation

Accessibility

According to the research from Microsoft, ”more than half of all computer users experience difficulties or impairments related to accessibility, and are likely to benefit from the use of accessible technology. Moreover, approaching software design with the flexibility and inclusiveness that are the hallmarks of accessibility often results in overall improved usability and customer satisfaction.”

Regarding touch/gesture interaction, know that “19% (24.4 million) of computer users have a mild dexterity difficulty or impairment, and 5% (6.8 million) of computer users have a severe dexterity difficulty or impairment.”

Christian Moore, from the NUI-Group, shared this nice graphic that depicts the various fields that intersect or converge with interface design, which I think is useful when thinking about how we will design, develop, and use emerging technologies:



Concepts such as accessibility and universal usability can fit into this design!

For more information regarding multi-touch, see my previous post, "For Techies and the Tech-Curious: Multi-touch/Gesture from the NUI Group"

Nov 26, 2008

For the Tech Curious: Multi-Gesture Net: A Multi-touch and Multi-gesture Research Blog

Laurence Muller, M.Sc. is a scientific programmer at the Universiteit van Amsterdam who develops scientific software for multi-touch devices. His blog, Multigesture.Net, provides good information regarding multi-touch and gesture interaction hardware and software applications.


Laurence links to the DYI tabletop computing bootcamp that was held at
IEEE Tabletops and Interactive Surfaces 2008. From there, you can find a linked list of the organizers of the events, and additional information.

Picture below is from MTC Multi-touch Console:
Image

Here is a link to the group's libavg wiki that includes open-source code and "how-to" instructions.

If you are interested in multi-touch and multi-gesture computing from an academic point of view, Florian Echtler, of the Technische Universitat Munchen has a series of publications listed on his website. Here is the abstract of one of his papers. He is on the right track. I especially like the fact that he's thought about widget layers. (I have, too, but they are only sketches in my idea book.)

TICH: Tangible Interactive Surfaces for Collaboration between Humans (Sourceforge website, with links to libtisch.

F. Echtler, G. Klinker
A Multitouch Software Architecture
NordiCHI 2008: Using Bridges, 18-22 October, Lund, Sweden. (bib)

"In recent years, a large amount of software for multitouch interfaces with various degrees of similarity has been written. In order to improve interoperability, we aim to identify the common traits of these systems and present a layered software architecture which abstracts these similarities by defining common interfaces between successive layers. This provides developers with a unified view of the various types of multitouch hardware. Moreover, the layered architecture allows easy integration of existing software, as several alternative implementations for each layer can co-exist. Finally, we present our implementation of this architecture, consisting of hardware abstraction, calibration, event interpretation and widget layers."

Feb 15, 2013

Designing for Touch & Gesture: Tips for Apps and the Web (Updated)

In the past, our fingers did the walking, sifting through files, papers, pamphlets, and phonebooks, and then by point-click-clicking with a mouse to interact with images and text, in essence, electronic imitations of the paper-based world. Traditional forms, brochures, ad inserts, and posters informed much of the design. 

How much have things change?   It is 2013, but you'd think it was 1997 from the PowerPoint look and feel of many apps and web sites!   Touch is everywhere, but from what I can tell, not enough designers and developers have stepped up to the plate to think more deeply about ways their applications can support human endeavors though touch and gesture interactions.  

For an overview of this topic, take a look at my 2011 post, written after a number of ugly encounters with user-unfriendly applications:  Why bother switching from GUI to NUI?  

For an in-depth look into the history of multi-touch, the wisdom of Bill Buxton is well-worth absorbing.  He's worked with all sorts of interfaces, and has been curating the history of multi-touch and gesture systems since 2007:


Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12



Even if you are not a designer or developer, I encourage you to explore some of the links below:

Touch Gestures for Application Design
Luke Wroblewski, 10/9/12

Common Misconceptions About Touch
Steven Hoober, 3/18/13

Designing With Tablets in Mind:  Six Tips to Remember
Connor Turnbull, Webdesign tuts+, 9/27/11

Finger-Friendly Design: IDeal Mobile Touchscreen Target Sizes
Anthony T, Smashing Magazine, 2/21/12

Best Practices: Designing Touch Tablet Experiences for Preschoolers (pdf)
Sesame Street Workshop


Are Touch Screens Accessible?
AcessIT, National center on Accessible Information Technology in Education

iOS Human Interface Guidelines
Apple

Android User Interface Guidelines
Using Touch Gestures
Handling Multi-Touch Gestures
Android

Designing for Tablets?  We're Here to Help!
Roman Nurik, Android Developers Blog 11/26/12

Touch interaction design (Windows Store apps)
Microsoft - MSDN

Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12


Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....