Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Feb 15, 2013

Designing for Touch & Gesture: Tips for Apps and the Web (Updated)

In the past, our fingers did the walking, sifting through files, papers, pamphlets, and phonebooks, and then by point-click-clicking with a mouse to interact with images and text, in essence, electronic imitations of the paper-based world. Traditional forms, brochures, ad inserts, and posters informed much of the design. 

How much have things change?   It is 2013, but you'd think it was 1997 from the PowerPoint look and feel of many apps and web sites!   Touch is everywhere, but from what I can tell, not enough designers and developers have stepped up to the plate to think more deeply about ways their applications can support human endeavors though touch and gesture interactions.  

For an overview of this topic, take a look at my 2011 post, written after a number of ugly encounters with user-unfriendly applications:  Why bother switching from GUI to NUI?  

For an in-depth look into the history of multi-touch, the wisdom of Bill Buxton is well-worth absorbing.  He's worked with all sorts of interfaces, and has been curating the history of multi-touch and gesture systems since 2007:


Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12



Even if you are not a designer or developer, I encourage you to explore some of the links below:

Touch Gestures for Application Design
Luke Wroblewski, 10/9/12

Common Misconceptions About Touch
Steven Hoober, 3/18/13

Designing With Tablets in Mind:  Six Tips to Remember
Connor Turnbull, Webdesign tuts+, 9/27/11

Finger-Friendly Design: IDeal Mobile Touchscreen Target Sizes
Anthony T, Smashing Magazine, 2/21/12

Best Practices: Designing Touch Tablet Experiences for Preschoolers (pdf)
Sesame Street Workshop


Are Touch Screens Accessible?
AcessIT, National center on Accessible Information Technology in Education

iOS Human Interface Guidelines
Apple

Android User Interface Guidelines
Using Touch Gestures
Handling Multi-Touch Gestures
Android

Designing for Tablets?  We're Here to Help!
Roman Nurik, Android Developers Blog 11/26/12

Touch interaction design (Windows Store apps)
Microsoft - MSDN

Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12


Jan 17, 2013

XBox Kinect in the OR: Kinect supports gesture interaction with 3D imaging of the patient, while operating.

Here's an interesting use of technology for health - the Xbox Kinect in the OR!

Thanks to Harry van der Veen for the link!


RELATED
Kinect sensor poised to leap into everyday life
Niall Firth, NewScientist, 1/17/13

For the tech-curious:
PrimeSense (Company that developed the 3D depth sensor that powers the Kinect, the sensor in Ava, a healthcare robot by iRobot, and more.)

OpenNI (Framework for the development of 3D sensing middleware libraries and applications.)

NiTE: Natural Interface Technology for End User (Perception algorithms layer for 3D computer vision, allows for hand locating, tracking, analyzing scenes, and tracking skeleton joints.)

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7



Nov 17, 2012

Human Computer Interaction + Informal Science Education Conference (NUI News)

I recently learned of the HCI + ISE conference, funded by the National Science Foundation and organized by Ideum and Independent Exhibitions that will provide the groundwork for the future of the development and design of interactive computer-based science exhibits.
Science museums have a long history of interactivity, well suited to groups of "explorers", such as families or students visiting on a field trip.  

What is really exciting is that new interactive applications and technologies have the power to transform the way people learn and understand science in a collaborative and social way.  Innovations in the field of HCI - Human-Computer Interaction- such as multi-touch and gesture interaction, are  well-suited to meet the goals of science education for all, beyond the school doors and wordy textbooks. 

Below is a screen-shot of the conference website, a description about the conference, quoted from the site, and some related resources.



About the HCI+ISE Conference
"HCI technologies, such as motion capture, multitouch, augmented reality, RFID, and voice recognition are beginning to change the way computer-based science exhibits are designed and developed. Human Computer Interaction in Informal Science Education (HCI+ISE) is a first-of-its-kind gathering to explore and disseminate effective practices in developing a new generation of digital exhibits that are more intuitive, interactive, and social than their predecessors."
"The HCI+ISE Conference, to be held in Albuquerque, New Mexico June 11-14 2013, will bring together 60 museum exhibit designers and developers, learning researchers, and technology industry professionals to share effective practices, and to explore both the enormous potential and possible pitfalls that these new technologies present for exhibit development in informal science education settings."
"HCI+ISE will focus on the practical considerations of implementing new HCI technologies in educational settings with an eye on the future. Along with a survey of how HCI is shaping the museum world, participants will be challenged to envision the museum experience a decade into future. The conference results will provide a concrete starting point for exhibit developers and informal science educators who are just beginning to investigate these emerging technologies and design challenges in creating these new types of exhibits."
Why HCI+ISE?
"Since the mid-1980s informal educational venues have increasingly incorporated computer-based exhibits into their science communication offerings in an effort to keep pace with public expectations and make use of the expanding opportunities these technologies provide. The advent and popularity of once novel HCI technologies are becoming commonplace: the Wii and Microsoft Kinect now allow for motion capture video games, tablet PCs have multitouch interaction, and smart phones and other devices come standard with voice recognition. Yet many museums are still developing single-touch and trackball-driven, single-user computer kiosks."
"Science museums have a long history of championing hands-on, physical, and inquiry-based activities and exhibits. This vast experience has only just begun to be applied to interactive computer interfaces. Along with seasoned science exhibit developers, the Conference will draw upon individuals outside of ISE who will provide fresh insight into the technologies, design issues, and audience expectations that these visitor experiences present."
Involvement and Findings
"HCI+ISE will bring together a diverse group of practitioners and other professionals to discuss (and in some cases share and prototype) new design approaches utilizing emerging HCI technology. Please see our Apply page to learn how you can participate. Conference news and findings will be distributed through a variety of ISE and museum websites, including this one."
"We welcome your questions and comments about the HCI+ISE Conference."
CONTACTS
Kathleen McLean of Independent Exhibitions
& Jim Spadaccini of Ideum
HCI+ISE Co-chairs
"Open Exhibits is a multitouch, multi-user tool kit that allows you to create custom interactive exhibits."
CML:  Creative Mark-up Language
GML: Gesture Mark-up Language
GestureWorks
Ideum

Jul 29, 2012

Blast from the 2009 past: News, Videos, and Links about Multi-touch and Screen Technologies

One of the things I like to do is share updates about the world of multimedia, multi-touch, gesture, screen, surface, and interactive technologies, focusing on off-the-desktop applications and systems. When I started this blog, I had to put forth quite a bit of effort just to FIND interesting things to blog about.  


These days, there are so many sources that focus on emerging - and now commonplace- interactive technologies, my main challenge is to filter the noise.  Where do I begin?


My archives are vast.   I randomly picked the year 2009 and came across one of my previous posts, "News, Videos, and Links about Multitouch and Screen Technologies."   The post is long, and contains a number of videos and links that probably will be of value to a future curator of the history of technology.


I welcome comments from readers who might be able to help me update information about various applications and systems I've featured on this blog in the past. 

The pictures are screenshots from the results of an  image search for "interactivemultimediatechnology".  Over the past 6 years, I've posted quite a few!








Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Jul 12, 2012

TechCrunch Charlotte Highlights: T1 Vision; inTouch Collaborative Software


Yesterday evening I attended a meetup of TechCrunch and Charlotte-area techies, held at the uptown Packard Place entrepreneurial center.  It was jammed-packed with people all abuzz with tech start-up fever, fueled by awesome food provided by Zen Fusion.  Although my main purpose for attending the TechCrunch meet-up was to learn more about innovative technology start-ups in my region, I also was hoping to capture a few shots of interesting people.   I like to keep my eye open for tee-shirt slogans, and one worn by a young gentleman caught my eye, proclaiming that he'd seen the future, and it is in his browser.  On the back of his tee-shirt was a bright HTML5 logo, something that is dear to my heart, as I am moving from HTML4 to HTML5.  He was polite and agreed to pose for a couple of photographs:
 






It turned out that the HTML5 guy was at the TechCrunch event with one of his colleagues from T1 Visions, a social touchscreen solutions company that I've featured previously on this blog.  They caught me up on the growth of this start-up company, which now has 15 employees and has broadened its reach beyond table-top restaurant applications to the healthcare, education, corporate, retail, and broader hospitality sectors.

What I like about table-top systems is that they provide support for "natural user interaction".  It allows for multiple modes of interaction with and presentation of multimedia content.  Over the past several years, these systems have been proven to be useful to a wider range of people and settings.  Interfaces that support touch and gesture interaction are no longer viewed as novelties, given the pervasiveness of touch-phones and tablets and their ease-of-use for most people.

A useful product from T1 Visions is the T1 Collaboration Table. It supports touch-screen interaction and can also handle up to four simultaneously connected laptops.   The table system provides a media viewer that supports sharing of photos across screens, devices, and surfaces.  It also contains a web browser, a presentation viewer, and a whiteboard that is compatible with video conferencing.  The company provides customized applications for its clients.   In the Charlotte area, some of the tables can be found in restaurants, such as the Mellow Mushroom, Cowfish, and Harpers.  A few were recently installed in the Atkins library at UNC-Charlotte, to support group-work among students.

To learn more about what T1 Visions has to offer, take a few minutes to view the following videos and follow the links at the end of this post!






Demonstration of how the collaboration table can work within a business environment:


Demonstration of the T1VISION touch wall:
RELATED
T1 Visions Gallery
T1 Visions: Social Touchscreen Solutions
Interactive tabletops bring people together
Marty Minchin, Charlotte Observer, South Charlotte News, 2/20/12
Interactive Technology in the Carolinas: T-1 Visions Update

NOTE:
TechCrunch is a technology media group founded in 2005 that focuses on innovative technologies.  This summer, a group of TechCrunchers are visiting cities in the south that were previously not under their radar, such as Savannah and my home region, Charlotte, N.C.   The Charlotte TechCrunch meetup was held on Wednesday, July 11, 2012.  I plan to devote a few more blog posts to share what I learned.

Jul 8, 2012

PO-MO, a creative group that combines digital art, interaction, movement, and play to create engaging surfaces and spaces.

I recently learned more  PO-MO, a relatively new start-up tech company based in Winnipeg, Canada. According to the company's information, PO-MO "specializes in interactive digital display solutions, including gesture and motion based interactivity, interactive display content creation and management, and large interactive display and projection services for advertisers, educators, and events."  


Po-Motion was a finalist in an elevator pitch video contest last fall. It has several advantages over potential competitors.  The system is easy to use, and priced within the range that is affordable for schools, museums, and other cost-conscious groups who would like to provide technology-supported immersive interactive experiences for people of all ages.  The PO-MOtion software designed for interactive floors and walls starts at $39.99, and works on any computer, using any USB web camera and a projector. Other applications make use of Kinect sensors.


I especially like one of PO-MO's recent projects, the Impossible Animals Museum Exhibit, created using Unity 3-D, for the Manitoba Children's Museum.  How does it work?  Children create a colored egg using crayons and paper, which is then scanned into the exhibit and digitally embedded into the system, which includes an interactive wall and floor.  When the egg is touched, it is activated to hatch, and then becomes a motion reactive animal.  The environment includes things like water, landscapes, and even a spaceship.  The system has a "reset world" button for museum staff to use when needed.  

Impossible Animals Exhibit

Impossible Animals Interactive Museum Installation from PO-MO Inc. on Vimeo.



The following video explains how the PO-MO system works:


PO-MO is also involved in promotional projects, assisting retailers, ad agencies, and brand managers with creative ways to engage customers and clients:
Ragpackers Kinect-based Window Display

Ragpickers Kinect Window Display from PO-MO Inc. on Vimeo.


The following video provides a scrolling description about PO-MO's work, including promising data collected during implementation:

Other products and services provided by PO include mobile app development. I especially like the augmented reality business card depicted in the following video clip:

Augmented Reality Business Card from PO-MO Inc. on Vimeo


Imagine if your local shopping centers, museums, libraries, or even schools offered this level of immersive interaction on a regular basis!

RELATED
The PO-MOtion system has a wide range of uses. It is currently used in an educational setting in a sensory room for students with special needs, something that I'd like to try out in the near future with students at Wolfe School. I plan to share more about this in another post.


PO-MO Case Studies


PO-MO Bios:
Meghan Athavale – Director/CEO, PO-MO Inc.
"Meghan has been a professional designer and animator since graduating from Red River College in 1997. After graduation, she moved to Calgary, where she spent almost two years directing projects at Aurenya Studios, a start-up animation company. In 2001, Meghan was engaged by Community Connections to support community-based IT development projects in rural Manitoba and in Winnipeg’s inner city.  In 2008, Meghan joined Manlab, developing educational interactive games and resources for Immigrate Manitoba. She also launched Meghan PO-MO Project, a sole proprietorship which provided sound reactive visuals for DJs and venues across Canada. In 2009, Meghan was contracted as the User Experience Designer at Tipping Canoe, a multinational internet marketing company.

In 2010, Meghan formed PO-MO Inc. in partnership with Curtis Wachs. She began working exclusively for the company in December, 2010. Today, Meghan is the driving force behind PO-MO Inc."


Curtis Wachs – Technical Director/COO, PO-MO Inc.
"Curtis graduated from Assiniboine Community College in 2003 where he studied object oriented programming. Directly upon graduating, Curtis was hired by Assiniboine Community College to help design and develop software for online classes. Curtis relocated to Winnipeg in 2006 to create interactive training material for sales staff at E.H. Price. During the course of his work, Curt was apprenticed in 3D modelling and animation by Liem Ngyuen, a former Frantic Films resident. In 2008, Curtis joined Manlab, where he created online educational games for Travel Manitoba, Immigrate Manitoba, and other clients. In 2010, Curtis formally joined PO-MO Project, and the company became a partnership. In June 2010, PO-MO Inc. was founded.

Curtis is currently the technical director at PO-MO Inc., overseeing the project management and workflow of contracted and R&D development projects."


May 21, 2012

Leap Motion: Low Cost Gesture Control for Your Computer Display

Jessica Vascellaro, of the Wall Street Journal, reports about gesture,  motion. and even object control for computers, highlighting the work of  Leap Motion and Flutter.




Apparently the Leap Motion sensor is less expensive than Microsoft's Kinect. It can track movements down to 1/100 of a millimeter and can track fingers and movement. It handles interaction with 8 cubic feet of space.


Below is a video from the Leap Motion website:






RELATED
Leap FAQs
Leap Motion Developer Kit Application
Leap Motion: 3D hands-free motion control, unbound
Daniel Terdiman, CNET, 5/20/12
FYI:  Do a search and you'll find many more articles and posts about Leap Motion!

Feb 4, 2012

Razorfish Gesture and Touch Platform for the "Retail Experience"


Razorfish Connected Retail Experience Platform (codename "5D") from Razorfish - Emerging Experiences on Vimeo.


The above video is an overview of the "5D" connected retail experience platform by Razorfish Emerging Experience. This concept looks like it was designed for me - someone who loves tech,  has a high need for hassle-free shopping.  Someday I hope I will have the ultimate technology-supported shopping experience : )




RELATED
Razorfish Press Release
Razorfish


SOMEWHAT RELATED 
Previous posts:
Interactive Visual Merchandising
Another close encounter with in-store digital display marketing at Best Buy...
Interactions (ACM) Cover Article - "Proxemic Interactions: The New Ubicomp?" Plus - Close encounters with displays at the airport and JC Penney
Pervasive Retail Part 1: Web UX  Meets Retail CX - Screens Large and Small at the Mall, Revisited
Interactive Displays in Public Spaces
Interactive Display with QR Tag: Close Encounter at the Orlando Airport

Other:
Retail Customer Experience website
Pervasive Retail

GestureTek: Retail Marketing Solutions: Interactive Screen and Window Display Systems for Advertising in Stores, Malls and Shopping Centers
JC Penney Remodel  Interactive Video
Window Shopping Goes High-Tech With Motion-Sensing Interactive Displays
Bridgette Meinhold, Ecouterre, 9/22/11

Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Dec 13, 2011

Kinect in Education! (kinectEDucation)

Although I'm currently exploring the world of interactive HTML5, interactive video, etc., I think I just might make "kinecteducation" the focus of my tech-hobbies. I have some experience with game programming-one of my computer courses required a project using XNA- and I know quite a bit about gesture and multitouch, multi-user interactio, so it would'nt be too much of a stretch.


My motivation?

As a school psychologist, my main assignment is a school/program for students with disabilities, including about 40 or so who have autism spectrum disorders. Yesterday, the principal of the school attended a demonstration of the Kinect and requested that our school be considered for piloting it. One of my other assignments is a magnet high school for technology and the arts, and rumor has it that it will be offering a game programming curriculum.  I'd love to co-sponsor an after-school game club and encourage the students to program educational apps for the Kinect sometime in the near future! 


I'm also working as a client, in collaboration with come of my educator colleagues, with a team of university students who are creating a communication/social skills game suite geared for students with autism and related disabilities....


I'm inspired by the possibilities!


We have large SMARTboards in each classroom and in other locations around the building, and we have a Wii set up in the large therapy room adjacent to my office. The Wii has proven to be very useful in helping the students develop social and leisure skills that they can use in and outside of the school settings, but some of the students have difficulty manipulating the buttons on the controllers.


You can get Kinect-based apps from the Kinect Education website! Below are selected links from the website:

You can also get additional information from the Microsoft in Education "Kinect in the Classroom" website.

Below are a few videos to give you an overview of how open-source applications designed for the Kinect can be used in education: 






Nov 28, 2011

FlatFrog Multitouch Videos: Point Separation, Multi-input, Multi-user input

FlatFrog Multitouch is a company based in Sweden. It was founded by Ola Wassvic and Christer FÃ¥hraeus.  The technologies support 20+ simultaneous touches, and recognize object size, a useful feature. FlatFrog screens can be optimized for a wide range of light conditions  FlatFrog's multi-touch and gesture interaction is featured in the short video clips below.  


FlatFrog is gearing up for commercial release. According to the FAQ's on the website, "all sizes are possible, from 5" to 100" and upward.  Promethean is one of the company's investors.   There is a volume manufacturing agreement with Kortek Corporation, known for industrial and gaming displays.




Thanks Touch User Interface for sharing this information! (Touch User Interface is the blog for Sensible UI, known for the ArduMT, aka the Arduino Multi-touch Development Kit)

Nov 13, 2011

Is the answer Voronoi? Looking for possible solutions to an art+dance+music+tech idea from a recurring dream....

If you are a long-time reader of this blog, you probably know that I sometimes have some unusual dreams about technology.  I don't blog about my dreams very often, but last night, I had another technology dream, a continuation of a dream I had one night last week.  
Voronoi Diagram (Wikipedia)

I'm pretty sure that the last two dreams were sparked by playing an online interactive demonstration of a Voronoi application before going to sleep one night, and also reading an article about "extracting ordered patterns from a triangular mesh surface" in the Novemeber/December IEEE Potentials magazine before turning in last night. 

The dance probably was influenced by my recent viewing of the North Carolina Dance Theater's performance of Innovative Works with my mom, someone who encouraged my  study of music, art, and dance at an early age.

Some of my tech dreams are sort of...practical. For example, in one recurring dream, I find myself coding for a flexible mesh/grid application. Sometimes the mesh/grid has something to do with wireless sensor networks on curved terrain, perhaps related to something like the Smart Grid, and sometimes I find myself working on an application that analyzes streaming data from a variety of sources, for security prediction purposes.  At other times, I'm coding for something more artistic, my preference.

Last night, my dream focused on creating a flexible mesh fabric that used in a multimedia dance/graphic arts/music performance.  I was coding for this performance using a Voronoi-like algorithm.


This is the best I can do to explain this: The fabric is carried by the dancers, and is both reactive and generative. In essence, the fabric is intertwined/embedded in the dance, the music, and the graphics. In my dream, everything looked/sounded/felt awesome and otherworldly, and the music that merged and morphed during the dance was so beautiful, not only the melodies, but the sounds.  (In a previous dream, the mesh contained a "nanotechnology" component, but I'll save that quest for the future.)


I thought I'd look at some of my web book marks and search a bit more for information related to this topic.  For now, here is the "brain dump".  I have more to add to this post, and plan to port it to a reference page for this blog in the future.  I hope that this post will be useful to some of my art/music/dance/tech readers!

RELATED AND SOMEWHAT RELATED


Update:  Right after uploaded this post, I came across a link to a WebGL demo for a 3D music video of pop singer Ellie Goulding's song, 'Lights', by HelloJoy.   Visitors to the webpage can click to interact with the environment. If you keep the button pressed, you fly faster.  If you tweet the link, you'll see your name crop up as you fly around in the soundspace.  For more information about the making of 'Lights' - take a look at  Behind the scenes of 'Lights": the latest WebGL sensation!  (Carlos Ulloa, 11/9/11)


After I watched the 'Lights' video, I recalled Radiohead's 'House of Cards' video, which I wrote about back in 2008: 
 "We were rolling computers all day"...The Making of Radioheads House of Cards using imaging and info visualization software.   The process behind the making of the House of Cards video was described in detail in Chapter 10 of the book, Beautiful Data.

Bradley, E., Capps, D., Luftig, J, & Stuart, J.M. Toward Stylistic Consonance in Human Movement Synthesis.(pdf)  The Open Artificial Intelligence Journal, 2010, 4, 1-19
Bradley, E., Stuart, J.  Using Chaos to Generate Variations on Movement Sequences (pdf) Chaos, 8:800-807 (1998)
Bradley, E., Stuart, J.  Learning the Grammar of Dance.(pdf)  Proceedings Fifteenth International Conference on Machine Learning, Madicson, WI, 1998
E. Bradley, D. Capps, and A. Rubin, "Can computers learn to dance?," Proceedings International Dance & Technology (IDAT), Tempe AZ, Feb 1999.
Chaotic Dance: Using mathematics to generate choreographic variations
Schedl, M., Hoglinger, C., Knees, P. Large-Scale Music Exploration in Hierarchically Organized Landscapes Using Prototypicality Information (pdf)
Fournel, N. Procedural Audio for Video Games: Are we there yet? (pdf) GDC 2010


Voronoi Cells, created by Nathan Nifong.  A version of this interactive work was used in a DanceDraw performance















Patterns in the Noise (Nathan Nifong's site - FYI, Nathan worked with Celine Latulipe with the Dance.Draw project while completing his bachelor degree in computer science at UNC-Charlotte)
Voronoi Dance (Christian Gross, using OpenFramework)
Voronoi art: Slow Trip (Oktalist/Mat)


The above video, by Mat/Oktal, was inspired by his viewing of Thomas Ruff's Substrat images. 
Scott Snibbe Studio (Intearctive art, music, and animation for iPhone, iPad, iPod, and Mac)

Interactive Voronoi Diagram Generator with WebGL (Alex Beutel)
 
The above video was found in Alex Beutel's blog post, "Interactive Voronoi Diagrams with WebGL"

Posts about DanceDraw and related work at UNC-Charlotte:
News from the HCI lab at UNC-Charlotte- Creative Interactions (Videos)
Exploring the Design Space in Technology-Augmented Dance at CHI 2010:  Celine Latulipe's team from UNC-Charlotte
Interactive Surveillance:  Live digital art installation by Annabel Manning and Celine Latulipe

SIGCHI  Digital Arts and Interaction Community:  Building Bridges
The Interdisciplinary World of Dance and Interactive Technology

HTML5Voronoi  (HTML5Code website)

HTML5 Voronoi, Live Version
Update to code to compute Voronoi diagrams (Raymond Hill, 5/22/11)
William Forsythe's "Synchronous Objects-One Flat Thing, Reproduced" - Multidisciplinary online interactive project: Translating choreography into new forms.
BCS HCI 2011 Workshop: When Words Fail:  What can Music Interaction tell us about HCI?
Woven Sound (Alex McLean)
Real DJs Code Live (Robert Andrews, Wired, 7/3/06)
Visualization of Live Code (Alex McLean)
Voronoi diagrams of music (pdf)  (Alex McLean, 2006)
WebGL
Sylvester: Vector and Matrix Math for JavaScript
Generative Art Links (Mikael Hvidtfeldt Christensen)
Schacher, J.C. Motion to Gesture to Sound:  Mapping For Interactive Dance (pdf) Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia
Code & Form:  Computational Aesthetics (Marius Watz)
Werghi, N. Extracting ordered patterns from a triangular mesh surface.  IEEE Potentials, Nov/Dec 2011
Last night I dreamt about haptic touch-screen overlays
Hyun-Seok Kim's 'Voronoi' dragonfly wing inspired superyacht 2

Hyun-Seok Kim's Voronoi Yacht