Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Dec 27, 2009

Touch, Multi-Touch & Gesture Responsive Web & Related Applications (helpful if you have a touch screen or IWB!)

I regularly share information about applications that work well on touch, multi-touch, and/or gesture-based screens.  Over the past few months, there have been updates and new developments that I'm still exploring. (Some of this information might be "old" news, but for many, it will be "new".)

Here's what I have to share today!

Be sure to explore the activities from the Kids section of the National Gallery of Art website, located at the end of this post.

MULTI-TOUCH FIREFOX



Multi-touch on Firefox from Felipe on Vimeo.

Code Snippets from Felipe's Demo (Includes tracking divs, drawing canvas, image resizing, image crop, & pong) Mozilla Wiki
Bringing Multi-touch to Firefox and the Web
Christopher Blizzard, Mozilla Hacks

COOL IRIS
I have a hunch that someone out there is working on a multi-touch version of Cool Iris. Until I can find out the details, take a look at the videos below:


Cool Iris Overview on Google Chrome


 Here is a short video of what Cool Iris looks like on an iPhone:



Cool Iris Links
Cool Iris and iPhone
Cool Iris and Developers
Cool Iris Blog
Cool Iris Media/Press


About Cool Iris:   "Cooliris, Inc. was founded in January 2006 with a simple mantra: "Think beyond the browser". We focus on creating products that make discovering and enjoying the Web more exciting, efficient, and personal.Our core products include Cooliris (formerly PicLens), which transforms your browser into an interactive, full-screen "cinematic" experience for web media, and CoolPreviews, which lets you preview links instantly. Headquartered in Palo Alto, CA, our team consists of seasoned developers, entrepreneurs, and Stanford computer engineers. Each of us is passionate about serving our users without compromise and seeing that our products deliver the best experience."


BUMPTOP
Bumptop Gets Multi-touch Support on Windows 7


Bumptop Website
You can download Bumptop from the Bumptop website.  Here's the description:
"BumpTop is a fun, intuitive 3D desktop that keeps you organized and makes you more productive.  Like a real desk, but better.  Now with awesome mouse and multi-touch gestures!"
Anand Agarawala's Ted Talk

"Anand Agarawala presents BumpTop, a user interface that takes the usual desktop metaphor to a glorious, 3-D extreme, transforming file navigation into a freewheeling playground of crumpled documents and clipping-covered "walls.""

Discussion about Bumptop on the TED website

NATIONAL GALLERY OF ART KIDS
I've been compiling a list of websites that offer good touch-interaction.  One site that is good for children- and children at heart- is the National Gallery of Art's Kids pages.  There are a few entries that I had fun playing with students on the new SMARTboards at one of my schools:

FACES AND PLACES - LANDSCAPE

interactive landscapes


"FACES & PLACES encourages children of all ages to create portraits and landscape paintings in the style of American naive artists. By combining visual elements borrowed from more than 100 works in the National Gallery's permanent collection, this two-part interactive activity offers an overview of American folk art of the 18th and 19th centuries.(Shockwave, 6 MB)."


This one is so fun!  You can select different characters and make them dance, run, jump, or even fall.  You can design the landscape and add buildings, trees, and animals, and even change the sky pattern.  Press "go", and your character will travel around the panorama you've created.


DUTCH DOLL HOUSE

inDutch-Studio
Dutch Dollhouse  (Shockwave, 4.6 MB)
"Mix and match colorful characters, create decorative objects, and explore the kitchen, living quarters, artist's studio, and courtyard of this interactive 17th-century Dutch House."


NGA KIDS JUNGLE
Jungle interactive


"Create a tropical jungle filled with tigers, monkeys, and other exotic creatures. Inspired by the art of Henri Rousseau, NGAkids Jungle is an interactive art activity for kids of all ages. (Shockwave, 930k)"


What I liked about the Jungle application is that each item can be easily customized.  On the SMARTBoard, as well as on my HP TouchSmart PC, it is very easy for a student who has limited fine-motor control to create beautiful pictures.


FLOW
snow flow
"Flow is a motion painting machine for children of all ages. Enjoy watching the changing patterns and colors as you mix pictures on two overlapping layers. Choose  designs from four sets of menu icons, or add to the flow by clicking the pencil tool to create your own designs."


This application is a favorite of some of the students I work with who enjoy watching things spin. (You don't have to have an autism spectrum disorder to enjoy playing with Flow!)

National Gallery of Art Student and Teacher Online Resources

Dec 25, 2009

Flowing Data's Best Visualizations of 2009 (Nathan Yau): "It was a huge year for data"

"Data has been declared sexy, and the rise of the data scientist is here."  -Nathan Yau, Flowing Data
I started following Nathan Yau's Flowing Data blog nearly 2 years ago, when I was in Robert Kosara's information visualization/visual communication class at UNC-C.  What a great blog!  


Here is the short list- visit Flowing Data for the pics and details:
Honorable Mention:  MTV VMA Tweet Tracker
Honorable Mention:  Crisis of Credit Visualized (animation)
5.  Microsoft's Photosynth
4. New York Times' The Jobless Rate for People Like You
3. Open Street Map: A Year of Edits
2. Stanford University's  Protovis
1. Ben Fry's On the Origin of Species: The Preservation of Favored Traces


RELATED
Ben Fry's Origin of the Species: The Preservation of Favored Traces

Protovis
NYT's Jobless Rate for People Like You
Photosynth Website (Christmas lights featured on main page today 12/25/09  "Mintz Family Christmas")

Open Street Map
Crisis of Credit, Visualized
MTV MTA Tweet Tracker


SOMEWHAT RELATED
Dr. Kosara's EagerEyes site is a great resource to follow- I especially like his link to the History of Visual Communication website.
The Three Sexy Skills of Data Geeks (Michael E. Driscoll Dataspora Blog, 5/27/09)
Michael Driscoll's Book Recommendation:
The Elements of Statistical Learning:  Data Mining, Inference, and Prediction, Second Edition

Stanford University Viz Group


NOTICE
I'm gathering information about good examples of user-friendly multi-user interactive information visualization applications, preferably on multi-touch or gesture-based systems. Leave a comment and a link if you can point me in the right direction!

Dec 20, 2009

CityScape 1.8: What would it be like on an interactive touch or gesture screen? (Video and link to demo)


Pixel Active, has announced version 1.8 of CityScape, "the rapid urban modeling tool that allows users to build both custom and real-world environments quickly and easily"


CityScape 1.8 looks ideal for use on an interactive touch/gesture display or table.  Since it allows people to work within the same environment concurrently,  I have a hunch that it might work, perhaps with a few tweaks, in multi-touch, multi-user situations.  

As you watch the video below, imagine your finger as the pointer, creating terrain, landforms, buildings, roads, and other structures and spaces, and editing on-the-fly.  

CityScape  "Real Cities, Real Fast" (via Digital Urban & YouTube)
The above video is of version 1.7. To learn about the new features, see the "What's New in CityScape 1.8" section of the Pixelactive website.

Unfortunately, it comes with a high price tag.  Even so, wouldn't it be a great tool to use in middle and high school classrooms?  I want to get the demo and try it out on a SMARTboard....I can think of quite a few ways that it could be put to work in learning environments.

About CityScape
"CityScape was created to reduce the tedious, labor intensive process of building and modifying urban environments, empowering the interactive world builders of tomorrow."


"Using robust data importing, rapid urban modeling, Active Adjustment features, and procedural modeling, CityScape is able to achieve substantial improvements in worker efficiency over traditional object modeling tools and scene editors.
Worlds created by CityScape are compatible with a variety of file formats and optimized for real-time rendering engines."


Pictures from the Pixelactive website:
CityScape's client-server model allows collaboration between co-located or distributed teams so that the environment can be modified concurrently if needed.  It also lets users to check regions in and out in order to work on things independently.
Client-Server Model diagram for CityScape.

Terrain Painting                           Imported Heightmap
Hills created with the terrain painting tool.Mountain range created by importing a heightmap.


Dynamic adjustment allows you to quickly change your mind and modify without headaches!
Buildings will keep their relative positions to the roads as roads are dynamically adjusted.


If you change elevations, everything will adapt automatically.
Animation of a road adapting to the change in geography as a mountain is moved under it.


Here's a picture of the traffic data generation feature:
Downtown intersection showing all available traffic splines (colorized according to current snapshot in street-light simulation).


Here is a picture of an example of CityScape's GIS data import feature:
Aerial view of La Jolla city model.


Pixelactive Company Information

"PixelActive Inc. was founded in 2003 and is located in Carlsbad, California. Our mission is to develop state of the art interactive 3D technology. We believe that tools and technology should be intuitive to users regardless of its underlying complexity. Our focus is on the video game, virtual world, simulation, serious game, and GIS industries."


"The company's rapid urban modeling tool, CityScape, allows users to build both custom and real-world environments quickly and easily. With the editor, developers can focus on designing their world rather than the labor involved in creation. CityScape's unique feature set allows users to produce environments magnitudes faster than with traditional object modeling tools or scene editors."


"The PixelActive team has a track record of delivering exceptional products on time and on budget. Our core technology is focused on stability, reliability, modularity, usability, high performance and visual quality. PixelActive develops technology for a variety of PC platforms including Microsoft Windows XP and Windows Vista, and video game consoles such as the Microsoft XBox360™, Nintendo Wii™, and Sony Playstation® 3.P"

RELATED
You can download a demo of CityScape from the Pixelactive website.

Dec 14, 2009

Missed this one: Microsoft Surface in the Mix Lounge on Holland America Cruise Ships

They didn't have this on the Veendam when I was on the cruise! This video doesn't show anything new or exciting if you are familiar with tabletop/multi-touch/gesture based computing. I'm sure it will be eye opening to the average cruise ship passenger.



Microsoft Surface is New Coolness on Veendam Holland America Blog (5/4/09)


Press Release:
Seattle, Wash., December 7, 2009 —
"As part of its ongoing Signature of Excellence enhancements, Holland America Line will install five Microsoft® Surface® units in Mix, its newest lounge venue with three specialty bars — Martinis, Champagne, and Spirits & Ales. Already in full operation on ms Veendam, Mix and Surface units will be completed on ms Rotterdam by mid-December, followed by three additional ships."


"An exciting new entertainment space, Mix is a destination lounge where guests can meet and mingle throughout the day and night. Mix is an open and inviting space anchored by a pianist providing entertainment for the three distinctly themed specialty bars."


More:  Holland America Line Features Microsoft Surface in New Mix Lounge on Several Ships





Dec 5, 2009

Interactive Mobile Multimedia

A study in January of 2005 by Nokia indicated that there was a demand for interactive mobile multimedia services. That study was conducted about five years ago, before anyone had heard of the iPhone!   The technology to support interactive mobile multimedia has come a long way since then, and many of the new applications support multi-touch, or at least duo touch interactivity.

I'm very much interested in figuring out how to design web-based interactive content (and apps) that can be optimized for touch (and multi-touch/gesture) screens of various sizes, from SmartPhones/iPhones to the large interactive whiteboards that are now in a multitude of classrooms.

From my experience as a school psychologist, I know that there are many teens who have graduated from traditional cell phone to the next level.  If they don't have an iPhone, they have a smart phone.  I don't have the statistics on this, but my personal observations tell me that there are teens who come from families who are from lower economic status who are somehow able to own 3G smartphones.

What a great opportunity to provide casual interactive multimedia educational games to support student learning!  The games and activities could be assigned as homework from time to time, and with the appropriate LMS (Learning Management System), the teacher would have instant access to student progress.  In addition, the students would be provided with immediate feedback about their "work", which we know is an important factor in learning.

It is difficult to figure out the best path to forge, since nearly every week someone announces a new platform, technology, and programming approach!

At any rate, here are a few interesting things related to this topic that I'd like to share.  Many of these concepts are in the experimental phase, but are worth some attention.


SciLor's Open-Source Programs
SciLor's HD2/ Leo Multi-touch Demo v2 12/3/09 using VB.Net and Windows Mobile

SciLor's Comments: 
I have manged it to get Multitouch running in an vb.net app :)
There are still some bugs, which have to be resolved:
-Stop auto alignment!
-Identify the "Touches"



Google I/O 2009: Mastering the Android Media Framework


AT&T Interactive Mobile Website


Satellite-Terrestrial Network Delivering Mobile Video with Interactive Services - ICO mim





RELATED
Adding Multi-Touch to Your Windows Mobile Application's User Interface
Wei-Meng Lee,  DevX.com 3/24/08
Previous Post:  The new 3G iPhone:  Expanding the Possibilities of Interactive Multimedia Communication (Interactive Multimedia Technology,  6/9/08)
Note: I just skimmed the following articles - when I have more time, I'll post more of my reflections related to this set of topics.



Daniel Stewart, Nitya Narasimhan,  Position Paper, CMPPC Workshop, Pervasive 2007
Interactive Mobile Multimedia Needs IP and Circuits
Brough Turner,  Internet Telephony, 9/09
Primetime for Mobile Television:  Extending the entertainment concept by bringing together the best of both worlds (pdf)  IBM Institute for Business Value
A Holistic Approach to Enhance Universal Usability in m-Learning
Vlado Glavinic, Sandi Ljubic, Mihael Kukec, 2008 The Second International Conference on Mobile Ubiquitous Computing, Systems, Services and Technologies
Seamless Mobility: A Continuity of Experiences across Domains, Devices, and Networks (pdf) 2005
Raghu Rau, Senior Vice President of Marketing, Motorola

Ambient Networks:  Cooperative Mobile Networking for the Wireless World
Norbert Niebert, John Wiley and Sons LTD  4/07 
Mobility management challenges and issues in 4G heterogeneous networks (link to ACM pdf)
Sadia Hussain, Zara Hamid and Naveed S. Khattak, InterSense '06.  Proceedings of the first International Conference on Integrated Internet Ad hoc and Sensor Networks
Mobile Multimedia: Tune in to Digital Convergence (pdf)
DVB-Scene (a trade magazine) 3/2008
HP OpenCall Media Plafform: a cost-effective, agile IP media server (pdf)
Whitepaper, Hewlett-Packard Development Company, L.P.  2/2009
A survey of wireless multimedia sensor networks (pdf)
Ian F. Akyildiz, Tommaso Melodia, Kaushik R. Chowdhury, 2006  Science Direct  Elsevier


Mobile Multimedia Companies/R&D
Stantum  (UMPC)
Movidity
MobIME: Mobile Internet for Media and Entertainment


Nov 23, 2009

GestureTek & Sprint's Interactive Wall: 3D depth-sensing allows wall interaction with a cell phone.

I missed this one! The video and photos below are of the Sprint Center Interactive Wall, powered by GestureTek's 3D depth-sensing system.  The media art was created by Takashi Kawashima,a designer/media artist who lives in San Francisco.   He has an MFA in Design| Media Arts from UCLA.

The interactive display can be controlled by a cell phone.



YouTube description/plug:
"GestureTeks 3D depth sensing technology powers an attention-grabbing interactive digital signage system for telecom leader Sprint. The 3D depth sensing interactive display screen, with mobile phone connectivity, tracks peoples body movements, and responds by sending a Sprint promotional message that follows them the entire length of the interactive billboard. The interactive motion-detecting advertising message invites users to create their own personalized interactive wall art on Sprints gesture control screen, by calling Sprint on their mobile phone. GestureTeks 3D tracker is the heart of the system. Installation lead: Mission Electronics. Creative: Goodby Silverstein."

The Instant DJ application looks fun!  It allows you to mix the music tracks on the large display with your phone.


Phone Painter:  Sprint Center Interactive Wall



Instant DJ



Now Widget


RELATED
Sprint Uses GestureTek 3D Tracking & Control System for New Interactive Digital Signage Campaign
GestureTek Announces 3D Gesture Tracking Initiatives for Sprint and Hitachi; Shares New 3D Patent Information

SOMEWHAT RELATED
GestureFX:  Next Generation Pediatrics Business Case (interactive floor for a pediatric clinic's waiting room)
AirPoint Hand-Tracking Unit for Mouse Replacement and "Point to Control" Interactivity

Cross posted on The World Is My Interface blog

Morning Tech News: LED "Tatoos"; Sixth Sense Wearable Displays

Since I am usually crunched for time, I thought I'd try posting "morning tech news" on this blog in a brief format, and return to the topic later - hopefully later in the day or at the most, within the week.

If you are familiar with this blog, what I consider "news" is sometimes new to me.  It might be something that crossed my path a while ago and never posted.  It might be something that I missed.   It doesn't even have to be "news",  if it is something that is unique, catches my fancy, or is something that I think is an important innovation that should be followed and shared.

Today's news I caught from Wired, which linked to an article in MIT's Technology Review, "Implantable Silicon-Silk Electronics:  Biodegradable circuits could enable better neural interfaces and LED  tatoos", written by Katherine Bourzac.

"By building thin, flexible silicon electronics on silk substrates, researchers have made electronics that almost completely dissolve inside the body. So far the research group has demonstrated arrays of transistors made on thin films of silk. While electronics must usually be encased to protect them from the body, these electronics don't need protection, and the silk means the electronics conform to biological tissue. The silk melts away over time and the thin silicon circuits left behind don't cause irritation because they are just nanometers thick."


RELATED
WIRED's Gadget Lab:  The Illustrated Man:  How LED Tattos Could Make Your Skin a Screen Charlie Sorrel 11/20/09
"The silk substrate onto which the chips are mounted eventually dissolves away inside the body, leaving just the electronics behind. The silicon chips are around the length of a small grain of rice — about 1 millimeter, and just 250 nanometers thick. The sheet of silk will keep them in place, molding to the shape of the skin when saline solution is added.
These displays could be hooked up to any kind of electronic device, also inside the body. Medical uses are being explored, from blood-sugar sensors that show their readouts on the skin itself to neurodevices that tie into the body’s nervous system — hooking chips to particular nerves to control a prosthetic hand, for example."
Tatoo You:  Silicon LED's can act as photonic tattoos that can show blood sugar readings
-Surfdaddy Orca,  hplusmagaizine 11/17/09


"Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania, is working with researchers from Beckman Institute at the University of Illinois and Tufts University to develop medical applications for the new transistors. Their silk-silicon LEDs can act as photonic tattoos that can show blood-sugar readings, as well as arrays of conformable electrodes that might interface with the nervous system."


Litt Lab : Translational NeuroEngineering 
(Brian Litt's lab.)


SOMEWHAT RELATED
I've been thinking about flexible touch-screen applications, and it never occurred to me that the concept might be something that would transfer to human skin!  Here are a few of my posts related to this topic:

Last night I dreamt about haptic touch-screen overlays...
Rhizome 2009:  A Lovely Interactive Multi-touch App on a Flexible Lycra Screen
Impress:  A cool flexible interface project  by Silke Hilsing


More about this "somewhat related topic" to come: 
Latest SixthSense demo features paper "laptop" camera gestures
Nilay Patel, Engadget  11/18/09
Adding a "SixthSense" to your Cellphone
Vikas Bajaj, Bits, New York Times 11/6/09
Pattie Maes TED Talk:  Sixth Sense-  Mobile Wearable Interface and Gesture Interaction (for the price of a cell phone!) - my post from 3/2009

Nov 19, 2009

Multi-touch & Gesture Interaction News: NUITEQ's Snowflake Suite 1.7 compatible with Windows 7 and 3M Touch Systems, N-trig and Lumio



"This video demonstrates the N-trig DuoSense true multi-touch solution utilizing up to four fingers. The video features various multi-touch enabled applications, including how to pan and rotate using up to four fingers on Google Earth, a demonstration of how to play various onscreen musical instruments using the Snowflake Suite Music application, and a new hands-on way to play Sudoku. The Corel Paint it!™ application shows how existing images can be transformed using multi-touch, and a 3D desktop organizer application from BumpTop demonstrates new and innovative ways in which to organize your desktop using up to four fingers"   -avitaintrig's YouTube description

Snowflake Suite and NextWindow Plugin Information
NUITEQ in the media

3M Touch Systems

nTrig

Lumio

Bumptop

NextWindow
(SnowFlake Suite 1.7 works on NextWindow screens.)

I'll post more news and information about the natural interface/interaction biz very soon!

Oct 31, 2009

Sensory-Mind's Ring Wall, an interactive multi-touch wall you don't even have to touch!

Sensory-Minds is a small company in Germany that is focused on research and design in the field of Natural User Interfaces. If you visit the SENSORY-MINDS website, you'll find that it has been designed for touch interaction.



ring°wall from SENSORY-MINDS on Vimeo.

Information about the Ring Wall from Sensory-Mind's Vimeo site:

".....The two-piece ring°wall consists of a LED display and a multitouch information-wall and impresses by its size: a total surface of 425 square meters, which equals more than 6000 computer displays, is the biggest of its kind. An interactive World emerges out of 34 million pixels generated by 15 high definition projectors and is supported by 30 directional speakers.


Multitouch sensors basing on laser technology, convert the usage of the natural user interface into an experience. By direct touching, more than 80 users can simultaneously get informed about news and activities around the ringworld.


The interactive wall is not only a central information system, but also an innovative advertising tool and medium for public viewings."

Heiko Hoffman of Sensory-Minds recently joined the NUI-Group.  Here is a response to a question on the NUI-group forum about the way the system's sensors work:


"The sensor works like spinning radar gun, or like a wiper in a car, that means like a radar gun you get the distance to the object.  It’s not quiet difficult when you know the position of the sensor and the degree from the beam to get the X,Y position.  Yes, you don’t need to touch the surface but we arranged it that the radar beam is very close to the screen so it seems that you have to touch the surface.

At the moment the system (has) limitations because each sensor (gets) coordinates and this means that you got shadows.This is no problem because the people interact very fast. We are working on a system where the data from the sensors were put together and that would be the solution for this problem."



Note:

I'll be writing a few more posts updating the work of past and present NUI-group members as well as news from the commercial multi-touch & gesture community.  Be sure to check back soon, or subscribe to this blog!

Oct 28, 2009

libTISCH, a multi-touch development framework with multi-touch widgets and more!

For techies and the tech-curious who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers































Here is information from libTISCH announcement:


Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 

the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be
found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


TISCH Project Wiki

RELATED
Florian is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.
MeTaTop A Multi Sensory Table Top System for Medical Procedures

Oct 25, 2009

Interactive multi-touch for sound design, dj-ing, and music creation

NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:


multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister



JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard.  Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks.  (I'll discuss this further in a future post.)



Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:


Additional tutorials can be found the Jazzmutant YouTube channel.


Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."

RELATED
Video:  Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures 
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen

How the Stantum's Resistive Multi-Touch Screen Works


GDIF: Gesture Description Interchange Format, a tool for music-related movements, actions, and gestures.

There has been a flurry of work in the computer music technology world that relates to what has been going on with interactive display technology, multi-touch & gesture interaction. I came across a link to the GDIF website when I was searching for information about interactive music and the use of multi-touch technologies for a future blog post.   

So what is GDIF?  Gesture description interchange format

"The Gesture Description Interchange Format (GDIF) is being developed as a tool for streaming and storing data of music-related movements, actions, and gestures.  Current general purpose formats developped within the motion capture industry and biomechanical community (e.g. C3D) focus mainly on describing low-level motion of body joints.  We are more interested in describing gesture qualities, performer-instrument relationships, and movement-sound relationships in a coherent and consistent way.  A common format will simplify working with different software, platforms and devices, and allow for sharing data between institutions."  (The Jamoma environment is used to prototype GDIF.)


Alexander Refsum Jensenius is the man who initiated the GDIF project.  He's written a variety of articles about music, gestures, movement, and emerging technologies.  


Here's Alexander's bio"Alexander (BA, MA, MSc, PhD) is a music researcher and research musician working in the fields of embodied music cognition and new interfaces for musical expression (NIME) at the University of Oslo and at the Norwegian Academy of MusicHe studied informatics, mathematics, musicology, music performance and music technology at UiOChalmersUC Berkeley and McGill. Alexander is active in the international computer music community through a number of collaborative projects, and as the initiator of GDIFHe performs on keyboard instruments and live electronics in various constellations, including the Oslo Laptop Orchestra (OLO)."




Related Publications
Godoy, R. I., E. Haga, and A. R. Jensenius (2006b). Playing `air instruments':Mimicry of sound-producing gestures by novices and experts. InS. Gibet, N. Courty, and J.-F. Kamp (Eds.), Gesture in Human-Computer Interaction and Simulation, GW 2005, Volume LNAI 3881, pp. 256{267.Berlin: Springer-Verlag.
Jensenius, A. R (2009): Motion capture studies of action-sound couplings in sonic interaction. STSM COST Action SID report. fourMs lab, University of Oslo.
Jensenius, A. R. (2007). Action - Sound: Developing Methods and Tools to Study Music-related Body Movement. PhD thesis. Department of Musicology. University of Oslo, Norway
Jensenius, A. R., K. Nymoen and R. I. Godoy (2008): A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians. Proceedings of the International Computer Music Conference, 24-29 August 2008, Belfast.
Jensenius, A. R., T. Kvifte, and R. I. Godoy (2006). Towards a gesture description interchange format. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), NIME '06: Proceedings of the 2006 International Conference on New Interfaces for Musical Expression, Paris, pp. 176{179. Paris: IRCAM { Centre Pompidou.}
Kvifte, T. and A. R. Jensenius (2006). Towards a coherent terminology and model of instrument description and design. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), Proceedings of New Interfaces for Musical Expression, NIME 06, IRCAM - Centre Pompidou, Paris, France, June 4-8, pp. 220–225. Paris: IRCAM - Centre Pompidou. [PDF]
Marshall,M. T., N. Peters, A. R. Jensenius, J. Boissinot, M. M. Wanderley, and J. Braasch (2006). On the development of a system for gesture control of spatialization. In Proceedings of the 2006 International Computer Music Conference, 6-11 November, New Orleans. [PDF]

RELATED
"Sonic Interaction Design is the exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts."
SID Action has four working groups:
WG1: Perceptual, cognitive, and emotional study of sonic interactions
WG2: Product sound design
WG3: Interactive art and music
WG4: Sonification



    "SoundHack was my main thing for a long time, and I poured a lot of effort into it. It was the place I put my ideas. I did have something of a mission with SoundHack. I wanted to take some computer music techniques that were only used in academia, and get them out there so that all types of musicians could use them."-Tom Erbe  SoundHack Spectral Shapers


Csound Blog "Old School Computer Music"
"Csound is a sound and music synthesis system, providing facilities for composition and performance over a wide range of platforms. It is not restricted to any style of music, having been used for many years in the creation of classical, pop, techno, ambient, experimental, and (of course) computer music, as well as music for film and television."-Csound on Sourceforge


Quote from Dr. Richard Boulanger (Father of CSound):
"For me, music is a medium through which the inner spiritual essence of all things is revealed and shared. Compositionally, I am interested in extending the voice of the traditional performer through technological means to produce a music which connects with the past, lives in the present and speaks to the future. Educationally, I am interested in helping students see technology as the most powerful instrument for the exploration, discovery, and realization of their essential musical nature - their inner voice."


Upcoming post about innovations at Stantum:
I'll be focusing on Stantum and its music and media technologies division, JazzMutant. in my next post. It is interesting to note that the co-founders of Stantum, Guilliam Largilleir and Pascal Joget, have a background in electronic music.  Guiliam specializes in multi-modal user interfaces and human-machine interface technologies. Pascal has a background in physics and electronics, and has worked as a sound engineer.


My music back-story:



The very first computer-related course I took was Computer Music Technology (in 2003), since I play an electronic midi/digital keyboard and previously tried to teach myself a few things, long before computers and related technologies were "easy" for me to figure out.  During the mid-90's, I tried my hand at Dr. Richard Boulanger' CSound, and tried to acquaint myself with tools from Cycling'74, but I gave up.  Not long after that, bought the first version of MOTU's  Freestyle, which nicely worked on my Performa 600, hooked up to my Ensoniq 32, after the nice people at MOTU sent me an update that was compatible with my set-up.  Later on,  I came across Tom Erbe's SoundHack freeware.   


A lot has changed since then!