Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Nov 23, 2009

GestureTek & Sprint's Interactive Wall: 3D depth-sensing allows wall interaction with a cell phone.

I missed this one! The video and photos below are of the Sprint Center Interactive Wall, powered by GestureTek's 3D depth-sensing system.  The media art was created by Takashi Kawashima,a designer/media artist who lives in San Francisco.   He has an MFA in Design| Media Arts from UCLA.

The interactive display can be controlled by a cell phone.



YouTube description/plug:
"GestureTeks 3D depth sensing technology powers an attention-grabbing interactive digital signage system for telecom leader Sprint. The 3D depth sensing interactive display screen, with mobile phone connectivity, tracks peoples body movements, and responds by sending a Sprint promotional message that follows them the entire length of the interactive billboard. The interactive motion-detecting advertising message invites users to create their own personalized interactive wall art on Sprints gesture control screen, by calling Sprint on their mobile phone. GestureTeks 3D tracker is the heart of the system. Installation lead: Mission Electronics. Creative: Goodby Silverstein."

The Instant DJ application looks fun!  It allows you to mix the music tracks on the large display with your phone.


Phone Painter:  Sprint Center Interactive Wall



Instant DJ



Now Widget


RELATED
Sprint Uses GestureTek 3D Tracking & Control System for New Interactive Digital Signage Campaign
GestureTek Announces 3D Gesture Tracking Initiatives for Sprint and Hitachi; Shares New 3D Patent Information

SOMEWHAT RELATED
GestureFX:  Next Generation Pediatrics Business Case (interactive floor for a pediatric clinic's waiting room)
AirPoint Hand-Tracking Unit for Mouse Replacement and "Point to Control" Interactivity

Cross posted on The World Is My Interface blog

Morning Tech News: LED "Tatoos"; Sixth Sense Wearable Displays

Since I am usually crunched for time, I thought I'd try posting "morning tech news" on this blog in a brief format, and return to the topic later - hopefully later in the day or at the most, within the week.

If you are familiar with this blog, what I consider "news" is sometimes new to me.  It might be something that crossed my path a while ago and never posted.  It might be something that I missed.   It doesn't even have to be "news",  if it is something that is unique, catches my fancy, or is something that I think is an important innovation that should be followed and shared.

Today's news I caught from Wired, which linked to an article in MIT's Technology Review, "Implantable Silicon-Silk Electronics:  Biodegradable circuits could enable better neural interfaces and LED  tatoos", written by Katherine Bourzac.

"By building thin, flexible silicon electronics on silk substrates, researchers have made electronics that almost completely dissolve inside the body. So far the research group has demonstrated arrays of transistors made on thin films of silk. While electronics must usually be encased to protect them from the body, these electronics don't need protection, and the silk means the electronics conform to biological tissue. The silk melts away over time and the thin silicon circuits left behind don't cause irritation because they are just nanometers thick."


RELATED
WIRED's Gadget Lab:  The Illustrated Man:  How LED Tattos Could Make Your Skin a Screen Charlie Sorrel 11/20/09
"The silk substrate onto which the chips are mounted eventually dissolves away inside the body, leaving just the electronics behind. The silicon chips are around the length of a small grain of rice — about 1 millimeter, and just 250 nanometers thick. The sheet of silk will keep them in place, molding to the shape of the skin when saline solution is added.
These displays could be hooked up to any kind of electronic device, also inside the body. Medical uses are being explored, from blood-sugar sensors that show their readouts on the skin itself to neurodevices that tie into the body’s nervous system — hooking chips to particular nerves to control a prosthetic hand, for example."
Tatoo You:  Silicon LED's can act as photonic tattoos that can show blood sugar readings
-Surfdaddy Orca,  hplusmagaizine 11/17/09


"Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania, is working with researchers from Beckman Institute at the University of Illinois and Tufts University to develop medical applications for the new transistors. Their silk-silicon LEDs can act as photonic tattoos that can show blood-sugar readings, as well as arrays of conformable electrodes that might interface with the nervous system."


Litt Lab : Translational NeuroEngineering 
(Brian Litt's lab.)


SOMEWHAT RELATED
I've been thinking about flexible touch-screen applications, and it never occurred to me that the concept might be something that would transfer to human skin!  Here are a few of my posts related to this topic:

Last night I dreamt about haptic touch-screen overlays...
Rhizome 2009:  A Lovely Interactive Multi-touch App on a Flexible Lycra Screen
Impress:  A cool flexible interface project  by Silke Hilsing


More about this "somewhat related topic" to come: 
Latest SixthSense demo features paper "laptop" camera gestures
Nilay Patel, Engadget  11/18/09
Adding a "SixthSense" to your Cellphone
Vikas Bajaj, Bits, New York Times 11/6/09
Pattie Maes TED Talk:  Sixth Sense-  Mobile Wearable Interface and Gesture Interaction (for the price of a cell phone!) - my post from 3/2009

Nov 19, 2009

Multi-touch & Gesture Interaction News: NUITEQ's Snowflake Suite 1.7 compatible with Windows 7 and 3M Touch Systems, N-trig and Lumio



"This video demonstrates the N-trig DuoSense true multi-touch solution utilizing up to four fingers. The video features various multi-touch enabled applications, including how to pan and rotate using up to four fingers on Google Earth, a demonstration of how to play various onscreen musical instruments using the Snowflake Suite Music application, and a new hands-on way to play Sudoku. The Corel Paint it!™ application shows how existing images can be transformed using multi-touch, and a 3D desktop organizer application from BumpTop demonstrates new and innovative ways in which to organize your desktop using up to four fingers"   -avitaintrig's YouTube description

Snowflake Suite and NextWindow Plugin Information
NUITEQ in the media

3M Touch Systems

nTrig

Lumio

Bumptop

NextWindow
(SnowFlake Suite 1.7 works on NextWindow screens.)

I'll post more news and information about the natural interface/interaction biz very soon!

Oct 31, 2009

Sensory-Mind's Ring Wall, an interactive multi-touch wall you don't even have to touch!

Sensory-Minds is a small company in Germany that is focused on research and design in the field of Natural User Interfaces. If you visit the SENSORY-MINDS website, you'll find that it has been designed for touch interaction.



ring°wall from SENSORY-MINDS on Vimeo.

Information about the Ring Wall from Sensory-Mind's Vimeo site:

".....The two-piece ring°wall consists of a LED display and a multitouch information-wall and impresses by its size: a total surface of 425 square meters, which equals more than 6000 computer displays, is the biggest of its kind. An interactive World emerges out of 34 million pixels generated by 15 high definition projectors and is supported by 30 directional speakers.


Multitouch sensors basing on laser technology, convert the usage of the natural user interface into an experience. By direct touching, more than 80 users can simultaneously get informed about news and activities around the ringworld.


The interactive wall is not only a central information system, but also an innovative advertising tool and medium for public viewings."

Heiko Hoffman of Sensory-Minds recently joined the NUI-Group.  Here is a response to a question on the NUI-group forum about the way the system's sensors work:


"The sensor works like spinning radar gun, or like a wiper in a car, that means like a radar gun you get the distance to the object.  It’s not quiet difficult when you know the position of the sensor and the degree from the beam to get the X,Y position.  Yes, you don’t need to touch the surface but we arranged it that the radar beam is very close to the screen so it seems that you have to touch the surface.

At the moment the system (has) limitations because each sensor (gets) coordinates and this means that you got shadows.This is no problem because the people interact very fast. We are working on a system where the data from the sensors were put together and that would be the solution for this problem."



Note:

I'll be writing a few more posts updating the work of past and present NUI-group members as well as news from the commercial multi-touch & gesture community.  Be sure to check back soon, or subscribe to this blog!

Oct 28, 2009

libTISCH, a multi-touch development framework with multi-touch widgets and more!

For techies and the tech-curious who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers































Here is information from libTISCH announcement:


Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 

the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be
found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


TISCH Project Wiki

RELATED
Florian is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.
MeTaTop A Multi Sensory Table Top System for Medical Procedures

Oct 25, 2009

Interactive multi-touch for sound design, dj-ing, and music creation

NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:


multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister



JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard.  Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks.  (I'll discuss this further in a future post.)



Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:


Additional tutorials can be found the Jazzmutant YouTube channel.


Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."

RELATED
Video:  Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures 
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen

How the Stantum's Resistive Multi-Touch Screen Works


GDIF: Gesture Description Interchange Format, a tool for music-related movements, actions, and gestures.

There has been a flurry of work in the computer music technology world that relates to what has been going on with interactive display technology, multi-touch & gesture interaction. I came across a link to the GDIF website when I was searching for information about interactive music and the use of multi-touch technologies for a future blog post.   

So what is GDIF?  Gesture description interchange format

"The Gesture Description Interchange Format (GDIF) is being developed as a tool for streaming and storing data of music-related movements, actions, and gestures.  Current general purpose formats developped within the motion capture industry and biomechanical community (e.g. C3D) focus mainly on describing low-level motion of body joints.  We are more interested in describing gesture qualities, performer-instrument relationships, and movement-sound relationships in a coherent and consistent way.  A common format will simplify working with different software, platforms and devices, and allow for sharing data between institutions."  (The Jamoma environment is used to prototype GDIF.)


Alexander Refsum Jensenius is the man who initiated the GDIF project.  He's written a variety of articles about music, gestures, movement, and emerging technologies.  


Here's Alexander's bio"Alexander (BA, MA, MSc, PhD) is a music researcher and research musician working in the fields of embodied music cognition and new interfaces for musical expression (NIME) at the University of Oslo and at the Norwegian Academy of MusicHe studied informatics, mathematics, musicology, music performance and music technology at UiOChalmersUC Berkeley and McGill. Alexander is active in the international computer music community through a number of collaborative projects, and as the initiator of GDIFHe performs on keyboard instruments and live electronics in various constellations, including the Oslo Laptop Orchestra (OLO)."




Related Publications
Godoy, R. I., E. Haga, and A. R. Jensenius (2006b). Playing `air instruments':Mimicry of sound-producing gestures by novices and experts. InS. Gibet, N. Courty, and J.-F. Kamp (Eds.), Gesture in Human-Computer Interaction and Simulation, GW 2005, Volume LNAI 3881, pp. 256{267.Berlin: Springer-Verlag.
Jensenius, A. R (2009): Motion capture studies of action-sound couplings in sonic interaction. STSM COST Action SID report. fourMs lab, University of Oslo.
Jensenius, A. R. (2007). Action - Sound: Developing Methods and Tools to Study Music-related Body Movement. PhD thesis. Department of Musicology. University of Oslo, Norway
Jensenius, A. R., K. Nymoen and R. I. Godoy (2008): A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians. Proceedings of the International Computer Music Conference, 24-29 August 2008, Belfast.
Jensenius, A. R., T. Kvifte, and R. I. Godoy (2006). Towards a gesture description interchange format. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), NIME '06: Proceedings of the 2006 International Conference on New Interfaces for Musical Expression, Paris, pp. 176{179. Paris: IRCAM { Centre Pompidou.}
Kvifte, T. and A. R. Jensenius (2006). Towards a coherent terminology and model of instrument description and design. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), Proceedings of New Interfaces for Musical Expression, NIME 06, IRCAM - Centre Pompidou, Paris, France, June 4-8, pp. 220–225. Paris: IRCAM - Centre Pompidou. [PDF]
Marshall,M. T., N. Peters, A. R. Jensenius, J. Boissinot, M. M. Wanderley, and J. Braasch (2006). On the development of a system for gesture control of spatialization. In Proceedings of the 2006 International Computer Music Conference, 6-11 November, New Orleans. [PDF]

RELATED
"Sonic Interaction Design is the exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts."
SID Action has four working groups:
WG1: Perceptual, cognitive, and emotional study of sonic interactions
WG2: Product sound design
WG3: Interactive art and music
WG4: Sonification



    "SoundHack was my main thing for a long time, and I poured a lot of effort into it. It was the place I put my ideas. I did have something of a mission with SoundHack. I wanted to take some computer music techniques that were only used in academia, and get them out there so that all types of musicians could use them."-Tom Erbe  SoundHack Spectral Shapers


Csound Blog "Old School Computer Music"
"Csound is a sound and music synthesis system, providing facilities for composition and performance over a wide range of platforms. It is not restricted to any style of music, having been used for many years in the creation of classical, pop, techno, ambient, experimental, and (of course) computer music, as well as music for film and television."-Csound on Sourceforge


Quote from Dr. Richard Boulanger (Father of CSound):
"For me, music is a medium through which the inner spiritual essence of all things is revealed and shared. Compositionally, I am interested in extending the voice of the traditional performer through technological means to produce a music which connects with the past, lives in the present and speaks to the future. Educationally, I am interested in helping students see technology as the most powerful instrument for the exploration, discovery, and realization of their essential musical nature - their inner voice."


Upcoming post about innovations at Stantum:
I'll be focusing on Stantum and its music and media technologies division, JazzMutant. in my next post. It is interesting to note that the co-founders of Stantum, Guilliam Largilleir and Pascal Joget, have a background in electronic music.  Guiliam specializes in multi-modal user interfaces and human-machine interface technologies. Pascal has a background in physics and electronics, and has worked as a sound engineer.


My music back-story:



The very first computer-related course I took was Computer Music Technology (in 2003), since I play an electronic midi/digital keyboard and previously tried to teach myself a few things, long before computers and related technologies were "easy" for me to figure out.  During the mid-90's, I tried my hand at Dr. Richard Boulanger' CSound, and tried to acquaint myself with tools from Cycling'74, but I gave up.  Not long after that, bought the first version of MOTU's  Freestyle, which nicely worked on my Performa 600, hooked up to my Ensoniq 32, after the nice people at MOTU sent me an update that was compatible with my set-up.  Later on,  I came across Tom Erbe's SoundHack freeware.   


A lot has changed since then! 




Oct 23, 2009

Two good articles by Bill Buxton: The Mad Dash Towards Touch Technology; The Long Nose of Innovation

I came a couple of interesting links to a couple of articles from the Putting People First blog. The links are articles written by Microsoft Research principal scientist, Bill Buxton.  If you've never heard of Bill Buxton, he's the guy that was doing multi-touch research way back in the 1980's. 

The Mad Dash Toward Touch Technology
Bill Buxton, Business Week, 10/21/09
"True innovators need to know as much about when, why, and how not to use trendy technology as when to use i."

The Long Nose of Innovation
Bill Buxton,  Business Week, 1/2/08
"The bulk of innovation is low-amplitude and takes place over a long period. Companies should focus on refining existing technologies as much as on creation." 

RELATED
Updated!

Multi-Touch Systems that I Have Known and Loved
(Bill Buxton)


I came across Bill Buxton's Multi-Touch website in early 2007 when I was taking HCI and Ubicomp.  I was searching for information about large touch-screen displays and applications for a couple of class projects.  The website was the answer to my graduate student prayers.  On the site, you'll find a fantastic overview of the history of "multi-touch", including gesture recognition and related surface technologies. 

The website has interesting links.  If you have the time, take a look at Buxton's main websitehttp://www.billbuxton.com/. You'll find loads of interesting links. I especially like the links to his Business Week articles.



Bill Buxton is the author of "Sketching User Experiences:  Getting the design right and the right design", a book that I own and recommend.

Oct 15, 2009

Interactive Motion Graphics Showreel from Filmview Services - great content!

Here is a showreel from Filmview Services that simulates how tech-usability in an interactive gesture/touch world should be!



Here is a quote from the Filmview Services blog:


What Are Screen Graphics?

"...So it works out more cost effective for the films to actually have someone put the graphics on the screens for real. It also greatly enhances the performance of the actors. You only have to watch any of the Star Wars Eps 1-3 to see how wooden acting is when you don’t actually know what is in front of you. Actors love to be able push buttons and bang touch screens during their scenes. Having to actually do it in a certain order can stretch their capabilities mind you, and I am pretty gob smacked at how absolutely computer illiterate some of them are. Don’t they use email?


Anyway, due to this diminished ability to hit and bang things in any certain order, it is our job to make it impossible to mess things up. That’s why they are all genius typers. We make it so they can type any old thing and the letters still come out the way they are meant to each time. We also put little locking codes into our programming so they can’t accidentally escape the graphic mid job. It’s amazing how many of them can type the Esc button when they are meant to be spelling LOGIN."

Thanks, Tim!

SOMEWHAT RELATED
Coincidentally,  when I was visiting the NUI-Group forums this morning, I came across a link to Jakob Nielsen's "Usability in the Movies -- Top 10 Bloopers", which are worth taking a look at. I've posted the list, but you'll need to go to Nielson's web page to read the descriptions. You'll smile.

1. The Hero Can Immediately Use Any UI
2. Time Travelers Can Use Current Designs
3. The 3D UI
4. Integration is Easy, Data Interoperates
5. Access Denied/Access Granted
6. Big Fonts
7. Star Trek's Talking Computer
8. Remote Manipulators (Waldo Controls)
9. You've Got Mail is Always Good News
10."This is Unix, It's Easy"

Oct 9, 2009

WIRED's Overview of Touchscreen PC's and Interface Innovations (and some links for the tech-curious)

According to Priya Ganapati's recent article in WIRED,  only 3% of all PC's and notebooks have interactive touchscreens.  More are coming to market, such as Sony's Vaio L Touch HD PC,  Dell's all-in-one, and Lenovo's tablet PC and ThinkPad laptop.  HP came out with the TouchSmart PC and touch-enabled laptops, and will be adding updated versions soon.

This is a great opportunity for developers interested in touch, multi-touch, and gesture interaction applications.   I think there will be even more opportunity for web developers to create websites that are touch-enabled, or at least optimized for touch screen interaction.

Will the  multi-touch web might be right around the corner?    

At the moment, there is some confusion about what designers should consider when developing applications.  There is not standard system of touch or gesture interaction,  and researchers are still very busy figuring things out.   Some companies have rushed out and patented gestures,  which in my opinion, is like patenting how we breath. But that is another story.

If you are a designer or developer, you might be interested in the Touch First Microsoft Surface Developer Challenge.  It is a chance to win your very own Microsoft Surface!  Hurry, the deadline is October 12th.

If you want to learn more, read WIRED's Gadget Lab:  "Touchscreen PC's Prompt Interface Innovations" for a good overview of what's happening in the Touch PC world.

If you are curious and would like to dig deeper into this topic, take some time to look at some of my blog-posts about multi-touch, touch screen news, innovations, and related musings:

The World Is My Interface:  An Introduction (and some links)
Interactive Touch-Screen Technology, Participatory Design, and "Getting It".
Multimedia and Interaction Resources (a mega-list, work-in-progress)
Bump Top 3D Desktop on a Touch Screen:  Toss Your Photos to your Facebook Icon!
Ron George's Interaction Design Toolbox
So how are people using their multi-touch all-in-ones?  Medion X9613 will be released soon..
Multi-touch, multimedia, multi-modal:  Fujitsu LIFEBOOK 54310 has a multitude of possibilties
Windows Vista User Experience Guidelines: "All Windows programs should be touchable!"

Oct 8, 2009

More Multi-Touch & Urban Screens: SoTouch, onedotzero, the Aurora Mindstorm table, and collaboration at the Adventures in Motion festival in London

SoTouch and onedotzero collaborated to create a multi-touch, multi-user application that ran on a Mindstorm Aurora table at the onedotzero Adventures in Motion Festival at the BFI Southbank in  London during in September 2009. The interactive application provided festival participants enjoyable ways to filter and search the festival program. The art director for this project was Alex Le Guillou.



(Music: Raindrops, by Basement Jaxx)

How it works

How does it work? 
According to information provided by So Touch, users can search by dragging and dropping keywords in personal search discs.  Physics rules incorporated into the SoTouch application allow the manipulated objects bump and react to each other, encouraging social interaction among the people at the table.  For this particular application, each even could be viewed in a description panel, including timing, description, pictures, and videos, enabled for multi-touch gesture interaction. Additionally, people could use the table to send emails to friends, using a scalable keyboard.

The application was designed for flexibility, using the TUIO protocol, and allowing it to be customized for use at other festivals or shows through xml.


Detail
So touch / onedtozero / Martin Senyszak by So touch.

About SoTouch "Intelligent Surfaces"
"We deliver complete solutions to turn passive environment into active business contributor in retail, show room and trade show, leveraging multi-touch and touch-less technology."

Julien Lescure, CEO, and Florian Bernard, CTO, founded SoTouch, an advanced digital agency,  in early 2009.  The company is based in the U.K.

About onedotzero
"Onedotzero  is a contemporary, digital arts organisation with a remit to promote innovation across all forms of moving image and motion arts, activities encompass public events, artist + content development, publishing projects, education, production, creative direction, and related consultancy services ... onedotzero was conceived at the start of the desktop digital revolution in the mid-1990s out of a desire to explore moving image across single screen, interactive and live audio-visual work. today, onedotzero remains committed to providing a home for visionary moving image experimentation and contemporary creative collisions."



Interactive Urban Screen

The interactive urban screen installation, also known as the 2009 onedotzero identity, was made interactive by a Nokia n900 mobile phone, as shown in the pictures and video clips below.
The 2009 Identity by I am Martin.

http://farm3.static.flickr.com/2430/3906095157_7a76138444.jpg


onedotzero interactive identity powered by Nokia N900 from onedotzero on Vimeo.

Nokia n900 onedotzero Installation- Behind the Scenes


Onedotzero - Watch the interaction with an urban screen with a Nokia n900 (Story behind the scenes story).


RELATED
N900 to appear in London next week!
(JBC, Nokia Conversations, 9/4/009)
According to the article,  the application for the installation was developed by digital artists/computational designers Karsten Schmidt and Gary Birkett. (I think the ad agency Wieden + Kennedy London was also involved with this project, since Nokia is their client.)



More about Mindstorm   
                        
Simon Cowell and X Factor judges us Mindstorm interactive touchscreen to decide who goes through. Paul Milligan, AVInteractive, 10/6/09

The four judges from ITV's X Factor show, Simon Cowell, Louis Walsh, Dannii Minogue and Cheryl Cole used a Mindstorm multi-touch interactive surface to help them whittle down the contestants at the bootcamp stage of the series.


http://www.mindstorm.com/CMS_images/home-aurora-large.jpg
Mindstorm Aurora Table

Oct 7, 2009

We Need Collaborative Surfaces: The Shared IT Project & Other Research Touched by Yvonne Rogers and/or her Colleagues

Today I came across Doug Clow's blog post, "Tangibles, tabletops or mobiles:  Which is best for collaborative learning?".  In essence,  the post was a "liveblog", written during Yvonne Roger's review of her work of the past 10 years at an IET Technology Coffee Morning session held at Open University in London.

I've been following Yvonne's research for some time now, and I thought I'd take this opportunity highlight her work. (I've provided additional information for those of you who are new to this blog.)

Background (from my perspective)


Back in the early 1990's, I ran a "paired learning" study skills/social skills group three mornings a week in a computer lab, for a crew of fourth grade students referred to me (the school psychologist) for support for behavior, learning, and social skills problems.  Since many of the students needed to work on social skills such as waiting turns, cooperating, and working nicely with others, I decided to pair the students up, two to a computer.

Some of the applications were designed for two students, taking turns, but other applications were designed for only one person,  and for those applications, the student's were encouraged to forget about the "rules" and help each other out,  in other words, one student would control the mouse, and the other would be the "mouse-seat driver".  I spent some time teaching "mouse-sharing" skills.  One of the students was trying to grab the mouse when it wasn't his turn, and his partner whacked him over the head with it!

Moving on...


When I first touched an interactive whiteboard, it was 2002 or 2003.  It was a SmartBoard, and I was fascinated by the possibilities that it held.   Since I used it as a tool for group counseling and study skills in my role as a school psychologist,  I couldn't help but hope that a new version would come out that would allow at least two students to interact with the screen at the same time.

Better yet, wouldn't it be cool to have the group table offer the same sort of functionality?   I realize when people envision a learning environment, they are likely to think of students seated in single desks.  While this is the case in many classrooms, many teachers manage to focus on group work, such as group projects,  collaborative/cooperative learning lessons, or science experiments...even if they DON'T have tables. They push the desks together to form a "table".

In my life as a school psychologist,  particularly during my days at an elementary school,  I spent a good part of my day sitting at a table, with 3-6 students, for the purpose of group counseling, social skills training, and so forth.  (The speech and language therapist worked with groups of children around a table, too.)  You can imagine that something like the SMARTTable or the Surface would have been a godsend, provided that it came with a wealth of therapeutic activities and templates that could be easily customized to meet the needs of the students!

Interactive displays and whiteboards are now common in our schools and workplaces. There are a few classrooms that have interactive tables,  and these tables are also found in museums and visitor centers.  But there is much work yet to be done- in terms of content development,  hardware design, interaction design, and research. 

No one discipline "owns" this problem-space.  Better said, no one owns this "solution-space"!

(For more information about this topic, read "Don Norman's Keynote at the 21st Transmedia Symposium: Transmedia Design Challenge:  Co-creation" and "Transcending Disciplinary Boundaries in Interaction Design", by Eli Blevis & Erik Stolterman)


Yvonne Rogers has been a key player in this arena.  She is focused, but at the same time, takes a big-picture, multi-faceted point of view, something that I find lacking in much of the computer science research that crosses my path.  Yvonne's approach is consistent with other forward-minded thinkers in the field, such as

When she was a professor at Indiana University, Yvonne's team produced a short video to outline the types of problems when groups of people encounter when they share information and collaborate on projects.

What I like about this video is that it works towards solutions.  Although the video is from 2006, it still rings true.   



If you have ever worked with students in small groups forced to share one mouse and computer screen, some of the footage will bring you to a smile.


More about Dr. Rogers and her colleagues:

Dr. Rogers is the lead researcher of the SharedIT Project, an interdisciplinary endeavor that exploring how new technologies, such as multi-touch tables, gesture-based systems, interactive wall displays, and interactive tangibles, can support collaborative activities among small groups of people in a range of situations and settings.

Some of the people affiliated with Dr. Rogers and the Open University Pervasive Interaction Lab are highlighted in the video clip below.  The video was filmed at an event held on June 23, 2009 at th Science Museums' Dana Center (UK), in collaboration with the London Knowledge Lab.  The video highlights the work of researchers, artists, educators, and technologists, all exploring collaborative tangible, interactive technologies and surfaces.  Some of the questions asked:    What sort of new interfaces can support "collective creativity"?  How can these technologies support interactions between people who are in different places around the world?

Surface Tension:  Interactives and Workshops (June 23, 2009)

Surface Tension: Interactives and workshops from Dreamtime Film on Vimeo.


Yvonne was one of the researchers involved in the Dynamo Project,the topic of one of my posts a couple of years ago: Revisiting promising projects:  Dynamo, an application for sharing information on large interactive displays in public spaces.  (9/16/07)

Doug Clow provides a good overview of some of the SharedIT projects in his April 2009 post, "Low-hanging fruit: interactive tables for collaborative learning".    If you are curious and want to dig deeper, below is a link to the SharedIT publications web page, along with an assortment of articles related to Yvonne Roger's work.

ShareIT Publications

Hornecker, Eva (2008) "I don't understand it either, but it is cool" - Visitor Interactions with a Multi-Touch Table in a Museum. (pdf) IEEE Tabletop 2008


Rick, J, and Rogers, Y. (2008) From DigiQuilt to DigiTile: Adapting Educational Technology to a Multi-Touch Table. IEEE Tabletops and Interactive Surfaces. 79-86.

Rogers, Y., Lim, Y., Hazlewood, W. R. and Marshall, P. (2008) Equal Opportunities: Do Shareable Interfaces Promote More Group Participation than Single User Displays? - To Appear in Human Computer Interaction
PDF


Related publications of interest:


Rogers, Y. (2006) Moving on from Weiser's vision of of calm computing: engaging UbiComp experiences. In: P. Dourish and A. Friday (Eds.) Ubicomp 2006 Proceedings, LNCS 4206, pp. 404-421, Springer-Verlag, PDF version

Rick, J., Harris, A., Marshall, P., Fleck, R., Yuill, N. and Rogers, Y. (2009) Children designing together on a multi-touch tabletop: An analysis of spatial orientation and user interactions. Proceedings of Interaction Design and Children (IDC '09), 106-114

Harris, A., Rick, J., Bonnett, V., Yuill, N., Fleck, R., Marshall, P. and Rogers, Y. (2009) Around the Table: Are multiple-touch surfaces better than single-touch for children's collaborative interactions? Proceedings of CSCL 2009
Rick, J., Rogers, Y., Haig, C. and Yuill, N. (2009) Learning by Doing with Shareable Interfaces. Children, Youth & Environments, In Press.

Marshall, P., Fleck, R., Harris, A. Rick, J., Hornecker, E., Rogers, Y., Yuill, N. and Dalton, N. S. (2009) Fighting for Control: Children's Embodied Interactions When Using Physical and Digital Representations. Proceedings of CHI'09, 2149-2152. PDF version


Rogers, Y. and Price, S. (2009) How mobile technologies are changing the way children learn. In A. Druin, (Ed.) On the Move: Children, Learning and Technology. Elsevier. 3-22. PDF version

Rogers, Y. (2008) A comparison of how animation has been used to support formal, informal and playful learning. Learning with Animation, edited by R. Lowe and W. Schnotz, CUP, Cambridge. 286-303.

Rogers, Y. and Price, S. (2008) The role of mobile devices in facilitating collaborative inquiry in situ. Research and Practice in Technology Enhanced Learning. 3 (3), 209-229. PDF version


Rogers, Y. (2008) Using External Visualizations to Extend and Integrate Learning in Mobile and Classroom Settings. In J. Gilbert (ed.), Visualization: and practice in science education. CUP. 89-102. 

Rogers, Y. and Muller, H. (2006) A framework for designing sensor-based interactions to promote exploration and reflection. International Journal of Human-Computer Studies, 64 (1), 1-15. PDF version
  
Brignull, H., Izadi, S, , Fitzpatrick, G., Rogers, Y. and Rodden, T. (2004) The Introduction of a Shared Interactive Surface into a Communal Space. In Proc. of CSCW 2004, Conference on Computer Supported Cooperative Work, November 6-10, 2004, Chicago, USA: ACM Press. 49-58. PDF version
  
Randell, C., Phelps, T. and Rogers, Y. (2003) Ambient Wood: Demonstration of a digitally enhanced field trip for school children. In Adjunct Proc. UbiComp 2003, 100-104. PDF version

Brignull, H. and Rogers, Y (2003) Enticing people to interact with large public displays in public spaces. In Proceedings of INTERACT'03, Zurich, 17-24. PDF version


Rogers, Y., Brignull, H. and Scaife, M. (2002) Designing Dynamic Interactive Visualisations to Support Collaboration and Cognition. In First International Symposium on  Collaborative Information Visualization Environments, IV 2002, London,  July 10-12, 2002, IEEE, 39-50. PDF version

Of course, there are many other researchers who are exploring collaborative technologies, and I've featured a good number on this blog.  I'll be sure to feature more, from a range of disciplines, in future posts.