Oct 29, 2009

UX of ITV: The User Experience and Interactive TV (or Let's Stamp Out Bad Remote Controls)

I prefer to watch our flat-panel HDTV in the dark, and usually I watch something I've DVR'd.  Because of the nature of my remote control, I often have to interrupt the immersive experience, turn on the light, poke at the buttons, and start over again.


Not long ago, I had the flu and I thought I'd try out the interactive channel from my satellite TV provider.  My interactive experience was about the same as interacting with the DVR!

What the satellite TV company offered was not really ITV.  It was
BIRC.  Bad Interactive Remote Control.  You know what I'm talking about.  In the era of the WiiMote, most of us still have to interact with our TV systems as if were 1982.


There is hope!
A good number of researchers are working hard to make the UX of ITV, including mobile TV, a reality.   If you are reading this post, you most likely will appreciate some of the articles and links below:

Konstantinos ChorianopoulosResearch Methods in Interactive TV (pdf)
(Konstantinos Chorianopoulos is a lecturer and  Marie Curie Fellow in the Department of Informatics at the Ionian University, Corfu, Greece He founded UITV.INFO,  a site that has research and information about interactive television.)
Enhancing Social Sharing of Videos: Fragment, Annotate, Enrich, and Share (Proceedings of the 16th ACM international conference on Multimedia) Pablo Cesar, Dick C.A. Bulterman, David Geerts, Jack Jansen, Henrick Knoch and William Seager (This research paper includes a discussion of the concept of "personal secondary screens" on mobile devices that display information that can be shared and annotated while watching content on a large display.)


Sample of presentations and workshops at EuroITV 2009:
The Connected Home Redefines the TV Experience  Jan Van Bogaert (Alcatel-Lucent) 
The Internet Revolution Will Be Televised   Rich Exekiel (Yahoo! Conntected TV)
Needs, emotions, experience!  Marc Hassenahl (Folkwang University)
Marian F. Ursu, Pablo Cesar, and Doug Williams.Enhancing Social Communication and Belonging by Integrating TV Narrativity and Game Play
Rodrigo Laiola Guimaraes
. Telling Stories and Commenting on Media:  The Next Generation of Multimedia Authoring Tools (pdf)
Ana Vitoria Joly.  Designing iTV Interfaces for Preschool Children
Claus Knudsen and Roel Puijk. Television and Presence: Experiments in interaction and mediation in a digital environment


Marianna Obrist, Henddrik Knoche, Damien Alliez Tutorial: User-Experience in TV-centric Services: What to consider in the Design and Evaluation?(pdf) 
David Geets Tutorial: Designing and Evaluating the Sociability of Interactive Television (pdf)
Artur Lugmayr Tutorial:  Ambient Media - An Introduction by Case-Studies(pdf)
Janez Zaletelj, Mladen Savic and Marko Meza. Real-time Viewer Feedback in the iTV production.
Skylla Janssen. Interactive Television Format Development – Could Participatory Design Bridge the Gap?
Jan Hess and Volker Wulf. Explore Social Behaviour around Rich-Media: A Structured Diary Study 
Dimitri Schuurman, Tom Evens and Lieven De Marez. A living lab research approach for mobile TV

RELATED 
Ana Vitoria Joly. Design and Evaluation of Interactive Cross-platform Applications for Pre-literate Children.  IDC 2007 Proceedings: Doctoral Consortium
Ana Vitoria Joly. Interactive Cross-platform Environments for Young Children (pdf)
C. Hesselman, W. Derks, J. Broekens, H. Eertink, M. Guelbahar, and R. Poortinga, "Facilitating an Open Ecosystem to Enhance Interactive TV Experiences", Workshop on Sharing Content and Experiences with Social Interactive Television, co-located with the European Interactive TV Conference (EuroITV2008), Salzburg, Austria, July 2008
R. Kernchen, P. Cesar, S. Meissner, M. Boussard, K. Moessner, C. Hesselman, and I. Vaishnavi, "Intelligent Multimedia Presentation Delivery in Ubiquitous Multi-Device Scenarios,IEEE MultiMedia (IEEE MM), 17(2), April-June, 2010 [in print]

P. Cesar, D.C.A. Bulterman, and J. Jansen, "Leveraging the User Impact: An Architecture for Secondary Screens Usage in an Interactive Television Environment," in Springer/ACM Multimedia Systems Journal (MSJ), 15(3): 127-142, 2009



P. Cesar, D.C.A. Bulterman, and Luiz Fernando Gomes Soares, "Introduction to special issue:  Human-centered television-directions in interactive digital television research"  ACM Transactions on Multimedia Computing, Communications, and Applications, October 2008

Somewhat Related
My preliminary thoughts about 
Adobe's Open Screen project

Oct 28, 2009

libTISCH, a multi-touch development framework with multi-touch widgets and more!

For techies and the tech-curious who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers































Here is information from libTISCH announcement:


Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 

the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be
found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


TISCH Project Wiki

RELATED
Florian is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.
MeTaTop A Multi Sensory Table Top System for Medical Procedures

Interactive Multimedia Across Platforms and Screens: Adobe's Open Screen Project; MEX Mobile User Experience Manifesto.... (Please don't annoy the user!)

I'm not sure what I think about ubiquitous Flash 10.1. and Adobe's Open Screen project. I like the idea of anything that is seamlessly cross-platform, but I shudder to think that this might let out a wave unwanted or annoying "push" advertising on on-the-go screens of all sizes. I'm assuming web developers, along with TV ad producers, will be jumping on this train without fully thinking about how their applications and designs will play out in the off-the-desktop, digital-out-of home world.

I decided to take a look, drill down through the hype, and share a few links related to this topic.

Adobe Pushes for a Flash-ier Mobile Web
Rob Pegararo, Faster Forward, Washington Post (10/5/09)
"Are you anxious to bring Flash to the mobile Web, even if it means being subjected to some over-eager Web coder's song-and-dance routine? Or would you rather do without it on the go, even if that means having to switch to a "real" computer to use some Web sites' features?"

Hopefully the "over-eager web coders" will heed the MEX Manifesto:

MEX:  Mobile User Experience 2009 Manifesto (pdf)
"The Manifesto sets out our beliefs as to how user-centred design principles can enhance the experience of multi-platform digital services."

A framework for user journeys in a multi-platform world:  Marek Poawlowski, founder of MEX

MEX: User experience journeys in a multi-platform environment from Marek Pawlowski on Vimeo.

"User experiences are evolving into increasingly complex sets of interactions between multiple devices.  In this video presentation, Marek Pawlowski of the MEX Mobile User Experience strategy forum, shows how a framework can be used to map user journeys through the multi-platform environment."

"Unencumbered by wires, information is flowing into every corner of our world at an ever increasing rate and through an ever increasing range of digital platforms. The single greatest challenge facing digital industries is understanding how this explosion of data will be woven into the fabric of consumers' lives." -- Marek Pawlowski, founder of MEX.

MEX Blog 

OPEN SCREEN VIDEO

Open Screen Project from Vyshak V on Vimeo.
"The Open Screen Project is an industry-wide initiative, led by Adobe and backed by other industry leaders who all share one clear vision: Enable consumers to engage with rich Internet experiences seamlessly across any device, anywhere. Partners in the Open Screen Project are working together to provide a consistent runtime environment for open web browsing and standalone applications — taking advantage of Adobe® Flash® Player and, in the future, Adobe® AIR®. This consistent runtime environment will remove barriers to publishing content and applications across desktops, mobile phones, televisions, and other consumer electronics." Learn more

Reinventing Storytelling in the Digital Age Across Platforms, Across Screens

NAB 2009 presentation by Shantanu Narayen of Adobe and A.D. Albers, of Disney Interactive Media Group, from NAB 2009
Adobe and NVIDIA Deliver Rich Web Experiences on Netbooks and Mobile Devices
Reuters (10/5/009)
"At Adobe MAX, Adobe's worldwide developer conference, Adobe Systems Incorporated and NVIDIA Corporation..announced that both companies are bringing uncompromised browsing of rich Web content to netbooks, smarphones and smartbooks built with NVIDIA GPUs. The companies have been working closely together as part of the Open Screen Project to optimize and dramatically improve performance of Flash Player 10.1 by taking advantage of GPU video and graphics acceleration on a wide range of mobile Internet devices. NVIDIA customers embracing Flash Player 10.1 for their new devices include HP, Lenovo, Samsung, Acer, Asus and more..."


RIM Joins Open Screen Project  Reuters (10/4/09 )


Honey I Shrunk the Flash Player Simon Bisson and Mary Branscome, ZDNET, 10/12/09
Teaming up with Adobe and the Open Screen Project -Google Blog   (10/5/09)


Paramount Digital Entertainment Launches Interactive Thriller on MySpace  Tracy Sedlow, InteractiveTV Today (10/28/09)
"The company says that it can deliver the show's multiple interactive elements to viewers across devices using Adobe Flash Player and Adobe AIR, "because of efforts by the Open Screen Project, an industry-wide initiative led by Adobe and supported by PDE and close to 50 other industry leaders, to enable people to engage with rich Internet experiences across any device, anywhere." -

Oct 26, 2009

The Designer's Accord and Sustainability

Goal of the Designers Accord:


"The goal of the Designers Accord is to bring the creative community together as a social and cultural force to create positive impact. We will do this by educating ourselves and our teams about environmental social justice issues, and talking to each and every client and customer about the consequences and opportunities provided by our work. We will create sustainable products and services, and share information about how to do that with each other."


 





















According to an article in TIME, over 170,000 design firms, corporations, and colleges around the world have adopted the Design Accord document over the course of two years.  Valerie Casey, a designer, became concerned about the impact of her work on the environment.  At the time, she was designing products such as diapers with sensors, and realized that many of her creations were not in line with environmental sustainability.

Although I try to conserve energy and I'm conscious of environmental issues on a personal level, and I am fortunate to work in a "Green" certified school a couple of days a week,   I didn't think much about what people can do to make a difference on a global scale.  The design field is vast, it includes people who design the packaging of products. Every day tons of well-designed packaging is carted away to a landfill somewhere on the planet. Every day, large electronic components, in the form of interactive displays, computers, and so forth, are shipped around the planet in large amounts of packaging, and the components themselves are often designed in a way that wastes energy and resources needlessly during the manufacturing process, or directly through the use of large amounts of electricity when in use. When these large components are no longer needed, they do not decompose into the earth.


The Designers Accord recently sponsored the Global Summit on Design Education & Sustainability on October 23-24, 2009 in San Francisco. Some of the thought leaders at the conference included Piper Kujac, of the UC Berkeley Extension,  Barbara Sudick, California State University,Chico,  Nathan Shedroff, California College of the Arts,  Ricardo Gomes, San Francisco State University,  Scott Boylston, Savannah College of Art & Design,  Jeremy Faludi, Worldchanging, Heidrun Mumper-Drumm, Art Center College of Design, and Cindy Gilbert, of the Biomimicry Institute.  (A complete list of the educational institutes that participated in the conference can be found on the Designers Accord website.)

The Designer Accord members encourage people to organize Town Hall events with others in their local design communities.  Information about the Town Hall process can be found on the Events section of the Designers Accord website.  For more information about the Designers Accord, follow the links below.

The Designers Accord

The Designers Accord Educational and Corporate Adopters

Designer Accord Design Firms (Core77 Design Directory)
(Under the search box, select 'search for designers accord adopters only')

Oct 25, 2009

Interactive multi-touch for sound design, dj-ing, and music creation

NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:


multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister



JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard.  Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks.  (I'll discuss this further in a future post.)



Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:


Additional tutorials can be found the Jazzmutant YouTube channel.


Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."

RELATED
Video:  Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures 
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen

How the Stantum's Resistive Multi-Touch Screen Works


GDIF: Gesture Description Interchange Format, a tool for music-related movements, actions, and gestures.

There has been a flurry of work in the computer music technology world that relates to what has been going on with interactive display technology, multi-touch & gesture interaction. I came across a link to the GDIF website when I was searching for information about interactive music and the use of multi-touch technologies for a future blog post.   

So what is GDIF?  Gesture description interchange format

"The Gesture Description Interchange Format (GDIF) is being developed as a tool for streaming and storing data of music-related movements, actions, and gestures.  Current general purpose formats developped within the motion capture industry and biomechanical community (e.g. C3D) focus mainly on describing low-level motion of body joints.  We are more interested in describing gesture qualities, performer-instrument relationships, and movement-sound relationships in a coherent and consistent way.  A common format will simplify working with different software, platforms and devices, and allow for sharing data between institutions."  (The Jamoma environment is used to prototype GDIF.)


Alexander Refsum Jensenius is the man who initiated the GDIF project.  He's written a variety of articles about music, gestures, movement, and emerging technologies.  


Here's Alexander's bio"Alexander (BA, MA, MSc, PhD) is a music researcher and research musician working in the fields of embodied music cognition and new interfaces for musical expression (NIME) at the University of Oslo and at the Norwegian Academy of MusicHe studied informatics, mathematics, musicology, music performance and music technology at UiOChalmersUC Berkeley and McGill. Alexander is active in the international computer music community through a number of collaborative projects, and as the initiator of GDIFHe performs on keyboard instruments and live electronics in various constellations, including the Oslo Laptop Orchestra (OLO)."




Related Publications
Godoy, R. I., E. Haga, and A. R. Jensenius (2006b). Playing `air instruments':Mimicry of sound-producing gestures by novices and experts. InS. Gibet, N. Courty, and J.-F. Kamp (Eds.), Gesture in Human-Computer Interaction and Simulation, GW 2005, Volume LNAI 3881, pp. 256{267.Berlin: Springer-Verlag.
Jensenius, A. R (2009): Motion capture studies of action-sound couplings in sonic interaction. STSM COST Action SID report. fourMs lab, University of Oslo.
Jensenius, A. R. (2007). Action - Sound: Developing Methods and Tools to Study Music-related Body Movement. PhD thesis. Department of Musicology. University of Oslo, Norway
Jensenius, A. R., K. Nymoen and R. I. Godoy (2008): A Multilayered GDIF-Based Setup for Studying Coarticulation in the Movements of Musicians. Proceedings of the International Computer Music Conference, 24-29 August 2008, Belfast.
Jensenius, A. R., T. Kvifte, and R. I. Godoy (2006). Towards a gesture description interchange format. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), NIME '06: Proceedings of the 2006 International Conference on New Interfaces for Musical Expression, Paris, pp. 176{179. Paris: IRCAM { Centre Pompidou.}
Kvifte, T. and A. R. Jensenius (2006). Towards a coherent terminology and model of instrument description and design. In N. Schnell, F. Bevilacqua, M. Lyons, and A. Tanaka (Eds.), Proceedings of New Interfaces for Musical Expression, NIME 06, IRCAM - Centre Pompidou, Paris, France, June 4-8, pp. 220–225. Paris: IRCAM - Centre Pompidou. [PDF]
Marshall,M. T., N. Peters, A. R. Jensenius, J. Boissinot, M. M. Wanderley, and J. Braasch (2006). On the development of a system for gesture control of spatialization. In Proceedings of the 2006 International Computer Music Conference, 6-11 November, New Orleans. [PDF]

RELATED
"Sonic Interaction Design is the exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts."
SID Action has four working groups:
WG1: Perceptual, cognitive, and emotional study of sonic interactions
WG2: Product sound design
WG3: Interactive art and music
WG4: Sonification



    "SoundHack was my main thing for a long time, and I poured a lot of effort into it. It was the place I put my ideas. I did have something of a mission with SoundHack. I wanted to take some computer music techniques that were only used in academia, and get them out there so that all types of musicians could use them."-Tom Erbe  SoundHack Spectral Shapers


Csound Blog "Old School Computer Music"
"Csound is a sound and music synthesis system, providing facilities for composition and performance over a wide range of platforms. It is not restricted to any style of music, having been used for many years in the creation of classical, pop, techno, ambient, experimental, and (of course) computer music, as well as music for film and television."-Csound on Sourceforge


Quote from Dr. Richard Boulanger (Father of CSound):
"For me, music is a medium through which the inner spiritual essence of all things is revealed and shared. Compositionally, I am interested in extending the voice of the traditional performer through technological means to produce a music which connects with the past, lives in the present and speaks to the future. Educationally, I am interested in helping students see technology as the most powerful instrument for the exploration, discovery, and realization of their essential musical nature - their inner voice."


Upcoming post about innovations at Stantum:
I'll be focusing on Stantum and its music and media technologies division, JazzMutant. in my next post. It is interesting to note that the co-founders of Stantum, Guilliam Largilleir and Pascal Joget, have a background in electronic music.  Guiliam specializes in multi-modal user interfaces and human-machine interface technologies. Pascal has a background in physics and electronics, and has worked as a sound engineer.


My music back-story:



The very first computer-related course I took was Computer Music Technology (in 2003), since I play an electronic midi/digital keyboard and previously tried to teach myself a few things, long before computers and related technologies were "easy" for me to figure out.  During the mid-90's, I tried my hand at Dr. Richard Boulanger' CSound, and tried to acquaint myself with tools from Cycling'74, but I gave up.  Not long after that, bought the first version of MOTU's  Freestyle, which nicely worked on my Performa 600, hooked up to my Ensoniq 32, after the nice people at MOTU sent me an update that was compatible with my set-up.  Later on,  I came across Tom Erbe's SoundHack freeware.   


A lot has changed since then!