Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Nov 13, 2011

Is the answer Voronoi? Looking for possible solutions to an art+dance+music+tech idea from a recurring dream....

If you are a long-time reader of this blog, you probably know that I sometimes have some unusual dreams about technology.  I don't blog about my dreams very often, but last night, I had another technology dream, a continuation of a dream I had one night last week.  
Voronoi Diagram (Wikipedia)

I'm pretty sure that the last two dreams were sparked by playing an online interactive demonstration of a Voronoi application before going to sleep one night, and also reading an article about "extracting ordered patterns from a triangular mesh surface" in the Novemeber/December IEEE Potentials magazine before turning in last night. 

The dance probably was influenced by my recent viewing of the North Carolina Dance Theater's performance of Innovative Works with my mom, someone who encouraged my  study of music, art, and dance at an early age.

Some of my tech dreams are sort of...practical. For example, in one recurring dream, I find myself coding for a flexible mesh/grid application. Sometimes the mesh/grid has something to do with wireless sensor networks on curved terrain, perhaps related to something like the Smart Grid, and sometimes I find myself working on an application that analyzes streaming data from a variety of sources, for security prediction purposes.  At other times, I'm coding for something more artistic, my preference.

Last night, my dream focused on creating a flexible mesh fabric that used in a multimedia dance/graphic arts/music performance.  I was coding for this performance using a Voronoi-like algorithm.


This is the best I can do to explain this: The fabric is carried by the dancers, and is both reactive and generative. In essence, the fabric is intertwined/embedded in the dance, the music, and the graphics. In my dream, everything looked/sounded/felt awesome and otherworldly, and the music that merged and morphed during the dance was so beautiful, not only the melodies, but the sounds.  (In a previous dream, the mesh contained a "nanotechnology" component, but I'll save that quest for the future.)


I thought I'd look at some of my web book marks and search a bit more for information related to this topic.  For now, here is the "brain dump".  I have more to add to this post, and plan to port it to a reference page for this blog in the future.  I hope that this post will be useful to some of my art/music/dance/tech readers!

RELATED AND SOMEWHAT RELATED


Update:  Right after uploaded this post, I came across a link to a WebGL demo for a 3D music video of pop singer Ellie Goulding's song, 'Lights', by HelloJoy.   Visitors to the webpage can click to interact with the environment. If you keep the button pressed, you fly faster.  If you tweet the link, you'll see your name crop up as you fly around in the soundspace.  For more information about the making of 'Lights' - take a look at  Behind the scenes of 'Lights": the latest WebGL sensation!  (Carlos Ulloa, 11/9/11)


After I watched the 'Lights' video, I recalled Radiohead's 'House of Cards' video, which I wrote about back in 2008: 
 "We were rolling computers all day"...The Making of Radioheads House of Cards using imaging and info visualization software.   The process behind the making of the House of Cards video was described in detail in Chapter 10 of the book, Beautiful Data.

Bradley, E., Capps, D., Luftig, J, & Stuart, J.M. Toward Stylistic Consonance in Human Movement Synthesis.(pdf)  The Open Artificial Intelligence Journal, 2010, 4, 1-19
Bradley, E., Stuart, J.  Using Chaos to Generate Variations on Movement Sequences (pdf) Chaos, 8:800-807 (1998)
Bradley, E., Stuart, J.  Learning the Grammar of Dance.(pdf)  Proceedings Fifteenth International Conference on Machine Learning, Madicson, WI, 1998
E. Bradley, D. Capps, and A. Rubin, "Can computers learn to dance?," Proceedings International Dance & Technology (IDAT), Tempe AZ, Feb 1999.
Chaotic Dance: Using mathematics to generate choreographic variations
Schedl, M., Hoglinger, C., Knees, P. Large-Scale Music Exploration in Hierarchically Organized Landscapes Using Prototypicality Information (pdf)
Fournel, N. Procedural Audio for Video Games: Are we there yet? (pdf) GDC 2010


Voronoi Cells, created by Nathan Nifong.  A version of this interactive work was used in a DanceDraw performance















Patterns in the Noise (Nathan Nifong's site - FYI, Nathan worked with Celine Latulipe with the Dance.Draw project while completing his bachelor degree in computer science at UNC-Charlotte)
Voronoi Dance (Christian Gross, using OpenFramework)
Voronoi art: Slow Trip (Oktalist/Mat)


The above video, by Mat/Oktal, was inspired by his viewing of Thomas Ruff's Substrat images. 
Scott Snibbe Studio (Intearctive art, music, and animation for iPhone, iPad, iPod, and Mac)

Interactive Voronoi Diagram Generator with WebGL (Alex Beutel)
 
The above video was found in Alex Beutel's blog post, "Interactive Voronoi Diagrams with WebGL"

Posts about DanceDraw and related work at UNC-Charlotte:
News from the HCI lab at UNC-Charlotte- Creative Interactions (Videos)
Exploring the Design Space in Technology-Augmented Dance at CHI 2010:  Celine Latulipe's team from UNC-Charlotte
Interactive Surveillance:  Live digital art installation by Annabel Manning and Celine Latulipe

SIGCHI  Digital Arts and Interaction Community:  Building Bridges
The Interdisciplinary World of Dance and Interactive Technology

HTML5Voronoi  (HTML5Code website)

HTML5 Voronoi, Live Version
Update to code to compute Voronoi diagrams (Raymond Hill, 5/22/11)
William Forsythe's "Synchronous Objects-One Flat Thing, Reproduced" - Multidisciplinary online interactive project: Translating choreography into new forms.
BCS HCI 2011 Workshop: When Words Fail:  What can Music Interaction tell us about HCI?
Woven Sound (Alex McLean)
Real DJs Code Live (Robert Andrews, Wired, 7/3/06)
Visualization of Live Code (Alex McLean)
Voronoi diagrams of music (pdf)  (Alex McLean, 2006)
WebGL
Sylvester: Vector and Matrix Math for JavaScript
Generative Art Links (Mikael Hvidtfeldt Christensen)
Schacher, J.C. Motion to Gesture to Sound:  Mapping For Interactive Dance (pdf) Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia
Code & Form:  Computational Aesthetics (Marius Watz)
Werghi, N. Extracting ordered patterns from a triangular mesh surface.  IEEE Potentials, Nov/Dec 2011
Last night I dreamt about haptic touch-screen overlays
Hyun-Seok Kim's 'Voronoi' dragonfly wing inspired superyacht 2

Hyun-Seok Kim's Voronoi Yacht











Nov 11, 2011

Revisiting Good Blogs: IDEUM update

Jim Spadaccini is the director of Ideum and is responsible for the Ideum blog, which is well worth reading if you are interested in creative use of multi-touch, gesture, and interactive multimedia applications.  Jim also serves as the principal investigator of the Open Exhibits project, a non-profit open source, multi-touch, multiuser software initiative, which includes a free software development kit (SDK) for museums, students, nonprofits, and educational use.


Here are a few links to posts from the Ideum blog:
Building a High-Resolution Multi-touch Wall: Part 1, Part 2, Part 3,  Part 4 and Part 5.
Ideum Blog
Events at the The Tech and the Maxwell Museum
Open Exhibits, Year Two

The Open Exhibits Research and Design blog is worth reading, as it often includes Jim Spadaccini's words of wisdom and some interesting video clips.


More about IDEUM
Ideum will be involved in an upcoming day-long conference at the Tech Museum on Tuesday, November 15th (2011), Building Interfaces for the New Decade  Conference San Jose, California, 11/15/11

Ideum is working on interconnectivity between devices, as shown below, where people are connecting their iPhones to a multi-touch table:
Heist at the British Museum

Here is a video about Ideum's Tiny Drifters exhibit,  7 foot multi-touch wall at the Monterey Bay Aquarium:




Here is a promotional video about Ideum's rugged MT-55 multi-touch table. It holds up to the interaction of thousands of museum visitors.  






























Sep 16, 2011

MindHabits Online Demo: Useful Serious Game for Social Skills Group Activities



I'd like to share the on-line demo of MindHabit's suite of serious games that I've found useful in my work with teens and young adults who need support in the area of social-emotional skills. 


What I like about the online demo is that it adjusts to the player's responses. This feature made it fun to use during the last few social skills groups I facilitated at work, since it could be played by students with a range of cognitive abilities. I had students take turns playing the game using a SMARTboard, and found that all of the students paid attention to what was going on. In my opinion, using the interactive whiteboard supported "off-the-shoulder" learning among the students who were not at the board. 


MindGames is available for Windows and Macs, and the full version is just $19.99 and provides 100 game levels.  The full version tracks progress and includes four games.


Here's some information from the company's website:  
"Based on social intelligence research conducted at McGill University, these stress busting, confidence boosting games use simple, fun-to-play exercises designed to help players develop and maintain a more positive state of mind." 
 "Based on the principles of social intelligence: Inhibition - uses game mechanics to promote positive habits; Association - connects personal info to positive feedback; Activation - uses personal references"




 MindHabits MindHabits MindHabits MindHabits
 MindHabits
You are playing the MindHabits Trainer online demo. Your progress will not be logged beyond this session.
Copyright © 2008 MindHabits inc. All rights Reserved.

Upcoming: 

Jul 24, 2011

Video: Kinect SoundWall, links to info and code!





Here is information about the project from the KinetHacks SoundWall site:

"Kinect sound machines become prettier and easier with each development! The Kinect SoundWall is a drum beat music machine controlled by gestures and voice commands. This video by  displays this digital music machine at work and how through various gesture and voice commands, users can create awesome beats to dance to. In the video, the user gestures to to certain blocks in the screen in order to create a beat there or render the beats void. Through various voice commands, the beat can start, increase tempo, stop etc. Through the proper integration of both voice and gesture commands, the Kinect SoundWall sets the standard for a great and efficient sound machine of the Kinect!"
"For more information about the Kinect SoundWall visit the project’s website."
RELATED
Vertigo SoundWall CodePlex Project Site

Jul 7, 2011

Multimedia, Multi-Touch, and Visitor Participation at the Levine Museum of the New South

I had the pleasure of visiting the Levine Museum of the New South with my daughter, a history buff, and my 7 month old grandbaby. It was his first trip to a museum, and he loved it - the pictures, the hands-on exhibits, and of course, the multimedia technology. Future historian?  Museum curator?  Digital archivist?  We'll see...


We had a chance to explore the Cotton Fields to Skyscrapers exhibit and Whirlwind of Growth, updated to take a look at the past decade and how recent events, such as rapid population growth in the region (69% between 1990, the year I moved to Charlotte, and 2010), recession and financial crisis, have impacted the Charlotte region and the Carolinas.


I especially liked the New South TalkBack part of the exhibit. I think the TalkBack feature was originally part of the Changing Places exhibit, a multi-part project that focuses on how people in the Charlotte region deal with the growing cultural diversity and steady stream of newcomers who come from just about everywhere, and how newcomers experience their transition to the area.  The exhibit was unveiled in February of 2009, and due to positive feedback, was extended well past the planned ending date.  The exhibit's impact was extended to the virtual world through  the Changing Places Project website, as well as the museum's YouTube channel. Both are great places to explore. 


I'd like to share a bit more about the Levine Museum of the New South "experience" from content related to the Changing Places: From Black and White to Technicolor® exhibit:


This overview video of the Changing Places exhibit at the Levine Museum provides a good dose of civic pride.  I've lived in the Charlotte area since 1990 and have come to appreciate the rich tapestry that makes up our community, reflected in the short stories (and songs!) shared in the visitors' video clips.



Changing Places "Our History":

(I especially liked the uplifting background music in this video.)


Changing Places: "The Out of Towners" is a mashup of clips created by visitors to the Levine Museum who come from other places in the US and around the globe:



For more videos from Levine Museum visitors, take a look at the Levine Museum's YouTube channel.   The videos are much better than what I took using my cell phone video cam, below:


New South Talkback, Levine Museum of the New South




What I didn't like so much...


Below is a video of a multi-touch picture/info display, part of the exhibit.  I didn't like it very much, as the screen was very cluttered and the content was "jumpy" at the slightest touch. Given my interest in large displays, multi-touch, and gesture interaction, my expectations were pretty high:


REMIX HISTORY
.


Perhaps it is my civic duty to volunteer with a few other multi-touch techies to create a "redo" of this application!  


RELATED
The following links are to some of the organizations/people who work or have worked in partnership with the Levine Museum of the New South:
Darcie Fohrman, Exhibit Developer and Designer, Changing Places
Irene Morris , Graphic Design, Changing Places
Dr. Pamela Grundy, curator, Changing Places
Dr. Tom Hanchett, assistant curator, Changing Places
Sarah Bryan, researcher, Changing Places
New Granada Productions, edited video footage from the Talkback Booth at the Levine Museum, spanning the course of 14 months.
Studio Displays, Inc.  (Changing Places)
Brad Larson Media "Using technology to encourage family learning in museums, zoos, and other public places" (Changing Places)
Community Building Initiative
Emulsion Arts, Film production, Changing Places
Luquire George Andrews, Changing Places (PR, brand strategy, media planning, digital solutions, located in Charlotte, NC)
Nancy Pierce, Photographer, Changing Places
Michael Daul, Kaplan & Associates Cultural Resources, Online Curator, Website Design and Development, Changing Places


Video preview of the new section of the exhibit, including some shots of "Remix History":

Jul 6, 2011

Revisiting CHI 2011: Videos of Interactive Touch, Gesture, Large Surface, and Mobile Apps with Potential for Use in Education (CHI = Computer Human Interaction)

One of my interests is how the power and potential post-WIMP interactive technologies can be harnessed for use for formal and informal education purposes, including life-long collaborative learning.  


In May, I had a chance to meet with a number of like-minded people during the CHI 2011 conference at the 2nd Workshop on UI Technologies and Impact on Educational Pedagogy.  I was impressed with the depth and breadth of the presentations at the workshop.   Since then, I've been looking through other papers and videos from CHI 2011 to find interesting applications that hold potential for use in educational settings.  


I've come across a good number of interesting applications and prototypes, so be sure to check back for future posts on this topic.  For now, here are a few applications that I'd like to share.  


Below are a few videos from Phillip Chi-Wing Fu.  (He doesn't know it yet, but I've admitted his videos into the Post-WIMP Explorers' Club.)


Interactive Multi-touch Sketching Interface for Diffusion Curves

"A novel multi-touch sketching interface enabling interactive and practical design with 2D diffusion curves is proposed; featured interaction techniques include simultaneous sketching of multiple diffusion curves and at-the-spot colors tuning."


Distinguishing Multiple Smart-Phone Interactions on a Multi-touch Wall Display using Tilt Correlation

"This paper proposes a novel matching technique, called tilt correlation, which employs the built-in tilt sensor on smart-phones to identify their concurrent contacts on a common multi-touch wall display."


WYSIWYF: Exploring and Annotating Volume Data with a Tangible Handheld Device (CHI 2011)


"Integration of a multi-touch wall display with a tangible handheld device with multi-touch and tilt sensing capabilities to provide intuitive what-you-see-is-what-you-feel visual exploration and annotation of volume data."


The following videos were uploaded by alucero:


Pass-Them-Around: Collaborative Use of Mobile Phones for Photo Sharing (CHI 2011)

"Pass-Them-Around is a phone-based application that allows a small group of collocated people to share photos using the metaphor of passing paper photos around. The prototype encourages people to share their devices and use them interchangeably while discussing photos face-to-face. The prototype supports ad-hoc photo sharing in different contexts by taking into account the spatial arrangement of users around a table, measured with sensors embedded in their mobile phones."


The next video was part of MobileHCI '10:
MindMap: Collaborative Use of Mobile Phones for Brainstorming


Jun 17, 2011

In case you missed this: Microsoft Releases Kinect SDK Beta for PC

Kinect for Windows SDK Beta!   IT IS TRUE!!!!!


My Kinect and PC are waiting for my summer project.    What a great opportunity to "practice" programming over my 5 week summer break..... I already know C#, and I've done a little game programming (ie. AI for Game Development - using XNA Game Studio Express- it has been a while).  
Skeleton tracking image
-Photo credit: Microsoft Research


I have some cool ideas for basic games that might be good for the students I work with who have autism spectrum disorders... and some ideas that might be fun for my grand-baby.  I can't wait to have time to code again!   


Here's some info from the Microsoft Kinect for Windows SDK Beta website:


"The Kinect for Windows SDK beta is a programming toolkit for application developers. It enables the academic and enthusiast communities easy access to the capabilities offered by the Microsoft Kinect device connected to computers running the Windows 7 operating system."


"The Kinect for Windows SDK beta includes drivers, rich APIs for raw sensor streams and human motion tracking, installation documents, and resource materials. It provides Kinect capabilities to developers who build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010."


This SDK includes the following features:


Raw sensor streams
Access to raw data streams from the depth sensor, color camera sensor, and four-element microphone array enables developers to build upon the low-level streams that are generated by the Kinect sensor.
Skeletal tracking
The capability to track the skeleton image of one or two people moving within the Kinect field of view make it easy to create gesture-driven applications.
Advanced audio capabilities
Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.
Sample code and documentation
The SDK includes more than 100 pages of technical documentation. In addition to built-in help files, the documentation includes detailed walkthroughs for most samples provided with the SDK.
Easy installation
The SDK installs quickly, requires no complex configuration, and the complete installer size is less than 100 MB. Developers can get up and running in just a few minutes with a standard standalone Kinect sensor unit (widely available at retail outlets)."



Nicholas Kolakowski, Application Development News, 6/16/11


Apr 29, 2011

Musical Multitouch/Gestural Interfaces by Osmosis

I've been pleasantly surprised by the increase of interesting multi-touch and gesture-based applications developed for musical interaction on large displays.  This topic is dear to my heart -  I took a computer music technology class back in 2003, and think it would have been great if this sort of thing was available back then.  Who wants to point and click around a music app for hours on end?!


The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.


Performance Systems for Stage/Studio from Osmosis on Vimeo.

Transparent Stage System Specs
Design
• Floating, transparent HD displays from 32"
• Haptic surface with tempered glass backing
• High gain image with wide viewing angle
• Rugged aircraft-grade aluminum build
• Enclosed projector and computer
• Minimalist style, compact footprint
• Disassembles for easy transportation


Interactivity
• Projected capacitive foil or IR bezel options
• Up to 32 simultaneous touch points
• Precise, responsive touch tracking (3mm)
• Immune to external light conditions
• Use of fingers, gloves or stylus


Technical
Display:
• DLP portable projector
• 1280×800 HD resolution
• 2500 ANSI-lumen, 1800:1 contrast
Computer:
• Mini-ITX, Core 2 Quad, 4GB RAM
• ATI Radeon X1250 graphics card
• Wireless keyboard and mouse
• Windows 7 Pro

Stuart McClean, the founder of Osmosis, shared the following information about his company:

"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."


cid:image003.jpg@01CBBEEE.FB8A4D70cid:image016.jpg@01CBBEEE.FB8A4D70cid:image011.jpg@01CBBEEE.FB8A4D70


For more information about the applications developed by the Osmosis team, take a look at their showcase page.  

Apr 26, 2011

Multi-touch and Gesture Interaction News and Updates You Might Have Missed (Part I)

Over the past couple of months, I've come across many interesting links related to multi-touch and gesture interaction, but I haven't had time to devote a thoughtful post to each one.  "Part I",  is a nice collection of experimental, commercial, and non-commercial efforts by a variety of creative technologists, with a smattering of industry news that might be of interest to IMT readers. 


Ideum's MT55 HD Multitouch Table 4/19/11

New MT55 HD Multitouch Table Now Shipping,  Jim Spadaccini, Ideum Blog 4/11/11

Smithsonian American Art Museum to Open Education Center  Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center.  The center was funded by an anonymous $8 million dollar gift.)

Bill Buxton, Microsoft Research, 4/7/11 - Includes lots of pictures, links to videos, and more information of what might be the first touch-screen.  Also see Bill Buxton's companion website, Multi-Touch Systems that I have Known and Loved, updated on 3/21/11.  Bill Buxton knows all (almost!)


"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler

(Also check out NodeBeat, a multi-touch music/audio sequencer/generator app by Seth Sandler and Justin Windle)

Intuilab, 4/13/11
"IntuiLab, a global leader in surface computing software applications, today announced support for the revolutionary Microsoft Kinect device across its full line of IntuiFace products and solutions including IntuiFace Presentation and IntuiFace Commerce...Microsoft Kinect brings distant gesture control to interactive solutions. These gesture controls allow users to interact with displayed digital assets from a distance at their own pace and path – for example, browsing through a large quantity of products in a store catalog or manipulating 3D models (such as a mobile phone) – all without having to actually touch the screen..."  -IntuiLab (Take a look at the IntuiLab team- an interactive page!)




Sparkon:  Videos and links related to multi-touch and gesture-based applications



Official Kinect SDK to be Open SourceJosh Blake, Deconstructing the NUI, 4/18/11  
9 This bit of news excited me, but don't get your hopes up. If anyone knows what will happen with the Kinect SDK, please leave a comment.)
"Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea." -Josh Blake

Kenrick Kin, Tom Miller, Bjoern Bollensdorff, Tony DeRose, Bjoern Hartmann, Manees Agrawala (Pixar Online Library)

Flight Race Game on 3DFeel lm3Labs, 4/18/11


JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."


Harry van der Veen's Multitouch Blog (NUITEQ)


Stantum "Unlimited Multi-Touch" Latest News

At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11 

Immersive Labs

Hard Rock Cafe International Using NextWindow Touch Screens:  "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)

Razorfish: Thoughts on MIX 11 ,James Ashley, Razorfish Blog, 4/20/11  Also see: Razorfish Lab's Prototypes




"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen."  -Multitouchfi  Also see the Multitouch website.



Big Size Multitouch Display Turned into a MicroscopeMicroscopy-News, 3/28/11
Mac OX 10.7 Lion: new multi-touch gestures, Dock integration for Expose, Launchpad, Mission Control Appleinsider, 4/14/11


Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11


3M Touch Systems's YouTube Channel

Social Mirror 3D Gestural Display, Now Using Kinnect:  SnibbeInteractive




Apr 22, 2011

Pervasive Retail Part I: Web UX Meets Retail CX - Screens Large and Small at the Mall, Revisited

If you follow my blog(s), you know that I have a passion for interactive displays in public spaces, and that I enjoy watching how various technologies converge, jump across platforms and devices, inter-operate, and re-purpose over time.  

The best places for watching this unfold, in my opinion, are airports, malls, shopping districts,  and larger "big box" establishments, where the Web meets Digital Out of Home (DOOH), old-fashioned kiosks morph into multi-touch screens and gesture-based windows, and visual merchandising meets technology, digital culture, architecture, and consumer metrics At the center of it all is the user/consumer - regular people, moms, dads, kids, teens, the elderly, the disabled, the hurried and the worried. Adding to the complexity is that an increasing number of people who are out-and-about are tethered to various mobile devices.

In scholarly tech circles, the concept of DOOH is known "Pervasive Retail".  The explosion of mobile devices and ubiquitous screens has fueled the fire for research, and is the focus of the current issue of IEEE's Pervasive and Ubiquitous Computing.   

Despite the influx of technology, no-one is exactly sure how to do it quite right.  (I have some ideas, which I'll save for a future post.)

If you are interested in learning more about concepts related to "pervasive retail", the Retail Customer Experience website is a treasure trove of information related to DOOH, digital signage, multi-channel retailing, in-store media, kiosks, interactive touch screens and windows, related metrics, and more, with stories about real-life technology implementation.


Mall Video
The following video, taken with my handy HTC Incredible, provides a quick sampling of the screens I encountered during a recent visit to South Park Mall, in Charlotte, N.C.  The last screens in the clip were taken in the Brookstone store, and will be included in another clip that focuses solely on all of the screens that were scattered about the retail space.  


I have a hunch that some of the smaller displays in the Brookstone store were iPads.  iPads and tablets have great potential for use for shelf-level in-store interactive visual merchandising deployments, given the right apps and mounting systems. (See iPads as Cheap Digital Signage, by Tony Hymes of DOOHSocial and the video about Premier's iPad mounts, for more information.)

Much of what you'll see in the following video, taken at the same mall in December of 2009, wasn't around during my most recent trip:
Screens Large and Small at the Mall

Interactive Coke Machine and Kid at the Mall












I was sad to see that the interactive screen on the Coke machine  had been replaced by an ordinary one.  Part of the problem, I think, is that the interactive display was too busy and as a consequence, made the goal getting a quick drink a bit too complicated for the average thirsty customer, as seen in the video below:


Touch Screen Coke Machine at the Mall: 90 seconds to get a coke!

RELATED

Previous Posts:

References and Resources (Partial List)
Ron Brunt, InTouch with Retailing Whitepaper, 1/15/06
Brian Monahan, IPG Emerging Media Blog, 4/15/11
When all the world is a screen (The video is worth taking the time to watch.)
Narayanswami, C.,  Kruger, A.,  Marmasse, N. Pervasive Retail, IEEE Pervasive Computing
April-June 2011 (Vol. 10, No. 2) pp. 16-18 1536-1268/11/$26.00 © 2011 IEEE 
References from the Pervasive Retail article:
Mobile Retail Blueprint, Nat'l Retail Federation; www.nrf.commodules.php?name=Pages&op=viewlive&sp_id=1268 .
G. Belkin, Pervasive Retail Business Intelligence, Aberdeen Group, Apr. 2010; www.slideshare.net/AxiomConsultingAustralia pervasive-retail-business-intelligence .
R. Wasinger, A. Krüger, and O. Jacobs, "Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant,"Proc. 3rd Int'l Conf. Pervasive Computing (Pervasive), Springer, 2005, pp. 297–314.
A. Meschtscherjakov et al., "Enhanced Shopping: A Dynamic Map in a Retail Store," Proc. 10th Int'l Conf. Ubiquitous Computing(UbiComp 08), ACM Press, 2008, pp. 336–339.
C. Stahl and J. Haupert, "Taking Location Modelling to New Levels: A Map Modelling Toolkit for Intelligent Environments," Proc. Int'l Workshop Location- and Context-Awareness (LoCA), LNCS 3987, Springer, 2006, pp. 74–85.