I had the pleasure of visiting the Levine Museum of the New South with my daughter, a history buff, and my 7 month old grandbaby. It was his first trip to a museum, and he loved it - the pictures, the hands-on exhibits, and of course, the multimedia technology. Future historian? Museum curator? Digital archivist? We'll see...
We had a chance to explore the Cotton Fields to Skyscrapers exhibit and Whirlwind of Growth, updated to take a look at the past decade and how recent events, such as rapid population growth in the region (69% between 1990, the year I moved to Charlotte, and 2010), recession and financial crisis, have impacted the Charlotte region and the Carolinas.
I especially liked the New South TalkBack part of the exhibit. I think the TalkBack feature was originally part of the Changing Places exhibit, a multi-part project that focuses on how people in the Charlotte region deal with the growing cultural diversity and steady stream of newcomers who come from just about everywhere, and how newcomers experience their transition to the area. The exhibit was unveiled in February of 2009, and due to positive feedback, was extended well past the planned ending date. The exhibit's impact was extended to the virtual world through the Changing Places Project website, as well as the museum's YouTube channel. Both are great places to explore.
I'd like to share a bit more about the Levine Museum of the New South "experience" from content related to the Changing Places: From Black and White to Technicolor® exhibit:
This overview video of the Changing Places exhibit at the Levine Museum provides a good dose of civic pride. I've lived in the Charlotte area since 1990 and have come to appreciate the rich tapestry that makes up our community, reflected in the short stories (and songs!) shared in the visitors' video clips.
Changing Places "Our History":
(I especially liked the uplifting background music in this video.)
Changing Places: "The Out of Towners" is a mashup of clips created by visitors to the Levine Museum who come from other places in the US and around the globe:
For more videos from Levine Museum visitors, take a look at the Levine Museum's YouTube channel. The videos are much better than what I took using my cell phone video cam, below:
New South Talkback, Levine Museum of the New South
What I didn't like so much...
Below is a video of a multi-touch picture/info display, part of the exhibit. I didn't like it very much, as the screen was very cluttered and the content was "jumpy" at the slightest touch. Given my interest in large displays, multi-touch, and gesture interaction, my expectations were pretty high:
REMIX HISTORY
.
Perhaps it is my civic duty to volunteer with a few other multi-touch techies to create a "redo" of this application!
RELATED
The following links are to some of the organizations/people who work or have worked in partnership with the Levine Museum of the New South:
Darcie Fohrman, Exhibit Developer and Designer, Changing Places
Irene Morris , Graphic Design, Changing Places
Dr. Pamela Grundy, curator, Changing Places
Dr. Tom Hanchett, assistant curator, Changing Places
Sarah Bryan, researcher, Changing Places
New Granada Productions, edited video footage from the Talkback Booth at the Levine Museum, spanning the course of 14 months.
Studio Displays, Inc. (Changing Places)
Brad Larson Media "Using technology to encourage family learning in museums, zoos, and other public places" (Changing Places)
Community Building Initiative
Emulsion Arts, Film production, Changing Places
Luquire George Andrews, Changing Places (PR, brand strategy, media planning, digital solutions, located in Charlotte, NC)
Nancy Pierce, Photographer, Changing Places
Michael Daul, Kaplan & Associates Cultural Resources, Online Curator, Website Design and Development, Changing Places
Video preview of the new section of the exhibit, including some shots of "Remix History":
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Jul 7, 2011
Multimedia, Multi-Touch, and Visitor Participation at the Levine Museum of the New South
Posted by
Lynn Marentette
Jul 6, 2011
Revisiting CHI 2011: Videos of Interactive Touch, Gesture, Large Surface, and Mobile Apps with Potential for Use in Education (CHI = Computer Human Interaction)
One of my interests is how the power and potential post-WIMP interactive technologies can be harnessed for use for formal and informal education purposes, including life-long collaborative learning.
In May, I had a chance to meet with a number of like-minded people during the CHI 2011 conference at the 2nd Workshop on UI Technologies and Impact on Educational Pedagogy. I was impressed with the depth and breadth of the presentations at the workshop. Since then, I've been looking through other papers and videos from CHI 2011 to find interesting applications that hold potential for use in educational settings.
I've come across a good number of interesting applications and prototypes, so be sure to check back for future posts on this topic. For now, here are a few applications that I'd like to share.
Below are a few videos from Phillip Chi-Wing Fu. (He doesn't know it yet, but I've admitted his videos into the Post-WIMP Explorers' Club.)
Interactive Multi-touch Sketching Interface for Diffusion Curves
"A novel multi-touch sketching interface enabling interactive and practical design with 2D diffusion curves is proposed; featured interaction techniques include simultaneous sketching of multiple diffusion curves and at-the-spot colors tuning."
Distinguishing Multiple Smart-Phone Interactions on a Multi-touch Wall Display using Tilt Correlation
"This paper proposes a novel matching technique, called tilt correlation, which employs the built-in tilt sensor on smart-phones to identify their concurrent contacts on a common multi-touch wall display."
WYSIWYF: Exploring and Annotating Volume Data with a Tangible Handheld Device (CHI 2011)
"Integration of a multi-touch wall display with a tangible handheld device with multi-touch and tilt sensing capabilities to provide intuitive what-you-see-is-what-you-feel visual exploration and annotation of volume data."
The following videos were uploaded by alucero:
Pass-Them-Around: Collaborative Use of Mobile Phones for Photo Sharing (CHI 2011)
"Pass-Them-Around is a phone-based application that allows a small group of collocated people to share photos using the metaphor of passing paper photos around. The prototype encourages people to share their devices and use them interchangeably while discussing photos face-to-face. The prototype supports ad-hoc photo sharing in different contexts by taking into account the spatial arrangement of users around a table, measured with sensors embedded in their mobile phones."
The next video was part of MobileHCI '10:
MindMap: Collaborative Use of Mobile Phones for Brainstorming
In May, I had a chance to meet with a number of like-minded people during the CHI 2011 conference at the 2nd Workshop on UI Technologies and Impact on Educational Pedagogy. I was impressed with the depth and breadth of the presentations at the workshop. Since then, I've been looking through other papers and videos from CHI 2011 to find interesting applications that hold potential for use in educational settings.
I've come across a good number of interesting applications and prototypes, so be sure to check back for future posts on this topic. For now, here are a few applications that I'd like to share.
Below are a few videos from Phillip Chi-Wing Fu. (He doesn't know it yet, but I've admitted his videos into the Post-WIMP Explorers' Club.)
Interactive Multi-touch Sketching Interface for Diffusion Curves
"A novel multi-touch sketching interface enabling interactive and practical design with 2D diffusion curves is proposed; featured interaction techniques include simultaneous sketching of multiple diffusion curves and at-the-spot colors tuning."
Distinguishing Multiple Smart-Phone Interactions on a Multi-touch Wall Display using Tilt Correlation
"This paper proposes a novel matching technique, called tilt correlation, which employs the built-in tilt sensor on smart-phones to identify their concurrent contacts on a common multi-touch wall display."
WYSIWYF: Exploring and Annotating Volume Data with a Tangible Handheld Device (CHI 2011)
"Integration of a multi-touch wall display with a tangible handheld device with multi-touch and tilt sensing capabilities to provide intuitive what-you-see-is-what-you-feel visual exploration and annotation of volume data."
The following videos were uploaded by alucero:
Pass-Them-Around: Collaborative Use of Mobile Phones for Photo Sharing (CHI 2011)
"Pass-Them-Around is a phone-based application that allows a small group of collocated people to share photos using the metaphor of passing paper photos around. The prototype encourages people to share their devices and use them interchangeably while discussing photos face-to-face. The prototype supports ad-hoc photo sharing in different contexts by taking into account the spatial arrangement of users around a table, measured with sensors embedded in their mobile phones."
The next video was part of MobileHCI '10:
MindMap: Collaborative Use of Mobile Phones for Brainstorming
Posted by
Lynn Marentette
Labels:
#CHI2011,
CHI 2011,
education,
gesture,
HCI,
interaction,
learning,
mobile,
multitouch,
NUI,
pedagogy,
post-WIMP,
teaching,
UI technology
No comments:
Jun 17, 2011
In case you missed this: Microsoft Releases Kinect SDK Beta for PC
Kinect for Windows SDK Beta! IT IS TRUE!!!!!
My Kinect and PC are waiting for my summer project. What a great opportunity to "practice" programming over my 5 week summer break..... I already know C#, and I've done a little game programming (ie. AI for Game Development - using XNA Game Studio Express- it has been a while).

-Photo credit: Microsoft Research
I have some cool ideas for basic games that might be good for the students I work with who have autism spectrum disorders... and some ideas that might be fun for my grand-baby. I can't wait to have time to code again!
Here's some info from the Microsoft Kinect for Windows SDK Beta website:
"The Kinect for Windows SDK beta is a programming toolkit for application developers. It enables the academic and enthusiast communities easy access to the capabilities offered by the Microsoft Kinect device connected to computers running the Windows 7 operating system."
"The Kinect for Windows SDK beta includes drivers, rich APIs for raw sensor streams and human motion tracking, installation documents, and resource materials. It provides Kinect capabilities to developers who build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010."
This SDK includes the following features:
Raw sensor streams
Access to raw data streams from the depth sensor, color camera sensor, and four-element microphone array enables developers to build upon the low-level streams that are generated by the Kinect sensor.
Skeletal tracking
The capability to track the skeleton image of one or two people moving within the Kinect field of view make it easy to create gesture-driven applications.
Advanced audio capabilities
Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.
Sample code and documentation
The SDK includes more than 100 pages of technical documentation. In addition to built-in help files, the documentation includes detailed walkthroughs for most samples provided with the SDK.
Easy installation
The SDK installs quickly, requires no complex configuration, and the complete installer size is less than 100 MB. Developers can get up and running in just a few minutes with a standard standalone Kinect sensor unit (widely available at retail outlets)."
My Kinect and PC are waiting for my summer project. What a great opportunity to "practice" programming over my 5 week summer break..... I already know C#, and I've done a little game programming (ie. AI for Game Development - using XNA Game Studio Express- it has been a while).

-Photo credit: Microsoft Research
I have some cool ideas for basic games that might be good for the students I work with who have autism spectrum disorders... and some ideas that might be fun for my grand-baby. I can't wait to have time to code again!
Here's some info from the Microsoft Kinect for Windows SDK Beta website:
"The Kinect for Windows SDK beta is a programming toolkit for application developers. It enables the academic and enthusiast communities easy access to the capabilities offered by the Microsoft Kinect device connected to computers running the Windows 7 operating system."
"The Kinect for Windows SDK beta includes drivers, rich APIs for raw sensor streams and human motion tracking, installation documents, and resource materials. It provides Kinect capabilities to developers who build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010."
This SDK includes the following features:
Raw sensor streams
Access to raw data streams from the depth sensor, color camera sensor, and four-element microphone array enables developers to build upon the low-level streams that are generated by the Kinect sensor.
Skeletal tracking
The capability to track the skeleton image of one or two people moving within the Kinect field of view make it easy to create gesture-driven applications.
Advanced audio capabilities
Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.
Sample code and documentation
The SDK includes more than 100 pages of technical documentation. In addition to built-in help files, the documentation includes detailed walkthroughs for most samples provided with the SDK.
Easy installation
The SDK installs quickly, requires no complex configuration, and the complete installer size is less than 100 MB. Developers can get up and running in just a few minutes with a standard standalone Kinect sensor unit (widely available at retail outlets)."
Nicholas Kolakowski, Application Development News, 6/16/11
Posted by
Lynn Marentette
Apr 29, 2011
Musical Multitouch/Gestural Interfaces by Osmosis
I've been pleasantly surprised by the increase of interesting multi-touch and gesture-based applications developed for musical interaction on large displays. This topic is dear to my heart - I took a computer music technology class back in 2003, and think it would have been great if this sort of thing was available back then. Who wants to point and click around a music app for hours on end?!
The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.
Performance Systems for Stage/Studio from Osmosis on Vimeo.
Transparent Stage System Specs
Design
• Floating, transparent HD displays from 32"
• Haptic surface with tempered glass backing
• High gain image with wide viewing angle
• Rugged aircraft-grade aluminum build
• Enclosed projector and computer
• Minimalist style, compact footprint
• Disassembles for easy transportation
Interactivity
• Projected capacitive foil or IR bezel options
• Up to 32 simultaneous touch points
• Precise, responsive touch tracking (3mm)
• Immune to external light conditions
• Use of fingers, gloves or stylus
Technical
Display:
• DLP portable projector
• 1280×800 HD resolution
• 2500 ANSI-lumen, 1800:1 contrast
Computer:
• Mini-ITX, Core 2 Quad, 4GB RAM
• ATI Radeon X1250 graphics card
• Wireless keyboard and mouse
• Windows 7 Pro
Stuart McClean, the founder of Osmosis, shared the following information about his company:
"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."



For more information about the applications developed by the Osmosis team, take a look at their showcase page.
The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.
Performance Systems for Stage/Studio from Osmosis on Vimeo.
Transparent Stage System Specs
Design
• Floating, transparent HD displays from 32"
• Haptic surface with tempered glass backing
• High gain image with wide viewing angle
• Rugged aircraft-grade aluminum build
• Enclosed projector and computer
• Minimalist style, compact footprint
• Disassembles for easy transportation
Interactivity
• Projected capacitive foil or IR bezel options
• Up to 32 simultaneous touch points
• Precise, responsive touch tracking (3mm)
• Immune to external light conditions
• Use of fingers, gloves or stylus
Technical
Display:
• DLP portable projector
• 1280×800 HD resolution
• 2500 ANSI-lumen, 1800:1 contrast
Computer:
• Mini-ITX, Core 2 Quad, 4GB RAM
• ATI Radeon X1250 graphics card
• Wireless keyboard and mouse
• Windows 7 Pro
Stuart McClean, the founder of Osmosis, shared the following information about his company:
"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."
For more information about the applications developed by the Osmosis team, take a look at their showcase page.
Posted by
Lynn Marentette
Labels:
creative,
gesture,
HCI,
interactive,
multi-touch,
music,
NUI,
Osmosis,
performance systems,
Stage/Studio,
Stuart McClean,
technology,
video
1 comment:
Apr 26, 2011
Multi-touch and Gesture Interaction News and Updates You Might Have Missed (Part I)
Over the past couple of months, I've come across many interesting links related to multi-touch and gesture interaction, but I haven't had time to devote a thoughtful post to each one. "Part I", is a nice collection of experimental, commercial, and non-commercial efforts by a variety of creative technologists, with a smattering of industry news that might be of interest to IMT readers.
Ideum's MT55 HD Multitouch Table 4/19/11
New MT55 HD Multitouch Table Now Shipping, Jim Spadaccini, Ideum Blog 4/11/11
Smithsonian American Art Museum to Open Education Center Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center. The center was funded by an anonymous $8 million dollar gift.)
"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler
Mac OX 10.7 Lion: new multi-touch gestures, Dock integration for Expose, Launchpad, Mission Control Appleinsider, 4/14/11
Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11
3M Touch Systems's YouTube Channel
Social Mirror 3D Gestural Display, Now Using Kinnect: SnibbeInteractive
Ideum's MT55 HD Multitouch Table 4/19/11
New MT55 HD Multitouch Table Now Shipping, Jim Spadaccini, Ideum Blog 4/11/11
Smithsonian American Art Museum to Open Education Center Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center. The center was funded by an anonymous $8 million dollar gift.)
Bill Buxton, Microsoft Research, 4/7/11 - Includes lots of pictures, links to videos, and more information of what might be the first touch-screen. Also see Bill Buxton's companion website, Multi-Touch Systems that I have Known and Loved, updated on 3/21/11. Bill Buxton knows all (almost!)
How to make a Multitouch Biggie (Seth Sandler)
"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler
(Also check out NodeBeat, a multi-touch music/audio sequencer/generator app by Seth Sandler and Justin Windle)
Sparkon: Videos and links related to multi-touch and gesture-based applications
"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen." -Multitouchfi Also see the Multitouch website.
Big Size Multitouch Display Turned into a Microscope, Microscopy-News, 3/28/11
Intuilab, 4/13/11
"IntuiLab, a global leader in surface computing software applications, today announced support for the revolutionary Microsoft Kinect device across its full line of IntuiFace products and solutions including IntuiFace Presentation and IntuiFace Commerce...Microsoft Kinect brings distant gesture control to interactive solutions. These gesture controls allow users to interact with displayed digital assets from a distance at their own pace and path – for example, browsing through a large quantity of products in a store catalog or manipulating 3D models (such as a mobile phone) – all without having to actually touch the screen..." -IntuiLab (Take a look at the IntuiLab team- an interactive page!)
EON Reality Launches EON Interact - Interact with 3D Data by Using Your Bare Hands Animation xPress, 3/31/11
Sparkon: Videos and links related to multi-touch and gesture-based applications
Official Kinect SDK to be Open Source, Josh Blake, Deconstructing the NUI, 4/18/11
9 This bit of news excited me, but don't get your hopes up. If anyone knows what will happen with the Kinect SDK, please leave a comment.)
"Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea." -Josh Blake
New Channel 9 Video: Josh Blake on NaturalShow Multi-touch in WPF, Pete Brown, 10rem.net, 4/22/11
Kenrick Kin, Tom Miller, Bjoern Bollensdorff, Tony DeRose, Bjoern Hartmann, Manees Agrawala (Pixar Online Library)
Flight Race Game on 3DFeel lm3Labs, 4/18/11
JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."
Harry van der Veen's Multitouch Blog (NUITEQ)
Stantum "Unlimited Multi-Touch" Latest News
At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11
Immersive Labs
Hard Rock Cafe International Using NextWindow Touch Screens: "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)
JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."
Harry van der Veen's Multitouch Blog (NUITEQ)
Stantum "Unlimited Multi-Touch" Latest News
At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11
Immersive Labs
Hard Rock Cafe International Using NextWindow Touch Screens: "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)
Razorfish: Thoughts on MIX 11 ,James Ashley, Razorfish Blog, 4/20/11 Also see: Razorfish Lab's Prototypes
"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen." -Multitouchfi Also see the Multitouch website.
Big Size Multitouch Display Turned into a Microscope, Microscopy-News, 3/28/11
Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11
3M Touch Systems's YouTube Channel
Social Mirror 3D Gestural Display, Now Using Kinnect: SnibbeInteractive
Posted by
Lynn Marentette
Labels:
3M touch,
ideum,
immersive labs,
Intuilab,
Josh Blake,
kinect,
lm3labs,
microscope,
multitouch,
MultitouchFi,
NextWindow,
nodebeat,
NUI,
razorfish,
vectorform
No comments:
Apr 22, 2011
Pervasive Retail Part I: Web UX Meets Retail CX - Screens Large and Small at the Mall, Revisited
If you follow my blog(s), you know that I have a passion for interactive displays in public spaces, and that I enjoy watching how various technologies converge, jump across platforms and devices, inter-operate, and re-purpose over time.

Mall Video
The best places for watching this unfold, in my opinion, are airports, malls, shopping districts, and larger "big box" establishments, where the Web meets Digital Out of Home (DOOH), old-fashioned kiosks morph into multi-touch screens and gesture-based windows, and visual merchandising meets technology, digital culture, architecture, and consumer metrics. At the center of it all is the user/consumer - regular people, moms, dads, kids, teens, the elderly, the disabled, the hurried and the worried. Adding to the complexity is that an increasing number of people who are out-and-about are tethered to various mobile devices.
In scholarly tech circles, the concept of DOOH is known "Pervasive Retail". The explosion of mobile devices and ubiquitous screens has fueled the fire for research, and is the focus of the current issue of IEEE's Pervasive and Ubiquitous Computing.

Despite the influx of technology, no-one is exactly sure how to do it quite right. (I have some ideas, which I'll save for a future post.)
If you are interested in learning more about concepts related to "pervasive retail", the Retail Customer Experience website is a treasure trove of information related to DOOH, digital signage, multi-channel retailing, in-store media, kiosks, interactive touch screens and windows, related metrics, and more, with stories about real-life technology implementation.
Mall Video
The following video, taken with my handy HTC Incredible, provides a quick sampling of the screens I encountered during a recent visit to South Park Mall, in Charlotte, N.C. The last screens in the clip were taken in the Brookstone store, and will be included in another clip that focuses solely on all of the screens that were scattered about the retail space.
I have a hunch that some of the smaller displays in the Brookstone store were iPads. iPads and tablets have great potential for use for shelf-level in-store interactive visual merchandising deployments, given the right apps and mounting systems. (See iPads as Cheap Digital Signage, by Tony Hymes of DOOHSocial and the video about Premier's iPad mounts, for more information.)
Much of what you'll see in the following video, taken at the same mall in December of 2009, wasn't around during my most recent trip:
Screens Large and Small at the Mall
Interactive Coke Machine and Kid at the Mall
I was sad to see that the interactive screen on the Coke machine had been replaced by an ordinary one. Part of the problem, I think, is that the interactive display was too busy and as a consequence, made the goal getting a quick drink a bit too complicated for the average thirsty customer, as seen in the video below:
Touch Screen Coke Machine at the Mall: 90 seconds to get a coke!
RELATED
Previous Posts:
Window Shopping in the Web Outside: Interactive Window Displays (videos and links)
Video: Gesture Interactive Window Display at Repetto Store in France, by the Marcel Agency
Interractions (ACM) Cover "Proximic Interactions: The New Ubicomp?"; Plus - Close encounters with displays at the airport and JC Penney. (Briefly touches on the topic of accessibility/usability/universal design of displays in public spaces, an area in need of further research.)
Interractions (ACM) Cover "Proximic Interactions: The New Ubicomp?"; Plus - Close encounters with displays at the airport and JC Penney. (Briefly touches on the topic of accessibility/usability/universal design of displays in public spaces, an area in need of further research.)
References and Resources (Partial List)
Ron Brunt, InTouch with Retailing Whitepaper, 1/15/06
Brian Monahan, IPG Emerging Media Blog, 4/15/11
When all the world is a screen (The video is worth taking the time to watch.)
Narayanswami, C., Kruger, A., Marmasse, N. Pervasive Retail, IEEE Pervasive Computing
April-June 2011 (Vol. 10, No. 2) pp. 16-18 1536-1268/11/$26.00 © 2011 IEEE
References from the Pervasive Retail article:
Mobile Retail Blueprint, Nat'l Retail Federation; www.nrf.commodules.php?name=Pages&op=viewlive&sp_id=1268 .
G. Belkin, Pervasive Retail Business Intelligence, Aberdeen Group, Apr. 2010; www.slideshare.net/AxiomConsultingAustralia pervasive-retail-business-intelligence .
R. Wasinger, A. Krüger, and O. Jacobs, "Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant,"Proc. 3rd Int'l Conf. Pervasive Computing (Pervasive), Springer, 2005, pp. 297–314.
A. Meschtscherjakov et al., "Enhanced Shopping: A Dynamic Map in a Retail Store," Proc. 10th Int'l Conf. Ubiquitous Computing(UbiComp 08), ACM Press, 2008, pp. 336–339.
C. Stahl and J. Haupert, "Taking Location Modelling to New Levels: A Map Modelling Toolkit for Intelligent Environments," Proc. Int'l Workshop Location- and Context-Awareness (LoCA), LNCS 3987, Springer, 2006, pp. 74–85.
Cross-posted: The World Is My Interactive Interface blog
Posted by
Lynn Marentette
Labels:
digital signage,
displays,
DOOH,
in-store marketing,
interactive,
mall,
media,
multimedia,
pervasive retail,
resources,
Retail CX,
screens,
technology,
UX,
video
No comments:
Apr 19, 2011
Gesture + Multitouch Interactive Media Wall, Wisconsin Institutes for Discovery, 2011 Award of Excellence, Digital Screenmedia Association
Below is a video of an interactive media wall at the Wisconsin Institutes for Discovery, the winner of the 2011 Award of Excellence from the Digtal Screenmedia Association in the category of Best Government/Education/Non-Profit Agency Deployment.
Detailed information about the Discovery Wall, including an overview of the technology, objectives of the deployment, and the positive outcomes of his project can be found on the Digital Screenmedia website.
"Float4 Interactive is a creative technology company that develops interactive systems for entertainment, advertising and design applications."
Credits
Interactive technology: Float4 Interactive
Content: Zebra Dog
AV Integration: R2W
Concept: Sensory Interactive
DISCOVERY WALL: 10 X 4 Christie MicroTiles
5496 X 1650 resolution
Gesture + MultiTouch Interaction
2 X RealMotion Servers
ENTRANCE PORTAL 17 X 2 Christie MicroTiles
9824 X 866 resolution
Multi-Zone Gesture Interaction
2 X RealMotion Servers
RELATED/SOMEWHAT RELATED
Wisconsin Institutes for Discovery: 2011 Excellence Award, Architainment
Live Design, 3/21/10
Architainment Examples
Christie MicroTiles Drive Fan Experience at new Nascar Hall of Fame
Nascar Hall of Fame (I live in the Charlotte area, so I'll visit the Nascar Hall of Fame soon and check out the Christie MicroTiles display soon!)
Miami Dolphins Choose Montreal's Arsenal Media and Float4 Interactive for Fan-Focused Interactive Wall at SunLife Stadium (BuzzWall) Arsenal Media, 12/16/10
Detailed information about the Discovery Wall, including an overview of the technology, objectives of the deployment, and the positive outcomes of his project can be found on the Digital Screenmedia website.
"Float4 Interactive is a creative technology company that develops interactive systems for entertainment, advertising and design applications."
Credits
Interactive technology: Float4 Interactive
Content: Zebra Dog
AV Integration: R2W
Concept: Sensory Interactive
DISCOVERY WALL: 10 X 4 Christie MicroTiles
5496 X 1650 resolution
Gesture + MultiTouch Interaction
2 X RealMotion Servers
ENTRANCE PORTAL 17 X 2 Christie MicroTiles
9824 X 866 resolution
Multi-Zone Gesture Interaction
2 X RealMotion Servers
RELATED/SOMEWHAT RELATED
Wisconsin Institutes for Discovery: 2011 Excellence Award, Architainment
Live Design, 3/21/10
Architainment Examples
Christie MicroTiles Drive Fan Experience at new Nascar Hall of Fame
Nascar Hall of Fame (I live in the Charlotte area, so I'll visit the Nascar Hall of Fame soon and check out the Christie MicroTiles display soon!)
Miami Dolphins Choose Montreal's Arsenal Media and Float4 Interactive for Fan-Focused Interactive Wall at SunLife Stadium (BuzzWall) Arsenal Media, 12/16/10
Posted by
Lynn Marentette
Apr 9, 2011
"Playful Interface Cultures" exhibit video showreel, just released (Ars Electronica Festival 2010)
Playful Interface Cultures from Interface Culture on Vimeo.
"This video showreel provides a walkthrough through the "Playful Interface Cultures" exhibition at the Ars Electronica Festival 2010. The works shown were created within the Master and Ph.D. program at the Interface Culture Lab, University of Art and Industrial Design in Linz, Austria." http://www.interface.ufg.ac.at/
"The exhibition design has been based on traditional Austrian tobacco brands in reference to the historic venue of the former tobacco factory built by the German designer and architect Peter Behrens." -Interface Culture Lab
Instructors:
Georg Russegger
Martin Kaltenbrunner
Michaela Ortner
Varvara Guljajeva
Design
Vesela Milhaylova
Henning Schulze
Production
Reinhard Gupfinger
Technical Support
Mar Canet
Anika Hirt
University of Art and Industrial Design, Linz
SOMEWHAT RELATED
TuioKinect, by Martin Kaltenbrunner: "A simple Tuio hand gesture tracker for Kinect"
Reactable
Posted by
Lynn Marentette
Apr 3, 2011
INTERACTIVE TABLETOPS AND SURFACES 2011 CALL FOR PAPERS AND NOTES (6th Annual ACM ITS 2011 - Nov. 13-16)
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011
ITS 2011
November 13-16, 2011
Portopia Hotel, Kobe, Japan
The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications.
Building on their success in previous years, ITS again features Papers and Notes presentations, as well as tutorials, posters, demonstrations tracks and a doctoral symposium. ITS 2011 will also include workshops.
ITS 2011 will bring together top researchers and practitioners who are interested in both the technical and human aspects of ITS technology. On behalf of the conference organizing committee, we invite you to begin planning your submissions and participation for this year's conference!
CALL FOR PARTICIPATION: PAPERS AND NOTES
The use of interactive surfaces is an exciting and emerging research area. Display technologies, such as projectors, LCD and OLED flat panels, and even flexible display substrates, coupled with input sensors capable of enabling direct interaction, make it reasonable to envision a not-so-distant future in which many of the common surfaces in our environment will function as digital interactive displays. ITS brings together researchers and practitioners from a variety of backgrounds and interests, such as camera and projector based systems, new display technologies, multi-touch sensing, user interface technologies, augmented reality, computer vision, multimodal interaction, novel input and sensing technologies, computer supported cooperative work (CSCW), and information visualization.
The intimate size of this single-track symposium provides an ideal venue for leading researchers and practitioners to exchange research results and experiences. We encourage submissions on (but not limited to) the following topic areas as they relate to interactive tabletops and surfaces:
* Gesture-based interfaces
* Multi-modal interfaces
* Tangible interfaces
* Novel interaction techniques
* Data handling/exchange on large interactive surfaces
* Data presentation on large interactive surfaces
* Software engineering methods
* Computer supported collaborative work
* Middleware and network support
* Virtual reality and augmented reality
* Social protocols
* Information visualizations
* Hardware, including sensing and input technologies with novel capabilities
* Human-centered design & methodologies
* Applications
CONFERENCE COMMITTEE
Conference Co-Chairs
Jun Rekimoto, The University of Tokyo / SonyCSL, Japan
Hideki Koike, University of Electro-Communications, Japan
Kentaro Fukuchi, Meiji University, Japan
Program Co-Chairs
Yoshifumi Kitamura, Tohoku University, Japan
Daniel Wigdor, University of Toronto, Canada
SUBMISSIONS
Papers/Notes: We invite paper submissions of two kinds: Papers (10 pages) and Notes (4 pages). Papers must present original, highly innovative, prospective and forward-looking research, possibly in one or more of the themes given above. Notes must also report novel and complete research, but where the scope and scale of the contribution is more focused and succinct than papers. Submissions must be submitted as a single PDF file in the ACM format through the submission system. A template for submissions can be found on the ITS
website (http://its2011.jp/).
All accepted submissions will be presented at ITS 2011 and appear in the ITS digital proceedings and be archived in the ACM digital library.
IMPORTANT DATES
Paper/Note Submissions: June 30, 2011
Paper/Note Author Notifications: September 2, 2011
Paper/Note Camera-Ready Deadline: October 21, 2011
RELATED
ITS 2010 Awards
| Best Paper: | Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg | |
| Best Note: | MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers | |
| Best Poster: | A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt | |
| Best Demo: | MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers |
| Best Paper: | Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg |
| Best Note: | MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers |
| Best Poster: | A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt |
| Best Demo: | MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers |
ITS 2010Conference General Chairs
ITS 2010 Organization Committee
Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
Brave NUI World: Designing User Interfaces for Touch and Gesture
Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
Brave NUI World: Designing User Interfaces for Touch and Gesture
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)

