Showing posts with label interaction. Show all posts
Showing posts with label interaction. Show all posts

Oct 2, 2013

LUMO, by PO-Motion: Projection system for children that transforms kid's designs into interactive creations (Kickstarter)

I'd like to give a shout-out to Meghan Athavale, Curtis Wachs and the other interactive minds at PO-MO for their Kickstarter project.  LUMO is an interactive system that allows any room - not just a kid's room - to be transformed into an environment that encourages play, interaction, and creativity.

From what I know about PO-MO-s other creative projects, LUMO is deserving of financial backing. Although this is the company's first venture into a hardware solution, it looks promising.  If it can be produced on a larger scale, children - as well as kids at heart - will have a new opportunity use technology to create interactive mixed reality play-scapes, right in their homes.

Watch the video, and then head over to LUMO on Kickstarter and make your pledge!

"LUMO turns a kid's room into an interactive playground, encouraging movement and imaginative play in children of any age."


Screen shot of the creation tools for children to use when creating their LUMO interactive effects:

Credit: PO-MO




































ABOUT PO-MO

The following information was taken from the Kickstarter website:


PO-MO Inc. creates interactive experiences for museums, retail environments and events.
They also offer SaaS platform Po-motion.com, a website which enables people all over the world to create their own interactive floor and wall installations.
In 2010, Meghan Athavale and her business partner Curtis Wachs decided to quit their day jobs and design interactive experiences for a living. They launched PO-MO Inc. in July 2010. Within a few months, Keith Otto and Matt Gillies joined PO-MO. Meghan and the PO-MO team launched Po-motion interactive floor and wall projection software as a beta in early 2011. In 2012, the full version of Po-motion was launched and Meghan won many awards at investment pitch contests, was one of 12 people chosen to participate in Tech Women Canada in Silicon Valley, and closed the year with an amazing advisory board, another member of the team (Joss Le Leannec) and a profitable, global company.
Meghan Athavale (CEO) and partner Curtis Wachs (CTO), along with Matt Gillies, Keith Otto, Jocelyne Le Leannec and Dave Kelly spend their days making, playing with, and dreaming of interactive surfaces.
RELATED
PO-MO Website
LUMOPlay
PO-MO interactive effects for special events and parties:
PO-MO Inc. interactive visuals for parties from PO-MO Inc. on Vimeo.



Jun 6, 2013

Interactive Displays and "Billboards" in Public Spaces; Pervasive Displays 2013

The 2013 International Symposium on Pervasive Displays (PerDis 2013), recently convened  in Mountain View, California.  Since I couldn't attend this conference, I was happy to learn from Albrecht Schmidt that the conference proceedings were recently uploaded to the ACM Digital library.  There are many exciting things going on in this interdisciplinary field!

Researchers involved with the Instant Places project, described in the video below, presented their work at PerDis 2013. The Instant Places project was part of PD-Net, a series of research efforts exploring the future of pervasive display networks in Europe. (See the "Related" section for additional references and links.)


Instant Places: Tools and Practices for Situated Publication in Display Networks

Below is information from the Instant Places video and website:
"The video describes a novel screen media system that explores new practices for individual publication and identity projection in public digital displays." 

"Instant Places has been developed by the Ubicomp group of the Information Systems Department, at the University of Minho, and has been funded within the scope of pd-net: Towards Future Pervasive Display Networks, by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 244011."

Saul Greenberg was the keynote speaker at PerDis 2013.  His keynote, "Proxemic Interactions: Displays and Devices that Respond to Social Distance", highlights how far off-the-desktop our digital/physical lives have become, and how this has influenced recent research in human-computer interaction. Saul is a professor at the University of Calgary and leads research in Human Computer Interaction, Computer Supported Cooperative Work, and Ubiquitous Computing.

Although the video of Saul Greenberg's presentation below is not from PerDis 2013, it touches on the same topics and is worth taking an hour to watch.  In this video, Greenberg presents an overview of the history of human-computer interaction. He also offers up a discussion how an understanding social theory, perception of spatial relationships, and embodied interaction can be applied to the design of natural user interfaces and interactive systems.  Useful examples of interaction design explorations, within an ecological context, are provided later in the video.

Proxemic Interactions: the New Ubicomp?




RELATED


My Backstory
Regular readers of this blog know that to subject interactive displays in public spaces holds my interest. When I was taking computer courses during the mid 2000s, I focused some of my energy on projects designed for large interactive displays, inspired by reading articles like "Physically Large Displays Improve Performance on Spatial Tasks" (Desney S. Tan, Darren Gergle, Peter Scupelli, and Randy Pausch) and "Dynamo: public interactive surface supporting the cooperative sharing and exchange of media(Shahram Izadi, Harry Brignull, Tom Rodden, Yvonne Rogers, Mia Underwood).  

Jeff Han's 2006 TED talk was another inspiration. I remember my excitement as watched his demonstration of an interactive multi-touch touch screen the size of a drafting board, before the iPhone/iPad was born.  Another inspiration was Hans Rosling's TED Talk  about health statistics, with his animated interactive data visualizations presented on a huge screen.

The following year, I stumbled upon the  NUI-Group while searching for information about multi-touch displays, and was inspired by many of the early members of the group.  I also became acquainted with a world-wide network of people who share similar interests, such as Albrecht Schmidt and his team of researchers at the Unversity of Stuttgart. This busy group recently presented at PerDis 2013 and at CHI 2013 and are involved in a wider range of ongoing projects.

INTERACTIVE DISPLAYS
Alt, F. Sahami, A., Kubitza, T., Schmidt, A.  Interaction Techniques for Creating and Exchanging Content with Public Displays. In: Proceedings of the 2013 ACM Annual Conference on Human Factors in Computing Systems 
Hinrichs, U., Carependale, S., Valkanova, N., Kulkkaniemi, K., Jacucci, G., Moer, A.V., Interactive Public Displays   Computer Graphics, Vol. 33(2) IEEE Computer Society (25-27)
PerDis 2013 Program
Sample Papers:
Otero, N., Muller, M., Alissandrakis, A., and Milrad, M. Exploring video-based interactions around digital public displays to foster curiosity about science in the schools. PerDis 2013 (pdf)
Alt, F., Schneegass, S., Girgis, M., Schmidt, A. Cognitive Effects of Interactive Public Display Applications. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 2013
Langeinrich, M., Schmidt, A., Davies, N., and Jose, R.  A practical framework for ethics: the 

Note:  Members of ACM have access to all of the proceedings of PerDis2013 in the ACM Digital Library. Non-members have access to the abstracts.

PD-NET
PD-net approach to supporting ethics compliance in public display studies. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 139-143
PD-Net 
PD-NET Publications - a great reference list, with links to many papers
Reading List on Pervasive Public Displays
About Instant Places
About the Living Lab for Screens Set

DOOH-DIGITAL OUT-OF-HOME
Daily Digital Out of Home post "Billboards That Look Back" : Could miniature cameras embedded in ads lead to Big Brother at the mall? The World Is My Interactive Interface, 5/28/08
J. Müller et al., "Looking Glass: A Field Study on Noticing Interactivity on a Shop Window," Proc. 2012 SIGCHI Conf. Human Factors in Computing Systems (CHI 12), ACM, 2012, pp. 297–306
Michelis, D., Meckel, M. Why Do We Want to Interact With Electronic Billboards in Public Space?  First Workshop on Pervasive Advertising, Pervasive 2009, 5/11/09
The Rage of Interactive Billboards
The Print Innovator, 11/28/12
10 Brilliant Interactive Billboards (Videos)
Amy-Mae Elliot, Mashable, 8/21/11


SOME INTERESTING EARLIER WORK
Jeff Han's 2006 TED Talk (This is worth revisiting, as it came out before the iPhone, iPad, etc.)


Tan, D.S., Gergle, D, Scupelli, P., Pauch, R. Physically large displays improve performance on spatial tasks. ACM Transactions on Computer-Human Interaction, V13(1) 2006 (71-99)

Revisiting promising projects: Dynamo an application for sharing information on large interactive displays in public spaces (blog post)
Lynn Marentette, Interactive Multimedia Technology, 09/16/07

Brignull, H., Izadi, S., Fitzpatrick, G., Rogers, Y., Rodden,  T. The introduction of a shared interactive surface into a communal space. Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW'04), Chicago, ACM Press, 2004 (pdf)


Izadi, S., Brignull, H., Rodden, T., Rogers, Y. and Underwood,M. Dynamo: public interactive surface supporting the cooperative sharing and exchange of media. In Proc. User
Interfaces and Software Technologies (UIST’03), Vancouver, ACM Press, 2003, 159-168. (pdf)

Proxemics (Wikipedia)


Why Do We Want to Interact With Electronic Billboards in Public Space? 


May 21, 2013

Xbox One and Kinect 2 for the Playground of the Future

Xbox One and Kinect 2, Playground of the Future

The big news in tech today is the unveiling of the new Xbox One/Kinect 2 system.  For now, the video below might be the closest you'll get to the system.  Wired's senior editor, Peter Rubin had a chance to interview Scott Evans, of Microsoft, as he demonstrated the fascinating technical details in a family-room type setting.

Wired's interview of Scott Evans and demo of the new Xbox One and Kinect 2, using Active IR technology.



From what I learned, the new Kinect sensor has six times the fidelity of the previous version. Paired with the new Xbox One, it can do amazing things.  Engineers from around the world collaborated on this project, providing expertise in facial recognition, digital signal processing, speech recognition, machine learning, and computer vision.  The Xbox One is fueled by an 8-core x86 processor, supported by 8GB of RAM, which is sure to handle the hardest gamer's needs. It also includes a 500GB hard drive and an HD Blu-ray player.


The new system was designed to enhance the gaming/user experience. The 1080p camera provides a field of view that is 60 degrees larger than its  predecessor, and can handle a high level of detail.  It provides a better means of interpreting movement and orientation, and it processes skeleton and hand movements more precisely.  The system features "muscle man", a human-based physics model that is layered over the skeleton and depth map. It senses and calculating the forces the player uses while moving in a game. 

What I find interesting is that the camera can detect the player's pulse by measuring subtle changes of the skin that can't be perceived by the naked eye.  It also can quickly identify each player (it handles up to six), and identify facial expressions.  The active IR (infrared) system provides the system with better accuracy than the original Kinect. 

I wasn't able to find out much information regarding privacy issues with this system.  This is a concern, since it can sense your physiological responses, movement patterns, and facial expressions.  Over time, a good deal of very personal information would be gathered about each user. I shudder to think about the consequences if the data fell into the wrong hands.  

Possibilities for Special Needs Populations

I can see that the Xbox One + Kinect 2 system has the potential for games and other interactive applications for use in physical rehabilitation and fitness.  Since it can interpret facial expressions, it could also provide a way to support social skills learning among children and teens who have autism spectrum disorders.

RELATED

Microsoft invests a good deal of attention to proof-of-concept projects that may or not become part of a commercial product.  Below is an example of IllumiRoom:


Hrvoje Benko, of Microsoft Research, discusses the IllumiRoom concept during an interview at CHI 2013.


Xbox One Website
The new Xbox One Kinect tracks your heart rate, happiness, hands and hollers
Matthew Panzarino, The Next Web, 5/22/13
Kinect 2 Full Video Walkthrough: The Xbox Sees You Like Never Before
Kyle Wagner, Gizmodo, 5/21/13
Hands-on with prototypes of the Xbox One and New Kinect Sensor
Ben Gilbert, engadget, 5/21/13
Efficient Human Pose Estimation from Single Depth Images
Shotton, J., Girshick, R., Fitzgibbon, A., Sharp, T., Cook, M., Finocchio, M., Moore, R., Kohli, P., Crinisi, A., Kipman, A., Blake, A.   Video
Consumer Depth Cameras for Computer Vision:  Research Topics and Applications
Fossati, A., Gall, J., Grabner, H., Ren, X., Konolige, K. (Eds.)
Xbox One: Microsoft's supergeeks reveal what's inside the hardware
Dean Takahashi, VentureBeat, 5/21/13
Next Xbox Will Face New Array of Rivals
Nick Wingfield, New York Times, 5/21/13

Mar 16, 2013

UPDATE: What's New for Kinect? Fusion, real-time 3D digitizing, design considerations, and more.

The Evolution of Microsoft Kinect

I've been following the evolution of Microsoft's Kinect, and recently discovered a few interesting videos that show how far the system has come. According to Josh Blake, the founder of the OpenKinect community and author of the Deconstructing the NUI blog,  the Kinect for Windows SDK v1.7 will be released on Monday, March 18th, from http://www.kinectforwindows.com.  More details about this version can be found on Josh's blog as well as the official Kinect for Windows blog.


It is possible to create applications for desktop systems that work with the Kinect in interesting ways, as you'll see in the following videos. I think there is potential here for use in education/edutainment!

Below is a video of Toby Sharp, of Microsoft Research, Cambridge, demonstrating Kinect Fusion.  The software allows you to use a regular Kinect camera to reconstruct the world in 3D.



KinEtre: A Novel Way to Bring Computer Animation to Life
According to information from the YouTube description, "KinÊtre is a research project from Microsoft Research Cambridge that allows novice users to scan physical objects and bring them to life in seconds by using their own bodies to animate them. This system has a multitude of potential uses for interactive storytelling, physical gaming, or more immersive communications."




The following videos are quite long, so feel free to re-visit this post when you have time to relax and take it all in!

Kinect Design Considerations
This video covers Microsoft's Human Interface Guidelines, scenarios for interaction and use, and best practices for user interactions.  It also includes a preview of the next major version of the Kinect SDK. 


Kinect for Windows Programming Deep Dive
This video discusses how to build Windows Desktop apps and experiences with the Kinect, and also previews some future work.




RELATED
Kinect for Windows Developer Downloads
Kinect for Windows Blog
Deconstructing the NUI Blog (Josh Blake)
Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible
Celia Gorman, IEEE Spectrum, 3/13/13
Kinect hand recognition due soon, supports pinch-to-zoom and mouse click gestures.
Tom Warren, The Verge, 3/6/13
Microsoft's KinEtre Animates Household Objects
Samuel K. Moore, IEEE Spectrum, 8/8/12
Kinect Fusion Lets You Build 3-D Models of Anything Celia Gorman, IEEE Spectrum, 3/6/13
Description of Kinect sessions at Build 2012
Kinect for every developer!
Tom Kerhove, Kinecting for Windows, 2/15/13
Kinect in the Classroom
Kinect Education

Note: Although I recently received my developer kit for Leap Motion, another gesture-based interface, I haven't lost interest in following news for Kinect.

Feb 20, 2013

Disney Research: Touche, Touch and Gesture Sensing

The following video is a demonstration of something called Swept Frequency Capacitive Sensing. It recognizes various configurations of hands and body during interactions.  This system is different than conventional capacitive touch sensing, as it senses a range of frequencies to develop a capacitive profile that provides a significant amount of data that can be analysed and utilized in an application.

At 1:23, the SFCS is demonstrated on a table, sensing body posture or body configuration. It is a wireless system and can be used on smaller touch screens, such as mobile devices.  It can recognize interactions in liquids.


Touche was awarded Best Paper at ACM CHI 2012:

RELATED
Touche: Touch and Gesture Sensing for the Real World
Disney Research
Sato, M., Poupyrev, I, and Harrison, C. Touché: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects. In Proceedings of CHI’12. 2012. ACM.
Paper [PDF, 10Mb]
Touche with Arduino
Swept Frequency Capacitive Sensing (SFCS)
Audrey Cropp, Responsive Landscapes, 2/18/13

SOMEWHAT RELATED Synthetic Ecologies Course Reading List
Responsive Environments Course
Allen Sayegh, Harvard Graduate School of Design



AirHarp for Leap Motion, a Responsive Musical Natural User Interface

I like this demonstration of Adam Somers  AirHarp music application for use with the Leap Motion 3D controller:


AirHarp is being developed in C++ using Adam Somer's audio processing toolkit, MusKit.  This looks interesting!  Things have changes since I last took a computer music technology course (back in 2003).

Adam Somers is a senior software engineer at Universal Audio.  He has a graduate degree in music technology from Stanford, and a background in computer science, electronics, human-computer interaction, and signal processing.

Leap Motion is a motion-control software and hardware start-up company located in San Francisco, California. According to promotional information from the website, the company's first product, the Leap Motion controller, is 200 times more sensitive than existing technologies.  It will be interesting to see how this plays out.  (I'm still waiting for my pre-order.)

RELATED
AirHarp (links to GitHub)
Leap FAQs
Leap Motion Website
Leap Motion Developer Portal
Leap Motion Leadership Team
Leap Motion goes retail: Motion controller sold exclusively at Best Buy
Michael Gorman, engadget, 1/16/13

Leap Motion: Low Cost Gesture Control for your Computer Display
Asus partners up with Leap Motion, PCs with 3D motion control to debut in 2013
Michael Gorman, engadget, 1/3/13
Stanford Center for Computer Research in Music and Acoustics


Feb 15, 2013

Designing for Touch & Gesture: Tips for Apps and the Web (Updated)

In the past, our fingers did the walking, sifting through files, papers, pamphlets, and phonebooks, and then by point-click-clicking with a mouse to interact with images and text, in essence, electronic imitations of the paper-based world. Traditional forms, brochures, ad inserts, and posters informed much of the design. 

How much have things change?   It is 2013, but you'd think it was 1997 from the PowerPoint look and feel of many apps and web sites!   Touch is everywhere, but from what I can tell, not enough designers and developers have stepped up to the plate to think more deeply about ways their applications can support human endeavors though touch and gesture interactions.  

For an overview of this topic, take a look at my 2011 post, written after a number of ugly encounters with user-unfriendly applications:  Why bother switching from GUI to NUI?  

For an in-depth look into the history of multi-touch, the wisdom of Bill Buxton is well-worth absorbing.  He's worked with all sorts of interfaces, and has been curating the history of multi-touch and gesture systems since 2007:


Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12



Even if you are not a designer or developer, I encourage you to explore some of the links below:

Touch Gestures for Application Design
Luke Wroblewski, 10/9/12

Common Misconceptions About Touch
Steven Hoober, 3/18/13

Designing With Tablets in Mind:  Six Tips to Remember
Connor Turnbull, Webdesign tuts+, 9/27/11

Finger-Friendly Design: IDeal Mobile Touchscreen Target Sizes
Anthony T, Smashing Magazine, 2/21/12

Best Practices: Designing Touch Tablet Experiences for Preschoolers (pdf)
Sesame Street Workshop


Are Touch Screens Accessible?
AcessIT, National center on Accessible Information Technology in Education

iOS Human Interface Guidelines
Apple

Android User Interface Guidelines
Using Touch Gestures
Handling Multi-Touch Gestures
Android

Designing for Tablets?  We're Here to Help!
Roman Nurik, Android Developers Blog 11/26/12

Touch interaction design (Windows Store apps)
Microsoft - MSDN

Multi-Touch Systems that I have Known and Loved
Bill Buxton, Microsoft Research, Updated 8/30/12


Feb 14, 2013

Affinity+: Semi-Structured Brainstorming on Large Displays, from Pacific Northwest National Laboratory

The Affinity+ concept has the potential to be useful in educational settings such as schools, museums, and libraries. Although it was designed to support collaborative activities among software designers/developers, it could support a wide range of collaborative project-based learning activities. The clearly narrated video below was produced by a team from the Pacific Northwest National Laboratory




"Affinity diagraming is a powerful method for encouraging and capturing lateral thinking in a group environment. The Affinity+ Concept was designed to improve the collaborative brainstorm process through the use of large display surfaces in conjunction with mobile devices like smart phones and tablets. The system works by capturing the ideas digitally and allowing users to sort and group them on a large touch screen manually. Additionally, Affinity+ incorporates theme detection, topic clustering, and other processing algorithms that help bring structured analytic techniques to the process without requiring explicit leadership roles and other overhead typically involved in these activities." -PNNL



RELATED

Affinity+ Semi-Structured Brainstorming on Large Displays
Russ Burtner, Richard May, Randy Scarberry, Ryan LaMothe, Alex Endert
Pacific Northwest National Laboratory

Information Visualization Core Area:  Natural User Interactions
Information Visualization Core Area:  User Experience
Pacific Northwest National Laboratory

Large Displays: Will it ever be enough? (pdf)

Richard May, Jim Thomas, Pacific Northwest National Laboratory

Although this paper is from 2006, it contains a discussion of the "Top Ten Research Challenges" associated with  large high-resolution displays:
A Survey of Large High-Resolution Display Technologies, Techniques, and Applications (pdf)
Tao Ni, Greg S. Schmidt, Oliver G. Staadt, Mark A Livingston, Robert Ball, Richard May
IEEE Virtual Reality Conference 2006, pp223-226 Virginia Tech, 2006

Advanced Visualization and Interaction Techniques for Large High-Resolution Displays (pdf)

Sebastian Thelen (in Ariane Middel, Inga Scheler, and Hans Hagen (eds.), Visualization of Large and Unstructured Data Sets - Applications in Geospatial Planning, Modeling and Engineering (IRTG 1131 Workshop), VLUDS 2010, March 19-21, 2010, Bodega Bay, CA, USA DOI: 10.4230/OASIcs.VLUDS.2010.73

Affinity Diagraming

Usability Net


Jan 17, 2013

XBox Kinect in the OR: Kinect supports gesture interaction with 3D imaging of the patient, while operating.

Here's an interesting use of technology for health - the Xbox Kinect in the OR!

Thanks to Harry van der Veen for the link!


RELATED
Kinect sensor poised to leap into everyday life
Niall Firth, NewScientist, 1/17/13

For the tech-curious:
PrimeSense (Company that developed the 3D depth sensor that powers the Kinect, the sensor in Ava, a healthcare robot by iRobot, and more.)

OpenNI (Framework for the development of 3D sensing middleware libraries and applications.)

NiTE: Natural Interface Technology for End User (Perception algorithms layer for 3D computer vision, allows for hand locating, tracking, analyzing scenes, and tracking skeleton joints.)

Nov 23, 2012

Interactive TV Design Discussion - David Herigstad: Surface Space, Content Space, and Interactive Space

The following video is a presentation by Dale Herigstad, Chief Interaction Officer at Possible Worldwide, from the 2012 TV of Tomorrow Show.  If you are interested in interactivity and UX across screens and spaces, it is worth taking the time to watch!

WHO OWNS THE NEW SCREEN EXPERIENCES?

Special Presentation: Who Owns the New Screen Experiences? from TV of Tomorrow Show on Vimeo.

The "new world" combines TV and the internet, and poses new opportunities for design. Dale encourages flexibility and fearlessness, and setting goals for the long term. This means that organizations must move away from the "silo" approach, and create teams that can integrate across screens, platforms, content, and input methods, moving from flat-space to interactive, dynamic,layered, 3D, and augmented space.  

BTW, Dale Herigstad was one of the people commissioned by Steven Spielberg to work on the interface design for Minority Report.  

RELATED
Possible Worldwide
Tutorials: Dale Herigstad & Schematic, Interactive Design Agency
The Minority Report Interface
Luke Wroblewski, 9/23/04




Oct 23, 2012

Tablet and Mobile Day (or Era): Lots of upcoming news about tablets and mobile devices. New Chromebook, too.

I will be devoting a few upcoming posts to tablets, larger smartphones, and other mobile computing devices. I'll be hearing from the folks at Stantum later today.

This morning I woke up and turned on the TV as I was getting ready for work, and the first thing I saw on my local station was this cute family-focused ad for the new Google Chromebook, which at $249.00, is affordable:

I especially like the scene were a toddler stomps on a Chromebook left on the floor.

Oct 13, 2012

Lenovo's Touchy-Twisty Tablet-Laptops (Video)

I plan to get my hands on the IdeaPad Yoga and ThinkPad Twist someday soon. I'm curious to see how Windows 8 plays out the new touch-tablets.

In the following video, David Schmook, of Lenovo, demonstrates the new IdeaPad Yoga and Twist, running Windows 8.  


IdeaPad Yoga and ThinkPad Twist: Windows 8 Computers Bend and Twist
Joanna Stern, ABC News, 10/09/12


FYI: Lenovo will be manufacturing the "Think" family of computer products in my home state, North Carolina starting January, 2013. 

PHOTO: Lenovo's Windows 8 IdeaPad Yoga, ThinkPad Twist and Lynx convertibles have touchscreens and keyboards.
Photo Credit: Lenovo

Jul 29, 2012

Blast from the 2009 past: News, Videos, and Links about Multi-touch and Screen Technologies

One of the things I like to do is share updates about the world of multimedia, multi-touch, gesture, screen, surface, and interactive technologies, focusing on off-the-desktop applications and systems. When I started this blog, I had to put forth quite a bit of effort just to FIND interesting things to blog about.  


These days, there are so many sources that focus on emerging - and now commonplace- interactive technologies, my main challenge is to filter the noise.  Where do I begin?


My archives are vast.   I randomly picked the year 2009 and came across one of my previous posts, "News, Videos, and Links about Multitouch and Screen Technologies."   The post is long, and contains a number of videos and links that probably will be of value to a future curator of the history of technology.


I welcome comments from readers who might be able to help me update information about various applications and systems I've featured on this blog in the past. 

The pictures are screenshots from the results of an  image search for "interactivemultimediatechnology".  Over the past 6 years, I've posted quite a few!








Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.