Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts

Jan 28, 2011

"Microsoft is Imagining a NUI Future". You can, too!

Microsoft is Imagining a NUI Future
Steve Clayton, Next at Microsoft Blog, 1/26/11


"Our research shows that the vast majority of people polled in both developed and emerging markets see great potential for NUI applications beyond entertainment. This is especially true in China and India, where 9 out of 10 respondents indicate they are likely to use NUI technology across a range of lifestyle areas – from work, education and healthcare, to social connections, entertainment and the environment. We believe that taking technology to the next billion can be aided by NUI – making technology more accessible and more intuitive to a wider audience". - Steve Clayton, Microsoft


The people at Microsoft don't own the concept!  I'm a member of the NUI Group (May, 2007) and SparkOn.  Both are on-line communities where you can find people who live and breathe NUI, learn about their work, and even share designs and code. If you are intrigued by NUI - as a designer, developer, or user, please join us.


Note: 
I've been an evangelist and cheerleader for the NUI cause for many years.  If you search this blog for "post-WIMP", "NUI", "multi-touch", "gesture", "off-the-desktop""natural user interaction", "natural user interface", or even "DOOH", you'll be provided with an overwhelming number of posts that include videos, photographs, and links to NUI-related resources, including scholarly articles.  There is a small-but-growing number of people from many disciplines, quietly working on NUI-related projects.


RELATED
Microsoft Plans a Natural Interface Future Full of Gestures, Touchscreens, and Haptics
Kit Eaton, Fast Company, 1/26/112
Rethinking Computing (video)
Craig Mundie, Microsoft
Interactive Touch-Screen Technology, Participatory Design, and "Getting It" - Revised
Touch Screen Interaction in Public Spaces:  Room for Improvement, if "every surface is to be a computer".

Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them. 




Mar 29, 2011

SIFTEO, the next-gen Siftables! (Tangible User Interfaces for All)

Despite my enthusiasm for TUI's , I somehow missed the news about the transformation of Siftables to a commercial version, Sifteo:

Sifteo Inc. Debuts Sifteo™ Cubes - A New Way To Play (PDF



"Sifteo cubes are 1.5 inch computers with full-color displays that sense their motion, sense each other, and wirelessly connect to your computer. You, your friends, and your family can play an ever-growing array of interactive games that get your brain and body engaged.
Sifteo’s initial collection of titles includes challenging games for adults, fun learning puzzles for kids, and games people can play together." -Sifteo website
For more information, see the Sifteo website,  blog, and YouTube  channel.  If you can't wait to get your own set,  take a look at Josh Blake's Sifteo Cube Unboxing Video!

RELATED
About two years ago, I was interviewed about my thoughts about the interactive, hands-on, programmable cubes, then called Siftables,  for an article published in IEEE's Computing Now magazine:  Siftables Offer New Interaction Mode  (James Figeuroa, Computing Now, 3/2009). 

For those of you who'd like more information about tangible user interfaces (TUIs) and  the development of Siftables, I've copied my 2009 post,   Tangible User Interfaces, Part I:  Siftables,  below:

TANGIBLE USER INTERFACES, PART I: SIFTABLES (2009)
In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms" (pdf).   According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth a look, for those interested in this seminal work.

Another must-read is Hiroshi Ishii's 2008 article, Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related to Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. 



According to the Fluid Interfaces website, the goal of this research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined. Siftables is the work of David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement..No special sensing surface or cameras are needed."





Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg


More about Siftables:
Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces (pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."

In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.


Dec 26, 2009

DYI multi-touch...

If you follow this blog, you know I like to share what people are doing with multi-touch and related natural user interfaces/interaction. In this post, I'd like to share an article about two students who decided to build and market a multi-touch table- the article below explains the story in-depth, and video shows the nuts and bolts.


Enterprising roomates build multi-touch LCD, market their business to West Coast*
Walter Valencia, Collegiate Times 12/1/09



According to the above article, Aaron Bitler and Brady Simpson they were inspired by CNN's Magic Wall during the 2008 election.  Bitler and Simpson learned more about natural user interface/interaction during a presentation in a business class that featured a video about the Microsoft Surface table and natural user interface technologies.  They formed a company, 3M8,  to build and market mutli-touch display/tables.


Vision x32 from Aaron Bitler on Vimeo.


From what I can tell, it looks like Bitler and Simpson relied on the DYI information and support from the NUI-group website to carry out their ideas. Bitler and Simpson met with representatives of 22Miles, a company located in San Jose that provides interactive solutions, including multi-touch, for web, mobile, and touch screen implementations.

I'll post more about 22Miles in an upcoming post.

Until then, take a look at 22Miles' promo video, featuring a huge 3D interactive multi-touch heart:

Oct 12, 2010

Update on Josh Blake, newly designated Microsoft Surface MVP

Josh Blake is the Tech Lead of the InfoStrat Advance Technology Group in DC.  He has been creating multi-touch applications Microsoft's Surface multi-user table-tops for a while. Recently, his team built a suite of applications designed for use by young children at a museum.  Below is a video demonstration of some of this work. It really looks exciting!


Microsoft Surface and Magical Object Interaction

Josh Blake's blog is called Deconstructing the NUI- for those of you new to this blog, NUI stands for Natural User Interface (also known as Natural User Interaction).  See his post, Microsoft Surface and Magical Object Interaction, for more information!

RELATED
Here is a plug for Josh Blake's book, "Multitouch on Windows"

Book Ordering Information

FYI:  InfoStrat  is hiring  WPF experts as well as Microsoft CRM and Microsoft SharePoint experts.


Microsoft Surface MVPs
Dr. Neil Roodyn
Dennis Vroegop
Rick Barraza
Joshua Blake





Apr 22, 2009

From the NUITEQ (Natural User Interface) Gallery, via Harry van der Veen

Kids take to multi-touch interaction naturally!

The following photos are from Harry van der Veen's Multi-touch blog. (Harry was one of the founding members of the NUI-Group, and also is the CEO of NUITEQ-Natural User Interface)

The last two pictures are of the HP TouchSmart running NUI Suite Snowflake software, developed by the Natural User Interface Europe AB (NUITEQ) for think LCD, Plasma, and FT displays.






May 19, 2008

More Multi-Touch from members of the NUI group!

It is always exciting to see what members of the NUI group are doing!

Here is a new video of a multi-touch creation by some of the members of the NUI group. Although this is a proof-of-concept example, it is fun to see how it is played out, using the little iPhone-like touch-pad widgets as a navigation tool for the large screen.


Read the "Multi-touch Goodness" article in Gizmodo of an interview with Christian Moore about this demo and his Lux open-source framework. (Christian is a colleague of Harry van der Veen, both members of the NUI group.)

Here is an excerpt from the interview:
"JD: Why Flash?
CM: Because it's fast to prototype in. However, the software is broken into several segments. One C++ application that tracks hands that talks to Flash... WPF... or another C++ app... and basically everything you can imagine. You can enable multitouch in any environment, like Cocoa."

High-resolution screen shots and additional information can be found on the nuiman website.

For my tech-minded readers:
I'm pretty sure that the C++ application that track hands and fingers in the video demo uses Touchlib, a library for creating multi-touch interaction. Touchlib can work with TUIO, a protocol for tabletop tangible user interfaces. Applications such as Flash and Processing support TUIO. For more information about TUIO, read
"TUIO: A Protocol for Table-Top Tangible User Interfaces".
(Information from the NUI group website mentions that OpenCV, or Open Computer Vision Library, found on SourceForge, can support blog detection and tracking.)

The people behind TUIO are from the Reactable project, of the Music Technology Group at Pompeu Fabra University in Barcelona:

Here is my plug for the NUI group, once again!

"The NUI group, or Natural User Interface Group, is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.

We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."


FYI
I came across the Harry van der Veen of the NUI group in early 2007 when I was working on touch-screen projects for my HCI and Ubicomp classes, and I'm inspired by all of the creativity I've seen coming from this group.

If you'd like to see more demos, visit the Natural User Interface website, a commercial out-growth of Harry and his colleague's work, where you can view a reel that includes a few touch-screen games. I love the vision statement on this site:

"Technology should enable us to interact with computers, in the same way we interact with the real world; in a way which is natural to us, namely through gestures, expressions, movements, and manipulations. Our vision is to change the way people interact with computers."

May 10, 2009

Michael Haller Discusses Multi-touch, Interactive Surfaces, and Emerging Technologies for Learning

I came across an excellent overview of interactive display technologies that hold promise for education. The link below is a research article written by Michael Haller for BECTA, formally known as the British Educational Communications and Technology Agency.

Emerging Technologies for Learning: Interactive Displays and Next Generation Interfaces(pdf)
Becta Research Report (2008) Michael Haller Volume 3 (2008)


"Multi-touch and interactive surfaces are becoming more interesting, because they allow a natural and intuitive interaction with the computer system.

These more intuitive and natural interfaces could help students to be more
actively involved in working together with content and could also help improve whole-class teaching activities. As these technologies develop, the barrier of having to learn and work with traditional computer interfaces may diminish.

It is still unclear how fast these interfaces will become part of our daily life and
how long it will take for them to be used in every classroom. However, we strongly believe that the more intuitive the interface is, the faster it will be accepted and used. There is a huge potential in these devices, because they allow us to use digital technologies in a more human way." -Michael Haller

Michael Haller works at the department of Digital Media of the Upper Austria University of Applied Sciences (Hagenberg, Austria), where he is the head of the Media Interaction Lab.

Michael co-organized the Interaction Tomorrow course at SIGGRAPH 2007, along with Chia Shen, of the Mitsubishi Electric Research Laboratories (MERL). Lecturers included Gerald Morrison, of Smart Technologies, Bruce H. Thomas, of the University oof Southern Australia, and Andy Wilson, of Microsoft Research. The course materials from Interaction Tomorrow are available on-line, and include videos, slides, and course notes.

Below is an excerpt from the discription of the Interaction Tomorrow SIGGRAPH 2007 course:

"Conventional metaphors and underlying interface infrastructure for single-user desktop systems have been traditionally geared towards single mouse and keyboard-based WIMP interface design, while people usually meet around a table, facing each other. A table/wall setting provides a large interactive visual surface for groups to interact together. It encourages collaboration, coordination, as well as simultaneous and parallel problem solving among multiple people.

In this course, we will describe particular challenges and solutions for the design of direct-touch tabletop and interactive wall environments. The participants will learn how to design a non-traditional user interface for large horizontal and vertical displays. Topics include physical setups (e.g. output displays), tracking, sensing, input devices, output displays, pen-based interfaces, direct multi-touch interactions, tangible UI, interaction techniques, application domains, current commercial systems, and future research."

It is worth taking the time to look over Haller's other publications. Here is a few that would be good to read:

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009. "
Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

A. D. Cheok, M. Haller, O. N. N. Fernando, and J. P. Wijesena, 2009.
"Mixed Reality Entertainment and Art," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009. "Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]


M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation." IGI Publishing, 2008. [bibtex]

D. Leithinger and M. Haller, 2007. "Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]


J. Leitner, J. Powell, P. Brandl, T. Seifried, M. Haller, B. Dorray, and P. To, 2009."Flux: a tilting multi-touch and pen based surface," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3211-3216. [bibtex]

P. Brandl, J. Leitner, T. Seifried, M. Haller, B. Doray, and P. To, 2009. "Occlusion-aware menu design for digital tabletops," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3223-3228. [bibtex]


References from the BECTA paper:

Elrod, S., Bruce, R., Gold, R., Goldberg, D., Halasz, F., Janssen, W., Lee, D., Mc-Call, K., Pedersen, E., Pier, F., Tang, J., and Welch, B., Liveboard: a large interactive display supporting group meetings, presentations, and remote collaboration, CHI ’92 (New York, NY, USA), ACM Press, 1992, pp. 599–607.

Morrison, G., ‘A Camera-Based Input Device for Large Interactive Displays’, IEEE Computer Graphics and
Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Albert, A. E. The effect of graphic input devices on performance in a cursor positioning task. Proceedings ofthe Human Factors Society 26th Annual Meeting, Santa Monica, CA: Human Factors Society, 1982, pp. 54-58.

Dietz, P.H., Leigh, D.L., DiamondTouch: A Multi-User Touch Technology, ACM Symposium on User
Interface Software and Technology (UIST), ISBN: 1-58113-438-X, pp. 219-226, November 2001.

Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,

CHI 2002, 2002.

Kakehi, Y., Iida, M., Naemura, T., Shirai, Y., Matsushita, M.,
Ohguro, T., ‘Lumisight Table: Interactive View-Dependent Tabletop Display Surrounded by Multiple Users’, In IEEE Computer
Graphics and Applications, vol. 25, no.1, pp 48 – 53, 2005.

Streitz, N., Prante, P., Röcker, C., van Alphen, D., Magerkurth, C.,
Stenzel, R., ‘Ambient Displays and Mobile Devices for the Creation of Social Architectural Spaces: Supporting informal communication and social awareness in organizations’ in Public and Situated Displays: Social and Interactional Aspects of Shared Display Technologies, Kluwer Publishers, 2003. pp. 387-409.

Morrison, G., A Camera-Based Input Device for Large Interactive
Displays, IEEE Computer Graphics and Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-Joseph, E.,
Yeung, L. and Zahra, K., Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. IEEE and ACM International Symposium on Mixed and Augmented Reality ACM Press, Darmstadt, Germany.

Han, Y., Low-cost multi-touch sensing through frustrated total internal reflection, UIST ’05 (New York), ACM
Press, 2005, pp. 115–118.

Hull., J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J., Olst, D., Paper-Based Augmented Reality. In
Proceedings of the 17th International Conference on Artificial Reality and Telexistence (Esbjerg, Denmark,November 28-30, 2007). ICAT ’07. IEEE, 205-209.

Haller, M., Leithinger, D., Leitner, J., Seifried, T., Brandl, P., Zauner, J., Billinghurst, M., The shared design space. In SIGGRAPH ’06: ACM SIGGRAPH 2006 Emerging technologies, page 29, New York, NY,USA, 2006. ACM Press.

Research email: emtech@becta.org.uk

Main email: becta@becta.org.uk
URL: www.becta.org.uk

(This was also posted on the TechPsych blog.)

Mar 18, 2009

More for Multi-touch: NextWindow Plug-in for Natural User Interface's Snowflake Multi-touch Software -and more.



Those of you have an HP TouchSmart, Dell Studio One PC, or NextWindow displays might be interested in the new NUI plug-in that supports NUI Suite Snowflake software. Here are the features of the plugin, according to information from the Natural User Interface website:
  • Detailed user manual included with FAQ
  • Developed on fast and reliable C++ platform
  • Intuitive
  • Customizable
  • Gesture recognition library
  • TUIO/OSC (Open Sound Control) support (sending and receiving events)
  • Low level API
  • Hardware accelerated rendering
  • Support for wide variety of media types
  • Advanced window handler that supports scaling and rotation
  • Suitable for Windows® XP and Windows® Vista (Mac OSX and Linux can be developed on request)
  • Audio support
  • Single, dual support
  • Multi-threaded resource handler (For fast data visualization)

"NUI has partnered up with NextWindow™, an international leader in the development of optical multi-touch technology and the manufacturer of optical multi-touch screens, overlays and OEM touch components."

"NextWindow™'s integrated technology allows for natural and intuitive interaction of digital content on flat TFT, LCD and Plasma solutions."

"The NUI NextWindow™ plug-in can be used with any programming language that supports TUIO, i.e. C/C++/C#, Java, Flash, Python, VVVV etc, meaning that software developers can run their own applications on NextWindow™, utilizing the NUI NextWindow™ plug-in."

Comment:
I became a fan of NextWindow touch-screen displays in early 2007 when I worked on a couple of touch-screen projects in my HCI and Ubicomp classes at UNC-Charlotte.


I've been using my HP TouchSmart PC at work with students with disabilities. I'm experimenting with the NUI Suite SnowFlake on my TouchSmart, and found that interacting with the Particles application delighted students with severe autism. The activities provided opportunities to establish joint attention. I also noticed an increase in the number of vocalizations and/or verbalizations among the students. Of course, this was NOT a scientific study.

RELATED
Definition of Joint Attention from UConn:

"Joint Attention is the process of sharing one’s experience of observing an object or event, by following gaze or pointing gestures. It is critical for social development, language acquisition, cognitive development…"

http://eigsti.psy.uconn.edu/jt_attn.JPG


Establishing joint attention is an important step in the development of social interaction skills among young people who have autism spectrum disorders.

More about joint attention:

Joint Attention Study Has Implication for Understanding Autism
Science Daily, 9/29/07

Asperger-Advice: Joint Attention

Autism Games: Joint Attention and Reciprocity

Why is joint attention a pivotal skill in autism?
Tony Charman
Philos Trans R Soc Lond B Biol Sci. 2003 February 28; 358(1430): 315–324.
doi: 10.1098/rstb.2002.1199.

Aug 9, 2009

Surface Flight Tracker Video from fboweb labs / flightwise.com, with background music by Art of Noise for your NUI pleasure.



This flight-tracker application for the Surface, looks fun to use. As I watched the video, I realized that it wasn't the application itself that I liked. It was the music that accompanied the video. The choice of music was from the 80's synth-pop band, Art of Noise

Since I'm a music lover, the music got me thinking.

Wouldn't it be great if productivity/work-related applications like Flight Tracker could be developed to provide a means for incorporating a sound-track?


Several thoughts and ideas flashed into my mind:

  • Surface and related natural user interface/interaction (NUI) applications have the potential to transform routine, ho-hum work tasks into activities that are a bit more pleasant. Since people often listen to music while they work, it stands to reason that NUI productivity applications should incorporate a music component, at least as an option.
  • To support a user-centered music platform for NUI applications, the application could incorporate a "smart" music library within the system, with the capability of integrating music libraries and playlists from user's mobile devices, as well as the web, effortlessly.(Of course, there are privacy/security and firewall issues to address, but that is another story.)
  • Users could have a choice of listening to their own music playlists (including a shuffle option, selecting from a variety of presets, or go for something like the iTunes genius effect, listening to music generated from an algorithm that takes into account music preferences and user interaction with the productivity application over time.
  • Since many Surface/NUI applications are designed to support collaborative work and interaction between two or more people, the music situation could get a bit complicated, since people have differing tastes. If co-workers disagreed about the music selection, the program would automatically default to generic elevator music, or silence.
  • NUI applications might even pave the way for a new genre of music. This concept isn't too far-fetched. Think of all the music we've come to love over the years that was composed for movies and even video games!

If you know of anyone that is working on this concept, or would like to collaborate with me sometime in the future on this concept, please let me know. I'm slowly working on an interactive timeline prototype, and I have some ideas about adding a music/sound track component.

RELATED
Art of Noise - Close To The Edit (Version 1):

Dec 13, 2012

Connecting: Exploration of the Future of Interaction Design and User Experience - Good for promoting CSEd!

I've been looking for a relatively short video about human-computer interaction and related fields to include in a presentation I'm planning for high school students. The presentation is my small part to promote Computer Science in Education Week (CSEd)

One of the goals of CSEd Week is to spread the word that computer science education is much more than learning how to program one. Technical and computational thinking skills are important to develop, but young people also need to know what sort of things they can do with these skills as they become adults in our technological society. As stated on the CSEd website"Computing professionals work on creative teams to develop cutting-edge products and solutions that save lives, solve health problems, improve the environment, and keep us connected."  

Coincidentally, I was pleasantly surprised by a tweet I received today that linked to Connecting, a well-produced 18-minute video about interaction and user experience design. This video would be great to share with high school students.


Connecting (Full Film) from Bassett & Partners on Vimeo.

The video features a number of well-spoken, creative professionals who are passionate about their work, people, and the future.  Although the video is a bit techno-centric, it depicts people who live and breathe technology in a favorable light.  It also inspires some degree of thought and reflection on the part of the viewer.

Although much of what is discussed in Connecting is futuristic, the seeds were planted years ago.  If you are new to the HCI/UX/ID/UCD world, it might help to read
Mark Weiser's 1991 article, The Computer for the 21st Century, published in Scientific American in 1991, before viewing the video.  

After viewing the video, I encourage you to take the time to read some of the comments on the Vimeo website.  Also read  Marc Rettig's comments, posted on the IxDA website:  "A film about interaction design: what it says about us".  

Near the end of the video, there is a discussion about where we might be headed, as interconnected, technically enhanced, augmented humans.  Hopefully we will not create, and then be assimilated into a Borg-like collective, or live out our days in a Matrix-like disembodied state.

In the wrong hands, what might happen?

Is resistance futile?!

FYI: Connecting was produced by Microsoft, Windows Phone Design Studio: Mike Kruzeniski (now at Twitter), Kat Holmes, and Albert Shum, and featured interviews with the following people:

Matt Jones, BERG London
Raphael Grignani, Method
Liz Danzico, School of Visual Arts, New York
Blaise Aguera y Arcas, Architect of Bing Mobile and Bing Maps
Helen Walters,  Writer, Editor, Researcher at Doblin/Monitor
Younghee Jung, Research Leader, Nokia
Massimo Banzi, Co-Founder, Arduino
Jennifer Bove, Co-Founder, Managing Director, Kicker Studio
Robert Murdock, Principal, Method (Artefact)
Jonas Lowgren, Professor of IxD, Malmo University, Sweden
Eric Rodenbeck, CEO, Founder, Creative Director, Stamen Design
Robert Fabricant, VP of Creative, Frog Design
Andrei Herasimchuck, Twitter 

The video was first screened in Seattle, Washington, last April, with a panel discussion that included Rob Girling and Gavin Kelly, of Artefact, Bill Buxton, of Microsoft, and Scott Nazarian, of Frog Design.

Description of the "Connecting" video, from Bassett & Partners' Vimeo site:

"The 18 minute "Connecting" documentary is an exploration of the future of Interaction Design and User Experience from some of the industry's thought leaders. As the role of software is catapulting forward, Interaction Design is seen to be not only increasing in importance dramatically, but also expected to play a leading role in shaping the coming "Internet of things." Ultimately, when the digital and physical worlds become one, humans along with technology are potentially on the path to becoming a "super organism" capable of influencing and enabling a broad spectrum of new behaviors in the world." -Bassett & Partners

Selected Quotes:

Liz Danzico:
"It's understanding that ecosystem, where the human in the center, and understanding that network of things, and how they all work together, rather than of your device or thing being in the center."

Younghee Jung:
"... you can not necessarily foresee the consequences when people adopt what you designed..to see something completely different from what you created. .it is like throwing a stone in the water, and you don't know what it will cause."

Blaise Aguera y Arcas:
"....these are all augmentations of abilities as humans. And when the augmentation really works, then that extension of yourself feels natural, and beautiful and does what you want, and doesn't get in the way....The use of voice, and the use of natural gestures... you are removing the extraneous, you are removing the artificial."

Massimo Banzi:
"...Something that can do it's own thing, disappearing in the background, is correct"  (nod to Weiser)

Jennifer Bove:
"...it is really important to look at what the consequences are of putting these products into the world when we think about things like the phone...the way it has changed our behavior, it can be enabling, and also disrupting...for these things to change our lives for the better, or enable for them to let us do things we couldn't do before.. they have to feel natural, and feels like a conversation." 

Robert Murdock:
"How you actually design and enact a living system in UX is something that is quite challenging...you have to think about patterns of desired outcomes and behaviors you want to achieve, instead of moving a user through one flow in an experience."

Jonas Lowgren: 
"...back in the day.. it was one user, one task, one computer,  its all gone now, its is much more like you are setting the stage, really,  for other people to perform, but you can never tell them what to do."

Eric Rodenbeck:
"....the map is like a living thing, that is being made up of everything we got. The idea that it is different in the morning than what it was in the evening, is a really good idea to stay connected to the idea that the world is changing."

Helen Walters:
"What we need is for designers to be embedded in the topics that are really, really important right now, so there can be a better synergy between design, and business design, and social change design, and entrepreneurship."

Andrei Herasimchuck
"That is where the future lies with us. There will be software in everything..You can take all of those (digital) pieces, and you can design all kind of things around it. People are now actually entering their lives and what is going around them, into a digital format, and so we will start do things with that in the future, and I think it will be exciting."

Robert Fabricant:
"The network is sampling the world, and knowing what is cropping up where, being able to match and find patterns...and anticipate outbreaks of diseases. ..  We are trying  now to collect from the periphery a much richer set of what is going on the world so we can learn as a society and optimize and evolve the right systems and services".

SOMEWHAT RELATED
IxDA
Experientia: Putting People First 
What's the Difference- IXD, IA, UXD, HCI, UCD, UX (Jon Karpoff)