Showing posts sorted by relevance for query "natural user interface". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interface". Sort by date Show all posts

Sep 26, 2009

More Multi-touch and Gesture-based Natural User Interfaces: Bamboo Wacom Tablet; Multi-touch PresTop Kiosk and Snowflake Suite software

Wacom Tablets Get Multi-Touch, Gestures
(Charlie Sorrel, Wired, 9/24/09)
"For the tech-curious, the new tablets have 512 pressure levels in the pen tip and the active area of the tablet is 5.8 x 3.6 inches, and all lose the in-pack mouse (for obvious reasons). The Touch and the Pen models are both $70, and the Pen & Touch is $100. Also, if you were thinking of buying Photoshop Elements 7 for the same price, get a tablet instead — Elements comes in the box."




http://www.wired.com/images_blogs/gadgetlab/2009/09/cth460k_3-660x371.jpg

Official Wacom Video

"Bamboo Touch is new type of computer input device by Wacom that lets you navigate and perform commands like zoom, scroll, rotate and more with a series of simple finger taps and hand gestures. Bamboo Touch brings Multi-Touch capability to your Mac or PC"

Video from a Wacom user:

A nice alternative to a mouse.  I'm going to get one for my laptop!


Multi-touch Kiosks!
Press release:  Dutch touchscreen supplier PresTop partners with Natural User Interface (NUITEQ)
 
http://prestop.nl/images/gallery/products/st_UU_zuil_wit.png
http://prestop.nl/images/gallery/products/st_DSC02106.png

RELATED

I couldn't find any video clips of PresTop's multi-touch interaction. From what I can tell, PresTop multi-touch screens will be using SnowFlake Suite from Natural User Interface Technogies AB.

How-to:SnowFlake Suite Flash multi-touch Interactable component (NUIversity)

Without a single line of code, you can do quite a bit with Snowflake Suite

"This video covers how to make a rotatable and scalable image. The beauty about this is, that we have developed a Flash mouse input simulator, so that there is no need for multi-touch hardware in order to develop your applications. Simply simulate multiple mouse inputs for multi-touch.This project is still in alpha phase and a download will become available with the next release of Snowflake Suite 1.7 for the NextWindow platform and camera based multi-touch solutions."


Below is a video of single-touch interaction for PresTop, from Omnivision:


PresTop  PresTop offers interactive hardware and software solutions that can be used indoors as well as in outdoor environments.

Dec 26, 2009

DYI multi-touch...

If you follow this blog, you know I like to share what people are doing with multi-touch and related natural user interfaces/interaction. In this post, I'd like to share an article about two students who decided to build and market a multi-touch table- the article below explains the story in-depth, and video shows the nuts and bolts.


Enterprising roomates build multi-touch LCD, market their business to West Coast*
Walter Valencia, Collegiate Times 12/1/09



According to the above article, Aaron Bitler and Brady Simpson they were inspired by CNN's Magic Wall during the 2008 election.  Bitler and Simpson learned more about natural user interface/interaction during a presentation in a business class that featured a video about the Microsoft Surface table and natural user interface technologies.  They formed a company, 3M8,  to build and market mutli-touch display/tables.


Vision x32 from Aaron Bitler on Vimeo.


From what I can tell, it looks like Bitler and Simpson relied on the DYI information and support from the NUI-group website to carry out their ideas. Bitler and Simpson met with representatives of 22Miles, a company located in San Jose that provides interactive solutions, including multi-touch, for web, mobile, and touch screen implementations.

I'll post more about 22Miles in an upcoming post.

Until then, take a look at 22Miles' promo video, featuring a huge 3D interactive multi-touch heart:

May 19, 2008

More Multi-Touch from members of the NUI group!

It is always exciting to see what members of the NUI group are doing!

Here is a new video of a multi-touch creation by some of the members of the NUI group. Although this is a proof-of-concept example, it is fun to see how it is played out, using the little iPhone-like touch-pad widgets as a navigation tool for the large screen.


Read the "Multi-touch Goodness" article in Gizmodo of an interview with Christian Moore about this demo and his Lux open-source framework. (Christian is a colleague of Harry van der Veen, both members of the NUI group.)

Here is an excerpt from the interview:
"JD: Why Flash?
CM: Because it's fast to prototype in. However, the software is broken into several segments. One C++ application that tracks hands that talks to Flash... WPF... or another C++ app... and basically everything you can imagine. You can enable multitouch in any environment, like Cocoa."

High-resolution screen shots and additional information can be found on the nuiman website.

For my tech-minded readers:
I'm pretty sure that the C++ application that track hands and fingers in the video demo uses Touchlib, a library for creating multi-touch interaction. Touchlib can work with TUIO, a protocol for tabletop tangible user interfaces. Applications such as Flash and Processing support TUIO. For more information about TUIO, read
"TUIO: A Protocol for Table-Top Tangible User Interfaces".
(Information from the NUI group website mentions that OpenCV, or Open Computer Vision Library, found on SourceForge, can support blog detection and tracking.)

The people behind TUIO are from the Reactable project, of the Music Technology Group at Pompeu Fabra University in Barcelona:

Here is my plug for the NUI group, once again!

"The NUI group, or Natural User Interface Group, is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.

We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."


FYI
I came across the Harry van der Veen of the NUI group in early 2007 when I was working on touch-screen projects for my HCI and Ubicomp classes, and I'm inspired by all of the creativity I've seen coming from this group.

If you'd like to see more demos, visit the Natural User Interface website, a commercial out-growth of Harry and his colleague's work, where you can view a reel that includes a few touch-screen games. I love the vision statement on this site:

"Technology should enable us to interact with computers, in the same way we interact with the real world; in a way which is natural to us, namely through gestures, expressions, movements, and manipulations. Our vision is to change the way people interact with computers."

Dec 9, 2010

Interested in the OpenNI Initiative? OpenKinect? To learn more, read Josh Blake's Interview of Tamir Berliner of PrimeSense




Josh Blake, Deconstructing the NUI, 12/9/10



Josh Blake recently interviewed Tamir Berliner, one of the founders of PrimeSense.  If you haven't heard, Microsoft's Kinect was based on work by PrimeSense, and licensed their technology. PrimeSense provides consumer electronics with natural user interaction capabilities. The good news is that the company recently released open-sourced middleware for natural interaction and depth-camera drivers. It will be interesting to see how this will play in the near future!




In the interview, Tamir discussed a number of topics related to postWIMP technologies.  He also announced the newly created  OpenNI, "an industry-led, not-for-profit organization formed to certify compatibility and interoperability of Natural Interaction (NI) devices, applications, and middleware."   It is good to see this level of support for the cause!


Here is a quote from the interview that I especially liked:

"I believe that till today the devices we’ve been using, made us learn greatly lot about them before we could use them and gain their value. I’m pretty sure everyone who is reading this has got at least 3 remotes sitting on his living room table, and at least once a week needs to help someone use their computer/media center/phone/etc. It’s time for that to change and it’s up to us, the technologists to make this revolution happen, it’s time for the devices to take the step of understanding what we want and making sure we get that, even without asking if it’s a trivial task as opening a door when we approach, closing the lights when we leave the room, even making sure we have hot water to shower with when we return from work or wake up in the morning, depends on what we normally do." -Tamir


RELATED
Here are a couple of videos from the OpenNI website that demonstrate OpenNI-compliant applications:

OpenNI-compliant real time skelton tracking by PrimeSense


OpenNI-compliant real time SceneAnalyzer by PrimeSense



FYI: 
Josh Blake is the author of the Deconstructing the NUI blog. Over the past couple of years, he's explored natural user interfaces and interactions through his work on applications designed for Microsoft Surface and Win7 with Windows Presentation Foundation.
About a month ago, Josh organized OpenKinect, an on-line community to support collaboration among people interested in exploring ways to use Kinect with PCs and other devices.  An example of this effort is the open source code, libfreenect, which includes drivers and libraries for Windows, Linux, and OS X. 


The Natural User Interface Revolution
Josh Blake, 1/5/09


Kinect for Xbox 360: The inside story of Microsft's secret 'Project Natal'  (long, but worth reading) David Rowan, Wired UK, 10/29/10


People of libreenect

OpenNI User Guide (pdf)

Mar 18, 2009

More for Multi-touch: NextWindow Plug-in for Natural User Interface's Snowflake Multi-touch Software -and more.



Those of you have an HP TouchSmart, Dell Studio One PC, or NextWindow displays might be interested in the new NUI plug-in that supports NUI Suite Snowflake software. Here are the features of the plugin, according to information from the Natural User Interface website:
  • Detailed user manual included with FAQ
  • Developed on fast and reliable C++ platform
  • Intuitive
  • Customizable
  • Gesture recognition library
  • TUIO/OSC (Open Sound Control) support (sending and receiving events)
  • Low level API
  • Hardware accelerated rendering
  • Support for wide variety of media types
  • Advanced window handler that supports scaling and rotation
  • Suitable for Windows® XP and Windows® Vista (Mac OSX and Linux can be developed on request)
  • Audio support
  • Single, dual support
  • Multi-threaded resource handler (For fast data visualization)

"NUI has partnered up with NextWindow™, an international leader in the development of optical multi-touch technology and the manufacturer of optical multi-touch screens, overlays and OEM touch components."

"NextWindow™'s integrated technology allows for natural and intuitive interaction of digital content on flat TFT, LCD and Plasma solutions."

"The NUI NextWindow™ plug-in can be used with any programming language that supports TUIO, i.e. C/C++/C#, Java, Flash, Python, VVVV etc, meaning that software developers can run their own applications on NextWindow™, utilizing the NUI NextWindow™ plug-in."

Comment:
I became a fan of NextWindow touch-screen displays in early 2007 when I worked on a couple of touch-screen projects in my HCI and Ubicomp classes at UNC-Charlotte.


I've been using my HP TouchSmart PC at work with students with disabilities. I'm experimenting with the NUI Suite SnowFlake on my TouchSmart, and found that interacting with the Particles application delighted students with severe autism. The activities provided opportunities to establish joint attention. I also noticed an increase in the number of vocalizations and/or verbalizations among the students. Of course, this was NOT a scientific study.

RELATED
Definition of Joint Attention from UConn:

"Joint Attention is the process of sharing one’s experience of observing an object or event, by following gaze or pointing gestures. It is critical for social development, language acquisition, cognitive development…"

http://eigsti.psy.uconn.edu/jt_attn.JPG


Establishing joint attention is an important step in the development of social interaction skills among young people who have autism spectrum disorders.

More about joint attention:

Joint Attention Study Has Implication for Understanding Autism
Science Daily, 9/29/07

Asperger-Advice: Joint Attention

Autism Games: Joint Attention and Reciprocity

Why is joint attention a pivotal skill in autism?
Tony Charman
Philos Trans R Soc Lond B Biol Sci. 2003 February 28; 358(1430): 315–324.
doi: 10.1098/rstb.2002.1199.

Jan 20, 2009

More Multi-touch Multimedia: Video demonstration of applications created with Snowflake and Flash



This video showcases the work of Natural User Interface-AB, using NUI Suite 1.0 Snowflake and Flash.

Here is the plug from the company's website:
"Natural User Interface (NUI) is a Swedish innovative emerging technology company specializing in commercially available advanced multi-touch software, hardware and service solutions. NUI's solutions can convert an ordinary surface into an interactive, appealing and intelligent display that creates a stunning user experience."

For more information and links:

For Techies and the Tech Curious: Multi-touch/Gesture from the NUI-Group

Search this blog!

Mar 17, 2008

Look, touch, listen, and play: Seth Sandler's Interactive Audio Touch Table Video; NUI Group and Google's Summer of Code

Seth Sandler's most recent video of the Audio Touch interactive table provides a good demonstration of how multi-touch on a table can work with music.



Seth is a member of the NUI group (Natural User Interface). He is finishing a Bachelors degree in Interdisciplinary Computing and the Arts, with an emphasis on Music, at the University of California, San Diego. His research and development work centers around multi-touch, multi-user musical interfaces.

Here is an update about the NUI group:

"Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications."

"We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."

The NUI group has been selected for mentoring organization for Google's Summer of Code, for those of you who are interested in working on open-source code for multi-touch systems. The student application process begins Monday, March 24th, 2008, and ends Monday, March 31st, 2008.

NUI group's project ideas page outlines the requirements for the application, which includes a 7500 word project proposal. The project page has a long list of ideas to spark some thinking for potential Summer of Code applicants.

For those of you who aren't into coding, I encourage you to take a look at the NUI Groups project ideas page just to get an idea of th interesting ideas that are being explored. The page has a list of links to other good resources.

Share the word with anyone who might be interested in the NUI Group's projects for the Summer of Code. We need to get more people interested in STEM careers, and the project ideas outlined by the NUI group look enticing.


Dec 31, 2009

Josh Blake's' Nice Multi-touch and Natural User Interface Applications for Surface (Cross-Post )



Information from Josh's YouTube channel:
"This is a video of some of the cool multi-touch and Natural User Interface (NUI) applications I designed and developed for Surface and Windows 7." The InfoStrat.VE map control for WPF and Surface is available for free at http://virtualearthwpf.codeplex.com.
I especially like the moving ring-menu concept, as it facilitates smoother collaboration between people on an interactive table or surface, where flexible orientation control is important.

At 3:15, the demonstration of Josh's ink-shape recognition begins. This is a feature that would be great to incorporate in my applications for children with disabilities who have some fine-motor limitations.


Josh's Blog:  Deconstructing the NUI
Josh's Recent Post about post-WIMP concepts:
Metaphors and OCGM
Josh works at InfoStrat

Aug 12, 2009

Do you have an HP TouchSmart, Dell Studio One, or NextWindow touch-screen? NUITech's Snowflake Suite upgrade provides a multi-touch plug-in.

If you have a NextWindow touch screen display, HP TouchSmart PC, or Dell Studio One PC, you might be interested to know that NUITech has upgraded their multi-touch software, Snowflake Suite. There is an evaluation download available on the NUITech website created for the NextWindow platform. (The TouchSmart and Studio One have NextWindow touchscreens.)

Here is a short clip of Snowflake 1.6 in action:

Snowflake Suite 1.6 provides users with an opportunity to change the standard content that is delivered with the software, which includes images, videos, 3D models, and backgrounds, so it can be customized according to need.

RELATED

Natural User Interface AB has adopted new branding and a new name, Natural User Interface Technologies AB, or NUITech.

Press Release (8/12/09)

Evaluation version of Snowflake Suite for NextWindow systems (including TouchSmart, Studio One)

Snowflake Suite running on multiple NextWindow 2150 overlays for 22-inch screens:

(The previous version of Snowflake, depicted in the above video, was a finalist for the 2009 Stevie Award, "Best New Product or Service of the Year- Media & Entertainment)



Singapore's Public Utility Board display, Singapore International Water Week, featuring NUITech systems and software.

SOMEWHAT RELATED
Press Release: Next Window Earns Coveted Windows 7 Logo Certification pdf

May 14, 2008

Multi-touch Crayon Physics

Also posted on the TSHWI blog:
Watch how you can draw simple shapes that can instantly turn into a game!


Multitouch Crayon Physics from multitouch-barcelona on Vimeo.
For a better version of this video, see http://www.vimeo.com/980528 For more information, see the RXSurface blog post. On May 18, Multi touch crayon physics will be offered as an alphabeta opensource! The people behind RXSurface are members of the Natural User Interface (NUI) group:
"Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications."



May 10, 2009

Michael Haller Discusses Multi-touch, Interactive Surfaces, and Emerging Technologies for Learning

I came across an excellent overview of interactive display technologies that hold promise for education. The link below is a research article written by Michael Haller for BECTA, formally known as the British Educational Communications and Technology Agency.

Emerging Technologies for Learning: Interactive Displays and Next Generation Interfaces(pdf)
Becta Research Report (2008) Michael Haller Volume 3 (2008)


"Multi-touch and interactive surfaces are becoming more interesting, because they allow a natural and intuitive interaction with the computer system.

These more intuitive and natural interfaces could help students to be more
actively involved in working together with content and could also help improve whole-class teaching activities. As these technologies develop, the barrier of having to learn and work with traditional computer interfaces may diminish.

It is still unclear how fast these interfaces will become part of our daily life and
how long it will take for them to be used in every classroom. However, we strongly believe that the more intuitive the interface is, the faster it will be accepted and used. There is a huge potential in these devices, because they allow us to use digital technologies in a more human way." -Michael Haller

Michael Haller works at the department of Digital Media of the Upper Austria University of Applied Sciences (Hagenberg, Austria), where he is the head of the Media Interaction Lab.

Michael co-organized the Interaction Tomorrow course at SIGGRAPH 2007, along with Chia Shen, of the Mitsubishi Electric Research Laboratories (MERL). Lecturers included Gerald Morrison, of Smart Technologies, Bruce H. Thomas, of the University oof Southern Australia, and Andy Wilson, of Microsoft Research. The course materials from Interaction Tomorrow are available on-line, and include videos, slides, and course notes.

Below is an excerpt from the discription of the Interaction Tomorrow SIGGRAPH 2007 course:

"Conventional metaphors and underlying interface infrastructure for single-user desktop systems have been traditionally geared towards single mouse and keyboard-based WIMP interface design, while people usually meet around a table, facing each other. A table/wall setting provides a large interactive visual surface for groups to interact together. It encourages collaboration, coordination, as well as simultaneous and parallel problem solving among multiple people.

In this course, we will describe particular challenges and solutions for the design of direct-touch tabletop and interactive wall environments. The participants will learn how to design a non-traditional user interface for large horizontal and vertical displays. Topics include physical setups (e.g. output displays), tracking, sensing, input devices, output displays, pen-based interfaces, direct multi-touch interactions, tangible UI, interaction techniques, application domains, current commercial systems, and future research."

It is worth taking the time to look over Haller's other publications. Here is a few that would be good to read:

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009. "
Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

A. D. Cheok, M. Haller, O. N. N. Fernando, and J. P. Wijesena, 2009.
"Mixed Reality Entertainment and Art," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009. "Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]


M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation." IGI Publishing, 2008. [bibtex]

D. Leithinger and M. Haller, 2007. "Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]


J. Leitner, J. Powell, P. Brandl, T. Seifried, M. Haller, B. Dorray, and P. To, 2009."Flux: a tilting multi-touch and pen based surface," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3211-3216. [bibtex]

P. Brandl, J. Leitner, T. Seifried, M. Haller, B. Doray, and P. To, 2009. "Occlusion-aware menu design for digital tabletops," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3223-3228. [bibtex]


References from the BECTA paper:

Elrod, S., Bruce, R., Gold, R., Goldberg, D., Halasz, F., Janssen, W., Lee, D., Mc-Call, K., Pedersen, E., Pier, F., Tang, J., and Welch, B., Liveboard: a large interactive display supporting group meetings, presentations, and remote collaboration, CHI ’92 (New York, NY, USA), ACM Press, 1992, pp. 599–607.

Morrison, G., ‘A Camera-Based Input Device for Large Interactive Displays’, IEEE Computer Graphics and
Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Albert, A. E. The effect of graphic input devices on performance in a cursor positioning task. Proceedings ofthe Human Factors Society 26th Annual Meeting, Santa Monica, CA: Human Factors Society, 1982, pp. 54-58.

Dietz, P.H., Leigh, D.L., DiamondTouch: A Multi-User Touch Technology, ACM Symposium on User
Interface Software and Technology (UIST), ISBN: 1-58113-438-X, pp. 219-226, November 2001.

Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,

CHI 2002, 2002.

Kakehi, Y., Iida, M., Naemura, T., Shirai, Y., Matsushita, M.,
Ohguro, T., ‘Lumisight Table: Interactive View-Dependent Tabletop Display Surrounded by Multiple Users’, In IEEE Computer
Graphics and Applications, vol. 25, no.1, pp 48 – 53, 2005.

Streitz, N., Prante, P., Röcker, C., van Alphen, D., Magerkurth, C.,
Stenzel, R., ‘Ambient Displays and Mobile Devices for the Creation of Social Architectural Spaces: Supporting informal communication and social awareness in organizations’ in Public and Situated Displays: Social and Interactional Aspects of Shared Display Technologies, Kluwer Publishers, 2003. pp. 387-409.

Morrison, G., A Camera-Based Input Device for Large Interactive
Displays, IEEE Computer Graphics and Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-Joseph, E.,
Yeung, L. and Zahra, K., Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. IEEE and ACM International Symposium on Mixed and Augmented Reality ACM Press, Darmstadt, Germany.

Han, Y., Low-cost multi-touch sensing through frustrated total internal reflection, UIST ’05 (New York), ACM
Press, 2005, pp. 115–118.

Hull., J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J., Olst, D., Paper-Based Augmented Reality. In
Proceedings of the 17th International Conference on Artificial Reality and Telexistence (Esbjerg, Denmark,November 28-30, 2007). ICAT ’07. IEEE, 205-209.

Haller, M., Leithinger, D., Leitner, J., Seifried, T., Brandl, P., Zauner, J., Billinghurst, M., The shared design space. In SIGGRAPH ’06: ACM SIGGRAPH 2006 Emerging technologies, page 29, New York, NY,USA, 2006. ACM Press.

Research email: emtech@becta.org.uk

Main email: becta@becta.org.uk
URL: www.becta.org.uk

(This was also posted on the TechPsych blog.)

Sep 26, 2010

Essential Interaction Design Essays and Articles: Dan Saffer's Lists, Don Norman, and Interactions Magazine

I came across a link about Dan Saffer's recent post, Essential Interaction Design Essays and Articles.  Equally important is Dan Saffer's List:  Top Ten Essential Interaction Design Books


Dan Saffer is one of my "important influences".  When I was taking HCI and Ubiquitous Computing courses, I bought the first edition of his book,  Designing for Interaction:  Creating Innovative Applications and Devices.  In today's world of technical convergence, it is an important read, as Saffer's content crosses a number of disciplines.

Thoughts:
It doesn't surprise me to learn that the #1 book on Saffer's Essential Interaction Design Books list is  list is Don Norman's The Design of Everyday Things.  According to Saffer,  "there’s no getting around it: this is the book. Affordances, mental models, and other bits that have all become part of the general lexicon all started with The Don’s book. A must read." 

Don Norman's book was required reading in the Human-Computer Interaction class I took a few years ago.  As I read through the book, I sensed a familiar tone.  I later learned that Don Norman was the co-author of a required textbook for one of the psychology courses I took when I was a university student the first time around.    



Don Norman's thinking has influenced me for decades - he continues to be an influence, because he writes articles for one of my favorite publications, Interactions Magazine:



It brightens up my day when I open up my mailbox- the one at the end of my real-life driveway- and find my Interactions magazine, in all of its well-designed, well-written,  semi-glossy-paged glory, waiting for me to open up and read.   The September/October, 2010 issue includes articles on topics related to authenticity in new media, the complexity of "advancement", design and usability, and the politics of development. 


A must-read is Gestural Interfaces: A Step Backwards in Usability, co-authored by Don Norman and his collaborator, Jakob Neilson, 


Here is an excerpt from the article, which highlights some of the problems of rushing to get products with natural-user interfaces out to market:
"Why are we having trouble? Several reasons:
  • The lack of established guidelines for gestural control
  • The misguided insistence by companies (e.g., Apple and Google) to ignore established conventions and establish ill-conceived new ones.
  • The developer community’s apparent ignorance of the long history and many findings of HCI research, which results in their feeling empowered to unleash untested and unproven creative efforts upon the unwitting public"
(Interactions Magazine is a publication of ACM CHI -Association of Computing Machinery, Computer-Human Interaction interest group).


Other articles by Don Norman, published in Interactions Magazine:
The Research-Practice Gap: The Need for Translational Developers 
Natural User Interfaces are not Natural 
The Transmedia Design Challenge: Technology that is Pleasurable and Satisfying
Technology First, Needs Last: The Research-Product Gulf
To be published, available on the jnd website:
Systems Thinking:  A Product is More Than The Product  


SOMEWHAT RELATED
My resource pages:
RESOURCES: Natural User Interaction, InfoViz, Multi-touch, Blog roll, and More - a huge mega-list of links! 
Conferences, Research, Resources page


Living with Complexity
Donald Norman, to be release in October 2010
Living with Complexity


Interactions Archives


Here are a list of books/articles, suggested by Dan Saffer's readers:


Designing for Interaction – Saffer, D. (2nd Edition; 2009)
Thoughts on Interaction Design – Kolko, J. (2009)
The Humane Interface – Raskin, J.
Digital Ground – McCullough, M.
Inmates are running the Asylum – Cooper, A
Designing Interactions – Moggridge, B (ed.)
Everyware – Greenfeild, A.
Designing Social Interfaces – Malone & Crumlisch
Emotional Design – Norman, D.
Invisible Computer – Norman, D.
Persuasion Technology – Fogg, BJ
Thoughtful Interaction Design: A Design Perspective on Information Technology by Jonas Lowgren and Erik Stolterman (Paperback – Mar 30, 2007)

Designing Visual Interfaces by Mullet/San
Steve Krug – Don’t Make Me Think: A Common Sense Approach to Web Usability
Design Research: Methods and Perspectives edited by Brenda Laurel 
Information Architecture (“The Polar Bear Book”) by Peter Morville.


Thanks to Putting People First for the link to Dan Saffer's list!

Jan 28, 2011

"Microsoft is Imagining a NUI Future". You can, too!

Microsoft is Imagining a NUI Future
Steve Clayton, Next at Microsoft Blog, 1/26/11


"Our research shows that the vast majority of people polled in both developed and emerging markets see great potential for NUI applications beyond entertainment. This is especially true in China and India, where 9 out of 10 respondents indicate they are likely to use NUI technology across a range of lifestyle areas – from work, education and healthcare, to social connections, entertainment and the environment. We believe that taking technology to the next billion can be aided by NUI – making technology more accessible and more intuitive to a wider audience". - Steve Clayton, Microsoft


The people at Microsoft don't own the concept!  I'm a member of the NUI Group (May, 2007) and SparkOn.  Both are on-line communities where you can find people who live and breathe NUI, learn about their work, and even share designs and code. If you are intrigued by NUI - as a designer, developer, or user, please join us.


Note: 
I've been an evangelist and cheerleader for the NUI cause for many years.  If you search this blog for "post-WIMP", "NUI", "multi-touch", "gesture", "off-the-desktop""natural user interaction", "natural user interface", or even "DOOH", you'll be provided with an overwhelming number of posts that include videos, photographs, and links to NUI-related resources, including scholarly articles.  There is a small-but-growing number of people from many disciplines, quietly working on NUI-related projects.


RELATED
Microsoft Plans a Natural Interface Future Full of Gestures, Touchscreens, and Haptics
Kit Eaton, Fast Company, 1/26/112
Rethinking Computing (video)
Craig Mundie, Microsoft
Interactive Touch-Screen Technology, Participatory Design, and "Getting It" - Revised
Touch Screen Interaction in Public Spaces:  Room for Improvement, if "every surface is to be a computer".

Mar 29, 2011

SIFTEO, the next-gen Siftables! (Tangible User Interfaces for All)

Despite my enthusiasm for TUI's , I somehow missed the news about the transformation of Siftables to a commercial version, Sifteo:

Sifteo Inc. Debuts Sifteo™ Cubes - A New Way To Play (PDF



"Sifteo cubes are 1.5 inch computers with full-color displays that sense their motion, sense each other, and wirelessly connect to your computer. You, your friends, and your family can play an ever-growing array of interactive games that get your brain and body engaged.
Sifteo’s initial collection of titles includes challenging games for adults, fun learning puzzles for kids, and games people can play together." -Sifteo website
For more information, see the Sifteo website,  blog, and YouTube  channel.  If you can't wait to get your own set,  take a look at Josh Blake's Sifteo Cube Unboxing Video!

RELATED
About two years ago, I was interviewed about my thoughts about the interactive, hands-on, programmable cubes, then called Siftables,  for an article published in IEEE's Computing Now magazine:  Siftables Offer New Interaction Mode  (James Figeuroa, Computing Now, 3/2009). 

For those of you who'd like more information about tangible user interfaces (TUIs) and  the development of Siftables, I've copied my 2009 post,   Tangible User Interfaces, Part I:  Siftables,  below:

TANGIBLE USER INTERFACES, PART I: SIFTABLES (2009)
In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms" (pdf).   According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth a look, for those interested in this seminal work.

Another must-read is Hiroshi Ishii's 2008 article, Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related to Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. 



According to the Fluid Interfaces website, the goal of this research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined. Siftables is the work of David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement..No special sensing surface or cameras are needed."





Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg


More about Siftables:
Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces (pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."

In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.