Showing posts sorted by date for query off-the-desktop. Sort by relevance Show all posts
Showing posts sorted by date for query off-the-desktop. Sort by relevance Show all posts

Dec 31, 2009

The Post-WIMP Explorers' Club: Update of the Updates, Morning of 12/31/09

What is the Post WIMP Explorers Club?  
I came up with the name of this semi-fictional club as a way to organize my thoughts (and blog posts) regarding the development of a new metaphor for post-WIMP applications and technologies, related specifically to natural user interfaces, natural user interaction design, and off-the-desktop user experience.  


Update, morning of 12/31/09:
Josh Blake, author of the blog "Deconstructing the NUI", posted Metaphors and OCGM  this morning.  It fleshes out post-WIMP concepts, addressing metaphors & interfaces.  The premise is that NUI metaphors will be less complex than GUI (WIMP) metaphors.    My feeling is that on the surface, this will hold true, especially for consumers/users and people creating light-weight applications and software widgets.  


Underneath the surface,  where designers and developers brains spend more time than users & consumers, things might be more complex.  Why? The technology to support the required wizardry is more complex.  With convergence, the creation of new technologies, applications, communication systems, and even electronic entertainment, is  now dependent upon the work and thinking of people from a wider range of disciplines.  Each discipline brings to the table a set of terms rooted in theory, and even research practices.


Update,  late afternoon, 12/30/09:
Richard Monson-Haefels response to Ron George's "Part 2".  The concept of OCGM might be growing on him now... OCGM: George's Razor : "If Ron George can explain how OCGM encompasses Affordances and Feedback than I'll be convinced that OCGM works for NUI. Otherwise, I think OCGM is a great start that would benefit from an added "A" and "F"." -Richard   
  • OCGM relates to Occam's Razor.  It is helpful to read a bit about it if you are are interested in the post WIMP conversations. (The link is to an an article from "How Stuff Works", via Richard Monson-Haefel.)
UPDATE 12/30/09  -- This post is part of a discussion between several different bloggers, and was written before Ron George wrote his latest post, Welcome to the OCGM Generation!  Part 2, which I recommend that you read now, or within the same time frame, as this post.   Since I'm not ready to write "Part 2" of this post, I tweaked what I had and added some links to a handful of my previous posts that touch on this and related topics.  The links can be found at the bottom of this post.




START HERE FOR THE "ORIGINAL" POST FROM  12/29 & 12/20/09:


Background
About a year ago I responded to a conversation between Johnathan Brill, Josh Blake, and Richard Monson-Haefel discussing "post-WIMP" conceptualization regarding natural user interfaces and interaction, otherwise known as NUI.  The focus of the discussion was on Johnathan's post, "New Multi-touch Interface Conventions". At the time, we were reading Dan Saffer's book, Designing Gestural Interfaces, and contemplating new ways that technology can support human interaction and activities in a more natural, enjoyable, and intuitive manner.  

A few days later, I shared some of the concepts from the discussion on a post on this blog, "Why "new" ways of interaction?".  The post includes video of Johnathan Brill discussing PATA, a post-WIMP analogue to assist with multi-touch/gesture based application development, which he describes as follows:
Places
"Lighting, focus, and depth, simplified searching and effecting hyperlinked content."
Animation "Using animation to subtly demonstrate what applications do and how to use them is a better solution than using icons. Animations makes apps easier to learn."
Things "Back in the days of floppy disks, objects helped us organize our content. This limitation was forced by arcane technology, but it did have one huge advantage. We used our spatial memory to help us navigate content. Things will help us organize content and manipulate controllers across a growing variety of devices."

Auras "Auras will help us track what we are tracking and when an interaction has been successful."
(For reference, I've copied some of my responses to the first discussion, which can be found near the end of this post)


A year later....
What has changed?   Everything post-WIMP has been covered like a blanket by the NUI-word.  "NUI" now functions as a generic term for anything that is not exactly WIMP.  There is a sense of urgency now to figure out how best to conceptualize post-WIMP interfaces and interactions.  Newer, affordable technologies enable us to interact with friends and family while we are on-the go. Netbooks, e-Readers, SmartPhones, large touch screen displays, interactive HDTV, and new devices with multi-modal I/O's abound.  Our grandparents are on Facebook and twitter from their iPhones.  Our world no longer requires us to be slaves to the WIMP mentality.


So what is the problem?
The technology has moved along so fast that application designers and developers have not had a chance to catch up. (The iPhone is an exception.)  The downturn in the economy has made it difficult for many to take the leap from traditional software or web development and gain new skill sets.  On top of it all, most of us over the age of 15 have been brainwashed from years of working within the constraints of WIMP. It doesn't matter if we are users, consumers, students, designers, or developers.


Even the folks least likely to have difficulty expanding into the post-WIMP world have had some difficulty.  If you've had training in HCI (Human-Computer Interaction), you were inadvertently brainwashed with the best. The bulk of the theory and research you contemplated was launched at a time when WIMP was king, even as the Web expanded. Many of the of the principles held dear to traditional HCI folks have been shattered, and no-one has come up with a "theory of everything" that will cover all of the human actions and interactions that are supported or guided by new technologies.


The problem, in part, is that letting go of WIMP is hard to do, as illustrated by the following post from the Ars Technica website:  Light Touch:  A Design Firm Grapples with Microsoft Surface  (Matthew Braga, 6/29/09) "Ditching the mouse and keyboard means a whole lot more than just doing without two common peripherals.  As those who have worked with Microsoft Surface have found out, you have to jettison decades of GUI baggage and start with a whole new mindset...In actuality, few multi-touch gestures are really anything like what we experience in the physical world. There is no situation in which we pull on the corners of an image to increase its size, or swipe in a direction to reveal more content. So, in the context of real-world interaction, these types of gestures are far from natural...gestures should not only feel natural, but logical; the purpose that gestures like these serve, after all, is to replace GUI elements to the end of making interaction a more organic process."   (Be sure to read the comments.)

Now that the Surface is taking root in more places, and touch-screen all-in-one PC's and tablets are starting to multiply, more people are giving "NUI" some thought. Ron George, an interaction and product designer with experience working with Microsoft's Surface team has contributed to the post-WIMP discussion and spent some time sharing ideas with Josh Blake, a .NET, SharePoint, and Microsoft Surface Consultant for InfoStrat and author of Deconstructing the NUI blog. The outcome of this discussion was Ron George's December 28th blog post, "OCGM (pronounced Occam['s Razor] is the replacement for WIMP", and Josh Blake's post, "WIMP is to GUI as OCGM (Occam) is to NUI".   (Be sure to read the comments for both of these posts!)



OCGM (as conceptualized by Ron George)


Objects "are the core of the experience. They can have a direct correlation with something physical, or they can just be objects in the interface."


Containers "will be the “grouping” of the objects. This can manifest itself in whatever the system sees fit to better organize or instruct the user on interactions. They do not have to be, nor should they be, windows. They can be any sort of method of presentation or relationship gathering as seen fit."


Gestures "I went into detail about the differences in Gestures and Manipulations in a previous post [check it out for a refresher]. Gestures are actions performed by the user that initiate a function after its completion and recognition by the system. This is an indirect action on the system because it needs to be completed before the system will react to it."


Manipulations "are the direct influences on an object or a container by the user. These are immediate and responsive. They are generally intuitive and mimic the physical world in some manner. The results are expected and should be non-destructive. These are easily performed and accidental activations should be expected and frequent."

To illustrate a point regarding the validity of the OCGM analogy proposed by Ron George, Josh Blake shares the following video of a presentation from REMIX 2009, in which August de los Reyes, the Principle Director of User Experience for Surface Computing at Microsoft, briefly discusses the TOCA (Touch, Objects, Containers, and Actions) concept, suggested to replace the WIMP concept:

The video wouldn't embed, so go to the following link:


Predicting the Past: A Vision for Microsoft Surface
"Natural User Interface (NUI) is here. New systems of interaction require new approaches to design. Microsoft Surface stands at the forefront of this product space. This presentation looks at one of the richest sources for inventing the future: the past. By analyzing preceding inflection points in user interface, we can derive some patterns that point to the brave NUI world." 


The concepts outlined in the presentation are similar to Microsoft's Vision for 2019


Richard Monson-Haefel added his thoughts about the discussion about OCGM in his recent blog post, "What is NUI's WIMP?"  Richard disagrees with the OCGM concept, as he feels it doesn't encompass some important interactions, such as speech/direct voice input.   He'd probably agree that NUI is NOT WIMP 2.0.



Post-NUI, Activity Theory, and Off-the-Desktop Interaction Design:
As I was reading the recent posts and discussions regarding NUI/OCGM, I also contemplated some of what I've been reading over my holiday break, "Acting With Technology:  Activity Theory and Interaction Design", written by Victor Kaptelinin and Bonnie A. Nardi.   Victor Kaptelinin is the co-editor of "Beyond the Desktop Metaphor: Designing Integrated Digital Work Environments" (MIT Press, 2007), and has an interest in computer-supported cooperative work.  Bonnie Nardi brings to the IT world her background in anthropology, and is the co-author of "Information Ecologies:  Using Technology with the Heart" (MIT Press, 1999). The authors know what they are talking about. 


It is important to note that activity theory-based interaction design is viewed as a "post-cognitivistic", and informed by some of what I studied in psychology, education, and social science years ago. Within the field of activity theory are some important differences, which I'll save for a future post. 


Below are some concepts taken from the book. I am still mulling them over through the prism of NUI, post-WIMP, PATA, TOCA, OCGM, etc.  That's why there will be at "Part II", with specific examples.


"Means and ends, the extent to which the technology facilitates and constrains attaining user's goals and the impact of the technology on provoking or resolving conflicts between different goals


Social and physical aspects of the environment - integration of target technology with requirements, tools, resources, and social rules of the environment
Learning, cognition, and articulation,  internal vs external components of activity and support of their mutual transformations with target technology


Development -Developmental transformation of the above components as a whole" 
"Taken together, these sections cover various aspects of the way the target technology supports, or is intended to support, human actions".  (page 270)


I especially like the activity checklist included in the appendix of the book, as well as the concept of tool mediation. "The Activity Checklist is intended to be used at early phases of system design or for evaluating existing systems.  Accordingly, there are two slightly different versions of the Checklist, the "evaluation version" and the "design version".  Both versions are implemented as organized sets of items covering the contextual factors that can potentially influence the use of computer technology in real-life settings.  It is assumed that the Checklist can help to identify the most important issues, for instance, potential trouble spots that designers can address". (page 269)


"The Checklist covers a large space.  It is intended to be used first by examining the whole space for areas of interest, then focusing on the identified areas of interest in as much depth as possible...there is a heavy emphasis on the principle of tool mediation"  (page 270).


Other Thoughts
What is missing from this picture is a Universal Design component, something that I think holds up across time and technologies.  Following the principles of Universal design doesn't mean dumbing down or relying on simplicity. It is a multi-faceted approach, and relies on conctructing flexibility in use, one of the key concepts of Universal Design. I'd like to see this concept embedded in the post-WIMP conceptualization somehow. 


Because of my background in education/psychology/ special education, I try to follow the principles of  Universal Design for Learning (UDL) when I work on technology project.  I've spent some time thinking about how the principles of UDL could be realized through new interaction/interface systems.   Although this approach focuses on the educational technology domain, it is important to consider, given that a good percentage of our population - potential users, clients, consumers - has a temporary or permanent disability of one kind or another.


Components of Universal Design for Learning:
Multiple Means of Representation
Provide options for perception
Provide options for language and symbols
Provide options for comprehension
Multiple Means of Action and Expression
Provide options for physical action
Provide options for expressive skills and fluency
Provide options for executive functions
Multiple Means of Engagement
Provide options for recruiting interest
Provide options for sustaining effort and persistence
Provide options for self-regulation
-Adapted from the UDL Guidelines/Educator Checklist, which breaks down the components into more specific details.


Note:  The concept of Universal Design for Learning shares historical roots with some of the work behind Activity Theory and Interaction Design. Obviously, there is still much to contemplate regarding OCGM and other permutations of post-WIMP concepts!   


Here are my comments to the discussion on Johnathan Brill's blog from January 2009:
Thoughts: I'm seeing a trend to "optimize" WIMP-type applications so they can be manipulated by gesture and touch. People who program kiosks, ATM's and POS touch screens are examples of what I'm talking about. Touch and hold, two-fingered touch, and double-tap are just a slight transformation of the WIMP world-view, and in my opinion, are still WIMP (wimpy!). The mouse interaction "pretenders" are fine for using legacy productivity applications, OK in the short run.

For example, I have an HP TouchSmart, but I don't use the touch screen as often as I'd hoped. Try using using Visual Studio to code something on a touch screen. There is so much more that can be done! I know from the touch-screen prototype/demos I've worked on in various classes that applications that support collaboration and decision-making are important, and not just for work.

What do people DO, really? First of all, we are social beings, most of us. Think of what we share and discuss with others, and think about what sort of interactions on a display might best correspond with this interaction. Here are some of the things I've been DOING recently that involved some sort of technology and communication/collaboration with others:



---Travel planning - I recently went on a cruise and with various family members, selected activities I wanted to do on the ship as well plan my shore excursions (a complicated process)


---Picture sharing- I came back from the cruise with lots of pictures that I uploaded on Flickr. Related to this process: Picture annotating, tagging, choosing/comparing & editing it would be SO cool if I could use two sliders to enhance my pictures just so!


---Talking on the phone and responding to e-mail with friends and family members about the pictures, and what they wanted to see on Flickr- "you know, the ones of the dogs in Jamaica"... of which I had about 68!


---Financial planning with my husband. (I took info-viz last semester, so I know the possibilities are there.)


---Using the touch-screen to check-in at my eye-doctor's office: This was a user-unfriendly experience. Such a nice little screen. I was provided with a WIMPY PowerPoint-like interface which was confusing to use- and time consuming!


---Shopping at the new Super Wal-Mart: I asked the greeter if there was a map, and he said, "Food is over there, and the rest of the stuff is that way." Flat panel displays were all over the store, but of course,they weren't interactive. I had no idea where anything was, and the few items I was looking at had no bar-code tags. There wasn't a clerk in sight.   Wal-Mart TV rolled on-and-on via the display above my head. If I could only harness the display and have the talking head answer my questions! I gave up on my shopping trip when I was in the facial lotion/potion section. Too many choices, and too much fine print to read.

---Shopping at the mall with my young-adult daughters... I'll have to hold my thoughts on that one for now!

Some suggestions:
I think the artist/designers, (even dancers,) who are interested in multi-touch and gesture interaction have some interesting things to consider. (I linked to some of my previous posts.)


Again:
I am still mulling things over through the prism of NUI, post-WIMP, PATA, TOCA, OCGM, etc.  So that is why there will be at "Part II".  With specific examples!


RELATED
Multimedia, Multi-touch, Gesture, and Interaction Resources


My thoughts:
2007 Letter to the Editor, Pervasive Computing
Useful Usability Studies (pdf)
2007 Blog Post
Usability/Interaction Hall of Shame (In a Hospital)
2008 Blog Posts

Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors
Interactive Touch-Screen Technology, Participatory Design, and "Getting It"
An Example of Convergence: Interactive TV: uxTV 2008
2009 Blog Posts

Why "new" ways of interaction?
Microsoft: Are You Listening?  Cool Cat Teacher (Vicki Davis) Tries out Microsoft's Multi-touch Surface Table
Haptic/Tactile Interface:  Dynamically Changeable Physical Buttons
The Convergence of TV, the Internet, and Interactivity:  Update
UX of ITV:  The User Experience and Interactive TV (or Let's Stamp Out Bad Remote Controls)
Digital Convergence and Interactive Television;  Boxee and Digital Convergence 

ElderGadget Blog: Useful Tech and Tools


Other People's Thoughts
Ron George's blog, OCGM (pronounced Occam['s Razor] is the replacement for WIMP  12/28/09
Ron George: Welcome to the OCGM Generation! Part 2 
Stephen, Microsoft Kitchen: OCGM, A New Windows User Experience
Richard Monson-Haefel's blog, Multi-touch and NUI:  What is NUI's WIMP?
Richard Monson-Haefel:  OCGM: George's Razor
Josh Blake's blog,  Deconstructing the NUI: WIMP is to GUI as OCGM (Occam) is to NUI
Bill Buxton: Gesture Based Interaction (pdf) (Updated 5/2009)
Bill Buxton: "Surface and Tangible Computing, and the "Small" Matter of People and Design" (pdf) - ISSCC 2008
Dan Saffer, Designing for Gestural Interfaces: Touchscreens and Interactive Devices
Dan Saffer, Designing for Interaction 
Mark Weiser,  Computer for the 21st Century  Scientific American, 09, 1991
Touch User Interface:  Readings in Touch Screen, Multi-Touch, and Touch User Interface
Jacob O Wobbrock, Meredith Ringel Morris, Andrew D. Wilson User-Defined Gestures for Surface Computing CHI 2009, April 4–9, 2009, Boston, Massachusetts, USA.

Nov 2, 2009

Adobe's Resources: Contextual Solutions, Rich internet applications across devices, and Open Screen.

I thought I'd share a few interesting resources from Adobe.  The company is focused on the future, and I plan to watch what unfolds.  Adobe believes that the next generation of the company's products will move us forward to a world of seamless access to content, across screens of all sizes, everywhere we go.

I'm hoping that this will not lead to an environment where we have difficulty escaping "skip intro" ads everywhere we turn.  Hopefully it won't be too bad!

I found a few gems of wisdom in the Adobe white paper,   Designing Contextual Solutions and Applications.  The paper provides a conceptual framework that I think would be helpful to web designers/developers who are interested in off-the-desktop application development.


Here is an excerpt from the introduction of the white paper:
"With its platforms and products that are inherently multichannel and multidevice, Adobe envisions a new class of “contextual solutions” taking hold in the marketplace. These solutions are broader in scope than a device, a campaign, or a single service; rather, they encapsulate the various contexts in which the end user exists, interacts, thinks, consumes, and purchases. And when content owners understand the different types of users and how they differ in their contexts, they are in a much better position to design experiences that are compelling and sticky. At the heart of every contextual solution is at least one or more contextual applications—an emerging type of application that spans multiple devices and uses system, user, and network contexts to provide an optimized user experience and deliver improved business results.

This white paper provides an overview of the types of contexts that compose a contextual solution and application. With examples given along the way, it culminates with a framework for considering all appropriate contexts while designing and developing a new contextual solution or an application that is part of an existing solution."


Another Adobe white paper is quite useful - Rich Internet Applications across devices: The emergence of contextual applications.  It provides examples of RIA's across devices, such as Amazon Video On Demand,  Demandbase, Atlantic Records: Fanbase, Finetune, and Times Reader 2.0.

Additional Adobe Resources
Contextual Applications
Open Screen Project:  "Singular experience, multiple devices"
Adobe Flash Platform

Oct 28, 2009

Interactive Multimedia Across Platforms and Screens: Adobe's Open Screen Project; MEX Mobile User Experience Manifesto.... (Please don't annoy the user!)

I'm not sure what I think about ubiquitous Flash 10.1. and Adobe's Open Screen project. I like the idea of anything that is seamlessly cross-platform, but I shudder to think that this might let out a wave unwanted or annoying "push" advertising on on-the-go screens of all sizes. I'm assuming web developers, along with TV ad producers, will be jumping on this train without fully thinking about how their applications and designs will play out in the off-the-desktop, digital-out-of home world.

I decided to take a look, drill down through the hype, and share a few links related to this topic.

Adobe Pushes for a Flash-ier Mobile Web
Rob Pegararo, Faster Forward, Washington Post (10/5/09)
"Are you anxious to bring Flash to the mobile Web, even if it means being subjected to some over-eager Web coder's song-and-dance routine? Or would you rather do without it on the go, even if that means having to switch to a "real" computer to use some Web sites' features?"

Hopefully the "over-eager web coders" will heed the MEX Manifesto:

MEX:  Mobile User Experience 2009 Manifesto (pdf)
"The Manifesto sets out our beliefs as to how user-centred design principles can enhance the experience of multi-platform digital services."

A framework for user journeys in a multi-platform world:  Marek Poawlowski, founder of MEX

MEX: User experience journeys in a multi-platform environment from Marek Pawlowski on Vimeo.

"User experiences are evolving into increasingly complex sets of interactions between multiple devices.  In this video presentation, Marek Pawlowski of the MEX Mobile User Experience strategy forum, shows how a framework can be used to map user journeys through the multi-platform environment."

"Unencumbered by wires, information is flowing into every corner of our world at an ever increasing rate and through an ever increasing range of digital platforms. The single greatest challenge facing digital industries is understanding how this explosion of data will be woven into the fabric of consumers' lives." -- Marek Pawlowski, founder of MEX.

MEX Blog 

OPEN SCREEN VIDEO

Open Screen Project from Vyshak V on Vimeo.
"The Open Screen Project is an industry-wide initiative, led by Adobe and backed by other industry leaders who all share one clear vision: Enable consumers to engage with rich Internet experiences seamlessly across any device, anywhere. Partners in the Open Screen Project are working together to provide a consistent runtime environment for open web browsing and standalone applications — taking advantage of Adobe® Flash® Player and, in the future, Adobe® AIR®. This consistent runtime environment will remove barriers to publishing content and applications across desktops, mobile phones, televisions, and other consumer electronics." Learn more

Reinventing Storytelling in the Digital Age Across Platforms, Across Screens

NAB 2009 presentation by Shantanu Narayen of Adobe and A.D. Albers, of Disney Interactive Media Group, from NAB 2009
Adobe and NVIDIA Deliver Rich Web Experiences on Netbooks and Mobile Devices
Reuters (10/5/009)
"At Adobe MAX, Adobe's worldwide developer conference, Adobe Systems Incorporated and NVIDIA Corporation..announced that both companies are bringing uncompromised browsing of rich Web content to netbooks, smarphones and smartbooks built with NVIDIA GPUs. The companies have been working closely together as part of the Open Screen Project to optimize and dramatically improve performance of Flash Player 10.1 by taking advantage of GPU video and graphics acceleration on a wide range of mobile Internet devices. NVIDIA customers embracing Flash Player 10.1 for their new devices include HP, Lenovo, Samsung, Acer, Asus and more..."


RIM Joins Open Screen Project  Reuters (10/4/09 )


Honey I Shrunk the Flash Player Simon Bisson and Mary Branscome, ZDNET, 10/12/09
Teaming up with Adobe and the Open Screen Project -Google Blog   (10/5/09)


Paramount Digital Entertainment Launches Interactive Thriller on MySpace  Tracy Sedlow, InteractiveTV Today (10/28/09)
"The company says that it can deliver the show's multiple interactive elements to viewers across devices using Adobe Flash Player and Adobe AIR, "because of efforts by the Open Screen Project, an industry-wide initiative led by Adobe and supported by PDE and close to 50 other industry leaders, to enable people to engage with rich Internet experiences across any device, anywhere." -

Oct 6, 2009

I want to play with mice! Microsoft's Multiple Multi-touch Mice Preview

Hot off the press from Microsoft's Applied Sciences Group at UIST 2009!
Mouse 2.0: Multi-touch Meets the Mouse

"In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementa-tions of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study."

The following video is courtesy of Microsoft's Applied Sciences Group:



Music:  "Motion Blur", by Bjorn Hartman. (Bjorn is an HCI researcher and electronic musician.)

The researchers on the team: 
Nicolas Villar, Shahram Izadi, Dan Rosenfeld, Hrvoje Benko, John Helmes, Jonathan Westhues, Steve Hodges, Eyal Ofek, Alex Butler, Xiang Cao and Billy Chen


Here is a video preview/demo of the multi-touch mice prototypes from Microsoft's Applied Sciences Group Lab, courtesy of CrunchGear:





meese
-CrunchGear

Sep 19, 2009

The World Is My Interface

THE WORLD IS MY INTERFACE  is the new  title of my Technology-Supported Human-World Interaction blog.  The name was just too long!

The first post written under the new title has a variety of links about ubiquitous computing, off-the-desktop interaction design, use of technology in public spaces, and so forth.  There are a few links to scholarly research and related projects, too.

Sep 3, 2009

The Future of Interactive Technology: Links to VentureBeat's series of articles

Put your finger on it: The future of interactive technology
Matt Marshall, 8/12/09

"Touch" technology for desktop computing finally taking off
Al Monro (CEO, NextWindow) 8/28/09

hptouchsmart

Speech, touchscreen--been there, done that. What's the user interface of tomorrow?
9/2/09 Mary Czerwinski (Microsoft)
mundie_interface
How phones emerged as main computing devices, and why user interface will improve

Andrew Hsu (Synaptics) 8/24/09

SOMEWHAT RELATED

Playing Tetris could be good for your brain, study says

tetris-2


Mind Research Network
BRAIN IMAGING SHOWS PLAYING TETRIS LEADS TO BOTH BRAIN EFFICIENCY AND THICKER CORTEX

"Over a three-month period, adolescent girls practiced Tetris, a computer game requiring a combination of cognitive skills. The girls who practiced showed greater brain efficiency, consistent with earlier studies. Compared to controls, the girls that practiced also had a thicker cortex, but not in the same brain areas where efficiency occurred."

"We showed that practice on a challenging visuospatial task has an impact on the structure of the cortex, which is in keeping with a growing body of scientific evidence showing that the brain can change with stimulation and is in striking contrast with the pervasive and only-recently outmoded belief that our brain’s structure is fixed,” said Dr. Sherif Karama, a co-investigator at the Montreal Neurological Institute. This study, published in the open-access journal BMC Research Notes, is one of the first to investigate the effects of practice in the brain using two imaging techniques. The girls completed both structural and functional MRI scans before and after the three-month practice period, as did girls in the control group who did not play Tetris. A structural MRI was used to assess cortical thickness, and a functional MRI was used to assess efficient activity."

The World is My Web Browser: Interactive Technology in Public Spaces

Off-the desktop Interactive technologies are spreading quickly, across many platforms.   It seems that within the next 3-5 years we will be able to have instant access to a social and dynamic web, with information and entertainment at our fingertips, anywhere we go. 


The "world-as-web-browser" concept is one I've touched upon in the past.  I'd like to focus on this topic in future posts.

I came across some of the information in this post on the Future of Media Blog,maintained by the folks at
the Interpublic Emerging Media Lab, based in California. This blog is a must-read!


The Saatchi & Saatchi ad agency in the UK created interactive games that provide people waiting in bus shelters the opportunity to pass the time by bashing digital falling eggs.


A more utilitarian approach to interactive touch screens is the EyeStop bus stop, a concept created by researchers MIT's SENSEable City Lab, along with the city of Florence and the city's transit service:


http://senseable.mit.edu/eyestop/01.jpg
MIT



MIT researchers unveil the EyeStop: "Bus stop of the future" can sense its surroundings (pdf)



SOMEWHAT RELATED

IPG Future of Media Blog

Jul 16, 2009

Convergence: Mobile content, Interactive Displays, Digital Out of Home, & Danoo

Danoo: "Location scouting and scalable web technologies combined to create a targeted DOOH experience".

If you are tech-savy and a member of the digital signage/interactive display/AV end of things, you probably know what this means. If not, you should. Things are pairing, merging, and converging in the world of "out and about" technology.

Technology, especially digital signage, is ubiquitous, and has targeted you and your neighborhood.


So what's the fuss? A company from Danoo, from China, is ramping things up in the Digital Signage/DOOH (Digital Out of Home) arena.
Fast.

Danoo has rapidly up with exisiting companies, such as National CineMedia, Blue Bite, Electronic Arts, and the HISTORY channel to create and push off-the-desktop and off-the-TV content and interactive user experiences. Bill Gerba, of Wirespring, and Manolo Almagro, CTO of Show + Tell in NYC (think Times Square digital signage experiences), and others have recently written about this phenomenon, right after Danoo released the following press announcement:

Danoo Inc Announces a Breakthrough in Mobile Content Delivery (7/11/09, Business Wire)
"Danoo partnered with mobile technology company Blue Bite to enable the campaigns, and is actively rolling out its mobile interactivity capability to all of its venues. The first 200 Danoo locations will be live in Los Angeles and New York by July 1st. At full deployment, Danoo will give advertisers the ability to get their content into the hands of more than 200,000 consumers in an average two-week campaign. In addition to content downloads, Danoo offers multiple ways to pair its screens with mobile interactivity to maximize campaign effectiveness, such as SMS call-to-actions, social gaming and mobile application promotion."

"Visitors to Danoo locations viewed video content on Danoo’s digital screens accompanied by an on-screen prompt to download exclusive content such as sneak peeks and ringtones from their Bluetooth or Wi-Fi-enabled devices via the Danoo network."



Here are a few related articles:
How Significant is the Danoo-IdeaCast-National CineMedia Deal?
(7/14/09, Bill Gerba, Wirespring)
A Watershed Moment for DOOH Media
(7/12/09, Bill Collins, Daily DOOH)

Pictures from Danoo's Website:

Danoo's IdeaCast
- "Captive TV".
http://www.danoo.com/images/danooideacast.jpghttp://www.danoo.com/images/danoopic2.jpg

I've probably encountered Danoo's technologies numerous times, judging from my archive of pictures of displays and related technologies that have crossed my path. The TV screen on the treadmill looks just like the one I saw on a treadmill in the fitness center of a Princess cruise ship last year. The picture in the middle looks like it was taken in an airport. The picture on the right is of a system in a coffeehouse. According to information on the Danoo website, the system is interactive and includes "content downloads, social gaming, mobile couponing and more via SMS, Bluetooth and Wi-Fi".


Blue Bite's website is worth looking at. It demonstrates their concept very well through animation:

Blue Bite: "Take the Ad With You"



Things are ramping up in my region, too.
T1 Tapas, a restaurant north of Charlotte, N.C., in the Birkdale Village in Huntersville, has majority owners with a technology background. Mike Feldman and Jim Morris started up Digital Optics Corporation, which focused on optics for computers and imaging, and after they sold their company, they teamed up with Denise Feldman to establish their company.

T 1 Connection Booth with Multi-User Touch Screen, HD TV, Sound System, Computer, & more:

"T1 Connection Booth seating gives you access to music, photos, and videos through built-in touchscreen tabletops, brilliant monitors, speakers, and computers." -Picture and taken from the T 1 Tapas website

The restaraunt serves as a test bed for T 1 Visions to try out their software and hardware designed to enhance the digitally connected dining experience. The restaurant was featured in May of this year on CNN:

Here is the link to the video: "High Tech Tapas"

Interactive Multimedia Technology Themes -Update on Travel Technologies

Over the next month or so I will be re-organizing this blog. I'll be analyzing the various themes that have emerged since I started this on 4/11/06, over three years ago, as part of an assignment for a class about distance education and on-line communication tools.

My first topic was "Games, Simulations, and Virtual Worlds". Although I continue to focus on those themes, I mostly center around off-the-desktop interactive, collaborative, and emerging technologies that support interaction and activities in public spaces.

One theme that interests me is technology that supports travel experiences. Since I've had the opportunity to travel a great deal (before the economy started to go downhill), I've had a chance to explore this arena as a participant-observer*, and have documented my findings through photographs and video.

It is a joke in my family that if I disappear from the tribe, I can usually be found nearby, poking at an interactive touch screen, photographing something related to technology, or sneaking in a few shots of other people interacting with technology, and sometimes even talking to strangers as they use technology. (I usually ask permission to take pictures of people who are in my view finder, but sometimes they just happen to be in my line of sight.)

It is amazing what an earful you can get about technology as a fellow traveller!

I came across the work of Nanonation when I was on a Royal Caribbean cruise ship, and was a little disappointed with the touch-screen content and interaction around the ship. From what I can tell from the NanoNation website, the applications have been improved somewhat, especially the way-finding application on the Freedom of the Seas:

Wayfinding Application, Freedom of the Seas

Nanonation was also involved in the development of a "Discovery Wall" at the Umpqua bank. This system incorporates tangible icons that sit on a shelf located near the Discovery Wall that trigger an interactive flash presentation on a screen. The icons represent various bank products, and are RFID enabled.

Discovery Wall, Umpqua Bank

Back to the topic of cruise ship/travel technology:
When I was on the Ruby Princess cruise in December of 2008, I was impressed with the "Movies Under the Stars" set-up. At night, the sunning decks are transformed into out-door movie viewing spots, where you can lounge around, basking under the stars at you watch the gigantic silver screen and excellent sound system.

During the day, the system is used to display games that people play on the Wii, which provides the non-playing sunbathers additional entertainment.

I recently learned that the Movies Under the Stars system was developed and installed by FUNA, an international company that focuses on marine-related industries, as well as land-based industries.

Take a look to my "Wii-OOH" Flickr set slideshow to see the Wii in action on the large screen of the Ruby Princess, and on smaller screens in the food-court of the Concord Mills (NC) mall:

(Note, the mall pictures were taken with my cell phone.)

I want to go back!

HCI Note
*I was trained in the use of participant observation long ago, when I was studying social science
and psychology at the University of Michigan. It is a method that was developed early on by anthropologists
and sociologists, and adopted later by researchers in other fields. Some human-computer interaction
researchers use this method, and related techniques, such as ethnography, in their work.

May 24, 2008

Dance.Draw Project : Exquisite Interaction - Collaboration between Software Information Systems -HCI- and Dance Departments at UNC-Charlotte


DANCE.DRAW: EXQUISITE INTERACTION
(Updated)

"The movement of the visualizations are artifacts in real-time of the movements of the dancers. They draw while they dance, they dance together and they draw together. Every performance generates a new visual imprint." -DanceDraw website


Interactive multimedia technology, blended with the arts!

Dr. Celene LaTulipe
, from UNC-Charlotte's Software and Information Systems Department, Professor Sybil Huskey, from the dance department, dance students, and others collaborated to create an amazing performance that I had the opportunity to see performed during the
Visualization in the World Symposium in April (2008).

If you look closely, you will see that each dancer holds two wireless mice, one in each hand. The mice trigger the visualization that is projected in the background. Dr. LaTulipe has focused some of her research on two-handed computer interaction. It is interesting to see how her work has been applied to this beautiful "off-the-desktop" application.

Dance.Draw is a work in progress- visit the following links for more information:

Website (Updated)
Movie
Technical Info
Dr. Kosara's Eager Eyes post about Dance.Draw

Note:
Dr. LaTulipe was my HCI professor- Dr. Kosara was my Visualization/Visual Communication professor.



Nov 17, 2007

About: Programming for interactive multimedia applications- WPF, Silverlight, EduSim, NeuroVR..

Part I Musings: Learning about application development and programming at mid-life:

I thought I'd write on a more personal level this time.

I'm a school psychologist, so in 2003, my motivation for taking computers at mid-life stemmed from my desire to create engaging interactive multimedia games for learning, games that could be played on hand-held devices as well as on the interactive whiteboards that I noticed were inching into my schools.


It was difficult for me to figure out how to get from Point A to Point B.

I shouldn't have been shocked to learn that most introductory programming classes provide instruction, as well as endless lab assignments, that are geared for people who want to make business forms and manipulate business-related data, build e-commerce websites, or create relational databases for... banks!

I now can make a mortgage calculator forms that adjust for various scenarios and provide cute error messages, in beginning Visual Basic.Net, C#, and Java. I can create a database that will let users look up part numbers for all sorts of widgets, in all sorts of combinations, and ensure that client data can be easily accessed in a nice looking form.

Why should I learn all of the old stuff when there are so many new avenues to explore?

Over the past few years, I've been fortunate to take a variety of classes that were not readily available just 8-10 years ago:

Computer/Internet Multimedia. Computer Music Technology. Game Design/Development. AI for Games. Ubiquitous Computing. Web Development Tools. Virtual Reality for Education and Training. These courses have motivated me to learn more about programming. The traditional programming courses had the opposite effect.


PART II
Keeping up

I recently attended a day-long code camp at Central Piedmont Community College to learn more about Windows Presentation Foundation (WPF) and Silverlight. WPF and Silverlight provide the foundation for applications that can run on Microsoft's Surface. WPF and Silverlight provide designers and developers with more efficient ways of developing visualization and interactive multimedia applications.

The architecture behind WPF and Silverlight seems to align more with the way people think and envision, which in my opinion, gives hope for those of us who have toyed with the idea of programming, but were frightened off the first time they opened up a traditional programming textbook.


Part III Visualization and Interactive Multimedia

For more information about WPF, Silverlight, design, etc, read Sam Batterman's (a Microsoft Evangalist) recent blog post: Some thoughts about WPF and Data Visualization

Here are some pictures and text that I lifted from Sam's blog that will give you a picture of what I'm talking about:


"Here's an application that one of our partners built in a few weeks - all WPF and actually, not complicated code...getting that heart rendered was probably less than 100 lines of code. This app is used in a hospital for documenting heart surgery procedures. You can draw and annotate the 3D surface, rotate the heart, etc."


Angiographer


Can you imagine how kids would love to manipulate something like this in a science class?! This would be great on an interactive whiteboard or display.

This focuses on user experience, much more so than applications in the past. For more information about designing for user experience, take a look at the link to Bill Buxton's book, "Sketching User Experiences."

sketch

Bill Buxton's webiste, "Multi-touch systems that I have known and loved" is a good resource for those of you who'd like to get a better picture of multi-touch systems and interaction.

(I'll write more about WPF and Silverlight after I get my laptop repaired and have a chance to experiment some more.)

Croquet and EduSim


Right now, I've been experimenting with
Croquet, which uses Squeak, and EduSim, which is powered by Croquet, to put together some learning activities for students.The best part is that Croquet is open-source, and all of the EduSim applications are free.

Here are some pictures that link to short video clips from the Greenbush EduSim website:







Neuro VR


Another application that I'm working with is NeuroVR.

NeuroVR is a free virtual environment that was designed for use in clinical settings. If you don't have access to virtual-reality hardware, you can still use NeuroVR on a desktop or large-screen display. Available 3D environments include an office, a supermarket, a park, a classroom, a poolside setting, and a home.

Andrea Gaggioli, Ph.D., is the Chief Technical Officer of the this project. He's also behind the Positive Technology blog, which is a great resource. Andrea is a
researcher working at the intersections of psychology, neuroscience, and emerging technologies.

NeuroVR allows the clinician (or educator) to easily insert pictures, objects, and videoclips into each virtual world. Doors open and close, and you can move items around in the environment. For example, fruit set on a table can be moved over to a counter.

In my opinion, NeuroVR has potential not only in the area of rehabilitation and therapy, but in special education as well, particularly for students who have multiple special needs, including severe autism. A variety of "how-to" videos are posted on DaevornLi's YouTube channel.


Here are some videos clips to give you a better picture of the application: