Showing posts with label HCI. Show all posts
Showing posts with label HCI. Show all posts

Oct 22, 2010

Quick Link: 3M Invests in Perceptive Pixel, Jeff Han's Multitouch Tech Company

3M Invests in Perceptive Pixel


"3M, through its 3M New Ventures business, has invested in Perceptive Pixel Inc., a developer of advanced multi-touch solutions based in New York City. Terms of the transaction were not disclosed.Founded by multi-touch pioneer Jeff Han in 2006, Perceptive Pixel is dedicated to the research, development and production of multi-touch interfaces for the knowledge worker. The company's hardware and software products enable users to manipulate complex datasets through a new class of intuitive, powerful and visually rich interface techniques. The combination of its technologies with those of 3M will create incredible new opportunities for both companies."

"To see Perceptive Pixel multi-touch solutions in action on 3M Projected Capacitive Technology, see the video at http://www.3m.com/touchPPI. For more information about 3M MicroTouch products, visit www.3M.com/touch. For an overview of popular touch technologies and terminology, visit www.touchtopics.com."

Catching up with multitouch pioneer Jef Han Ina Fried, Cnet 10/22/10

Jeff Han's 2006 Ted Talk



Jeff Han, 2007



Thanks to Seth Sandler for the link!

Oct 18, 2010

Words of Wisdom (and more) from Harry Brignull: UX Roots in Psychology, Design, Info Architecture...and so much more!

Harry Brignull is a User Experience Consultant at Madget in Brighton, England. According to his "about" page info, his work involves "building experiences by blending User Research, Interaction Design, and process consultancy."  Harry's 90 Percent of Everything blog is a well-spring of information and inspiration.


Back Story
I came across Harry's work in 2004 or 2005, when I was taking a VR Class (Virtual Reality for Education and Training) and working on an assignment about large-screen displays.   At the time, Harry was a Ph.D. working in the Interact Lab at the University of Sussex on the Dynamo project, in collaboration with researchers from the Mixed Reality Lab at the University of Nottingham.

I revisited this work again in early 2007 when I was studying HCI and Ubiquitous Computing, and researching information about collaborative interaction on large displays in public spaces.  The following research article inspired me at the time, and looking back, I consider the work of this team to be seminal, and worth revisiting once again.

Izadi, S., Brignull, H., Rodden, T., Rogers, Y., Underwood, M. (UIST'03)
Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media (pdf)  



(The picture was taken from the Dynamo project's website, and shared on my 2007 blog post, Revisiting promising projects, Dynamo, an application for sharing information on large interactive displays in public spaces.)


Links to a few of Harry's useful blog posts:

UX as Applied Psychology:
Clear Reporting & Critical Thinking:  Why User Experience Needs to Remember its Roots in Psychology (10/4/10)

"There was a time, back in the early 1990s, when almost everyone involved with UX research had a background in Psychology.  Back in those days, the term "User Experience" didn't really exist, and the nearest discipline was Human-Computer Interaction (HCI)..."

Comment:
As a school psychologist, I'm well-steeped in the process of research, observation, and data collection.  I also know that the fundamentals of applied psychology are a very important ingredient in UX work.  Don Norman, one of the "grandfathers" of UX/HCI, was a cognitive psychologist.  He was the co-author of one of my psychology textbooks when I was a university student the first time around. Coincidentally, Norman's book, Design of Everyday Things was required reading for my HCI graduate class.

























Mobile Usability Testing for Low Budgets
Mobile Usability Testing Tip: Recording from Two Webcams
In this post, Harry discusses quick and cheap methods of using two webcams for mobile usability testing. This method could be used in other situations, such as developing presentations. (It might also be applicable for use in therapeutic and special education settings.)

Image: Nick Bowmast


UX Brighton Presentation on Dark Patterns: User Interfaces Designed to Trick People
My Presentation on Out of Box Experience Design  (Harry Brignall)
David Ogilvy: We Sell or Else


RELATED
Links to Harry's Blog Posts, By Topic


SOMEWHAT RELATED

Oct 5, 2010

Light Space: Interaction with digital content across all sorts of surfaces! (Demo video: Hrvoje Benko and Andy Wilson, Microsoft Research)

Hrvoje Benko and Andy Wilson from Microsoft Research demonstrate a system that uses 3D depth support tracking and interpreting the interaction between people,  It also allows for manipulating digital content across a variety of surfaces.

Information Visualization Meets Augmented Reality?

Watch the video to find out:

Oct 1, 2010

Child-Computer Interaction: A Featured Community at the Upcoming CHI 2011 Conference!

Last year, I attended CHI 2010 and participated in a workshop about the next generation of HCI and education.  It was a wonderful opportunity to share ideas with people from all over the world who are interested in emerging technologies, kids, and education.  I plan to attend CHI 2011 in Vancouver, Canada next May 7-11, and even though the conference is months away, I can barely wait.  The good news it that the Child-Computer Interaction community will have an important presence at the 2011 ACM CHI conference.  I wanted to share a little bit about this development on this blog. 

During CHI 2010, I signed up for the  "Designing for the iChild" course.  In one afternoon, I learned more than I had expected, especially the technique called "Layered Elaboration", a collaborative design strategy that involves inter-generational teams of children and adults.  
One of the leaders of this course was Allison Druin, Associate Professor and director of the Human-Computer Interaction Lab at the University of Maryland.  Dr. Druin's focus is in the area of child-computer interaction and how children can be meaningfully involved as partners in the design process.  

The quote below, found on the HCIL Children as Design Partners website, explains why this is so important:

"We have a chance to change technology, but more importantly we have a chance to change the life of a child. Every time a new technology enables a child to do something they never dreamed of, there are new possibilities for the future."  -Allison Druin

In my work as a school psychologist, I use technology with students quite often, especially when I'm at Wolfe, a program for students who have more complex disabilities, including severe autism.  I have been fortunate to have a new SMARTBoard at my fingertips, and access to the school's SMARTtable.  I learn from my students every day.


I believe that we are only at the "tip of the iceberg" with this sort of technology- and related applications such as the iPad and similar devices.   In my experience, well designed technologies and applications can open up a meaningful window to the world for children, teens, and others with disabilities.

Most of the information below was taking from the CHI 2011 conference website:

About the Child-Computer Interaction (CCI) community:


"At CHI, the CCI community will want to attract papers and contributions that represent real advances in the understanding of, or development and refinement of methods for, child computer interaction. It will also seek to unearth groundbreaking innovations addressing the needs, capabilities and preferences of children that have the potential to become reference works for developments in this field."


"By its very nature, The CCI community will have to be divergent in its thinking at CHI; it must also be about two of the mainstream CHI communities – engineering and design, but will potentially also be concerned with many of the communities of technologies (Smart devices, surfaces, mobile), of experiences (Play, Learning, Communication) and of methods (participatory design, evaluation)." ....


"Child Computer Interaction is a new community for CHI. It is a place for contributions where a method or a design is proposed that is especially suited to children and that could not sensibly be easily adapted for adults.  

We are keen to have contributions to all the usual CHI tracks but are also offering four special tracks for our own extra special community. These are:

Child Partnership Projects (CPP): A design competition for teams that include children.
Participatory Papers: Scholarly publications that are disseminated for children readers. (i.e. written in a different way)
Lessons from the Trenches: Targeting industrial cases and experiences. A lively venue where experiences can be exchanged, and researchers can be exposed to the realities of industrial practice in this domain.
Theatre pieces: High quality video contributions, available in a library after the conference, of methods that can be re used and learned from."

Child-Computer Interaction Chairs:
Janet C. Read
University of Central Lancashire
Panos Markopoulos
Eindhoven University of Techology
Allison Druin
University of Maryland
childcomputerinteraction@chi2010

RELATED
Walsh, G., Druin, A., Guha, ML, Foss, B., Golub, E., Hatley, L (2009)  [PDF] Layered Elaboration: A New Technique for Co-Design with Children.  ACM CHI 2009 

Sep 8, 2010

iPad Apps: Supporting Communication for Young People with Autism (& Links to Moms with Apps)

Have you ever watched a kid pick up an iPad for the first time and just go with it? 


Some parents have found that the same is true for their of children who have autism.  The  iPad is a great way to support learning and communication in that it is user-friendly and easy for children- and parents- to understand. 


Here are two videos that were shared with my by Liz Ditz,  author of the "I Speak of Dreams" blog.  The first video is of a boy with autism. The second video is of his little sister, interacting with the iPad to watch a "home-made" social story about a family trip to a coffee shop, created in the "Stories to Learn" app.



YOUTUBE INFO:

"Another too-cool app for his iPad, this one all about writing as well as spelling (http://itunes.apple.com/app/iwritewords-handwriting-game/id307025309?mt=8). For each letter in the word, the app says the letter, has him trace it in super-easy guided steps, and his written letter replaces the original letter in the word (at top of screen). When he finishes, the word appears in his handwriting. The program then spells it and pronounces it, displays a kid's illustration of the word -- and Leo gets to shake the original letters into a hole in the corner of the screen. LOVE IT."

STORIES TO LEARN



YOUTUBE INFO:
"Our family (and a friend) worked together to make this social story about visiting our local coffee shop for my son with autism. We used Stories2Learn (http://itunes.apple.com/us/app/stories2learn/id348576875?mt=8), on our iPad. It was so easy! And he loves it. In fact he loves it so much I had to have his sister demo it, as he loves to play the voiceovers over and over again. :)"


RESOURCES
Moms with Apps
Moms with Apps for Special Needs
Moms with Apps:  Characteristics of great apps for kids with autism


    Jul 19, 2010

    Multi-touch Parallel Coordinates for Interactive InfoVis (video and Info) via Dr. Robert Kosara

    Dr. Robert Kosara's area of research is information and data visualization. He is an assistant professor in the Department of Computer Science, College of Computing and Informatics, at the University of North Carolina at Charlotte, and also a member of the Charlotte Visualization Center.   He's fascinated with parallel sets/categorical data visualization.

    Dr. Kosara is known for his deep thoughts about information visualization, as well as persuading others that they should think deep thoughts about the subject, too! His latest work is an example of interesting multi-touch interaction with representation of parallel coordinates using multi-touch trackpads commonly found on laptops, including the Macbook Pro.

    Indirect Multi-touch Interaction for Brushing in Parallel Coordinates:

    Indirect Multi-Touch Interaction for Brushing in Parallel Coordinates from Robert Kosara on Vimeo.


    BELOW IS INFORMATION FROM THE VIMEO WEBSITE:

    "This is the companion video for a paper we submitted recently. It describes a technique for interacting with parallel coordinates using the multi-touch trackpad found on laptops like Apple's MacBook Pro. 


    Below is the abstract from that paper:
    Interaction in visualization is often complicated and tedious. Brushing data in a visualization such as Parallel Coordinates allows the user to select data points according to certain criteria. Modifying a brush or combining it with another one usually requires a lot of additional effort and mode switches.
    We propose the use of multi-touch interaction to provide fast and convenient interaction with Parallel Coordinates. By using a multi-touch trackpad rather than the screen directly, the users hands do not obscure the visualization during interaction. Using one, two, three, or four fingers, the user can easily and quickly perform complex selections. Being able to change the selections rapidly, the user can explore the data set more easily and effectively, and can focus on the data rather than the interaction."


    RELATED
    Robert Kosara's info about parallel sets
    EagarEyes Shorts
    IEEE VizWeek 2010 (21st IEEE Visualization Conference; 16th IEEE InfoVis Conference; 5th IEEE VAST Conference)  October 24-29, 2010, Salt Lake City, Utah


    A plug for a couple of Dr. Kosara's recent publications:


    stacks_image_8C8D7C23-8DD3-4868-836C-A46090D21939
    Caroline Ziemkiewicz, Robert Kosara,
    Implied Dynamics in Information Visualization,
    Proceedings Advanced Visual Interfaces (AVI), pp. 215–222, 2010. (acceptance rate 20%)
    stacks_image_C0B78654-3A63-4737-AD5A-5B3176F2AEFD
    Robert Kosara,
    Turning a Table into a Tree: Growing Parallel Sets into a Purposeful Project,
    in Steele, Iliinsky (eds), Beautiful Visualization, pp. 193–204, O'Reilly Media, 2010.


    stacks_image_B2727A6B-72E8-4A3E-9C49-F9584ABB468F
    Caroline Ziemkiewicz, Robert Kosara,
    Embedding Information Visualization Within Visual Representation,
    in Ras, Ribarsky (eds), Advances in Information and Intelligent Systems, Studies in Computational Intelligence, Vol. 251, pp. 307–326, Springer Verlag, 2010.





    Jul 18, 2010

    Interactive Technology in the Carolinas: Discovery Place Science Center

    I recently visited the Discovery Place science center in Charlotte, N.C.  It was my first visit without kids, and I thought it would be fun to explore all of the exhibits at my own pace. I was excited to play with the interactive applications created for exhibits running on Microsoft's Surface table-top computers. There were so many kids and teens at Discovery Place that every exhibit was in use. Fortunately, I obtained permission from parents to videotape/photograph some of the action.


    Next time, I'll make sure that I visit at a less-busy time!

    The following displays were developed by the Microsoft Surface team at Quatrefoil.  Quatrefoil is a company located in Maryland that develops immersive interactive experiences, primarily for museum exhibits and related projects.

    Midwife Toad App on a Microsoft Surface


    Project Build Exhibit video, from the Quatrefoil YouTube website, helps people understand the architectural design process:


    THEM exhibit, from the Quatrefoil YouTube site:

    "This health-related exhibition at Discovery Place in Charlotte, NC, explores how the human body is an ecosystem with a myriad of flora and fauna. Visitors are presented with an exciting and educational experience. The content is delivered in a way that carefully walks the line between gross-out information and the need to keep the context within the human body...Quatrefoil Associates designed the exhibit and produced creative media including Attack of the Superbugs, a video about antibiotic-resistant bacteria." -Quatrefoil Associates

    The Reactable at Discovery Place

    Jul 15, 2010

    Interactions Magazine: Cover story by Dennis Littky, looking at the UX of high schools and colleges.

    I'm out and about with no time to blog. Even so, I had to post a link to one of my favorite magazines, ACM's Interactions. Here is the introduction to the most recent issue, which comes with membership in ACM SIGCHI:
    interactions, XVII.4



    July / August, 2010



    The cover story of a recent issue of Interactions Magazine is written by Dennis Littky, who focuses on ways to improve the success of our high schools and colleges, from a UX perspective:
    Cover Story
    Time Goes By, Everything Looks the Same. 

    Full article for Free! 



    More:
    interactions: subtlety and change

    Full article for Free! 
    "There are some strange changes under way in our world. We constantly hear the refrain of the massive chaos around us, yet the allure of such a large, looming flux may distract us from something more important: the countless tiny, nuanced, and fundamental ways in which our culture and society are advancing. This issue of interactions describes these subtleties and teases them out of the greater topics that we've grown accustomed to discussing: environmental change, the role of education and government in a technological society, and the nature of behavior."

    Jul 14, 2010

    Collaborative Travel Planning App for Microsoft Surface, by Object Consulting



    Information from the Microsoft Surface YouTube Channel:
    "Object Explore is a collaborative travel planner for use by travel consultants and consumers on Microsoft Surface. Explore deep levels of interactive content through intuitive navigation, ensuring the customer doesn't get lost in the process. All content is tailored to the current client and can be taken away through mobile or email. Object Explore is built upon Object Consulting's Object Zoom framework, which brings intuitive Zoom exploration to sectors including retail, finance and museums."Object Consulting


    SOMEWHAT RELATED
    I worked on a collaborative travel planning application a few years ago for an HCI class project- before the Surface was unveiled by Microsoft.  It would have been fun to implement it in a Surface environment, since applications on the Surface can easily interface with mobile devices.

    Below is a screenshot of a working prototype of "navSmart", a mash-up using photo-overlays linking to travel related information, including YouTube videos of related travel videos and information. You spin the globe around to explore possible destinations and vacation activities.   This concept worked best on a large touch-screen display.
    PhotoGlobe and YouTube Travel Videos, using Google Earth.

    Jun 30, 2010

    Kids and and Tangible Tabletop Interaction: The NIKVision Project

    NIKVision is an interactive tangible tabletop designed for young children, but suitable for "kids" of all ages. Javier Marco Rubio shared his team's work on the NIKVision system on the TechSpark website, and I thought I'd share some of the videos on this post. What I like about NIKVision is that the team encouraged the participation of children as design partners, which is a very special approach.

    The NIKVision team is part of the GIGA Group, at the Computing Department of the Zaragoza University (Spain). The team collaborates with the ChiCI Group in the University of Central Lancashire, Preston (UK)

    designparners1

    NIKVision Tangible Tabletop Demo

    NIKVision Tangible Tabletop Demo from Javier Marco on Vimeo.


    "NIKVision is a Tangible tabletop in which toys control the games. NIKVision has been created by Javier MMarco, from GIG Group in University of Zaragoza, Spain." http://webdiis.unizar.es/~jmarco
    Researchers for this project included Javier Marco, Eva Cerezo, and Sandra Baldassarri.

    NIKVision Toys & Games

    NIKVision Toys & Games from Javier Marco on Vimeo.

    NIKVision, highlighted in the news in Spain (in Spanish)

    May 8, 2010

    Revisiting Razorfish: Emerging Experiences, RockstAR application, and more...

    I've written a few posts about Razorfish in the past. What is Razorfish?


    "The Razorfish Emerging Experiences team is a dedicated group of highly experienced professionals focused solely on emerging experiences and technologies. "Effective innovation" is our multifaceted approach to concepting and delivering pioneering solutions for our clients."
    Razorfish has forged ahead into very interesting-and fun- territory. Here is a video of the RockstAR application. It combines multi-touch technology and augmented reality, utilizing the Razorfish Vision Framework (RVT), integrated with the Razorfish Touch Framework.

    RockstAR (Augmented Reality) Experience Demo from Razorfish - Emerging Experiences on Vimeo.


    A recent post on the Razorfish Emerging Experiences blog provides a detailed account of the technology that was pulled together to make it happen in the post, The Technology Behind RockstAR. The application is integrated into Twitter and Flickr.
    RockstAR
    -Razorfish Emerging Experiences Blog
    "For the RockstAR experience, we are analyzing each frame coming from an infrared camera to determine if faces are found in the crowd. Once a face is detected, it is assigned a unique ID and tracked. Once receive a lock on the face, we can pass position and size information to the experience where we can augment animations and graphics on top of the color camera feed."


    RELATED
    One of my previous posts includes a video of the Razorfashion application, which highlights the Razorfish Touch Framework:


    Razorfish's Touch Framework "Razorfashion" - A lot like my idea for an in-home FashionMirrorAdvisor...


    I'm still hoping to work on my FashionMirrorAdvisor - but with a twist. Now that I have a smartphone, I want to incorporate a mobile app into the concept. Guys probably just wouldn't understand.  (However, something like this would make a nice gift for a guy who is a bit lacking in the fashion department.)


    Below is a remix of my previous post


    RAZORFISH'S TOUCH FRAMEWORK:  RAZORFASHION - A LOT LIKE MY IDEA FOR AN IN-HOME FASHIONMIRRORADVISOR (5/23/09)


    Razorfish recently unveiled the Razorfashion application designed to provide shoppers with an engaging retail experience within the "multi-channel shopping ecosystem". I'm not the "shop to you drop" type of gal, but I can see that this concept could be useful in other situations, after a few tweaks.



    As soon as I saw this Razorfish Touch "Fashion" demo video, it touched a nerve. I've been playing around with a similar idea, but for my personal use, in the form of an RFID-enabled system. I'd call it something like "FashionMirrorAdvisor".


    Instead of showing skinny fashion models like the Razorfashion application, I'd harness the power of built-in web-cam and mirror my own image on the screen. My mirror would dress me up in the morning when I'm way too foggy to think about matching colors and accessories.
         
    My FashionMirrorAdvisor would be my friend. My "smart" friend, since all of my clothes would be RFID-tagged, along with my shoes, jewelry, and other accessories. My make-up, too. It would be a no-brainer. I really could use this application - just ask my husband!


    More often than not, most mornings I find myself staring at the clothes in my closet, frozen in time, unable to formulate a fashion thought. I might set my eyes on a favorite blouse, but blank out when I try to think about the rest of the steps I need to pull my look together.
         
    I know I can't wear my reddish-pink camisole with my dusty-orange/brown slacks, but at 5:15 A.M., who has the time to think about this little detail? My friend, the TouchFashionMirror would prevent me from making this fashion faux-pas.
         
    No problem.
         
    My FashionMirrorAdvisor would show me a few outfits, and dress my real-time moving image on the screen. Since she knows all things, she'd show me ONLY the articles of clothing that were clean, since my RFID system would keep up with all of that. It would be much more functional than a "virtual wardrobe" application. I could try out different earrings without having to get them out.
         
    If I couldn't find something, the RFID system would take care of this detail. My FashioMirrorAdvisor would know where I misplaced my clothes, accessories, and even my keys, since they would all be tagged. The mirror application would provide me with a nice little map of my house and car, and highlight the location of the item.
         
    My FashionMirrorAdvisor would keep track of my laundry, too. This would be a great feature. So if my dirty laundry was piling up, and I wanted to wear outfit X, Y, or Z over the next few days, I'd receive a gentle reminder that I'd need to do some laundry first!


    Another practical feature:
         
    My FashionMirrorAdvisor would also serve as my health consultant, keeping track of my weight and BMI. This data, along with information gained from the webcam, would be combined so that my advisor would NEVER suggest an outfit that would be too...snug.


    I could program the system to provide me with gentle reminders if my weight was an issue. My FashionMirrorAdvisor would show me images of myself "before" and "after", outfits included.

    Information about the "after" outfits could be fed to the system from the web-catalogs of my favorite fashion retailers, and once I lost those 10 darned pounds, I'd find a nice parcel delivered to my door. Thanks to my FashionMirrorAdvisor, I know that the outfit would be just right.


    UPDATE 5/8/10:  The FashionMirrorAdvisor would be integrated with a mobile app - since I now have a smartphone, this would be quite useful in planning shopping trips centered around the purchase of new clothes, shoes, accessories, and coordinating cosmetics!  I created a little game  that I think would be ideal for this sort of thing, too.   I still want to work on this....someday. Too many ideas, too little time!


    ALSO RELATED
    From the Razorfish site:
    "The Razorfish Emerging Experiences team is a dedicated group of highly experienced professionals focused solely on emerging experiences and technologies. "Effective innovation" is our multifaceted approach to concepting and delivering pioneering solutions for our clients"

    "Founded in 2008, Razorfish Emerging Experiences is a cross-functional team composed of strategists, artists, experience designers, and technologists. We’re part of the Razorfish Strategy & Innovation practice led by Shannon Denton. Jonathan Hull is the managing director of the team, Steve Dawson is the technology lead and Luke Hamilton is the creative lead."

    Razorfish
    Razorfish Emerging Experiences Portfolio
    Razorfish Emerging Experiences Blog
    Razorfish Emerging Experiences on Vimeo


    RELATED 5/8/10
    Razorfish Health (Fun music on the home page!)
    Razorfish Establishes Cloud Computing Practice
    Douglas Quenqua, ClickZ 4/15/10
    The Razorfish 5: Five Technologies that Will Change Your Business
    Razorfish Whitepapers


    If you are looking for a job, you might be interested in the openings at Razorfish. Before applying, take a look at what is expected:
    "You dream in digital. You're fluent in the technologies that define our world and passionate about the way they're shaping our future.  You're a communicator. A creator. You understand how the Web connects us, and you want to shape the conversation. You're a restless innovator.  you're not only waiting for the next big idea to happen, you're making it happen.  You're a unique talent, a visionary, an experimenter, and you're looking for an environment that lets you shine. In other words, you're just our type...."


    FYI
    When I visited the Razorfish website, I noticed that the background appeared to be a live feed of the offices. Since today is Saturday, it makes sense that the only person busy at the office was a custodian. Below is the screenshot: