Showing posts with label HCI. Show all posts
Showing posts with label HCI. Show all posts

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7



Dec 13, 2012

Connecting: Exploration of the Future of Interaction Design and User Experience - Good for promoting CSEd!

I've been looking for a relatively short video about human-computer interaction and related fields to include in a presentation I'm planning for high school students. The presentation is my small part to promote Computer Science in Education Week (CSEd)

One of the goals of CSEd Week is to spread the word that computer science education is much more than learning how to program one. Technical and computational thinking skills are important to develop, but young people also need to know what sort of things they can do with these skills as they become adults in our technological society. As stated on the CSEd website"Computing professionals work on creative teams to develop cutting-edge products and solutions that save lives, solve health problems, improve the environment, and keep us connected."  

Coincidentally, I was pleasantly surprised by a tweet I received today that linked to Connecting, a well-produced 18-minute video about interaction and user experience design. This video would be great to share with high school students.


Connecting (Full Film) from Bassett & Partners on Vimeo.

The video features a number of well-spoken, creative professionals who are passionate about their work, people, and the future.  Although the video is a bit techno-centric, it depicts people who live and breathe technology in a favorable light.  It also inspires some degree of thought and reflection on the part of the viewer.

Although much of what is discussed in Connecting is futuristic, the seeds were planted years ago.  If you are new to the HCI/UX/ID/UCD world, it might help to read
Mark Weiser's 1991 article, The Computer for the 21st Century, published in Scientific American in 1991, before viewing the video.  

After viewing the video, I encourage you to take the time to read some of the comments on the Vimeo website.  Also read  Marc Rettig's comments, posted on the IxDA website:  "A film about interaction design: what it says about us".  

Near the end of the video, there is a discussion about where we might be headed, as interconnected, technically enhanced, augmented humans.  Hopefully we will not create, and then be assimilated into a Borg-like collective, or live out our days in a Matrix-like disembodied state.

In the wrong hands, what might happen?

Is resistance futile?!

FYI: Connecting was produced by Microsoft, Windows Phone Design Studio: Mike Kruzeniski (now at Twitter), Kat Holmes, and Albert Shum, and featured interviews with the following people:

Matt Jones, BERG London
Raphael Grignani, Method
Liz Danzico, School of Visual Arts, New York
Blaise Aguera y Arcas, Architect of Bing Mobile and Bing Maps
Helen Walters,  Writer, Editor, Researcher at Doblin/Monitor
Younghee Jung, Research Leader, Nokia
Massimo Banzi, Co-Founder, Arduino
Jennifer Bove, Co-Founder, Managing Director, Kicker Studio
Robert Murdock, Principal, Method (Artefact)
Jonas Lowgren, Professor of IxD, Malmo University, Sweden
Eric Rodenbeck, CEO, Founder, Creative Director, Stamen Design
Robert Fabricant, VP of Creative, Frog Design
Andrei Herasimchuck, Twitter 

The video was first screened in Seattle, Washington, last April, with a panel discussion that included Rob Girling and Gavin Kelly, of Artefact, Bill Buxton, of Microsoft, and Scott Nazarian, of Frog Design.

Description of the "Connecting" video, from Bassett & Partners' Vimeo site:

"The 18 minute "Connecting" documentary is an exploration of the future of Interaction Design and User Experience from some of the industry's thought leaders. As the role of software is catapulting forward, Interaction Design is seen to be not only increasing in importance dramatically, but also expected to play a leading role in shaping the coming "Internet of things." Ultimately, when the digital and physical worlds become one, humans along with technology are potentially on the path to becoming a "super organism" capable of influencing and enabling a broad spectrum of new behaviors in the world." -Bassett & Partners

Selected Quotes:

Liz Danzico:
"It's understanding that ecosystem, where the human in the center, and understanding that network of things, and how they all work together, rather than of your device or thing being in the center."

Younghee Jung:
"... you can not necessarily foresee the consequences when people adopt what you designed..to see something completely different from what you created. .it is like throwing a stone in the water, and you don't know what it will cause."

Blaise Aguera y Arcas:
"....these are all augmentations of abilities as humans. And when the augmentation really works, then that extension of yourself feels natural, and beautiful and does what you want, and doesn't get in the way....The use of voice, and the use of natural gestures... you are removing the extraneous, you are removing the artificial."

Massimo Banzi:
"...Something that can do it's own thing, disappearing in the background, is correct"  (nod to Weiser)

Jennifer Bove:
"...it is really important to look at what the consequences are of putting these products into the world when we think about things like the phone...the way it has changed our behavior, it can be enabling, and also disrupting...for these things to change our lives for the better, or enable for them to let us do things we couldn't do before.. they have to feel natural, and feels like a conversation." 

Robert Murdock:
"How you actually design and enact a living system in UX is something that is quite challenging...you have to think about patterns of desired outcomes and behaviors you want to achieve, instead of moving a user through one flow in an experience."

Jonas Lowgren: 
"...back in the day.. it was one user, one task, one computer,  its all gone now, its is much more like you are setting the stage, really,  for other people to perform, but you can never tell them what to do."

Eric Rodenbeck:
"....the map is like a living thing, that is being made up of everything we got. The idea that it is different in the morning than what it was in the evening, is a really good idea to stay connected to the idea that the world is changing."

Helen Walters:
"What we need is for designers to be embedded in the topics that are really, really important right now, so there can be a better synergy between design, and business design, and social change design, and entrepreneurship."

Andrei Herasimchuck
"That is where the future lies with us. There will be software in everything..You can take all of those (digital) pieces, and you can design all kind of things around it. People are now actually entering their lives and what is going around them, into a digital format, and so we will start do things with that in the future, and I think it will be exciting."

Robert Fabricant:
"The network is sampling the world, and knowing what is cropping up where, being able to match and find patterns...and anticipate outbreaks of diseases. ..  We are trying  now to collect from the periphery a much richer set of what is going on the world so we can learn as a society and optimize and evolve the right systems and services".

SOMEWHAT RELATED
IxDA
Experientia: Putting People First 
What's the Difference- IXD, IA, UXD, HCI, UCD, UX (Jon Karpoff)

Dec 5, 2012

Augmented Human Conference '13 (ACM CHI) March 7th and 8th; CFP paper deadline Jan 8, 2013

Looks like a fascinating conference!

ACM SIGCHI 4th Augmented Human International Conference





















Call for Papers
The 4th Augmented Human (AH) International Conference in cooperation with ACM SIGCHI will be held in Stuttgart, Germany, on March 7–8 2013, focusing on augmenting human capabilities through technology for increased well-being and enjoyable human experience. 

As in previous years, the conference proceedings will be published in the ACM Digital Library as a volume in its International Conference Proceedings Series with ISBN. 

Topics 
  • Wearable Computing and Ubiquitous Computing 
  • Bionics, Biomechanics, and Exoskeletons 
  • Brain-Computer Interfaces, Muscle Interfaces, Implanted Interfaces 
  • Sensors and Hardware 
  • Smart Artifacts and Smart Textiles 
  • Augmented Sport, Health, & Well-being, Training/Rehabilitation Technology 
  • Augmented and Mixed Reality, Tourism and Games and Context-Awareness 
  • Augmented Fashion and Art 
  • Trust, Privacy, and Security of Augmented Human Technology 
PROGRAM COMMITTEE

Submission Categories for Papers 

Full papers 8 pages, anonymized, 30 minutes presentation 
Short papers 4 pages, anonymized, 15 minutes presentation 
Demonstration papers 2 pages, anonymized, demonstration at conference 
Poster papers 2 pages, anonymized, presented at conference 
Art pieces 1–2 pages, not published, exhibited at conference 

The four paper categories will be published in the ACM digital library and follow the ACM paper format. We encourage authors to submit supporting video material in addition to the PDF submission. 

Important Dates 
  • January 8, 2013      paper submission deadline 
  • February 5, 2013    author notification 
  • February 12, 2013  camera-ready and ACM copyright form due 
  • March 7–8, 2013    scientific conference in Stuttgart 

Art and Exhibition 

Augmented Human 2013 will feature contributions by art researchers and practitioners. Artists participating and exhibiting at Augmented Human 2013 will have to be self-funded to attend the conference. 

All art pieces will be included in a video to be published on the Augmented Human YouTube channel. Additionally, an exhibition catalog will be published on the Augmented Human website, including full-page pictures and the descriptions provided by the authors. 

Submission of Art Pieces 

Authors may choose the format to present their art pieces for submission. The submission should include the requirements of space, light, electricity, and equipment.

Organizing Committee

General Chair:  Albrecht Schmidt, University of Stuttgart, Germany
Program Co-Chairs: Andreas Bulling, University of Cambridge, UK; Christian Holz, Hasso Plattner Institute, Germany

Dec 3, 2012

Musings about still-popular Interactive Multimedia Technology blog posts from the past...

I have been blogging for over six years now, and from time-to-time I like to take a look my Google Analytics stats.   A large chunk of the visitors to this blog come from the US, 


I've noticed that the following blog posts still get hits, even though they were written a while ago. I'm not sure why, but I think it would be helpful to revisit each one, clean up any broken links, and provide new information related to each topic.  It might take me a while.

Reader input is welcome!

Games to Lift Stress Away: Flower, flOw (and Cloud), from thatgamecompany (2009)
A number of people who do a search of the word "Flower" or "Flower Pictures"  find this blog post, every day.

Interactive multimedia for social skills, understanding feelings, relaxation and coping strategies, etc. (2006)
This post gets a lot of hits from people who work in K-12 settings, looking for "free online interactive social skills games" or something similar.  

Interactive Touch-Screen Technology, Participatory Design, and "Getting It". (2008)
This is one of my rare "brain dump" posts. I noticed that in an attempt to update the post, I linked it to another post, written in 2010. 

The Children's Interactive Library: User Experience Design and the Library! (2009)
The Children's Interactive Library was a collaboration between Interactive Spaces, the Department of Computer Science, University of Aarhus, the Department for Design, Aarhus School of Architecture, and others. 

Yellowbird 6 lens 360 degree video camera creates web-based interactive videos. (2009)
I got the link from Harry Brignull, who blogs at 90 Percent of Everything

The 3D Evolution: Part I, Introduction (2010) Hmmmm 3D TV, where are we now? 
To find out, check out the 3D Guy blog.


Teliris InterAct TouchTable and TouchWall: Immersive Collaboration and Telepresence, DVE's Holographic Tele-Immersion Room
Since I wrote this post  in 2008, it has received over 7,000 page views from Malaysia.  Is it possible that I have a secret fan club there?  

Window Shopping in the Web Outside (2011)

William Forsythe's "Synchronous Objects - One Flat Thing, Reproduced" Multidisciplinary online interactive project: Translating choreography into new forms. (2009)
(I like this stuff.)






Dec 2, 2012

EpiCollect: A mobile app, useful for photo + data-collection "in the wild".

EpiCollect is an open-source project developed at Imperial College London, funded by the Wellcome Trust.  According to information posted on the project's website, "EpiCollect is a generic data collection tool that allows you to collect and submit geotagged data forms (along with photos) to a central project website (hosted using Google's App Engine) from suitable mobile phones (Android or iPhone). For example, questionnaires, surveys, etc.  All data synchronised (ie a copy sent from the phone) from multiple phones can then be viewed/charted/filtered at the project website using Google Maps/Earth or downloaded. Furthermore, data can be requested and viewed/filtered from the project website directly on your phone using Google Maps." -EpiCollect

EpiCollect Overview  epicollect.net
(Credit:  EpiCollect Website)

EpiCollect makes use of web API's such as Google Maps, Google Charts, Google Talk, and KML Specifiction, and JavaScript Libraries such as JQuery, script.aculo.us, ExtJS, and Mapstraction.  It runs on the Google AppEngine server, and is available for Android and iPhone.

I think that EpiCollect would be a useful interactive tool for use in education, K-12 and above.  It would be ideal for students working on group projects, such as environmental study.  For young children, a simple assignment might include taking pictures and data about  birds, animals, trees, cloud formations, or even litter, as part of a class project.  Since the data includes photographs, the students could create an end product in the form of an interactive multimedia presentation, available for other students - as well as parents- to view on the web, accessed from any web-enabled device.

HCI research teams could use these tools when observing people using various technologies in public spaces, such as malls, airports, special events, as well as in stores, eateries, and entertainment settings.  

I would be interested in learning more about the use of this application in HCI and K-12 education!

RELATED
EpiCollect Website
EpiCollect Instructions
EpiCollect Instructions (pdf)
The Sight of Road Kill Makes a Pretty, Data-Rich Picture (NPR All Tech Considered)
Note: Audio from the above December 2, 2012 episode can be found on the NPR Weekend Edition Sunday website after 12:00 PM ET on 12/2/12
Mobile app sees science go global  (BBC article)
App for Android Puts Laboratories on Your Phone (Tree Hugger article)
Scientific Data Collection Goes Mobile (Discovery News article)

Paper: EpiCollect: Linking Smartphones to Web Applications for Epidemiology, Ecology and Community Data Collection (PLos One 4(9), 2009)

David M. Aanensen, Derek M. Huntley, Edward J. Feil, Fada'a al-Own, Brian G. Spratt
Conclusion from the above paper:
"Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting ‘citizen scientists’ to contribute data easily to central databases through their mobile phone."

Nov 17, 2012

Human Computer Interaction + Informal Science Education Conference (NUI News)

I recently learned of the HCI + ISE conference, funded by the National Science Foundation and organized by Ideum and Independent Exhibitions that will provide the groundwork for the future of the development and design of interactive computer-based science exhibits.
Science museums have a long history of interactivity, well suited to groups of "explorers", such as families or students visiting on a field trip.  

What is really exciting is that new interactive applications and technologies have the power to transform the way people learn and understand science in a collaborative and social way.  Innovations in the field of HCI - Human-Computer Interaction- such as multi-touch and gesture interaction, are  well-suited to meet the goals of science education for all, beyond the school doors and wordy textbooks. 

Below is a screen-shot of the conference website, a description about the conference, quoted from the site, and some related resources.



About the HCI+ISE Conference
"HCI technologies, such as motion capture, multitouch, augmented reality, RFID, and voice recognition are beginning to change the way computer-based science exhibits are designed and developed. Human Computer Interaction in Informal Science Education (HCI+ISE) is a first-of-its-kind gathering to explore and disseminate effective practices in developing a new generation of digital exhibits that are more intuitive, interactive, and social than their predecessors."
"The HCI+ISE Conference, to be held in Albuquerque, New Mexico June 11-14 2013, will bring together 60 museum exhibit designers and developers, learning researchers, and technology industry professionals to share effective practices, and to explore both the enormous potential and possible pitfalls that these new technologies present for exhibit development in informal science education settings."
"HCI+ISE will focus on the practical considerations of implementing new HCI technologies in educational settings with an eye on the future. Along with a survey of how HCI is shaping the museum world, participants will be challenged to envision the museum experience a decade into future. The conference results will provide a concrete starting point for exhibit developers and informal science educators who are just beginning to investigate these emerging technologies and design challenges in creating these new types of exhibits."
Why HCI+ISE?
"Since the mid-1980s informal educational venues have increasingly incorporated computer-based exhibits into their science communication offerings in an effort to keep pace with public expectations and make use of the expanding opportunities these technologies provide. The advent and popularity of once novel HCI technologies are becoming commonplace: the Wii and Microsoft Kinect now allow for motion capture video games, tablet PCs have multitouch interaction, and smart phones and other devices come standard with voice recognition. Yet many museums are still developing single-touch and trackball-driven, single-user computer kiosks."
"Science museums have a long history of championing hands-on, physical, and inquiry-based activities and exhibits. This vast experience has only just begun to be applied to interactive computer interfaces. Along with seasoned science exhibit developers, the Conference will draw upon individuals outside of ISE who will provide fresh insight into the technologies, design issues, and audience expectations that these visitor experiences present."
Involvement and Findings
"HCI+ISE will bring together a diverse group of practitioners and other professionals to discuss (and in some cases share and prototype) new design approaches utilizing emerging HCI technology. Please see our Apply page to learn how you can participate. Conference news and findings will be distributed through a variety of ISE and museum websites, including this one."
"We welcome your questions and comments about the HCI+ISE Conference."
CONTACTS
Kathleen McLean of Independent Exhibitions
& Jim Spadaccini of Ideum
HCI+ISE Co-chairs
"Open Exhibits is a multitouch, multi-user tool kit that allows you to create custom interactive exhibits."
CML:  Creative Mark-up Language
GML: Gesture Mark-up Language
GestureWorks
Ideum

Nov 4, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) -Extended Deadline: December 9, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) -Extended Deadline: December 9, 2012

Overview 
One of the primary goals of teaching is to prepare learners for life in the real world. In this ever-changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Given the potential of these new interfaces, software, and technologies as learning tools, as well as the ubiquitous application of interactive technology in formal and informal learning environments, there is a growing need to explore how next-generation technologies will impact education in the future. 

As a community of Human-Computer Interaction (HCI) and educational researchers, we need to theorize and discuss how new technologies should be integrated into the classrooms and homes of the future. In the last three years, three CHI workshops have provided a forum to discuss key issues of this sort, particularly in the context of next-generation education. The aim of this special issue of Personal and Ubiquitous Computing is to summarize the potential design challenges and perspectives on how the community should handle next-generation technologies in the education domain for both teachers and students. 


We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include but are not limited to: 

  • Gestural input, multitouch, large displays 
  • Mobile devices, response systems (clickers) 
  • Tangible, VR, AR & MR, multimodal interfaces 
  • Console gaming, 3D input devices 
  • Co-located interaction, presentations 
  • Educational pedagogy, learner-centric, child computer interaction 
  • Empirical methods, case studies 
  • Multi-display interaction 
  • Wearable educational media 
Important Dates 

  • Full papers due: December 9, 2012 
  • Initial reviews to authors: January 18, 2013 
  • Revised papers due: March 15, 2013 
  • Final reviews to authors: April 26, 2013 
  • Final papers due: June 14, 2013 


Submission Guidelines 

Submissions should be prepared according to the Word template located at the bottom of this page. All manuscripts are subject to peer review. Manuscripts must be submitted as a PDF to the easychair submission system. Submissions should be no more than 8000 words in length. 

Guest Editors and Contact Information 

  • Syed Ishtiaque Ahmed, Cornell University 
  • Quincy Brown, Bowie State University 
  • Jochen Huber, Technische Universität Darmstadt 
  • Si Jung “Jun” Kim, University of Central Florida 
  • Lynn Marentette, Union County Public Schools, Wolfe School 
  • Max Mühlhäuser, Technische Universität Darmstadt 
  • Alexander Thayer, University of Washington 
  • Edward Tse, SMART Technologies 

Contact: eistjournal2012@easychair.org 

Information about the Journal of Personal and Ubiquitous Computing 


Submission Template: PUC_EIST_article_template.docx  (59k)

Oct 19, 2012

Link to "Who Works with Creative Coders", by Tim Stutts


Who Works With "Creative Coders"?
Tim Stutts, Interaction Designer/Programmer + Founder at PushPopDesign


RELATED
Tim Stutts' DataGlam Flickr Set; "An on-going series of data-inspired graphic and video works programmed in OpenFrameworks, Processing and Three.js"



















Creative Applications.Net
"CreativeApplications.Net reports innovation and catalogues projects, tools and platforms at the intersection of art, media and technology"

Sep 20, 2012

Thinking about a Kurio 7 Tablet for your kid? Here is a start!

I haven't had a chance to play with the Kurio, a 7 inch Android tablet designed for children and their families, so I haven't formulated an opinion about the device, or the applications that it runs.  I thought I'd share the promotional video and related information/links:


Kurio Tablet from CIDE on Vimeo.

Parents can view a number of "how-to" videos to get the tablet up and running. Developers can apply to be part of the Kurio store. From what I can see, the Kurio is in need of some creative, child-friendly apps.

Below is a hands-on demo:




RELATED
Kurio at Toys'R'Us
Toon Goggles Partners with Techno Source on New Kurio7 Android Tablet for Families
PR Web 7/9/12
Kurio's Features

Toys 'R' Us has its own tablet, the Tabeo:

Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Jul 14, 2012

Cute NAO robot performs "Evolution of Dance" and is an active participant in research with young people who have autism spectrum disorders.

I came across a cute video of a NAO robot performing the Evolution of Dance, and as I smiled, I remembered that the robot was used in some research about interventions for young people with autism. 


The technology behind the NAO robot was developed by Aldebaran Robotics, and more details can be found on the company's website, along with the video and links I've provided below. (Aldebaran Robotics is hiring, btw.)


Enjoy the dance performance!

Evolution of Dance by NAO Robot 


DEPCO NAO Robot and Notre Dame Autism Research 



NAO Next: Gen: The New Robot of Aldebaran Robotics



New Robot Helps Autistic Children Interact (UConn) Research with Tim Giffort, CEO of Movia Robotics, and UConn professor Anjana Bhat 


(Above)Bruno Maisonnier of Aldebaran Robots Highlights Therapeutic Uses of the NAO Robot 

RELATED 
Aldebaran Robotics NAO Developer Website Psychologist explores effective treatment options for children with autism disorders 
Susan Guibert, Notre Dame News, 4/16/10 
Robot Speaks the Language of Kids 
Beth Krane, UConn Today, 8/5/10 
Movia Robotics: Systems for Learning, Training, Education and Therapy 
Timothy Gifford and Anjana Bhat on Using Robots to Help Autistic Children 
Rachel Z. Arndt, FastCompany, 4/1/11 
Anjana N. Bhat, University of Connecticut Timothy Gifford 
Social story powerpoint for children with autism who are participate in research at the FUN Lab at Notre Dame (ppt)

Jul 13, 2012

Update: Video of My PlayHome App and 19-Month-Old Toddler


This little guy "plays" iPad about once or twice a week. The video shows him with the My PlayHome app. It is interesting to see how much he remembered from the previous week!


In the near future, I plan to write a few posts about the apps that I've used with students with special needs.  I will also touch on a few apps that are great for toddlers and "twos".

Jul 12, 2012

Quick post: Design for Emotion, co-authored by Trevor van Gorp and Edie Adams

I have been waiting for Design for Emotion to come out, and I look forward to reading it soon.  Below is information about the book from the Affective Design website:



"After seven years of research and almost one and a half years of writing, I’m very pleased to announce that the book I’ve co-authored with Microsoft’s Edie Adams on designing for emotion and personality is available on Amazon."
"Drawing on our combined experience of over 30 years in graphic, interactive and industrial design, human factors, and product management, Design for Emotion explores the what, when, where, why and how of designing emotion and personality. We define and model emotion and personality in a way that relates directly to design practice." -Trevor van Gorp



RELATED
Affective Design  Website

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST)

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) 


Overview 
One of the primary goals of teaching is to prepare learners for life in the real world. In this ever-changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Given the potential of these new interfaces, software, and technologies as learning tools, as well as the ubiquitous application of interactive technology in formal and informal learning environments, there is a growing need to explore how next-generation technologies will impact education in the future. 


As a community of Human-Computer Interaction (HCI) and educational researchers, we need to theorize and discuss how new technologies should be integrated into the classrooms and homes of the future. In the last three years, three CHI workshops have provided a forum to discuss key issues of this sort, particularly in the context of next-generation education. The aim of this special issue of Personal and Ubiquitous Computing is to summarize the potential design challenges and perspectives on how the community should handle next-generation technologies in the education domain for both teachers and students. 

We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include but are not limited to: 

  • Gestural input, multitouch, large displays 
  • Mobile devices, response systems (clickers) 
  • Tangible, VR, AR & MR, multimodal interfaces 
  • Console gaming, 3D input devices 
  • Co-located interaction, presentations 
  • Educational pedagogy, learner-centric, child computer interaction 
  • Empirical methods, case studies 
  • Multi-display interaction 
  • Wearable educational media
Important Dates
  • Full papers due: November 9, 2012
  • Initial reviews to authors: January 18, 2013
  • Revised papers due: March 15, 2013
  • Final reviews to authors: April 26, 2013
  • Final papers due: June 14, 2013
Submission Guidelines
Submissions should be prepared according to the Word template located at the bottom of this page. All manuscripts are subject to peer review. Manuscripts must be submitted as a PDF to the easychair submission system. Submissions should be no more than 8000 words in length.

Guest Editors and Contact Information
  • Syed Ishtiaque Ahmed, Cornell University
  • Quincy Brown, Bowie State University
  • Jochen Huber, Technische Universität Darmstadt
  • Si Jung “Jun” Kim, University of Central Florida
  • Lynn Marentette, Union County Public Schools, Wolfe School
  • Max Mühlhäuser, Technische Universität Darmstadt
  • Alexander Thayer, University of Washington 
  • Edward Tse, SMART Technologies

Information about the Journal of Personal and Ubiquitous Computing