Showing posts with label ACM. Show all posts
Showing posts with label ACM. Show all posts

Nov 18, 2015

Highlights from UIST 2015 - Animatronics + Digital Storytelling and More!

I recently attended the 28th ACM User Interface Software and Technology Symposium (UIST2015), held in Charlotte, N.C.   Although the focus of the conference was not on educational interfaces, a good number of the presentations and demos were related to education in some way.


I've been looking at ways newer technologies can take a bigger role in education, something that is important, given the spread of STEAM (Science, Technology, Engineering, Arts and Mathematics) in K-12 settings.  The STEM/STEAM approach requires students to work collaboratively through project-based learning activities designed to support a variety of skills that they will need as university students-- and beyond.

The focus of this year's UIST Student Innovation Contest  session was storytelling and animatronics!

According to info from the UIST website: "Schools and museums are starting to offer experimental Animatronics classes where kids build furry robotic animals and use them to tell amazing stories. These courses blur traditional lines - between art and engineering, between the virtual and physical worlds - all while demonstrating the universality of creativity across disciplines. Everybody learns how to write stories, build mechanisms, program motions, provide voice performances, etc. In this way, kids see how it all works together, and it has a way of inspiring kids to see new career possibilities."

I was delighted to see so many kid-friendly interfaces.  The first one that caught my eye was "ZOOO".

The ZOOO platform supports digital storytelling through the creation and use of electronic stage, computer-assisted animated puppets, and an a easy-to-use visual software programming environment.  ZOOO would appeal to teachers who are implementing project-based STEAM activities with their students.

ZOOO poster, UIST 2015

















ZOOO programming interface

















I liked the visual design of the programming interface, which can be accessed through a touch-screen.  ZOOO was created by a team of students at the IInterdisciplinary Program of Information Art & Design, Academy of Arts & Design,Tsinghua University, China.

Another demo that caught my eye was Show-Flow. Show-Flow is a software tool designed to create a non-linear animatronic puppet show.  In the following video, the story is entitled "Zee's Mystery": 


The Show-Flow software was created by Nuirit Kirshenbaum, and is available for download on github.

So where is all of this headed?

In today's K-12 classrooms following the Common Core Standards, STEM/STEAM subjects are often integrated into the English/Language Arts curriculum.  Reading instruction is very important, but new ways of teaching and learning are taking hold, such as project-based learning.   Projects are interdisciplinary in nature and require careful planning on the part of teachers---and also require plenty of space for students to communicate, collaborate, brainstorm, design, and share ideas.   

In some schools, teachers are finding that laptops, productivity software, and multimedia resources on the web are good for supporting project based learning activities, but there are limitations. Teachers can't always see what is going on with the students, and laptop screens don't allow for more than two children, at best, to easily see content and collaborate. 

What is exciting is that many schools are thinking beyond 2015.  Tired of the cost of replacing projector bulbs and "touchy" interactive whiteboards, some schools are embracing solutions that are likely to support authentic collaboration -- for example, the picture below shows young children on a Clear Touch display running NUITEQ's Snowflake MultiTeach software.  


Picture of children interacting on a Clear Touch multi-touch table and NUITEQ Snowflake MultiTeach Software.



















In a classroom setting, NUITEQ's Snowflake MultiTeach software can support students working individually or in collaborative groups.  Teachers can create a variety of game-like lessons using the versatile Lessons App.    With the multi-touch, multi-user  Clear Touch display, teachers can orient the screen to meet the needs of the students - upright in an interactive presentation mode, at an angle for design and brainstorming activities, or horizontally, in a table mode.  

All of this is great for supporting science lab activities-- and also student collaboration and planning for STEAM projects. 

I'll be posting more about UIST 2015 highlights soon!

REFERENCES/RESOURCES

UIST 2015
Conference Website

ZOOO
毛成鹏 Chengpeng Mao
清华大学美术学院 信息艺术设计系
Interdisciplinary Program of Information Art & Design, Academy of Arts & Design,Tsinghua University
Mobile  (+86)155-011-21372
Beijing• China  100084
Email  adrainmao@gmail.com

SHOW-FLOW
Nuirit Kirshenbaum

NUITEQ 
MultiTeach Pinterest
MultiTeach Lessons Community

CLEAR TOUCH
Clear Touch Interactive Experience




Disclosure: I am a K-12 Technology Specialist with NUITEQ.

Sep 30, 2015

Looking forward to attending UIST in Charlotte NC!

I am happy to share that I will be attending the UIST 15, the 28th ACM User Interface Software and Technology Symposium, from November 8-11. It will be held in my hometown, Charlotte, N.C. 

I look forward to sharing what I learn with my colleagues at NUITEQ and with educators who are interested in learning more about the technologies that have the potential to impact the future.

According to the conference website, "The ACM Symposium on User Interface Software and Technology (UIST) is the premier forum for innovations in human-computer interfaces. Sponsored by ACM special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together people from diverse areas including graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW. The intimate size and intensive program make UIST an ideal opportunity to exchange research results and ideas."

I am exited to see what new technologies are making way into K-12 educational settings-- for example, the following video from the UIST 15 conference website highlights software and hardware innovations for animatronics. High school students work on creative projects that support storytelling, while learning about technology and engineering concepts.





There are many opportunities for young children to gain a range of skills.  In the Charlotte-Mecklenburg Schools,  Oakhurst Elementary School was renovated and re-envisioned as a STEAM magnet program. 

What is STEAM?   

Science, Technology, Engineering, Arts, and Math.

The following video provides an overview of the exciting things happening at Oakhurst:







Jan 22, 2014

Curt "World is Open" Bonk is on SlideShare! - Plus Emerging Interactive Tech Links

If you follow trends in technology and education, you've probably come across the work of Curt Bonk, an instructional technology systems  professor at Indiana University, and an adjunct in the School of Informatics. His book, The World is Open: How Web Technology is Revolutionizing Education, came out in 2011 and still is a must-read in the field.

Zipping through my FB feed today, I learned that Curt Bonk mentioned that people follow him on Pinterest and SlideShare, even though he has nothing there. 

Things change.  

Although he's much more fun in person, Curt's slides provide historical overviews that put things in perspective, and in a few minutes, you'll know a bit more of something that you didn't know you didn't know.  

If you have a few more minutes, take a look at the links to about info about emerging interactive technologies.

Enjoy!




EMERGING INTERACTIVE TECH LINKS

How Do You Feel?  Your Computer Knows
Tom Geller, Communications of the ACM, 1/14
Brain Games: Move Objects With Your Mind To Find Inner Calm?
Amy Standen, All Tech Considered, 1/21/14
Coming soon:  Control your computer with your brain via open source
Serdar Yegulalp, InfoWorld, 1/13/14
Touchless Interaction in Surgery
Communications of the ACM, 1/14
Why Wearable Devices Will Never Be As Disruptive As Smartphones
Kevin McCullagh, Fast Company, 1/21/14
Apple envisions trackpad with sensors instead of click buttons
Lance Whitney, c\net, 1/21/14Robots test their own world wide web, dubbed RoboEarth
BBC, 1/14/14
Seeing things: A new transparent display system could provide heads-up data
David L. Chandler, MIT News, 1/21/14

I'll be writing a few blog posts about future tech and interactive technology news soon.

Jun 6, 2013

Interactive Displays and "Billboards" in Public Spaces; Pervasive Displays 2013

The 2013 International Symposium on Pervasive Displays (PerDis 2013), recently convened  in Mountain View, California.  Since I couldn't attend this conference, I was happy to learn from Albrecht Schmidt that the conference proceedings were recently uploaded to the ACM Digital library.  There are many exciting things going on in this interdisciplinary field!

Researchers involved with the Instant Places project, described in the video below, presented their work at PerDis 2013. The Instant Places project was part of PD-Net, a series of research efforts exploring the future of pervasive display networks in Europe. (See the "Related" section for additional references and links.)


Instant Places: Tools and Practices for Situated Publication in Display Networks

Below is information from the Instant Places video and website:
"The video describes a novel screen media system that explores new practices for individual publication and identity projection in public digital displays." 

"Instant Places has been developed by the Ubicomp group of the Information Systems Department, at the University of Minho, and has been funded within the scope of pd-net: Towards Future Pervasive Display Networks, by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 244011."

Saul Greenberg was the keynote speaker at PerDis 2013.  His keynote, "Proxemic Interactions: Displays and Devices that Respond to Social Distance", highlights how far off-the-desktop our digital/physical lives have become, and how this has influenced recent research in human-computer interaction. Saul is a professor at the University of Calgary and leads research in Human Computer Interaction, Computer Supported Cooperative Work, and Ubiquitous Computing.

Although the video of Saul Greenberg's presentation below is not from PerDis 2013, it touches on the same topics and is worth taking an hour to watch.  In this video, Greenberg presents an overview of the history of human-computer interaction. He also offers up a discussion how an understanding social theory, perception of spatial relationships, and embodied interaction can be applied to the design of natural user interfaces and interactive systems.  Useful examples of interaction design explorations, within an ecological context, are provided later in the video.

Proxemic Interactions: the New Ubicomp?




RELATED


My Backstory
Regular readers of this blog know that to subject interactive displays in public spaces holds my interest. When I was taking computer courses during the mid 2000s, I focused some of my energy on projects designed for large interactive displays, inspired by reading articles like "Physically Large Displays Improve Performance on Spatial Tasks" (Desney S. Tan, Darren Gergle, Peter Scupelli, and Randy Pausch) and "Dynamo: public interactive surface supporting the cooperative sharing and exchange of media(Shahram Izadi, Harry Brignull, Tom Rodden, Yvonne Rogers, Mia Underwood).  

Jeff Han's 2006 TED talk was another inspiration. I remember my excitement as watched his demonstration of an interactive multi-touch touch screen the size of a drafting board, before the iPhone/iPad was born.  Another inspiration was Hans Rosling's TED Talk  about health statistics, with his animated interactive data visualizations presented on a huge screen.

The following year, I stumbled upon the  NUI-Group while searching for information about multi-touch displays, and was inspired by many of the early members of the group.  I also became acquainted with a world-wide network of people who share similar interests, such as Albrecht Schmidt and his team of researchers at the Unversity of Stuttgart. This busy group recently presented at PerDis 2013 and at CHI 2013 and are involved in a wider range of ongoing projects.

INTERACTIVE DISPLAYS
Alt, F. Sahami, A., Kubitza, T., Schmidt, A.  Interaction Techniques for Creating and Exchanging Content with Public Displays. In: Proceedings of the 2013 ACM Annual Conference on Human Factors in Computing Systems 
Hinrichs, U., Carependale, S., Valkanova, N., Kulkkaniemi, K., Jacucci, G., Moer, A.V., Interactive Public Displays   Computer Graphics, Vol. 33(2) IEEE Computer Society (25-27)
PerDis 2013 Program
Sample Papers:
Otero, N., Muller, M., Alissandrakis, A., and Milrad, M. Exploring video-based interactions around digital public displays to foster curiosity about science in the schools. PerDis 2013 (pdf)
Alt, F., Schneegass, S., Girgis, M., Schmidt, A. Cognitive Effects of Interactive Public Display Applications. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 2013
Langeinrich, M., Schmidt, A., Davies, N., and Jose, R.  A practical framework for ethics: the 

Note:  Members of ACM have access to all of the proceedings of PerDis2013 in the ACM Digital Library. Non-members have access to the abstracts.

PD-NET
PD-net approach to supporting ethics compliance in public display studies. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 139-143
PD-Net 
PD-NET Publications - a great reference list, with links to many papers
Reading List on Pervasive Public Displays
About Instant Places
About the Living Lab for Screens Set

DOOH-DIGITAL OUT-OF-HOME
Daily Digital Out of Home post "Billboards That Look Back" : Could miniature cameras embedded in ads lead to Big Brother at the mall? The World Is My Interactive Interface, 5/28/08
J. Müller et al., "Looking Glass: A Field Study on Noticing Interactivity on a Shop Window," Proc. 2012 SIGCHI Conf. Human Factors in Computing Systems (CHI 12), ACM, 2012, pp. 297–306
Michelis, D., Meckel, M. Why Do We Want to Interact With Electronic Billboards in Public Space?  First Workshop on Pervasive Advertising, Pervasive 2009, 5/11/09
The Rage of Interactive Billboards
The Print Innovator, 11/28/12
10 Brilliant Interactive Billboards (Videos)
Amy-Mae Elliot, Mashable, 8/21/11


SOME INTERESTING EARLIER WORK
Jeff Han's 2006 TED Talk (This is worth revisiting, as it came out before the iPhone, iPad, etc.)


Tan, D.S., Gergle, D, Scupelli, P., Pauch, R. Physically large displays improve performance on spatial tasks. ACM Transactions on Computer-Human Interaction, V13(1) 2006 (71-99)

Revisiting promising projects: Dynamo an application for sharing information on large interactive displays in public spaces (blog post)
Lynn Marentette, Interactive Multimedia Technology, 09/16/07

Brignull, H., Izadi, S., Fitzpatrick, G., Rogers, Y., Rodden,  T. The introduction of a shared interactive surface into a communal space. Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW'04), Chicago, ACM Press, 2004 (pdf)


Izadi, S., Brignull, H., Rodden, T., Rogers, Y. and Underwood,M. Dynamo: public interactive surface supporting the cooperative sharing and exchange of media. In Proc. User
Interfaces and Software Technologies (UIST’03), Vancouver, ACM Press, 2003, 159-168. (pdf)

Proxemics (Wikipedia)


Why Do We Want to Interact With Electronic Billboards in Public Space? 


Dec 5, 2012

Augmented Human Conference '13 (ACM CHI) March 7th and 8th; CFP paper deadline Jan 8, 2013

Looks like a fascinating conference!

ACM SIGCHI 4th Augmented Human International Conference





















Call for Papers
The 4th Augmented Human (AH) International Conference in cooperation with ACM SIGCHI will be held in Stuttgart, Germany, on March 7–8 2013, focusing on augmenting human capabilities through technology for increased well-being and enjoyable human experience. 

As in previous years, the conference proceedings will be published in the ACM Digital Library as a volume in its International Conference Proceedings Series with ISBN. 

Topics 
  • Wearable Computing and Ubiquitous Computing 
  • Bionics, Biomechanics, and Exoskeletons 
  • Brain-Computer Interfaces, Muscle Interfaces, Implanted Interfaces 
  • Sensors and Hardware 
  • Smart Artifacts and Smart Textiles 
  • Augmented Sport, Health, & Well-being, Training/Rehabilitation Technology 
  • Augmented and Mixed Reality, Tourism and Games and Context-Awareness 
  • Augmented Fashion and Art 
  • Trust, Privacy, and Security of Augmented Human Technology 
PROGRAM COMMITTEE

Submission Categories for Papers 

Full papers 8 pages, anonymized, 30 minutes presentation 
Short papers 4 pages, anonymized, 15 minutes presentation 
Demonstration papers 2 pages, anonymized, demonstration at conference 
Poster papers 2 pages, anonymized, presented at conference 
Art pieces 1–2 pages, not published, exhibited at conference 

The four paper categories will be published in the ACM digital library and follow the ACM paper format. We encourage authors to submit supporting video material in addition to the PDF submission. 

Important Dates 
  • January 8, 2013      paper submission deadline 
  • February 5, 2013    author notification 
  • February 12, 2013  camera-ready and ACM copyright form due 
  • March 7–8, 2013    scientific conference in Stuttgart 

Art and Exhibition 

Augmented Human 2013 will feature contributions by art researchers and practitioners. Artists participating and exhibiting at Augmented Human 2013 will have to be self-funded to attend the conference. 

All art pieces will be included in a video to be published on the Augmented Human YouTube channel. Additionally, an exhibition catalog will be published on the Augmented Human website, including full-page pictures and the descriptions provided by the authors. 

Submission of Art Pieces 

Authors may choose the format to present their art pieces for submission. The submission should include the requirements of space, light, electricity, and equipment.

Organizing Committee

General Chair:  Albrecht Schmidt, University of Stuttgart, Germany
Program Co-Chairs: Andreas Bulling, University of Cambridge, UK; Christian Holz, Hasso Plattner Institute, Germany

Jan 14, 2012

You Know You've Secretly Wanted to Learn To Code! (Info, links, video!)

You know you've secretly wanted to learn to code. Just do it! 


"Make your New Year's resolution learning to code.Sign up on Code Year to get a new interactive programming lesson sent to you each week and you'll be building apps and web sites before you know it." -Code Year  http://codeyear.com/


BTW, coding skills are needed beyond the world of apps and websites. Take a look at some of the posts and links on this blog - your imagination just might be sparked!  If you already know how to code, why not commit to learning something new?  


COMMENT:  There is much more to coding than what you'll learn through Code Academy's Code Year process. If you are serious about learning more about coding, computer science, and software systems, take an introductory course at your local community college or university extension program,  preferably with a friend. Ask the instructor if some of the assignments can be done through the "paired programming" technique.  It is more fun and social than the traditional way of coding!


Of all the textbooks, videos, and coding/programming self-help books I've come across, the series that has made the most sense to me is the brain-friendly "Head First" publications from O'Reilly.  It explains things well for beginners. Although it contains text and code, there are many pictures, diagrams, and humorous visual representation of basic concepts that are much more engaging than traditional "learn-to-code" tomes.


It is not too late to learn to code! 
There are so many great resources available to us now, in 2012, there is no excuse to ignore your inner geek.  If some (or all) of your hair is grey, why do crossword puzzles or Sudoku when you can be creative with code?


I took my first programming class about 8 years ago, when my youngest daughter was in high school.  It was daunting at first, because the textbook was dry, the programming labs were tedious, and some of my mostly-male classmates already knew how to code.  In my case, I was motivated to learn to code because I wanted to create games at the time, and this got me over the hump.  I soon learned that coding is both a science and an art, and learning to code opens up a whole new way of thinking. (See the video of Jeanette Wing's presentation about computational thinking, at the end of this post.)


RELATED
Why your 2012 New Year's Resolution Should Be Learning to Code
Sarah J., SPOTLIGHT on Digital Media and Learning Blog 1/9/12
PLAYBACK:  Pedagogy, Coding and Teaching Kids to Think Deeply
Sarah J., SPOTLIGHT on Digital Media and Learning 1/13/12
Center for Computational Thinking
Computational Thinking (pdf) (Jeannette Wing)
Code Academy
http://codeyear.com/
Head First Labs
Coding4Fun
Coding4Fun Blog
Coding4Fun KinectToolkit
Kinect for Windows SDK
CSTA:  ACM K-12 Computer Science Model Curriculum, 2nd Edition
CS Model Curriculum, 2nd Edition (pdf)
Jeanette M. Wing's Vision: "Computational thinking will be a fundamental skill used by everyone in the world by the middle of the 21st Century"
br />

For 2012, my goal is to brush up on my previously learned coding skills and learn a few new ones related to the Kinect. I also want to become comfortable with HTML5.














Dec 12, 2011

UPDATE POST: Educational Interfaces, Software, and Technology: 2012 ACM-CHI Workshop Call for Papers/Presentations

There is still time left to submit your paper!


CALL FOR PAPERS
EDUCATIONAL INTERFACES, SOFTWARE, AND TECHNOLOGY 2012
3rd Workshop on UI Technologies and Educational Pedagogy
May 5-6 2012
in conjunction with ACM-CHI 2012, Austin, Texas

This will be our third annual workshop in conjunction with CHI 2012.



One of the primary goals of teaching is to prepare learners for life in the real world. In this ever changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Teachers and students can leverage these tools to improve learning outcomes. Educational interfaces and software are needed to ensure that new technologies serve a clear purpose in the classrooms and homes of the future.



Since teachers are always looking for creative ways to engage 21st century learners, there needs to be an academic venue for researchers to discuss novel educational tools and their role in improving learning outcomes. This workshop aims at filling this void: combining the pedagogical expertise of the cooperative learning, and learning sciences communities with the technical creativity of the CHI, UIST and interactive surface communities. The objective of this workshop is to become a conference within two years


We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. 




Topics of interest include:

  • Gestural input, multitouch, large displays
  • Mobile Devices, response systems (clickers)
  • Tangible, VR, AR & MR, Multimodal interfaces
  • Console gaming, 3D input devices
  • Co-located interaction, presentations
  • Educational Pedagogy, learner-centric, Child Computer Interaction
  • Empirical methods, case studies
  • Multi-display interaction
  • Wearable educational media
Submission:  The deadline for workshop paper submissions is Dec 20, 2011. Interested researchers should submit a 4-page position paper in the ACM CHI adjunct proceedings style to the workshop management system. Acceptance notifications will be sent out February 20, 2012. The workshop will be held May 5-6, 2012 in Austin, Texas. Please note that at least one author of an accepted position paper must register for the workshop and for one or more days of the CHI 2012 conference.

Website: http://smarttech.com/eist2012
Contact: Edward Tse, SMART Technologies, edwardtse@smarttech.com


RELATED
Educational Interfaces, Software, and Technology Workshop Organizers
Edward Tse, SMART Technologies 
Lynn V. Marentette, Union County Public Schools
 Syed Ishtiaque Ahmed, Cornell University
 Alex Thayer, University of Washington
 Jochen Huber, Technische Universität Darmstadt

 Max Mühlhäuser, Technische Universität Darmstadt
 Si Jung “Jun” Kim, University of Central Florida

 Quincy Brown, Bowie State University

Apr 3, 2011

INTERACTIVE TABLETOPS AND SURFACES 2011 CALL FOR PAPERS AND NOTES (6th Annual ACM ITS 2011 - Nov. 13-16)

Thanks to Johannes Schöning for sharing this information!

CALL FOR PAPERS AND NOTES
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011
ITS 2011
November 13-16, 2011
Portopia Hotel, Kobe, Japan
The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications.

Building on their success in previous years, ITS again features Papers and Notes presentations, as well as tutorials, posters, demonstrations tracks and a doctoral symposium. ITS 2011 will also include workshops.

ITS 2011 will bring together top researchers and practitioners who are interested in both the technical and human aspects of ITS technology. On behalf of the conference organizing committee, we invite you to begin planning your submissions and participation for this year's conference!

CALL FOR PARTICIPATION: PAPERS AND NOTES
The use of interactive surfaces is an exciting and emerging research area. Display technologies, such as projectors, LCD and OLED flat panels, and even flexible display substrates, coupled with input sensors capable of enabling direct interaction, make it reasonable to envision a not-so-distant future in which many of the common surfaces in our environment will function as digital interactive displays. ITS brings together researchers and practitioners from a variety of backgrounds and interests, such as camera and projector based systems, new display technologies, multi-touch sensing, user interface technologies, augmented reality, computer vision, multimodal interaction, novel input and sensing technologies, computer supported cooperative work (CSCW), and information visualization.

The intimate size of this single-track symposium provides an ideal venue for leading researchers and practitioners to exchange research results and experiences. We encourage submissions on (but not limited to) the following topic areas as they relate to interactive tabletops and surfaces:

* Gesture-based interfaces
* Multi-modal interfaces
* Tangible interfaces
* Novel interaction techniques
* Data handling/exchange on large interactive surfaces
* Data presentation on large interactive surfaces
* Software engineering methods
* Computer supported collaborative work
* Middleware and network support
* Virtual reality and augmented reality
* Social protocols
* Information visualizations
* Hardware, including sensing and input technologies with novel capabilities
* Human-centered design & methodologies
* Applications

CONFERENCE COMMITTEE
Conference Co-Chairs
Jun Rekimoto, The University of Tokyo / SonyCSL, Japan
Hideki Koike, University of Electro-Communications, Japan
Kentaro Fukuchi, Meiji University, Japan

Program Co-Chairs
Yoshifumi Kitamura, Tohoku University, Japan
Daniel Wigdor, University of Toronto, Canada

SUBMISSIONS
Papers/Notes: We invite paper submissions of two kinds: Papers (10 pages) and Notes (4 pages). Papers must present original, highly innovative, prospective and forward-looking research, possibly in one or more of the themes given above. Notes must also report novel and complete research, but where the scope and scale of the contribution is more focused and succinct than papers. Submissions must be submitted as a single PDF file in the ACM format through the submission system. A template for submissions can be found on the ITS

All accepted submissions will be presented at ITS 2011 and appear in the ITS digital proceedings and be archived in the ACM digital library.

IMPORTANT DATES
Paper/Note Submissions: June 30, 2011
Paper/Note Author Notifications: September 2, 2011
Paper/Note Camera-Ready Deadline: October 21, 2011

The calls for the posters, demos, tutorials, workshops and the doctoral symposium will be published soon, please check our website for further information.

RELATED
ITS 2010 Awards

Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by  Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Paper:Proxemic Interaction: Designing for a Proximity and Orientation-Aware Environment by Till Ballendat, Nicolai Marquardt, Saul Greenberg
Best Note:MudPad: Tactile Feedback and Haptic Texture Overlay for Touch Surfaces by Yvonne Jansen, Thorsten Karrer, Jan Borchers
Best Poster:A Multi-Touch Alignment Guide for Interactive Displays by Mathias Frisch, Ricardo Langner, Sebastian Kleinau, Raimund Dachselt
Best Demo:MudPad - A Tactile Memory Game by Yvonne Jansen, Thorsten Karrer, Jan Borchers




Who's Who
ITS 2010Conference General Chairs 
Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH 

ITS 2010 Organization Committee

Daniel Wigdor, one of the ITS 2011Program Co-chairs, co-authored the following book with Dennis Wixon. The book has not yet been released.
 
Brave NUI World: Designing User Interfaces for Touch and Gesture

Description: Description: Book Cover


Note: This post includes links, please let me know of anything I should correct.

Feb 24, 2011

Vision-Based Hand-Gesture Applications: Video from Communications of the ACM



The latest edition of Communications of the ACM, via "snail mail", was the inspiration for this post:



Vision-Based Hand-Gesture Applications
Juan Pablo Wachs, Mathias Kolsch, Helman Stern and Yael Edan

"Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing."


More to come!

Feb 3, 2011

Interactions (ACM) Cover Article- "Proxemic Interactions: The New Ubicomp?" - Plus - Close encounters with displays at the airport and JC Penney

Cover Image

Today I came home from and found one of my favorite publications, Interactions, on my kitchen counter.  Imagine my excitement when I saw that the cover article for the Jan-Feb issue was  "Proxemic Interactions:  The New Ubicomp?". (pdf)  The authors of this article are Saul Greenberg, Nicolai Marquardt, Till Ballendat, Rob Diaz-Marino, and Miaosen Wang, from the University of Calgary. (A list of some of the articles from the current issue of Interactions can be found at the end of this post.)


On the topic of proxemic interactions.... I welcome the research that is germinating in this area, and agree with the authors of Interaction's cover article that there is much work to be done in this space!   


I've been searching for great examples of interactive displays in public spaces.  Since there is much room for improvement in this field, much of what I share on this blog is a bit  disheartening.   Most of my user-unfriendly encounters with large displays happen when I'm in an airport, mall, health care facility, or a cruise ship.  Things are improving, but at a slow pace. The best work I've seen so far has been in museums.  


Info display at the Orlando airport with a QR tag.A few weeks ago I wrote a post about my nearly comical interaction with a large display that featured a QR tag that simply would NOT cooperate with my smartphone's tag reader.   What were they thinking when they decided that the best place for a QR tag on a large display should be at knee level?
Interactive Display with QR Tag:  Close Encounter at the Orlando Airport

There is more to this story!  On my way home,  I had a few hours to kill at the Orlando airport. I stopped by a convenience store the Orlando airport, and discovered the non-static display below. I didn't notice the display the first time I was in the store that day.


I spotted another QR tag!

There were a few problems with this set-up.  First, the display was located on a wall close to a ceiling, making it very inconvenient to scan with my phone.  The page with the QR tag didn't stay up long enough for me to attempt to scan it, although waited around until it cycled back in view-  three times.  I gave up and took a picture instead, standing on my tip-toes with my arms extended over my head, the same position required to scan the QR tag, if it would stay put!


Another problem was that the display was perched above a high-traffic area, right above the  refrigerated drinks.  The only place to scan the QR tag was in a narrow aisle, behind the throngs of thirsty travelers making a bee-line to the cold drink section of the store.  I was in the way of a number of people in this bee-line.


What were they thinking when they decided that the best place this large display featuring a QR tag should be several feet above eye-level in this particular location?  

The displays in front of the store were located on either side of the doorway, at "body" level. This is a logical place to put a display.  Potential customers can view the displays without getting in the way of other travelers.


I didn't notice QR tags on these displays.  QR tags linking to web-based reviews of books I'd like to purchase would be useful in this scenario.


Non-static displays outside of the store:





PART II: Interactive Touch Screen Display Encounter at JC Penney

Interactive displays are cropping up everywhere.  The display featured below is part of the Find More at JC Penney  campaign. The following is a quote from a JC Penney press release: 

"JC Penney has announced the launch of a new mobile initiative that connects mobile commerce with social and instore media applications that include location-based check-in offers and enhancements to its mobile apps. JCPenney's mobile initiatives are part of the company's investment in its overall digital platform, which also includes jcp.com, social media, and its in-store FindMore™ smart fixture."


"The smart fixture integrates JCPenney's online and offline stores by providing customers with a 52-inch interactive touch screen experience that allows them to view and purchase items from jcp.com while in a JCPenney store. Customers can also scan a bar code to get product details, access recommendations, or email data about an item of interest to themselves or friends."

Can you find the large interactive touch screen display?    After shopping around in the mall, I decided to go to JC Penney and check out the sales.  Below is what I passed as I entered the store.  I went up and down the main aisle a few times before I noticed a large interactive display along a side aisle.  



The stack of red holiday boxes obstructed my view:



I didn't notice the display the first few times I walked in the other direction. From the back, the display looked like a poster. 
  
When I finally discovered the display, I was impressed with the slick graphics and stylish models. 



I wasn't so impressed by the placement of the display. To interact with the screen, I had to stand in a narrow aisle, in the path of a steady stream of slow-walking shoppers, often moving in small groups, including parents with babies in strollers. 


I wasn't impressed by the confusing array of input methods on the display.  I noticed that the light blue square in the lower left-hand corner of the display is designed to provide access for people with disabilities.   I couldn't figure it out.

Accessibility:  There is a need for research in this area!

The accessibility feature was confusing to use. The virtual trackball activated a curser on the screen, but the curser did not accurately map the virtual trackball movements. This section of the screen was located on the lower left hand corner, which would be convenient for someone in a wheelchair to use. The placement of the "accessibility window" was too low for someone who might to interact with the screen in an accessible manner but don't happen to need a wheelchair.


I shot video of my interaction with the screen, but found that I was in the way of shoppers, so I stopped.  Not to worry.  Below is a video taken by Mike Cearly, shared on his 11th Screen blog post, "Out and About: JC Penney's "Find More" Touch Screen".   (Mike's impressions of this display were similar to mine and worth reading if this is an area that interest you.)




Mike uses his 11th Screen Scorecard to evaluate the technology he encounters in public spaces.  I am updating a similar rubric to use when evaluating displays in public spaces.   It is important to look at multiple dimensions, such as the physical placement of the display, weather and lighting conditions,  how the display interoperates with mobile devices or a system of displays,  how the content is presented/accessed across channels and devices, and how users share the experience with others.

Components of the 11th Screen Scorecard  -Mike Cearly
Purpose: "What is the purpose of the solution? Is it to drive awareness? Acquisition?  Loyalty?  What is the brand trying to accomplish in this medium?"
Drama: "Does the solution make a big impact on the user? Does it make them stop and interact?"
Usability: "Can the user navigate through the experience with ease? Are the paths to information intuitive?  There's also an element of functions, too, but I think that is much more subjective.  Do the functions enhance the user experience?"
Interactivity:  "How does the user interact with it?  Is it gesture based? Is it touch-based? Can the user interact with it through any other enabling technology?"
Information: "How much and what kind of content is available for the user to interact with?  Generally speaking, the more information and the different formats of information, the better."
Personalization: "What level of personalization does the experience provide?"


COMMENT
I've been obsessed with large displays and the amazing potential they hold for supporting all sorts of  interactions -  collaboration, planning, creating, information sharing, playing, viewing, communicating, learning, shopping,  way-finding, artistic expression.  

I'm also drawn to smaller screens and how we use our mobile devices- smartphones, iphones, e-readers, digital cameras, etc.,  to interact with the displays we encounter in public spaces.   We are just beginning to figure out how technology can effectively - and seamlessly-support interaction between people who are co-located as well as located in different places.

INTERACTIONS Volume XVIII.1 
Here is a list of some articles from the current issue of Interactions:
The Cloud (Yue Pan, Eli Blevis)