Showing posts with label UI. Show all posts
Showing posts with label UI. Show all posts

Mar 29, 2011

Program- CHI 2011 Child Computer Interaction: 2nd Workshop on UI Technologies and Educational Pedagogy

CHILD COMPUTER INTERACTION: 2ND WORKSHOP ON UI TECHNOLOGIES AND THEIR IMPACT ON EDUCATIONAL PEDAGOGY
May 7, 2011
In conjunction with CHI 2011, Vancouver, Canada
PROGRAM (Draft)



9:00am – 09:05am       Introduction


9:05am - 10:05am        Paper Session I (8min presentation, 4min Discussion)


Evaluating Pervasive Classrooms


• Son Do-Lenh et al.:   Classroom-experience evaluation: An ecological approach to evaluating pervasive technologies in classroom
• Kathryn Rounding et al.:   Evaluating Interfaces with Children
• Dan Morris et al.:   Using Machine Learning to Scaffold Artistic Creativity in Education
• Tamara Polajnar et al.:   Enabling Picture-based Querying and Learning with the JuSe Interface
• Hamed Alavi et al.:   Lantern 2.0: A Pervasive Scripting Tool


10:05am - 11:00am  Coffee Break and Preparation time for short talks


11:00am - 12:10pmPaper Session II (8min presentation, 4min Discussion)


Collaboration in Math: Fears, Myths, and Insights


• Nicola Yull et al.: Pass the iPad: Comparing collaboration on paper and screen
• Veronika Irvine et al.: Math Education: A Creative Approach


Group talk: Math and Embodied Agents
a.  Kristina Richter et al.: Bridging Diagnosis and Learning for Mathematics Education in Classroom Setting
b.  Lisa Anthony et al.: Handwriting Interaction for Math Tutors: Lessons for HCI in Education
c.  Andrew Jensen et al.: Using Embodied Pedagogical Agents and Direct Instruction to Augment Learning for Young Children with Special Needs


Group talk: Dispelling Myths About the Next Generation Classroom
a.   Alex Thayer et al.: The Myth of the Paperless School: Replacing Printed Texts with E-readers
b.  Sharon Oviatt: Designing Digital Tools for Thinking, Adaptive Learning and Cognitive Evolution
c.  Alexandra Dunn et al.: Designing Classroom Technology to Meet the Needs of All


Group talk: Games, Wearables and Fun Learning.
a.  Christiane Moser et al.:    Child-Centered Game Development
b.  Lizbeth Escobedo et al.:  Blue’s Clues: An Augmented Reality Positioning System
c.  Si Jung Kim et al.:   Wearable Story: A Story Telling Jacket for Young Children to Improve Their Independent Physical and Learning Activities


12:10pm - 1:30pm Lunch


1:30pm - 2:30pm Paper Session (10 min presentation, 5min Discussion)


Tangibles and Tabletops
•   Izdihar Jamil et al.: Communication Patterns in Collaborative Peer Learning around Interactive Table
•   Aura Pon et al.: Graspable Music and Vuzik: Music Learning and Creativity using an Interactive Surface
•   Ahmed Kharrufa et al.: Learning at interactive surfaces and designing for reflection
•   Cristina Sylla et al.: TOK – a Tangible Interface for Storytelling


Group talk: Tables and Tangibles
• a.Robert Sheehan: Constructionism, Programming and Touch Interfaces
• b.  Orit Shaer et al.: Making the Invisible Tangible: Learning Biological Engineering in Kindergarten
• c.  Sebastien Kubicki: Evaluation of an interactive table with tangible objects: Application with children in a classroom
• d.  Cristina Emilia Costa et al.: I-Theatre: developing narratives skills in kindergarten children
• e.  Wooi Boon Goh et al.: Potential Challenges in Collaborative Game Design for Inclusive Settings


2:30pm - 3:00pm  Coffee Break


3:00pm - 4:00pm  Breakout Discussion (What are future challenges? Next years workshop?)


4:00pm - 4:45pm  Closing Keynote  TBA


4:45pm - 5:00pm  Closing Remarks


7:00pm   Optional Workshop Dinner


RELATED
Workshop Website
ACM-CHI 2011 Conference Website



Jan 12, 2011

Child-Computer Interaction Workshop on May 7th at CHI 2011: Deadline Extended to February 18th!

The deadline for submissions to the CHI 2011 Child Computer Interaction: 2nd Workshop on UI Technologies an their Impact on Educational Pedagogy has been extended to February 18, 2011.

The workshop will be held on May 7, 2011, Vancouver, Canada, in conjunction with ACM's CHI 2011 conference. Interested researchers should submit a 4-page position paper in the ACM CHI adjunct proceedings style to the workshop management system. Note that the best 3 papers will be pre-accepted for a upcoming for a theme issue of the Springer Journal Personal and Ubiquitous Computing on “Child Computer Interaction”.
IMPORTANT DATES
Submission Deadline: February 18th, 2011
Author Notification: March 15th, 2011
Final Version Deadline: April 1st, 2011 
Workshop Date: May 7th, 2011

DETAILS 
2011 Workshop Website
Call for Papers
Workshop Management System

Note:  Deadlines for several other CHI 2011 workshops have also been extended to February 18th. (Pictures and links from the CHI 2010 workshop are located at the end of this post.)


CALL FOR PAPERS
IN CONJUNCTION WITH CHI 2011
Child Computer Interaction: Workshop on UI Technologies and Educational Pedagogy
in conjunction with CHI 2011, Vancouver
May 7th, 2011
Topic: Given the emergence of Child Computer Interaction and the ubiquitous application of interactive technology as an educational tool, there is a need to explore how next generation HCI will impact education in the future. Educators are depending on the interaction communities and to deliver technologies that will improve and adapt learning to an ever- changing world. In addition to novel UI concepts, the HCI community needs to examine how these concepts can be matched to contemporary paradigms in educational pedagogy. The classroom is a challenging environment for evaluation, thus new techniques need to be established to prove the value of new HCI interactions in the educational space. This workshop provides a forum to discuss key HCI issues facing next generation education.

We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include:
• Gestural input, multitouch, large displays, multi-display interaction, response systems
• Mobile Devices/mobile & pervasive learning
• Tangible, VR, AR & MR, Multimodal interfaces, universal design, accessibility
• Console gaming, 3D input devices, 3D displays
• Co-located interaction, presentations, tele-presence, interactive video
• Child Computer Interaction, Educational Pedagogy, learner-centric, adaptive “smart” applications,
• Empirical methods, case studies, linking of HCI research with educational research methodology
•Usable systems to support learning and teaching: Ecology of learning, any where, anytime, (UX of cloud computing to support teaching and learning)

Submission: The deadline for workshop paper submissions is February 18, 2011. Interested researchers should submit a 4-page position paper in the ACM CHI adjunct proceedings style to the workshop management system. Acceptance notifications will be sent out March 15, 2011. The workshop will be held May 7, 2011 in Vancouver, Canada. Please note that at least one author of an accepted position paper must register for the workshop and for one or more days of the CHI 2011 conference.

The best 3 papers will be pre accepted for a upcoming for a theme issue of the Springer Journal Personal and Ubiquitous Computing on “Child Computer Interaction”. Other authors are also invite to submit/ resubmit their work. We will send a separate call for paper for this special issue soon.
Contact: Edward Tse, SMART Technologies, edwardtse@smarttech.com

CHI 2010
NEXT GENERATION OF HCI AND EDUCATION
Join an interdisciplinary, international group of like-minded peers and colleagues.


Brainstorm with thought leaders.

After the workshop, relax, dine, and converse with interesting people.

Dec 3, 2010

Workshop on Mobile and Personal Projection: Call for Papers, CHI 2010, May 8, 2011, Vancouver, CA

I can't wait to attend CHI 2011!    There will be lots to learn about emerging technologies and interactions at the conference. Here's another call for papers/participation for a workshop session at the conference, via Markus Löchtefeld



CALL FOR PAPERS: MP²: Workshop on Mobile and Personal Projection, a workshop to be held at CHI 2011, Vancouver, CA. May 8,  2011


Objectives

The workshop will provide an open forum to share information, results, and ideas on current research on mobile and personal projection. The participants will explain, demonstrate and discuss their current research with others in order to receive feedback, criticism and ideas for future work. Concrete selected questions, ideas and concepts will be addressed in various group sessions in which the participants will work on topics such as a design space for mobile and personal projection; user interface, interaction design and application sketches; paper prototypes; or ad-hoc studies using the provided mobile and personal projector hardware. The results of these group sessions will be discussed with all workshop participants. Finally, we will discuss future research areas, challenges and the potential for mobile and personal projection in order to lay the foundations for a research agenda in this field.

Workshop Topics

The workshop looks for contributions on the following and related topics:
  • Applications and interaction techniques for mobile and wearable projection.
  • Personal projection in augmented reality.
  • Interaction with projected interfaces.
  • Projector phones and wearable projectors.
  • Multi-user interactions and applications.
  • Multimodal and personalized (mobile) interfaces.
  • New application areas of mobile projection.
  • Social implications when interacting with projected interfaces.
  • Artistic and unusual ways to utilize mobile projection.
  • New forms of interaction with the environment.

Research Questions

Mobile and personal projection is at a relatively early stage of research. Reflecting this state, the workshop specifically addresses the following fundamental research questions:
  • What are the unique properties and affordances of mobile and personal projection? What are suitable interaction metaphors?
  • What are core application domains that benefit the most from the usage of mobile and personal projection? What are the application contexts and usage requirements that support mobile and personal projection?
  • What are suitable interaction techniques for mobile and personal projection? How can gestures be incorporated? How should visualizations be structured? How can the projected virtual and real images of objects coexist? What is the role of augmented and mixed reality?
  • What is the social impact of mobile and personal projection technologies? How can users manage privacy when using mobile and personal projectors? How does public behavior change with the introduction of mobile and personal projection technologies?
  • How can spontaneous co-located collaboration be supported by mobile and personal projection technologies? How can designs support the exchange of media items between mobile projector phones?
  • What are suitable strategies and methodologies for evaluating mobile and personal projection interfaces? What aspects impact the user experience?

Submission

We ask for papers that address one or more of the research questions mentioned above, or that describe findings that relate to these research questions based on systems the authors have built. We welcome position papers (2 pages) as well as papers reporting novel concepts, (first) prototypes, studies, applications or interaction concepts (up to 4 pages). All submissions should be prepared according to the standard HCI Archive format.
Each paper will be receive at least two reviews. All accepted papers will be made available online and will be published at Sun SITE Central Europe (CEUR) Workshop Proceedings.
INFORMATION:

Mobile and personal projection interfaces are no longer fiction and have received considerable attention recently. Integrated pico-projectors in mobile and wearable devices could make mobile projection ubiquitous within the next few years. Walls, desks, floors, ceilings, t-shirts or palms will act as projection surfaces for these kinds of new devices.
These technological developments offer new opportunities and challenges for novel forms of interaction. Virtual displays can extend beyond physical device boundaries and augment existing objects. There are also new opportunities for spontaneous multi-user interaction. However, issues such as lighting conditions, privacy, and social acceptability also come into play.
We will bring together researchers and practitioners who are concerned with design, development, and implementation of new applications and services using personal mobile and wearable projectors in their user interfaces.

Important Dates

  • January 10, 2011 - Submission Deadline
  • February 4, 2011 - Acceptance Notification
  • March 11, 2011 - Revised Manuscript Due
  • May 8, 2011 - Workshop Date
Organizers

Otto-von-Guericke-Universität Magdeburg(Germany)
Nokia Research Center,Tampere (Finland)
Swansea University (UK)
DFKI (Germany)
University of Munich (Germany)
University of Duisburg Essen (Germany) & Lancaster University (UK)

Nov 23, 2010

Light Touch Interactive Projector; Holographic Laser Projection (HLP) "How it Works": Update on Light Blue Optics (Videos, links)

It has been about a year since I wrote about Light Blue Optics, "a privately-funded company developing and supplying miniature projection systems for use in high volume applications in markets including automotive, digital signage and consumer electronics." Light Blue Optics is located in Cambridge, UK, and has a development facility in Colorado Springs.


Light Touch Interactive Projector


Holographic Laser Projection (HLP): How it Works




RELATED
A Touch Screen Table
Brendan O'Brian, QSR 11/23/10

"Light Blue Optics, which rolled out the Light Touch in January, is working with several restaurant chains to put its technology on tables...“You can project menus onto the table so the customer can sit down and order their meal,” says Tamara Roukaerts, director of marketing communications at Light Blue Optics. “They can also watch videos of the chef preparing their meal through a live video feed.”"
Light Blue Optics turns KFC tables into touch screens
Roland Gribben, The Telegraph 10/11/10


HLP technology, and how it can be used for practical purposes, is further explained in the following white papers:
Buckley, E., Lacoste, L., Stindt, D. Rear-view virtual image displays. SID (Society for Information Display), Vehicles and Photons - 16th Annual Symposium on Vehicle Displays, 10/15/09
Abstract: "Light Blue Optics holographic laser projection technology can be utilised to create a virtual image display which, with a volume enclosing less than 700cc, exhibits a form-factor consistent with integration into a rear-view mirror. By combining the visual accommodation and concomitant reaction time benefits of a head-up display with the ability to present high resolution safety-critical information in a rear-view off-axis configuration with large eyebox, significant potential safety benefits can result."


Buckley, E., Tindt, D., Isele, R.  Novel Human-Machine Interface (HMI) Design Enabled by Holographic Laser Projection SID 2009 Symposium, 6/2/09

Abstract: "Despite the current proliferation of in-car flat panel displays, designers continue to investigate alternatives to flat and rectangular thin-film transistor (TFT) panels – principally to obtain differentiation by freedom of design using, for example, free-form shapes, round displays, flexible displays or mechanical 3D solutions. A perfect demonstration was provided at the 2008 Paris Motor Show by the BMW Mini Center Globe, a novel instrument cluster design which combines lighting, a circular flat panel and a holographic laser projector provided by Light Blue Optics (LBO) to redefine the state of the art in human-machine interface (HMI)...In this paper, the authors will show how the incorporation of LBO’s holographic laser projection technology can allow the construction of a unique display technology like the Mini Center Globe, and how such a combination of technologies represents a significant advance in the current state of the art in automotive displays."
The Story Behind this Post
I was having one of my occasional vivid "technology dreams" just before my dog woke me up in the middle of the night tonight.  I was driving around in a futuristic car that had all sorts of cool technologies, including a holographic side-view mirror, similar to the one I blogged about in a 2009 post about Blue Light Optics.  This inspired me take a quick look at what the company is doing now.  


The dream that entertained me tonight was probably triggered by what I read just before I went to sleep- a call for papers posted by Albrecht Schmidt on Facebook: "Call for Papers - Theme Issue on Automotive User Interfaces for an upcoming edition of Personal and Ubiquitous Computing.   If you are curious,   here's an example of one of my blog posts that was inspired by one of my geek-tech-dreams:  "Last Night I Dreamt about Haptic Touch Screen Overlays".  

Nov 13, 2010

HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)

MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's  proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction.  The application was built on Ubuntu Linux written using libfreenect, by marcan42  and Florian's creation, libTISCH.



Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"


(I have SO many ideas for this!  I'll throw a few out there in an upcoming post....maybe someone can run with them!)


RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10


FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework.  You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.


LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10

Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.

For your convenience, I've reposted something I wrote about libTISCH back in 2009:

For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers



































Here is information from libTISCH announcement:

Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.


MeTaTop A Multi Sensory Table Top System for Medical Procedures

Sep 15, 2010

For Multi-Touch Techies and the Tech Curious- Touch and Retouch article, with code samples, by Charles Petzold

Thanks to Josh Blake, I came across a good article by Charles Petzold  in the September 2010 issue of MSDN Magazine:
Touch and Response


In this article,  Charles Petzold continues his discussion of the multi-touch support in version 4 of Windows Presentation Foundation (WPF).  I like the introduction to his article:
"Programming is an engineering discipline rather than a science or a branch of mathematics, so rarely does there exist a single correct solution to a problem. Varieties and variations are the norm, and often it’s illuminating to explore these alternatives rather than focus on one particular approach." -Charles Petzold
I also appreciate Petzold's discussion about smooth Z transitions, his thoughtful code samples and related links.

RELATED
Multi-touch Manipulation in WPF 
Charles Petzold (August, 2010, MSDN Magazine, UI Frontiers)
"Just within the past few years, multi-touch has progressed from a futuristic sci-fi film prop to a mainstream UI. Multi-touch displays are now standard on new models of smartphones and tablet computers. Multi-touch is also likely to become ubiquitous on computers in public spaces, such as kiosks or the table computer pioneered by Microsoft Surface."
"The only real uncertainly is the popularity of multi-touch on the conventional desktop computer. Perhaps the greatest impediment is the fatigue known as “gorilla arm” associated with moving fingers on vertical screens for long periods of time. My personal hope is that the power of multi-touch will actually provoke a redesign of the desktop display. I can envision a desktop computer with a display resembling the configuration of a drafting table, and perhaps almost as large." -Charles Petzold
Exploring Multi-Touch Support in Silverlight
Charles Petzold (March 2010, MSDN Magazine, Finger Style)


Comment:
I agree with Charles about the need for a re-design of desktop displays.  I like the drafting table as a form factor.  Here are a couple of my previous posts related to this topic:
Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors (2008)
DigiBoard Multi-Touch Mixed Reality Game; Ideas for future design of a flexible, adjustable multi-touch surface (June 2008)