Showing posts with label interaction. Show all posts
Showing posts with label interaction. Show all posts

Nov 30, 2010

Call for Participation - Large Displays in Urban Life: From Exhibition Halls to Media Facades (CHI 2011 Workshop)

Large Displays in Urban Life: From Exhibition Halls to Media Facades 
CHI 2010 Workshop May 7 or 8, 2011 (final date to be announced)


Call for Participation
Large interactive displays are now common in public urban life. Museums, libraries, public plazas, and architectural facades already take advantage of interactive technologies for visual and interactive information presentation. Researchers and practitioners from such varied disciplines as art, architecture, design, HCI, and media theory have started to explore the potential and impact of large display installations in public urban settings.


This workshop aims to provide a platform for researchers and practitioners from different disciplines such as art, architecture, design, HCI, social sciences, and media theory to exchange insights on current research questions in the area. The workshop will focus on to the following topics: how to design large interactive display installations that promote engaging experiences and go beyond playful interaction, how different interaction models shape people’s experience in urban spaces, and how to evaluate their impact.


Workshop Goals & Topics
The goal of this one-day CHI 2011 workshop is to cross-fertilize insights from different disciplines, to establish a more general understanding of large interactive displays in public urban contexts, and to develop an agenda for future research directions in this area. Rather than focusing on paper presentations, this workshop aims to trigger active and dynamic group discussions around the following topics:


Beyond Playful Interaction
A number of studies found that large display installations invite for playful interaction but often fail to convey meaningful experiences related to content. This raises the following questions:
  • How can we design installations that endure people’s attention past the initial novelty effect and direct the interest toward the content?
  • What design strategies can be applied to promote an active individual and social exploration and discussion of the presented information?
Character of Interaction
A number of interaction techniques have been explored for large displays in public spaces ranging from interaction via cell phones, to direct-touch or full body interaction. We would like to discuss:
  • How do different interaction methods shape people’s experience of large display installations in urban spaces?
  • How do interaction methods differ from each other in terms of triggering interaction and engagement with the presented content?
Evaluation
Different quantitative and qualitative methods have been applied to evaluate people’s experience and use of large display installations in public spaces. During the workshop we would like to discuss:
  • How can we evaluate the "success" of large display installations in urban spaces?
  • How can particular aspects of public large display installations such as engagement be evaluated?
  • What kind of evaluation methods are most effective in different progress stages (design phase/installment phase)?
We see this workshop as an opportunity to start thinking about a general framework that can inform the design and evaluation of large interactive displays in different urban contexts. With a diverse research community present at the workshop we hope to come up with an agenda for future research directions in this area.

For more details on the workshop please refer to our extended abstract and workshop proposal.

Submission Details
Submit a position paper (maximum 4 pages) to largedisplaysinurbanlife@gmail.com by January 14, 2011 using the CHI extended abstract format. The paper should describe experiences, works in progress, or theories around designing and/or evaluating large interactive displays in public urban settings. We plan to explore approaches and insights from different disciplines to this topic so submissions from art, architecture, design, HCI, media theory, and social science are highly encouraged. We welcome all methodological approaches and techniques centered around the topic of large interactive displays in urban life.


At least one author of each accepted position paper needs to register for the workshop and for one or more days of the CHI conference itself.


Important Dates
Submission Deadline: January 14, 2011
Notification of acceptance: February 11, 2011
Workshop: May 7 or 8, 2011 (final date to be announced)

WORKSHOP ORGANIZERS
Uta Hinrichs is a PhD candidate in computational media design at the Innovations in Visualization (InnoVis) research group of the University of Calgary, Canada, under the supervision of Sheelagh Carpendale. Her research focuses on the design and study of large display interfaces to support lightweight information exploration in walk-up-and-use scenarios
Nina Valkanova is doing her PhD at the interaction group of the Universitat Pompeu Fabra (UPF) in Barcelona, Spain under the supervision of Ernesto Arroyo. Her research interest focuses on the design of urban media facades exploring the intersections between scientific and artistic design knowledge.
Kai Kuikkaniemi is a project manager in Helsinki Institute for Information Technology. He is currently leading a national research project focusing on public displays. His earlier research has focused on exploring novel multiplayer game designs ranging from pervasive gaming to biosignal adaptive gaming.
Giulio Jacucci is a professor at the University of Helsinki at the Dept. of Computer Science and director of the Network Society Programme at the Helsinki Institute for Information Technology. He leads several interactional projects on interaction design and ubiquitous computing, and is co-founder of MultiTouch Ltd. a company commercializing products for multi-touch screens.
Sheelagh Carpendale is a Professor at the University of Calgary where she holds a Canada Research Chair: Information Visualization and an NSERC/iCORE/SMART Industrial Research Chair: Interactive Technologies. She directs the Innovations in Visualization (InnoVis) research group and her research focuses on information visualization, collaborative visualization, and large interactive displays.
Ernesto Arroyo holds an associate teaching position at the Dept. of Information and Communication Technologies of the Universitat Pompeu Fabra  (UPF) in Barcelona, Spain. He earned his PhD at MIT Media Lab in 2007. His research at the Interactive Technologies Group  focuses on interaction design, visualization, and user-centered interfaces, enabling and preserving the fluency of user engagement.

Thanks to Uta Hinrich for sending this my way!

Therenect: Theremin for the Kinect! (via Martin Kaltenbrenner)

Yet another reason why I need to get a Kinect!

Martin Kaltenbrenner's video demonstrates how the Kinect can be transformed into a virtual Theremin.


Therenect - Kinect Theremin from Martin Kaltenbrunner on Vimeo.

Here's Martin's description of the Therenect:

"The Therenect is a virtual Theremin for the Kinect controller. It defines two virtual antenna points, which allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that are quite similar to playing an actual Theremin."

"This musical instrument has been developed by Martin Kaltenbrunner at the Interface Culture Lab at the University of Art and Industrial Design in Linz, Austria. The software has been developed using the Open Frameworks and OpenKinect libraries.
"

Nov 8, 2010

RockMelt? Watch the video interview of the co-founders to learn more about the worlds newest web browser!

A quick glance at my morning newspaper- yes, the kind that involves paper and ink- brought to my attention that today is the first day that the RockMelt browser meets the world.


So what is special about RockMelt?  According to the RockMelt blog, the browser is built on Chromium, the same source code that fuels Google's Chrome. The biggest investment in RockMelt was $10  million dollars, from Marc Andreessen's venture capital firm.  Andreessen is also one of RockMelt's board of directors.  RockMelt is tightly integrated with Facebook, which is understandable, since Marc Andreessen is also on that company's board of directors.  (Marc Andreessen is the guy behind Netscape, one of the first popular browsers in the mid 1990's.)


Below is a video of an interview of the co-founders of RockMelt, who explain the reasons why browsers need to be re-invented to address social networking, "the cloud", and more.  





"Wherever you go on the Internet, RockMelt makes the Web a personal experience. Because RockMelt is the first browser you log into, it unlocks  your Web experience with your Facebook friends, your feeds, your favorite services, even your bookmarks and preferences. RockMelt is also the first browser to be fully backed by the cloud. This means you can access your personal browsing experience from anywhere, and you get quick updates from the people and sites that are important to you."  -RockMelt Blog


Promo Video:  RockMelt- Your Browser. Re-Imagined







For more information, visit the RockMelt website.  You can also sign up to get the browser at the website. The browser might be a little buggy, as it is a Beta version, and is still in development.

Nov 2, 2010

EyeTube for YouTube! Eye-gaze interaction software, free and downloadable from GazeGroup

Gaze interaction systems provide access to computers and the rich content now available on the web for many people with disabilities.  Unfortunately, commercial gaze tracking systems are very expensive and at times, difficult to calibrate.  There is hope!


Following up on my recent post, "Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta", I thought I'd share the GazeGroup's EyeTube for YouTube interface.  


What is great about EyeTube for YouTube is that it provides two different interfaces. The simplified version looks good for younger children or people with cognitive disorders, and is icon-based.  The second version is appropriate for people who can navigate through more complex visual representations of content. 


EyeTube requires a Windows-based system and .Net 3.5 at this time. It can be downloaded from the GazeGroup website.  If you plan to download the application, you must also make sure you have a YouTube account. To get the application up and running, you'll need to change the settings (EyeTubeSettings.xml) to match your account.   (If you don't know much about changing settings or xml, ask someone you know who works in IT.)


Below is the icon-based version of the eye-gaze interface for YouTube:
EyeTube - Gaze Interaction for YouTube (simplified version)


Feature-rich version of the EyeTube interface for YouTube:
EyeTube - Gaze Interaction for YouTube

From the GazeGroup site:

"The EyeTube prototype offers a feature rich eye controlled interface for the popular YouTube service. Instead of emulating a mouse pointer and interacting with a web browser the EyeTube interface is especially designed to be driven by gaze input. It offers a wide range of features such as keyword searching, popular video feeds, favorites and social aspects such as subscriptions, friends and commenting on videos.The highly optimized interfaces allows for a streamline interaction which is aleviated from the Midas Touch problem. In most previous gaze interfaces selection is made by a dwell time activator, e.g fixat a button for a specific amount of time and it will execute the function. In the EyeTube interface a fixation on a U.I element will highlight it and a second fixation on the activation button is required to execute the function. This removes the stress of having to constantly move the eyes to avoid unintentional activation."
"The EyeTube also exists in another simplified incarnation developed for users whom are distracted by a larger number of options. It supports basic features such as browsing categories, optional keyword searching and favorites."

RELATED
The GazeGroup
(The individuals mentioned below may be currently working elsewhere, but involved in the gaze research in some way.)

GazeGroup Research Areas

COGAIN (Communication by Gaze Interaction)

ACM CHI Conference Articles
San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458. DOI= http://doi.acm.org/10.1145/1520340.1520682
San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466 
Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. InProceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671

UPDATE

Eye-controlled games and leisure applications from the COGAIN wiki: http://www.cogain.org/wiki/Leisure_Applications
  • EyeArt - EyeArt eye-drawing program, developed by Andre Meyer and Markus Dittmar, Technical University of Dresden, Applied Cognitive Research Unit, Germany.
  • GazeTrain - Gaze-controlled action oriented puzzle game, developed by Lasse Farnung Laursen, Technical University of Denmark
  • Puzzle - Simple puzzle game that can be played with eye movements, developed by Vytautas Vysniauskas, Siauliai University, Lithuania
  • Road to Santiago - Gaze-controlled adventure game (full game), developed by Javier Hernandez Sanchiz, Universidad Publica de Navarra, Spain
  • Snap Clutch - An application that uses eye gaze data to generate key and mouse events for playing games such as World of Warcraft and Second Life.
  • ASE: Accessible Surfing Extension for Firefox - Follow this link to access ASE, an Accessible Surfing Extension for Firefox, developed by Emiliano Castellina and Fulvio Corno at Politecnico di Torino. (Note that this is a beta version.)
  • Eye Gaze Music (SAW Selection Sets) - Point and Play – eye gaze (direct pointing) musical activities, developed by DART. Please note that SAW (Special Access to Windows) framework application is needed to play these 15 music selection sets. SAW is available for free athttp://www.oatsoft.org/Software/SpecialAccessToWindows
  • EyeTube - Gaze interaction for YouTube - Follow this link to get more information and download EyeTube at ITU GazeGroup's web pages
  • Eye3D and other head eye mouse software - Eye3D for education, and a collection of links to free software that works with head or eye mouse. Includes links to downloads and original sites.
  • Gaze-controlled Breakout - Follow this link to access a modified version of the LBreakout2 game which can be operated by an SMI eye tracker, developed by Michael Dorr et al. at University of Luebeck
  • Oleg Spakov's Freeware games for MyTobii - Follow this link to access MyTobii compatible games developed by Oleg Spakov, University of Tampere, Finland
  • Free ITU Gaze Tracker and applications - Download a webcam based open-source gaze tracker and several applications that work with it, developed at IT University of Copenhagen
  • GameBase - Check out the Eye-Gaze Games category at the SpecialEffect GameBase!
  • More information about Gaze-Controlled Games - Follow this link to see a list of online information resources on using gaze for the control of games and other leisure applications

Nov 1, 2010

Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tall, NUI-Group Member

I came across the first version of the open-source ITU Gaze Tracker on the NUI Group forum in April of 2009 and played around with it a bit.  I was impressed.  I'm happy to say that the new version looks even better, although I haven't had the time to try it out.  Below are two recent videos that will give you a better understanding about gaze tracking.  


For the tech-curious, make sure you take the time to view the second video!  Links to info & code are below.


GT2 High speed remote eye tracking "Pushing the limits"


Technical Demonstration


Info about  the ITU Gaze Tracker 2.0 Beta from the NUI Group Forum, posted by Martin Tall:



Introducing the ITU Gaze Tracker 2.0 Beta
"We’ve made great progress since the initial release, today we open the doors for version 2.0. Internally we’ve rewritten major parts of the platform to gain flexibility and higher performance.  First version was DIY playtime, this version is nothing short of a screamer. High performance, very accuracy tracking. People are telling us we are crazy giving it away but we’re dedicated to the mission: Accessible eye tracking for all, regardless of nationality and means. We’re making it happen."
Important highlights for GT2.0b:
- Supports three modes of operation, head-mounted, remote mono/binocular
- Vastly improved performance, +500fps head mounted, +170fps remote binocular (both eyes)
- Awesome accuracy, avg. 0.3 - 0.7 degrees of visual angle (remote binocular)
- New U.I, looks so.. 2010
- Automatic tuning (optimization of algorithms parameters)
- Relatively low CPU-utilization and memory footprint (12%, 170Mb, core i7 860 win7-64)
- Many enhancements, bug-fixes etc.

Oct 18, 2010

Words of Wisdom (and more) from Harry Brignull: UX Roots in Psychology, Design, Info Architecture...and so much more!

Harry Brignull is a User Experience Consultant at Madget in Brighton, England. According to his "about" page info, his work involves "building experiences by blending User Research, Interaction Design, and process consultancy."  Harry's 90 Percent of Everything blog is a well-spring of information and inspiration.


Back Story
I came across Harry's work in 2004 or 2005, when I was taking a VR Class (Virtual Reality for Education and Training) and working on an assignment about large-screen displays.   At the time, Harry was a Ph.D. working in the Interact Lab at the University of Sussex on the Dynamo project, in collaboration with researchers from the Mixed Reality Lab at the University of Nottingham.

I revisited this work again in early 2007 when I was studying HCI and Ubiquitous Computing, and researching information about collaborative interaction on large displays in public spaces.  The following research article inspired me at the time, and looking back, I consider the work of this team to be seminal, and worth revisiting once again.

Izadi, S., Brignull, H., Rodden, T., Rogers, Y., Underwood, M. (UIST'03)
Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media (pdf)  



(The picture was taken from the Dynamo project's website, and shared on my 2007 blog post, Revisiting promising projects, Dynamo, an application for sharing information on large interactive displays in public spaces.)


Links to a few of Harry's useful blog posts:

UX as Applied Psychology:
Clear Reporting & Critical Thinking:  Why User Experience Needs to Remember its Roots in Psychology (10/4/10)

"There was a time, back in the early 1990s, when almost everyone involved with UX research had a background in Psychology.  Back in those days, the term "User Experience" didn't really exist, and the nearest discipline was Human-Computer Interaction (HCI)..."

Comment:
As a school psychologist, I'm well-steeped in the process of research, observation, and data collection.  I also know that the fundamentals of applied psychology are a very important ingredient in UX work.  Don Norman, one of the "grandfathers" of UX/HCI, was a cognitive psychologist.  He was the co-author of one of my psychology textbooks when I was a university student the first time around. Coincidentally, Norman's book, Design of Everyday Things was required reading for my HCI graduate class.

























Mobile Usability Testing for Low Budgets
Mobile Usability Testing Tip: Recording from Two Webcams
In this post, Harry discusses quick and cheap methods of using two webcams for mobile usability testing. This method could be used in other situations, such as developing presentations. (It might also be applicable for use in therapeutic and special education settings.)

Image: Nick Bowmast


UX Brighton Presentation on Dark Patterns: User Interfaces Designed to Trick People
My Presentation on Out of Box Experience Design  (Harry Brignall)
David Ogilvy: We Sell or Else


RELATED
Links to Harry's Blog Posts, By Topic


SOMEWHAT RELATED

Sep 22, 2010

PD-NET Project: "Exploring..large scale networks of pervasive public displays..."

I love the concept of interactive, networked public displays!   The PD-NET project has a fairly new website and a Facebook page.  If this interests you, take the time to learn about the PD-NET project, a collaborative effort between researchers from several different universities in Europe. I'd like to see this take hold in the US.  


PD-NET Project Objectives (From the PD-NET Website):
  • To create enabling technologies for large-scale pervasive display networks through the design, development, and evaluation of a robust, scalable, distributed and open platform for interconnecting displays and their sensors.
  • To establish Europe as the international centre for work on pervasive display networks.
  • To address key scientific challenges that may inhibit the widespread adoption of pervasive display network technology:  Tensions between privacy and personalization, situated displays, business and legislative requirements, User Interaction.
Here is a list of participating universities and researchers, taken from the PD-NET website:


Participants from Lancaster University (UK):
Participants from University of Duisburg-Essen (Germany):
Participants from University of Lugano (Italy):
Participants from University of Minho (Portugal):
Publication
J. Müller, F. Alt, D. Michelis, and A. Schmidt, "Requirements and Design Space for Interactive Public Displays," in Proceedings of the 18th annual ACM international conference on Multimedia (Multimedia 2010), to appear, 2010.

COMMENT
If you are familiar with this blog, you know that I am passionate about interactive displays, especially in public spaces.  I've devoted numerous posts to this topic on this blog, and also on  The World Is My Interactive Interface, a blog I started a few years ago when I was working on projects for large interactive displays.  


I am interested in how interactive displays, of any size, can be accessible and universally designed, especially those that can inter-operate with mobile devices, including devices that support communication for people with special needs.


SOMEWHAT RELATED
Here are links to some of my previous blogposts related to the topic of interactive displays in public spaces.  Some have links to scholarly publications.  
What is DOOH and Why Should We Care? (DOOH- Digital Out-Of-Home)
Interactive Displays and Interaction (Presentation via Daniel Michelis)
Thoughts about technology on a cruise ship, and other reflections
Multi-touch and gesture interaction out-and-about
UPDATED: More News, Videos, and Links about Multi-Touch and Screen Technologies
Technology-Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch screen usability.
Usability/Interaction Hall of Shame (In a Hospital)
Think Globally, Act Locally:  Exploring the Problem Space - Top-down, bottom-up, local and the global...

Sep 13, 2010

For the iPad: Reactable Multi-touch Mobile



More details later!

RELATED
History of the Reactable


How it works (Info from the Reactable website)
"The way the Reactable works is deeply inspired by modular analogue synthesizers such as those developed by Bob Moog in the early 60s and enhances them with new concepts of sampling and digital effects processing."

"While with modular synthesizers one typically had to connect these different modules with patch cables in a complex and error-prone process, with the Reactable this is attained in a much easier and more intuitive way, since connections between the blocks are managed automatically based on their types and affinities and on their proximity."


Martin Kaltenbrunner (Co-founder of Reactable)
"His research concentrates on tangible user interfaces and the development of novel human computer interaction concepts within open tools for creative production. As co-founder of Reactable Systems he had been mainly working on the interaction design of the Reactable - an electronic musical instrument with a tangible user interface. He is author of the open source tangible interaction framework reacTIVision and the related TUIO protocol, which have been widely adopted for the realization of tangible tabletop applications."


SOMEWHAT RELATED
Soundythingie for the iPad
iPad Apps for Making Music: What's Coming Later
SurfaceDJ iPhone App (Vectorform)

Aug 21, 2010

Reflections About Interactivity in my Present World

Background:  For those of you who are new to this blog, I work as a school psychologist in a public school district in North Carolina. During the mid 2000's, I took a variety of computer courses, leading up to a decision to work part time while taking graduate courses part-time such as human-computer interaction, games, information visualization, ubiquitous computing, VR, etc. I returned to work full time in 2008, due to the economic downturn.

I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs.  I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.

Update:  I don't usually post much about my work on this blog, but I have some exciting things to share.

New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.

School started for Wolfe students on July 26th, and three more whiteboards were installed.  One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.


Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.  

Why is this important to me?

Observations of positive changes:
Students who previously had limited interest in applications on the computer, or a limited attention span for such activities, demonstrate a high degree of interest and engagement with the activities on the IWBs we've tried so far.  Many students who do not attend to learning materials, or only for brief periods of time, actively look at the IWB screen and even interacting with it, and smile.   Some students who rarely-or never- speak sometimes vocalize more often during IWB activities!   Some are making connections between learning materials and what they interact with on the screen, something that simply did not happen previously.  Some students who rarely established eye contact with others or demonstrated joint attention do so during IWB activities, something I discussed in a post on the TechPsych blog.

Technology and my role as a school psychologist:
Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory.  I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation.  There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".

We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district,  how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:

"School psychologists promote the development and maintenance of learning environments that support resilience and academic growth, promote high rates of academic engaged time, and reduce negative influences on learning and behavior."

"School psychologists use information and assistive technology resources to enhance students’ cognitive and academic skills."

"School psychologists work with other school personnel to develop, implement, and evaluate effective interventions for increasing the amount of time students are engaged in learning."

"School psychologists facilitate design and delivery of curricula to help students develop effective behaviors, such as self-regulation and self-monitoring, planning/ organization, empathy, and healthy decision-making."

Working towards solutions:
I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district.  She's used interactive applications and web resources for many years along with more traditional "hands-on" materials,  but like me, has had limited access to IWB's until recently.  She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April.  We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.

One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards.   Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.


vSked_1 575x320
vSKED in action

Workshop Presentation
This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs.  We shared some of the exciting things we've noticed with our students and shared "before and after" examples.  Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)

During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning.   We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens.  Below are a few of the resources we shared:

Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.




Sensory World provides an intuitive navigation system for students to use as they explore various areas of a house and engage in learning and related activities. The site map for all of the areas of the house, as well as the activities, is graphical.




We shared resources from the National Gallery of Art's Kids Zone, a great website that offers plenty of interactive content:


Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
Zac Browser

Another website that is good for visual-based browsing is Symbol World:

"The intention is that symbol readers, teachers and carers will contribute, and we are delighted at the number of contributors that have already sent us their work"

We shared a recent find, Taptu rthat can be used on an IWB.  Although Taptu focuses on the mobile touchable web, accessed through smartphones with internet access, it provides links to a wide range of websites designed with intuitive touch interaction in mind, perfect for students with special needs.  In many cases, Taptu web navigation resources work fine on the larger screen of the IWB.  The advantage is that a icon/visual based touch navigation system supports students who can't type and also those who have very limited reading skills.


Sports Navigation for Taptu

Below is a slideshow about the touchable web:
Taptu: Virtual Roundtable


Update: SMARTTable
At Wolfe, we'd like to create a conversation wheel with related activities for the SMARTTable that co-ordinate with SMARTBoard activities.  We'll probably experiment with the conversation wheel concept on the SMARTboard first.


Considerations:
One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart.   We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated.  The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.

Complications:
Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3.   My HP Touchsmart, my home computer, is 64 bit, but runs Vista.  My school-issued laptop runs Windows XP and is also 64-bit.  To upgrade it, I will have to send it to the tech department for several weeks.   It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.

Possible SMARTTable Solutions:
I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out!  We'd welcome volunteers from other universities as well.   Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.










Upcoming Interactive Multimedia Technology Posts
I have about 6 posts in draft form about interesting interactive technologies, research, people, and companies involved with multi-touch, interactive multimedia, and natural user interface/interaction applications. I'll get them up as soon as possible.

One post is about 3M Touch Systems. Another is about an interesting multi-touch start-up company, Osmosis.  I revisited Ballantyne Village to update the use of interactive touch screen technology, something I discussed in detail, along with photos and video clips, in a 2008 post, Technology Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch-screen usability".   I got some inside scoop about the rationale behind the changes at the upscale center from a seasoned owner of a high-end audio-visual store I interviewed about 3D TV.   I have some interesting information about current research in "glasses-less" 3D displays. Sharp will be coming out with a 3D cell-phone camera AND glasses-less cell-phone display in the not-to-distant future.  

I'm preparing for posts that highlight a few of my favorite blogs. Until then, take a look at the recent posts on Tracy Boyer's Innovative Interactivity blog, and also InteractiveTV Today.

RELATED & SOMEWHAT RELATED
"The AAC-RERC conducts a comprehensive program of research, development, training, and dissemination activities that address the NIDRR priorities and seek to improve technologies for individuals who rely on augmentative and alternative communication (AAC) technologies. The mission of the AAC-RERC is to assist people who rely on augmentative and alternative communication to achieve their goals by advancing and promoting AAC technologies and supporting the individuals who use, manufacture, and recommend them."



Janet Light, Kathy Drager, Penn State University




Jeff Higginbotham's Bibliography:  Viewing AAC Through Authentic Social Interactions (pdf)

ProLoquo2Go (iPhone/iPod Touch/iPad communication system)


Also posted on the TechPsych blog