Sep 18, 2010

Interactive Tabletops and Surfaces: 2010 ACM Conference, Nov. 7-10, Saarbrucken, Germany. Wish I could go!

If you are new to this blog, you should know that I'm passionate about interactive tables and surfaces of all sizes!   Although this technology has been around for a while, it is a new concept to most people.  The researchers and practitioners involved in the upcoming 2010 Interactive Tabletops and Surfaces Conference have been an important influence in the way people think about interacting with technology, and have made significant contributions to this emerging field over the past several years.   It hasn't been an easy road, given that most of us have minds brainwashed through years of forced keyboard-and-mouse interaction and traditional WIMP (Windows, Icons, Menus, Pointers) interfaces.

I first learned about the first Interactive Tabletops conference, held in 2006, in early 2007.  At the time, I was working on projects for my HCI and Ubiquitous Computing classes, trying to learn everything I could about natural user interaction, large touch-screen displays, tabletop computing, and multi-touch.   I was inspired by the interesting work going on in this field.  This was before the first iPhone was introduced, before Microsoft's multi-touch Surface was unveiled, and three years before Apple broke out with the iPad.

Many of the people involved with the 2010 Interactive Tabletops and Surfaces Conference are (or have been) affiliated with the NUI group NUI stands for Natural User Interface, or Natural User Interaction - the NUI group is "a global research community focused on the open discovery of natural user interfaces."  I joined the NUI-group in 2007 when I was looking for more information about the nuts and bolts of multi-touch programming and systems, and have been encourage to see how things have evolved since then.

Members of another group, sparkon, are also participating in the Interactive Tabletops and Surfaces conference.  Sparkon is an on-line community that includes people involved with  interactive technologies, including tabletop and surface computing. "On sparkon, you'll find projects demonstrating the latest interactive techniques, applications, software frameworks, case studies, and blog articles relating to creative and emergent technology."  (I'm also a member of Sparkon.)


Here's the information from the conference website:

ACM Interactive Tabletops and Surfaces, Saarbrücken, Germany:  7-10 November, 2010
"ITS 2010 is a premier venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a young community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, interaction design, and studies expanding our understanding of design considerations of ITS technologies and of their applications in modern society. ITS 2010 will bring together top researchers and practitioners who are interested in both the technical and human aspects of interactive tabletop and surface technologies. It is our hope that we will be able to achieve increased synergy of approaches between the disciplines engaged in the research in the area of interactive tabletops and surfaces, Design, HCI, UbiComp, Psychology, MobileHCI and other related fields. More directly, we intend to encourage immediate interdisciplinary collaboration on future research topics. Young scholars and Ph.D. students are especially encouraged to submit papers and participate in the doctoral colloquium."


Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH
Conference General Chairs



KEYNOTE SPEAKER:  W. Bradford Paley

"Bio: W. Bradford Paley uses computers to create visual displays with the goal of making readable, clear, and engaging expressions of complex data. He did his first computer graphics in 1973, founded Digital Image Design Incorporated in 1982, and started doing financial & statistical data visualization in 1986. He has exhibited at the Museum of Modern Art; he created TextArc.org; he is in the ARTPORT collection of the Whitney Museum of American Art; has received multiple grants and awards for both art and design, and his designs are at work every day in the hands of brokers on the floor of the New York Stock Exchange. He is an adjunct associate professor at Columbia University, and is director of Information Esthetics: a fledgling interdisciplinary group exploring the creation and interpretation of data representations that are both readable and esthetically satisfying."


SAMPLE TOPICS



  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Interactive surface hardware, including sensing and input technologies with novel capabilities
  • Human-centered design & methodologies





RELATED
Previous Conferences








PLUGS
From the conference website -Links to the conference sponsors:




We appreciate the generous support of the following sponsors, without whom this conference would not be possible. Click on the logos to learn more about our generous supporters, and let us know if you are interested in becoming a sponsor.

Champions:

 

Benefactors:

  

Donors:

Contributors:

Academic Sponsors:

     

The iPod Nano Watch? (via Wired Gadgets)

Watch the video!:


RELATED

Apple's Newest Watch Is.... Wait, What? It's an iPod Nano? Dylan Tweney, 9/17/10, Wired Gadgets

Interactive 360 Degree Glass-less 3D Video Display with Gesture Sensor: Demo of Sony's RayModeler

The video below gives a demo of Sony's RayModeler, "A 360-Degree Display that doesn't require glasses". The video shows how the auto-stereoscopic 3D content is filmed. It also shows how items within the display respond to gesture interaction. The first prototype was introduced in 2009 and then brought out at the SIGGRAPH conference this summer.



According to an article written by Richard Lawler, Core77 created "Breakout" for the RayModeler, a game similar to Pong.  I'll have to think more about this technology before I form an opinion!

RELATED
Sony's 360-degree RayModeler 3D display brings its glasses-free act to LA, plays Breakout
Richard Lawler, Engadget 7/28/10

Sony's 360-degree 3D display prototype makes virtual pets more lifelike, expensive
Thomas Ricker, Engadget 10/19/09

Sep 15, 2010

Link to TechPsych Post: Laying the Groundwork for Interactive Video Activities for Students with Special Needs: Community Places Road Trip

If you follow the link, you can see the first "draft" of the video.  My goal is to transform the linear video into an interactive video, with hot spots and links:


Laying the Groundwork for Interactive Video: Activities for Students with Special Needs: Community Places Road Trip

Note:  This was my first time trying to shoot video of buildings and signs from a convertible car with the top down ; )

For Multi-Touch Techies and the Tech Curious- Touch and Retouch article, with code samples, by Charles Petzold

Thanks to Josh Blake, I came across a good article by Charles Petzold  in the September 2010 issue of MSDN Magazine:
Touch and Response


In this article,  Charles Petzold continues his discussion of the multi-touch support in version 4 of Windows Presentation Foundation (WPF).  I like the introduction to his article:
"Programming is an engineering discipline rather than a science or a branch of mathematics, so rarely does there exist a single correct solution to a problem. Varieties and variations are the norm, and often it’s illuminating to explore these alternatives rather than focus on one particular approach." -Charles Petzold
I also appreciate Petzold's discussion about smooth Z transitions, his thoughtful code samples and related links.

RELATED
Multi-touch Manipulation in WPF 
Charles Petzold (August, 2010, MSDN Magazine, UI Frontiers)
"Just within the past few years, multi-touch has progressed from a futuristic sci-fi film prop to a mainstream UI. Multi-touch displays are now standard on new models of smartphones and tablet computers. Multi-touch is also likely to become ubiquitous on computers in public spaces, such as kiosks or the table computer pioneered by Microsoft Surface."
"The only real uncertainly is the popularity of multi-touch on the conventional desktop computer. Perhaps the greatest impediment is the fatigue known as “gorilla arm” associated with moving fingers on vertical screens for long periods of time. My personal hope is that the power of multi-touch will actually provoke a redesign of the desktop display. I can envision a desktop computer with a display resembling the configuration of a drafting table, and perhaps almost as large." -Charles Petzold
Exploring Multi-Touch Support in Silverlight
Charles Petzold (March 2010, MSDN Magazine, Finger Style)


Comment:
I agree with Charles about the need for a re-design of desktop displays.  I like the drafting table as a form factor.  Here are a couple of my previous posts related to this topic:
Emerging Interactive Technologies, Emerging Interactions, and Emerging Integrated Form Factors (2008)
DigiBoard Multi-Touch Mixed Reality Game; Ideas for future design of a flexible, adjustable multi-touch surface (June 2008)



Sep 14, 2010

Sep 13, 2010

For the iPad: Reactable Multi-touch Mobile



More details later!

RELATED
History of the Reactable


How it works (Info from the Reactable website)
"The way the Reactable works is deeply inspired by modular analogue synthesizers such as those developed by Bob Moog in the early 60s and enhances them with new concepts of sampling and digital effects processing."

"While with modular synthesizers one typically had to connect these different modules with patch cables in a complex and error-prone process, with the Reactable this is attained in a much easier and more intuitive way, since connections between the blocks are managed automatically based on their types and affinities and on their proximity."


Martin Kaltenbrunner (Co-founder of Reactable)
"His research concentrates on tangible user interfaces and the development of novel human computer interaction concepts within open tools for creative production. As co-founder of Reactable Systems he had been mainly working on the interaction design of the Reactable - an electronic musical instrument with a tangible user interface. He is author of the open source tangible interaction framework reacTIVision and the related TUIO protocol, which have been widely adopted for the realization of tangible tabletop applications."


SOMEWHAT RELATED
Soundythingie for the iPad
iPad Apps for Making Music: What's Coming Later
SurfaceDJ iPhone App (Vectorform)

Sep 11, 2010

Off Topic: Ad with 100 cute cats and kittens roaming free around IKEA (plus 2 video clips of cute kittens and 1 of "dancing" dogs)

IKEA CATS AD:


THE MAKING OF THE IKEA AD, "HERDING CATS"


THREE CUTE LITTLE KITTENS (Just over 2 weeks old)


THREE CUTE KITTENS  AT 5 WEEKS


DANCING DOGGIES

For Techies and the Tech Curious: A Few Links - IEEE Multimedia Innovations, Surface Computing, Usability, and Computer Education Research

IEEE Multimedia magazine online  
July-September 2010 Issue:  Multimedia Innovations
"This special issue addresses the notion that multimedia is a compelling field that provides a driving force behind most of today's technology innovations."
Games Only Scratch the Surface:  A Challenge for Developers Lou Silverstein, Identity Mine
In this article, Lou Silverstein discusses the latest news about Microsoft Surface and encourages developers to think about joining a dialogue on this topic.
This article is a gentle reminder, written for developers of traditional productivity software,  about the importance of usability and understanding non-technical users.
Why does the UK and Sweden take Computing Education Research more seriously than US? Mark Guzdial, Blog@cacm (Communications of the ACM)
In this article, Mark Guzdial discusses some of the trends in computer education research outside of the US and shares his discovery of the Centre for Excellence in Teaching and Learning in Computer Science


Thanks to Nat Torkington, of O'Reilly Radar,  for some of the links!


SOMEWHAT RELATED
Two years ago I wrote a post about touch-screen & surface computing - were still not there yet, although interactive tables are showing up in more places:
Interactive Touch-Screen Technology, Participatory Design, and "Getting It"

Sep 8, 2010

iPad Apps: Supporting Communication for Young People with Autism (& Links to Moms with Apps)

Have you ever watched a kid pick up an iPad for the first time and just go with it? 


Some parents have found that the same is true for their of children who have autism.  The  iPad is a great way to support learning and communication in that it is user-friendly and easy for children- and parents- to understand. 


Here are two videos that were shared with my by Liz Ditz,  author of the "I Speak of Dreams" blog.  The first video is of a boy with autism. The second video is of his little sister, interacting with the iPad to watch a "home-made" social story about a family trip to a coffee shop, created in the "Stories to Learn" app.



YOUTUBE INFO:

"Another too-cool app for his iPad, this one all about writing as well as spelling (http://itunes.apple.com/app/iwritewords-handwriting-game/id307025309?mt=8). For each letter in the word, the app says the letter, has him trace it in super-easy guided steps, and his written letter replaces the original letter in the word (at top of screen). When he finishes, the word appears in his handwriting. The program then spells it and pronounces it, displays a kid's illustration of the word -- and Leo gets to shake the original letters into a hole in the corner of the screen. LOVE IT."

STORIES TO LEARN



YOUTUBE INFO:
"Our family (and a friend) worked together to make this social story about visiting our local coffee shop for my son with autism. We used Stories2Learn (http://itunes.apple.com/us/app/stories2learn/id348576875?mt=8), on our iPad. It was so easy! And he loves it. In fact he loves it so much I had to have his sister demo it, as he loves to play the voiceovers over and over again. :)"


RESOURCES
Moms with Apps
Moms with Apps for Special Needs
Moms with Apps:  Characteristics of great apps for kids with autism


    Sep 5, 2010

    Planet in Action's 3D Sound in Google Earth: Part of the Interactive, Touchable 3D Picture (and thoughts about how this might play out on interactive whiteboards in education)

    I came across a link to Planet in Action's video demonstration of 3D sound in Google Earth, thanks to a post by Nate Torkington in O'Reilly Radar.  WOW!   


    The company shows that a 3D sound engine works nicely within the Google Earth environment.  Planet in Action tweaked the Soundmanager2 library from schillmania.com, resulting in a sound3D javascript object that can be used to program geo-located sounds in Google Earth.


    The 3D sound concept, in my opinion, will be useful in creating immersive interactive educational experiences, especially if students have access to newer interactive whiteboards with decent sound systems.    


    Below is the video from Planet in Action, along with a bit of promotional information:

    The video is of a port in San Sabastian, Spain.  The "sound landscape" is realistic, and a range of sounds, from various spots in the scene, can be heard during the exploration.  


    "Google Earth is a highly detailed 3D representation of our entire planet.  Rather than just looking at it, why not play on it!  PlanetInAction.com brings you top quality ideas, applications and concepts that will let you experience your planet in a whole new way." -Planet in Action


    Thoughts:



    My bet is that this concept will play out very well in the future in family rooms outfitted with large displays and decent surround sound systems. 

    Why?  No need to explain.  Just take a look at what is going on with 3D TV and film,  3D HD gaminginteractive TVinternet-enabled large HD TVs, Google's "Lean Back" concept, and the "Slow Media" movement.



    By the way, Planet in Action's work strikes a chord in me.  One of my HCI team travel planning projects that used Google Earth in the demonstration prototype.  We focused on cruise travel planning, mostly because I had lots of pictures and video clips from my cruise trips vacations and other travels.  I even worked on part of the project while I was ON a cruise ship.  (That was before the economic downturn.)   Below is a video of a cruise ship model that Planet in Action created for a cruise line, for use in Google Earth:

    Celebrity Constellation Virtual Fly-by


    SOMEWHAT RELATED





    As I mentioned in my previous post, "Video Experiments: Sea Life Medley - Extended version with music",  I'm experimenting with some ideas to create interactive educational video clips for future use on higher-resolution interactive whiteboards and other large touch-screen surfaces.   


    Part of my motivation stems from an immediate need for "touchable" content for the students I work with who have more significant disabilities, including autism- and of course, my colleagues. All of the classrooms in the program have IWBs- mostly brand-new SMARTBoards - - we also have a SMARTTable that is begging for more content.


    As I worked with students using my video clips and a new SMARTboard, I noticed that students have a strong desire to touch what they see on the screen.  If they could stop the video and further explore item of interest, for example, the jellyfish in the above picture, it would be great.  


    Imagine putting your hand on the jellyfish, and having the power to explore it as a 3D model! If you are a grad student or researcher who is working on this concept, or something related, please let me know.


    Getting back to the 3D landscapes/soundscapes in Google Earth- this is something that I'd like to try at school.  I'm wonder if PlanetinAction allows for touch-screen interaction so that students who are at the SMARTBoard can easily explore the 3D-sound enabled scenes in Google Earth.

    Sep 4, 2010

    Video Experiments: Sea Life Medley: Extended version with music

    I'm playing around with my sea life video clips to get ideas for creating interactive educational videos in the future for use on interactive whiteboards and other large touch-screen surfaces.

    Here's my YouTube description:
    This is a medley of several video clips taken of sea life, mostly living in aquariums. Awesome jellyfish can be viewed from 1:34- 3:45. The music consists of clips from iMovie, and then William Orbit's "Sea Green" and "Surfin". I created this video for students I work with who have more complex disabilities, such as severe autism.

    Sep 1, 2010

    Interactive: Mirror Fun and Diamonds Alchemy Game by Playbat

    MIRROR FUN

    The Mirror Fun application, along with the source code, can be downloaded from the TECHSPARKED website.  You might need to join Techsparked.)

    Info about Mirror Fun from Playbat:
    "This application shows you in a "rubber mirror" - when you move, the mirror deforms accordingly. Also, you can run application in "mesh mode", then you will see virtual mesh, which deforms when you move."


    DIAMONDS ALCHEMY GAME


    DIAMONDS ALCHEMY Tech Behind the Scene

    Aug 31, 2010

    Osmosis: Multi-touch systems for... everywhere!

    Not long ago I had the opportunity to chat with Stuart McLean, the founder of Osmosis, a company that delivers customized multi-touch systems of hardware and software that support human-centered natural user interaction.   Stuart has many years of experience working in more traditional IT/business roles, and knows from this experience that there is  better way to support  human computer interaction, including interaction between people.

    Like many of us in the "NUI" community, Stuart was impressed by the video of Jeff Han's 2006 TED Talk, which demonstrated a variety of awesome multi-touch, multi-user applications on a high-resolution drafting table.  Stuart saw the importance of natural user interfaces and interaction and became involved with the NUI Group, a "global research community focused on the open discovery of natural user interfaces". 

    Unlike traditional tech companies, Osmosis is a collaboration between a global network of engineers, designers, and developers who share the "NUI" vision. This collaboration enables the company to provide solutions for clients across a range of countries, cultures, and domains.


    Below is a photo-gallery of some of the applications and systems developed by Osmosis:


    Multi-touch by Osmosis
    GALLERY
    As you can see from the gallery photos, Osmosis provides a range of possibilities for their clients and potential clients.  All of the displays are high-definition.  Some are projection-systems, and others are displays with multi-touch sensing technology.  Since the construction is modular, a variety of form factors are available.  High-quality surround and domed sound systems are available.  Applications include information kiosks, point of sale/digital signage, hospitality, presentation and training, education, and audio-visual performance and production.  Osmosis also provides applications that support interaction with tangible objects.

    Below are two videos that give a taste of what Osmosis is all about:

    OSMOSIS DEMO REEL

    Demo Reel from Osmosis on Vimeo.

    MULTI-TOUCH EVERYWHERE

    MT Everywhere from Osmosis on Vimeo.

    I can see where some of these applications would be great in K-12 educational settings.  Just look at the joy on the faces of the kids in the Multi-Touch Everywhere video!

    (Short video clips of the Osmosis applications in action can be found in the showcase page of the company's website.)

    Aug 29, 2010

    Propellerheads' Gigantic Outdoor Drum Machine: Another "Playful Interface" and Media Facade - using Phidgets!

    I admit I have a passion for interactive media facades and playful outdoor interfaces. I'm geeky like that. Imagine my excitement when I saw the link to Propellerhead's outdoor drum machine that was taken out "in the wild" at night in Stockholm, Sweden, for people to play with!   Thanks, Ben Ullman, for the link!

    "No Swede would every think of doing something like this without a permit."



    I especially like that the team used Phidgets to create this drum machine.  Phidgets enable you to hook up a mix of digital or analog sensors to a USB interface.  According to a post on Create Digital Music, the interface below was supported by a Phidget 8/8/8 interface hooked to a computer, and a force sensor under the pad. The team used a Phidget Voltage divider to make the scaling and calibration magic happen.   (I had a chance to get my hands on some Phidgets, briefly, when I took Heather Richter Lipford's Ubicomp class at UNC-C a few years ago.  I would love to play with them in-depth in the future.)

    RELATED
    Behind the Scenes of Propellerheads' Oversized Drum Machine
    Peter Kirn , Create Digital Music, 8/26/10
    Propellerhead Reason (Virtual studio rack)


    What are Phidgets? Here is the info from the Phidgets website:


    "Phidgets are a set of "plug and play" building blocks for low cost USB sensing and control from your PC. All the USB complexity is taken care of by our robust API. Applications can be developed quickly by programmers using their favorite language: C/C++, C#, Cocoa, Delphi, Flash AS3, Flex AS3, Java, LabVIEW, MATLAB, Max/MSP, MRS, Python, REALBasic, Visual Basic.NET, Visual Basic 6.0, Visual Basic for Applications, Visual Basic Script, and Visual C/C++/Borland.NET. Click on Programming if you want to look at Getting Started Guides or Code Samples."


    Here is a version of a similar giant sampler in action at a Family Force 5 performance:

    Aug 28, 2010

    Update & Links about the Interface Culture Exhibition at the 2010 Ars Electronica Festival, via Martin Kaltenbrunner

    Here is an update about the the work of students in the Interface Cultures program, which will be presented at the Playful Interface Cultures at the Ars Electronica Festival in Linz, Austria:

    "This year‘s presentation by students in the Interface Cultures program showcases newly emerging artistic skill profiles at the nexus of interactive media technology and interface technology. These artists adroitly combine complex disciplines such as communications technology, biosciences, physical computing, interaction design, fashionable technology and information visualization in their test environments and experimental concepts."

    via Martin Kaltenbrunner (Interface Culture at the University of art and Industrial Design):
    statement: http://www.interface.ufg.ac.at/arsPDF
    poster: http://www.interface.ufg.ac.at/files/playful_interface_cultures.pdf
    festival page: http://new.aec.at/repair/en/2010/08/16/playful-interface-cultures/
    vimeo channel: http://vimeo.com/channels/interfaceculture



    SOMEWHAT RELATED
    Interface Culture Lab Blog
    Did you know that you can get a Masters degree in Ludic Interfaces?
    Ars Electronica 2010 Flickr Site
    WIRED: Here comes Ars Electronica 2010 (Bruce Sterling)
    Ars Electronica/Futurelab

    Aug 27, 2010

    iRiS Interaction at the Playful Interface Cultures Exhibition & Ars Electronica Festival (video, short post, and links)

    Earlier this week, I shared my wish to attend the Playful Interface Cultures exhibition at the Ars Electronica Festival.   Since I can't get to Linz, Austria to see the festival first hand, I'll share a video of the iRiS project that is part of the exhibition and festival - the link to the video was sent to me by Johannes Schoening, one of the iRiS project team members.  iRiS stands for Immediate Remote Interaction System, and is the outcome of a joint research project of the University of Saarbrucken and the University of Munich, in Germany.

    If you happen to be at the festival, you can see the project in action between 10 and 11 P.M on September 5th and September 6th.


    iRiS - Immediate Remote Interaction System from awiethoff on Vimeo.


    RELATED
    Post: Updated iRiS:  iPhone Remote Interaction System: Paint a Media Facade on your Favorite Building
    More Urban Screens and 3D Media Facades
    Ars Electronica is located in Linz, Austria.
    iRiS Project Website

    Aug 24, 2010

    I wish I could go to the Playful Interface Cultures exhibition at the Ars Electronica Festival!

    Here is part of the description of Playful Interface Cultures:

    "This years exhibition of the Interface Cultures Department is showing emerging artistic ability profiles transformed by heterogeneous sets, tools and methods of creation departed from interactive media- and interface technology. To take advantage of disciplines related to communication technology, bioscience, physical computing, interaction design, fashionable technology or information visualisation, artists are focusing on new ways to combine these complex frameworks within test environments and experimental concepts. This origin is supporting group- and project based developments and inspires a flexible and multi-perspective sharing of knowledge and abilities within the process of artistic creation."

    Video about the upcoming Ars Electronica Festival:



    RELATED:
    Interface Culture Lab: playful interactive art
    Ars Electronica Festival  Linz, Thursday, September 2-Saturday, September 11
    Ars Electronica Linz GmbH
    Ars-Electronica-Straße 1
    4040 Linz
    Austria

    Tel. 0043.732.7272.55

    Fax. 0043.732.7272.2
    E-Mail: festival@aec.at


    AGB Ars Electronica Linz GmbH

    Aug 21, 2010

    Reflections About Interactivity in my Present World

    Background:  For those of you who are new to this blog, I work as a school psychologist in a public school district in North Carolina. During the mid 2000's, I took a variety of computer courses, leading up to a decision to work part time while taking graduate courses part-time such as human-computer interaction, games, information visualization, ubiquitous computing, VR, etc. I returned to work full time in 2008, due to the economic downturn.

    I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs.  I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.

    Update:  I don't usually post much about my work on this blog, but I have some exciting things to share.

    New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.

    School started for Wolfe students on July 26th, and three more whiteboards were installed.  One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.


    Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.  

    Why is this important to me?

    Observations of positive changes:
    Students who previously had limited interest in applications on the computer, or a limited attention span for such activities, demonstrate a high degree of interest and engagement with the activities on the IWBs we've tried so far.  Many students who do not attend to learning materials, or only for brief periods of time, actively look at the IWB screen and even interacting with it, and smile.   Some students who rarely-or never- speak sometimes vocalize more often during IWB activities!   Some are making connections between learning materials and what they interact with on the screen, something that simply did not happen previously.  Some students who rarely established eye contact with others or demonstrated joint attention do so during IWB activities, something I discussed in a post on the TechPsych blog.

    Technology and my role as a school psychologist:
    Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory.  I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation.  There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".

    We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district,  how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:

    "School psychologists promote the development and maintenance of learning environments that support resilience and academic growth, promote high rates of academic engaged time, and reduce negative influences on learning and behavior."

    "School psychologists use information and assistive technology resources to enhance students’ cognitive and academic skills."

    "School psychologists work with other school personnel to develop, implement, and evaluate effective interventions for increasing the amount of time students are engaged in learning."

    "School psychologists facilitate design and delivery of curricula to help students develop effective behaviors, such as self-regulation and self-monitoring, planning/ organization, empathy, and healthy decision-making."

    Working towards solutions:
    I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district.  She's used interactive applications and web resources for many years along with more traditional "hands-on" materials,  but like me, has had limited access to IWB's until recently.  She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April.  We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.

    One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards.   Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.


    vSked_1 575x320
    vSKED in action

    Workshop Presentation
    This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs.  We shared some of the exciting things we've noticed with our students and shared "before and after" examples.  Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)

    During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning.   We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens.  Below are a few of the resources we shared:

    Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.




    Sensory World provides an intuitive navigation system for students to use as they explore various areas of a house and engage in learning and related activities. The site map for all of the areas of the house, as well as the activities, is graphical.




    We shared resources from the National Gallery of Art's Kids Zone, a great website that offers plenty of interactive content:


    Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
    Zac Browser

    Another website that is good for visual-based browsing is Symbol World:

    "The intention is that symbol readers, teachers and carers will contribute, and we are delighted at the number of contributors that have already sent us their work"

    We shared a recent find, Taptu rthat can be used on an IWB.  Although Taptu focuses on the mobile touchable web, accessed through smartphones with internet access, it provides links to a wide range of websites designed with intuitive touch interaction in mind, perfect for students with special needs.  In many cases, Taptu web navigation resources work fine on the larger screen of the IWB.  The advantage is that a icon/visual based touch navigation system supports students who can't type and also those who have very limited reading skills.


    Sports Navigation for Taptu

    Below is a slideshow about the touchable web:
    Taptu: Virtual Roundtable


    Update: SMARTTable
    At Wolfe, we'd like to create a conversation wheel with related activities for the SMARTTable that co-ordinate with SMARTBoard activities.  We'll probably experiment with the conversation wheel concept on the SMARTboard first.


    Considerations:
    One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart.   We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated.  The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.

    Complications:
    Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3.   My HP Touchsmart, my home computer, is 64 bit, but runs Vista.  My school-issued laptop runs Windows XP and is also 64-bit.  To upgrade it, I will have to send it to the tech department for several weeks.   It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.

    Possible SMARTTable Solutions:
    I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out!  We'd welcome volunteers from other universities as well.   Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.










    Upcoming Interactive Multimedia Technology Posts
    I have about 6 posts in draft form about interesting interactive technologies, research, people, and companies involved with multi-touch, interactive multimedia, and natural user interface/interaction applications. I'll get them up as soon as possible.

    One post is about 3M Touch Systems. Another is about an interesting multi-touch start-up company, Osmosis.  I revisited Ballantyne Village to update the use of interactive touch screen technology, something I discussed in detail, along with photos and video clips, in a 2008 post, Technology Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch-screen usability".   I got some inside scoop about the rationale behind the changes at the upscale center from a seasoned owner of a high-end audio-visual store I interviewed about 3D TV.   I have some interesting information about current research in "glasses-less" 3D displays. Sharp will be coming out with a 3D cell-phone camera AND glasses-less cell-phone display in the not-to-distant future.  

    I'm preparing for posts that highlight a few of my favorite blogs. Until then, take a look at the recent posts on Tracy Boyer's Innovative Interactivity blog, and also InteractiveTV Today.

    RELATED & SOMEWHAT RELATED
    "The AAC-RERC conducts a comprehensive program of research, development, training, and dissemination activities that address the NIDRR priorities and seek to improve technologies for individuals who rely on augmentative and alternative communication (AAC) technologies. The mission of the AAC-RERC is to assist people who rely on augmentative and alternative communication to achieve their goals by advancing and promoting AAC technologies and supporting the individuals who use, manufacture, and recommend them."



    Janet Light, Kathy Drager, Penn State University




    Jeff Higginbotham's Bibliography:  Viewing AAC Through Authentic Social Interactions (pdf)

    ProLoquo2Go (iPhone/iPod Touch/iPad communication system)


    Also posted on the TechPsych blog