Showing posts with label assistive technology. Show all posts
Showing posts with label assistive technology. Show all posts

May 2, 2012

Looking forward to CHI 2012: Child Computer Interaction, Educational Interfaces, Health, Special Needs, and Much More!

I am looking forward to attending the ACM-CHI 2012 conference!


On Saturday and Sunday I'll be at the Educational Interfaces, Software, and Technology (EIST) workshop, which is part of the Child Computer Interaction/HCI and Kids community.  I've selected a few papers/presentations/panels that I hope to attend during the remainder of the conference below.


If you are curious about other topics that will be presented at CHI 2012, take a look at the short video previews on the CHIMadness2012 YouTube channel.  

VocSyl


VocSyl: Designing Visualizations to Facilitate Multisyllabic Speech with Children with Autism and Speech Delay (pdf)
Joshua Hailpern, Andrew Harris, Reed LaBotz, Brianna Birman, Karrie Karahalios, Laura DeThorne, Jim Halle (Social Spaces Research Group, University of Illinois at Urbana Champaign)

Social Scripts
 
Fatima Bourjarwah: Facilitating the Authoring of Multimedia Social Skills Instructional Modules for Adolescents with High-Functioning Autism

Autism and Maths Tutor
Research Showcase - Autism & Maths tutor - CHI 2012 from George Chan on Vimeo.

ShoeSense


Giles Bailly, Joerg Mueller, Michael Rohs, Daniel Wigdor, Sven Kratz
ShoeSense: A New Perspective on Hand Gestures and Wearable Applications


Escobedo, L., Nguyen, D.H., Boyd, L., Hirano, S.H., Rangel, A., Garcia, D., Tentori, M., & Hayes, G.R. (2012). MOSOCO: A Mobile Assistive Tool to Support Children with Autism Practicing Social Skills in Real-Life Situations. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2012). Austin, Texas (May 5-10, 2012). New York: ACM Press. To appear

Design of an Exergaming Station for Children with Cerebral Palsy

(Research team from Queen's University, Canada, Holland Bloorview Kids Rehabilitation Hospital)


Illumishare


Paper: IllumiShare: Sharing Any Surface
Sasa Junuzovic, Kori Inkpen, Tom Blank, Anoop Gupta, Microsoft Research

Panel: Tangible Interfaces for Children: Cognitive, Social & Physical Benefits and Challenges
Shuili Gilutz, Sandra Calvert, Kathleen Kremer, Barbara Chamberline, Geri Gay

Paper: Improving Literacy in Developing Countries Using Speech-Recognition-Supported Games on Mobile Devices 
Anuj Kumar, Pooja Reddy, Anuj Tewari, Rajat Agrawal, Matthew Kam


Tap & Play: And End-User Toolkit for Authoring Interactive Pen and Paper Language Activities (pdf) Ann Marie Piper, Nadir Weibel, James D. Hollan


RELATED
About the CHI Conference (from the CHI 2012 Conference Preview)
"Originally a small conference for psychologists interested in user interface design, the annual CHI conference has grown to include a very diverse participant group (such as interaction designers, computer scientists, engineering psychologists, developers, performing artists and more). It has also grown to deal with larger problems such as the organizational integration of technology and the use of technology in the home rather than only office settings. This year's conference marks 30 years of research, innovation and development in the field of Human-Computer Interaction and is expected to draw more than 2500 professionals from over 40 countries. The experience at CHI 2012 offers innovative opportunities for interacting with future technologies. The following areas represent a small portion of the total conference. For complete information about this year's conference, consult the Advance Program."


"Featuring over 900 works, the CHI conference is the premier worldwide forum for the exchange of information on all aspects of human-computer interaction. Typically the works presented address the concerns of design, engineering, management and user experience professionals. This year's conference also features works that focus on: Digital Arts, Games and Entertainment, Human-Computer Interaction for Kids, Health and Sustainability. Works are presented in several different venues. After two days of intimate pre-conference workshops, the main conference includes panel discussions, courses that provide leading-edge HCI knowledge, paper sessions, case studies, works-in-progress, student competitions (involving design, research and games), interactive demonstrations, special interest groups, and a Tuesday evening video night (popcorn included)." 



SIG-CHI is the Special Interest Group on Computer-Human Interaction, and is part of ACM, the Association for Computing Machinery


Child Computer Interaction/HCI for Kids Chairs:
Janet C Read, University of Central Lancashire, UK
Panos Markopoulos, Eindhoven University of Technology
Allison Druin, University of Maryland


Educational Interfaces, Software, and Technology workshop organizers
CHI 2012 Facebook Page
EIST 2012 Facebook Page


SOMEWHAT RELATED 
The Autism Experience in Ubiquitous Computing (pdf)
Monica Tenority, Ph.D., UZBC, Ensenada, BC


Gillian Hayes:  Assistant Professor in Informatics, School of Information and Computer Sciences; Department of Education, University of California, Irvine.
Social and Technological Action Research (STAR), 

Interactive Technologies for Children with Special Needs IDC 2012 
(11th International Conference on Interaction Design and Children)
As special themes, IDC 2012 would like to discuss children's needs under the perspectives of "pedagogical aspects in theory and practise" as well as "children from diverse cultural backgrounds".


Grawemeyer, B., Johnson, H., Brosnan, M., Ashwin, E., Benton, L. (2012) Developing an Embodied Pedagogical Agent With and For Young People with Autism Spectrum Disorder. Accepted for presentation at the 11th International Conference on Intelligent Tutoring Systems, ITS 2012.

Jul 27, 2011

Apple's iOS 5 facial recognition feature opens up interactive possibilities

I've been thinking about creating my first iPad app, and as I was searching for information, I came across a few articles related to Apple's new iOS 5 that I found interesting.  


Because my target user group includes young people with autism spectrum disorders (ASD),  I was intrigued by the possibility that the facial recognition APIs might provide a means of assessing mood or emotional states.   Most of us understand that our faces function as mirrors to feelings, and we use our facial expressions to communicate our feelings to others.  Unfortunately, this is a concept that is difficult for young people with ASD to understand.  My hunch is that there is an "app for that".   


I'd love to great to create a little iPad app for young people with ASD for education, intervention, and/or communication activities that incorporates the facial recognition feature!


Apple's iOS facial recognition could lead to Kinect-like interaction
Darrell Etherington, GigaOm/Reuters, 7/27/11
Here is a quote from the above article:
"You could create apps that track a user’s eye movement and dynamically change content accordingly, for instance. App developers might even be able to use data gathered from facial recognition APIs to identify so-called “hotspots,” providing insight about where a user is looking most within an app and arranging content accordingly. In time, an iPhone app might even be able to assess the emotional state of the user, based on whether they’re frowning or smiling, and address the user in a manner appropriate to their mood. It might also be able to tell how engaged users are with mobile ads and content, which might be useful for iAd customers, among others."


Stan Schroeder, Mashable, 7/26/11
Below is a video from from the above post which demonstrates an app developed by Polar Rose, a company that was purchased by Apple.
RELATED
Apple plans native panorama functionality in iOS5
Seth Weintraub, 9TO5Mac, 7/8/11
iOS 5's final release may include "Assistant" speech-to-text feature
Chris Rawson, TUAW, 7/23/11

Jul 9, 2011

"Rise and Shine": TED video featuring Simon Lewis and his recovery from a serious brain injury, and how cutting-edge technology helped. (Video includes interesting visuals)

Simon Lewis was in an automobile accident that resulted severe injuries to his body, including a very severe head injury that left him in a coma.  As a result of his experience, he wrote a book, "RISE AND SHINE", covering his journey over 15 years of recovery and regeneration.  In the book, Lewis shares what he learned along the way, and how cutting edge technology and some non-traditional thinking helped him move forward.


Simon Lewis had the opportunity to discuss his journey at a TED partner event in India in December, 2010.   The following video of his talk is about 22 minutes long, but worth taking the time to watch. A variety of visuals are used to illustrate his journey and the research he did along the way. Near the end of the video, Simon Lewis demonstrates some of the technologies that he wears that supports his functioning.


Here is the blurb about the book from the Borders website:

"An impassioned tale of survival and recovery, this inspirational story recounts the author’s horrific car accident, his subsequent coma, and the more than 15 years of cutting-edge treatments and therapies endured during convalescence. With specific details of the rigorous rehabilitation process that ensued, including numerous breakthrough and experimental surgeries, the book also provides practical insight into navigating the treacherous world of insurance and how to differentiate between the often conflicting medical opinions offered. In addition to describing the numerous procedures undergone, the author tells not only of his pain, frustration, and despair, but also of his childlike wonder at the beauty and miracle of creation. A first-person account of sudden, unexpected tragedy and life-affirming courage, this remarkable tale of regeneration imparts lessons both medical and spiritual."



The Rise and Shine website includes many of the graphics used in Simon's video, and is worth taking some time to explore. Some of the graphics are interactive. (Since the website relies on Flash, it won't work if you try to access it using an iPad.)


Comment:  As a school psychologist with additional training in neuropsychology - specifically assessment and intervention for children and teens who have experienced traumatic brain injury (TBI), this topic is important to me.  I'm watching the video a second time, and I plan to read his book.

(Cross-posted on the TechPsych blog.)

Jan 12, 2011

Multi-modal Interactive Maps for People with Visual Impairments: Featuring a Stantum multitouch screen with a tactile layer.

To learn more about this project, take a look at the video and related publications below. This is a great example of a team that is harnessing emerging technologies to improve the lives of people with disabilities.


Video: "Multimodal Maps for Blind People"


Website


Publications
Anke Brock, Philippe Truillet, Bernard Oriola, Christophe Jouffrai (IRIT CNRS and Universite de Toulouse) Usage of Multimodal Maps for Blind People: Why and How
ITS’10, November 7–10, 2010, Saarbrücken, Germany
Paper: http://www.irit.fr/~Philippe.Truillet/projects/doc/MultimodalMapsForTheBlind-ITS10.pdf
Poster: http://www.irit.fr/~Philippe.Truillet/projects/doc/Poster-ITS10.pdf

Stantum (Multi-touch screen used for the application.)
Ivy Middleware (Used in this application.)

Cross-posted on the TechPsych blog.

Nov 2, 2010

EyeTube for YouTube! Eye-gaze interaction software, free and downloadable from GazeGroup

Gaze interaction systems provide access to computers and the rich content now available on the web for many people with disabilities.  Unfortunately, commercial gaze tracking systems are very expensive and at times, difficult to calibrate.  There is hope!


Following up on my recent post, "Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta", I thought I'd share the GazeGroup's EyeTube for YouTube interface.  


What is great about EyeTube for YouTube is that it provides two different interfaces. The simplified version looks good for younger children or people with cognitive disorders, and is icon-based.  The second version is appropriate for people who can navigate through more complex visual representations of content. 


EyeTube requires a Windows-based system and .Net 3.5 at this time. It can be downloaded from the GazeGroup website.  If you plan to download the application, you must also make sure you have a YouTube account. To get the application up and running, you'll need to change the settings (EyeTubeSettings.xml) to match your account.   (If you don't know much about changing settings or xml, ask someone you know who works in IT.)


Below is the icon-based version of the eye-gaze interface for YouTube:
EyeTube - Gaze Interaction for YouTube (simplified version)


Feature-rich version of the EyeTube interface for YouTube:
EyeTube - Gaze Interaction for YouTube

From the GazeGroup site:

"The EyeTube prototype offers a feature rich eye controlled interface for the popular YouTube service. Instead of emulating a mouse pointer and interacting with a web browser the EyeTube interface is especially designed to be driven by gaze input. It offers a wide range of features such as keyword searching, popular video feeds, favorites and social aspects such as subscriptions, friends and commenting on videos.The highly optimized interfaces allows for a streamline interaction which is aleviated from the Midas Touch problem. In most previous gaze interfaces selection is made by a dwell time activator, e.g fixat a button for a specific amount of time and it will execute the function. In the EyeTube interface a fixation on a U.I element will highlight it and a second fixation on the activation button is required to execute the function. This removes the stress of having to constantly move the eyes to avoid unintentional activation."
"The EyeTube also exists in another simplified incarnation developed for users whom are distracted by a larger number of options. It supports basic features such as browsing categories, optional keyword searching and favorites."

RELATED
The GazeGroup
(The individuals mentioned below may be currently working elsewhere, but involved in the gaze research in some way.)

GazeGroup Research Areas

COGAIN (Communication by Gaze Interaction)

ACM CHI Conference Articles
San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458. DOI= http://doi.acm.org/10.1145/1520340.1520682
San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466 
Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. InProceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671

UPDATE

Eye-controlled games and leisure applications from the COGAIN wiki: http://www.cogain.org/wiki/Leisure_Applications
  • EyeArt - EyeArt eye-drawing program, developed by Andre Meyer and Markus Dittmar, Technical University of Dresden, Applied Cognitive Research Unit, Germany.
  • GazeTrain - Gaze-controlled action oriented puzzle game, developed by Lasse Farnung Laursen, Technical University of Denmark
  • Puzzle - Simple puzzle game that can be played with eye movements, developed by Vytautas Vysniauskas, Siauliai University, Lithuania
  • Road to Santiago - Gaze-controlled adventure game (full game), developed by Javier Hernandez Sanchiz, Universidad Publica de Navarra, Spain
  • Snap Clutch - An application that uses eye gaze data to generate key and mouse events for playing games such as World of Warcraft and Second Life.
  • ASE: Accessible Surfing Extension for Firefox - Follow this link to access ASE, an Accessible Surfing Extension for Firefox, developed by Emiliano Castellina and Fulvio Corno at Politecnico di Torino. (Note that this is a beta version.)
  • Eye Gaze Music (SAW Selection Sets) - Point and Play – eye gaze (direct pointing) musical activities, developed by DART. Please note that SAW (Special Access to Windows) framework application is needed to play these 15 music selection sets. SAW is available for free athttp://www.oatsoft.org/Software/SpecialAccessToWindows
  • EyeTube - Gaze interaction for YouTube - Follow this link to get more information and download EyeTube at ITU GazeGroup's web pages
  • Eye3D and other head eye mouse software - Eye3D for education, and a collection of links to free software that works with head or eye mouse. Includes links to downloads and original sites.
  • Gaze-controlled Breakout - Follow this link to access a modified version of the LBreakout2 game which can be operated by an SMI eye tracker, developed by Michael Dorr et al. at University of Luebeck
  • Oleg Spakov's Freeware games for MyTobii - Follow this link to access MyTobii compatible games developed by Oleg Spakov, University of Tampere, Finland
  • Free ITU Gaze Tracker and applications - Download a webcam based open-source gaze tracker and several applications that work with it, developed at IT University of Copenhagen
  • GameBase - Check out the Eye-Gaze Games category at the SpecialEffect GameBase!
  • More information about Gaze-Controlled Games - Follow this link to see a list of online information resources on using gaze for the control of games and other leisure applications

Sep 8, 2010

iPad Apps: Supporting Communication for Young People with Autism (& Links to Moms with Apps)

Have you ever watched a kid pick up an iPad for the first time and just go with it? 


Some parents have found that the same is true for their of children who have autism.  The  iPad is a great way to support learning and communication in that it is user-friendly and easy for children- and parents- to understand. 


Here are two videos that were shared with my by Liz Ditz,  author of the "I Speak of Dreams" blog.  The first video is of a boy with autism. The second video is of his little sister, interacting with the iPad to watch a "home-made" social story about a family trip to a coffee shop, created in the "Stories to Learn" app.



YOUTUBE INFO:

"Another too-cool app for his iPad, this one all about writing as well as spelling (http://itunes.apple.com/app/iwritewords-handwriting-game/id307025309?mt=8). For each letter in the word, the app says the letter, has him trace it in super-easy guided steps, and his written letter replaces the original letter in the word (at top of screen). When he finishes, the word appears in his handwriting. The program then spells it and pronounces it, displays a kid's illustration of the word -- and Leo gets to shake the original letters into a hole in the corner of the screen. LOVE IT."

STORIES TO LEARN



YOUTUBE INFO:
"Our family (and a friend) worked together to make this social story about visiting our local coffee shop for my son with autism. We used Stories2Learn (http://itunes.apple.com/us/app/stories2learn/id348576875?mt=8), on our iPad. It was so easy! And he loves it. In fact he loves it so much I had to have his sister demo it, as he loves to play the voiceovers over and over again. :)"


RESOURCES
Moms with Apps
Moms with Apps for Special Needs
Moms with Apps:  Characteristics of great apps for kids with autism


    Aug 21, 2010

    Reflections About Interactivity in my Present World

    Background:  For those of you who are new to this blog, I work as a school psychologist in a public school district in North Carolina. During the mid 2000's, I took a variety of computer courses, leading up to a decision to work part time while taking graduate courses part-time such as human-computer interaction, games, information visualization, ubiquitous computing, VR, etc. I returned to work full time in 2008, due to the economic downturn.

    I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs.  I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.

    Update:  I don't usually post much about my work on this blog, but I have some exciting things to share.

    New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.

    School started for Wolfe students on July 26th, and three more whiteboards were installed.  One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.


    Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.  

    Why is this important to me?

    Observations of positive changes:
    Students who previously had limited interest in applications on the computer, or a limited attention span for such activities, demonstrate a high degree of interest and engagement with the activities on the IWBs we've tried so far.  Many students who do not attend to learning materials, or only for brief periods of time, actively look at the IWB screen and even interacting with it, and smile.   Some students who rarely-or never- speak sometimes vocalize more often during IWB activities!   Some are making connections between learning materials and what they interact with on the screen, something that simply did not happen previously.  Some students who rarely established eye contact with others or demonstrated joint attention do so during IWB activities, something I discussed in a post on the TechPsych blog.

    Technology and my role as a school psychologist:
    Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory.  I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation.  There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".

    We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district,  how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:

    "School psychologists promote the development and maintenance of learning environments that support resilience and academic growth, promote high rates of academic engaged time, and reduce negative influences on learning and behavior."

    "School psychologists use information and assistive technology resources to enhance students’ cognitive and academic skills."

    "School psychologists work with other school personnel to develop, implement, and evaluate effective interventions for increasing the amount of time students are engaged in learning."

    "School psychologists facilitate design and delivery of curricula to help students develop effective behaviors, such as self-regulation and self-monitoring, planning/ organization, empathy, and healthy decision-making."

    Working towards solutions:
    I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district.  She's used interactive applications and web resources for many years along with more traditional "hands-on" materials,  but like me, has had limited access to IWB's until recently.  She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April.  We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.

    One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards.   Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.


    vSked_1 575x320
    vSKED in action

    Workshop Presentation
    This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs.  We shared some of the exciting things we've noticed with our students and shared "before and after" examples.  Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)

    During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning.   We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens.  Below are a few of the resources we shared:

    Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.




    Sensory World provides an intuitive navigation system for students to use as they explore various areas of a house and engage in learning and related activities. The site map for all of the areas of the house, as well as the activities, is graphical.




    We shared resources from the National Gallery of Art's Kids Zone, a great website that offers plenty of interactive content:


    Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
    Zac Browser

    Another website that is good for visual-based browsing is Symbol World:

    "The intention is that symbol readers, teachers and carers will contribute, and we are delighted at the number of contributors that have already sent us their work"

    We shared a recent find, Taptu rthat can be used on an IWB.  Although Taptu focuses on the mobile touchable web, accessed through smartphones with internet access, it provides links to a wide range of websites designed with intuitive touch interaction in mind, perfect for students with special needs.  In many cases, Taptu web navigation resources work fine on the larger screen of the IWB.  The advantage is that a icon/visual based touch navigation system supports students who can't type and also those who have very limited reading skills.


    Sports Navigation for Taptu

    Below is a slideshow about the touchable web:
    Taptu: Virtual Roundtable


    Update: SMARTTable
    At Wolfe, we'd like to create a conversation wheel with related activities for the SMARTTable that co-ordinate with SMARTBoard activities.  We'll probably experiment with the conversation wheel concept on the SMARTboard first.


    Considerations:
    One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart.   We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated.  The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.

    Complications:
    Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3.   My HP Touchsmart, my home computer, is 64 bit, but runs Vista.  My school-issued laptop runs Windows XP and is also 64-bit.  To upgrade it, I will have to send it to the tech department for several weeks.   It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.

    Possible SMARTTable Solutions:
    I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out!  We'd welcome volunteers from other universities as well.   Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.










    Upcoming Interactive Multimedia Technology Posts
    I have about 6 posts in draft form about interesting interactive technologies, research, people, and companies involved with multi-touch, interactive multimedia, and natural user interface/interaction applications. I'll get them up as soon as possible.

    One post is about 3M Touch Systems. Another is about an interesting multi-touch start-up company, Osmosis.  I revisited Ballantyne Village to update the use of interactive touch screen technology, something I discussed in detail, along with photos and video clips, in a 2008 post, Technology Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch-screen usability".   I got some inside scoop about the rationale behind the changes at the upscale center from a seasoned owner of a high-end audio-visual store I interviewed about 3D TV.   I have some interesting information about current research in "glasses-less" 3D displays. Sharp will be coming out with a 3D cell-phone camera AND glasses-less cell-phone display in the not-to-distant future.  

    I'm preparing for posts that highlight a few of my favorite blogs. Until then, take a look at the recent posts on Tracy Boyer's Innovative Interactivity blog, and also InteractiveTV Today.

    RELATED & SOMEWHAT RELATED
    "The AAC-RERC conducts a comprehensive program of research, development, training, and dissemination activities that address the NIDRR priorities and seek to improve technologies for individuals who rely on augmentative and alternative communication (AAC) technologies. The mission of the AAC-RERC is to assist people who rely on augmentative and alternative communication to achieve their goals by advancing and promoting AAC technologies and supporting the individuals who use, manufacture, and recommend them."



    Janet Light, Kathy Drager, Penn State University




    Jeff Higginbotham's Bibliography:  Viewing AAC Through Authentic Social Interactions (pdf)

    ProLoquo2Go (iPhone/iPod Touch/iPad communication system)


    Also posted on the TechPsych blog

    Mar 27, 2010

    iPod Touch: Mobile Solutions for Special Needs Students (Eric Sailers' Slideshare Presentation)

    iPod touch: Mobile Solutions for Special Needs Students
    View more presentations from Eric Sailers.

    Eric Sailers is a speech and language pathologist and assistive technology consultant who is on the cutting edge of innovative technologies and how they can be harnessed to support young people with special needs.  His blog, Speech-Language Pathology Sharing, is a great resource, and has information that would be useful to regular education teachers who are working in inclusive co-teaching environments.


    Mar 11, 2010

    Social Thinking blog - Great post from Michelle Winner's Blog

    We're planning an application for a multi-touch SMART Table at one of my schools, and one of the goals is to ensure that it can facilitate social interaction among students with and without disabilties, including autism.  Keeping this in mind, one of my colleages sent me an email a minute ago with a link to a great post by Michelle Winner's blog about social thinking and "inclusion" vs "integration" in education settings.  It is worth a read!

    http://www.socialthinking.com/michelles-blog/social-skills-for-an-integrated-setting.html

    Feb 6, 2010

    Interactive Visual Supports for Children with Autism: Gillian Hayes' work at the Social and Technology Action Research Group:

    Here is an update about the work of Gillian Hayes, an assistant professor at the School of Informatics at the University of California at Irvine. Much of her work focuses on technology that supports young people with autism spectrum disorders. Hayes received her Ph.D. from Georgia Tech:  Documenting and Understanding Everyday Activities through the Selective Archiving of Live Experiences pdf  (2007).


    In the YouTube video below, Gillian Hayes discusses her work with technology with young people with autism spectrum disorders.


    Photos from the STAR website:
    droppedImage_1droppedImage_2


    vSked_1 575x320


    SenseCam_ChildMocotosAnalogCapture
    The STAR group is researching the SenseCam, too!


    About the SenseCam, from the Microsoft Research SenseCam Website:

    "SenseCam is a wearable digital camera that is designed to take photographs passively, without user intervention, while it is being worn. Unlike a regular digital camera or a cameraphone, SenseCam does not have a viewfinder or a display that can be used to frame photos. Instead, it is fitted with a wide-angle (fish-eye) lens that maximizes its field-of-view. This ensures that nearly everything in the wearer’s view is captured by the camera, which is important because a regular wearable camera would likely produce many uninteresting images."


    New study proves that Microsoft's sensory innovation aids memory recall
    Technology Review Article 

    Images from Microsoft Research:

    http://www.microsoft.com/emea/presscentre/images/0903_Sensescam_261x200.jpg

    The SenseCam application has the potential to be a resource for people who have developmental delays, traumatic brain injury, severe attention deficits, and autism spectrum disorders. It would be a great tool for special educators, occupational and speech/language therapists, and rehabilitation specialists.
    SenseCam Videos      Research and Publications


    RELATED

    STAR:: Social and Technology Action Research Group
    Interactive Visual Supports for Children with Autism

    Technology helps teach kids with autism: Informatics assistant professor designs computer devices to aid instruction, record-keeping.

    Monibi, M., Hayes, G.R. Mocotos: Mobile Communication Tools for Children with Special Needs. Proceedings of Interaction Design and Children, pages 121-124 ACM, 2008


    Sen H. Hirano, Michael T. Yeganyan, Gabriela Marcu, David H. Nguyen, Lou Anne Boyd, Gillian R. Hayes vSked: Evaluation of a System to Support Classroom Activities for Children with Autism. To Appear In CHI 2010 (Atlanta, GA, 2010).(pdf)

    "Particularly interesting for students with autism who often struggle with group work and social interaction, we also uncovered the potential for these types of interactive classroom technologies to support these challenging focus areas."

    vSked
    Note:  I think the following website contains the vSked prototype application. I poked around the site and it looks teacher-friendly.