Showing posts sorted by relevance for query off-the-desktop. Sort by date Show all posts
Showing posts sorted by relevance for query off-the-desktop. Sort by date Show all posts

Sep 8, 2011

Update, plus iGaze app by Dunedin Multimedia for use during social skills group activities

So what am I up to now?  


I'd like to share with my readers that I've decided to continue in my present position as a school psychologist, while still devoting a portion of my free time to technology. From time-to-time I think deep thoughts about usability, accessibility,  and UX/Interaction related to off-the-desktop interactive multimedia applications running on screens of all sizes.  I'm hoping to create a few multimedia experiments using HTML5 and JavaScript, and explore jQuery if and when I can find the time!  


For the present school year, my main school is a program for students with more significant disabilities, including autism spectrum disorders.  My second school is a magnet high school for technology and the arts,  located on the same campus.  I also consult throughout the district on cases involving students who have suffered traumatic brain injuries, as well as students who have multiple disabilities.  I am thankful that I have a job in a school district that values 21st Century technology.  


I'm looking forward to another technology-rich school year.  I've spent some of the time I usually devote to blogging devoted exploring iPad apps instead.  Since I'm new to the world of iPads, I'm still in discovery mode.  What an adventure!   


There are plenty of educational apps out there, and many of them are suitable for students with special needs.  On the other hand, there is much room for improvement - across all iPad app categories.  Since there is very little research about what makes up a killer app- or suite of apps- for students with special needs, experimenting with  iPad apps is uncharted territory. 



I made the decision to bring my personal iPad2 to work after I discovered a number of apps that I thought would be useful in my work as a school psychologist with students who have special needs, including autism spectrum disorders.  

One of my intervention themes this year focuses on social skills.   This is especially important for students who participate in our schools community-based job training program.  I'm using some content from Unique Learning's transition materials,  as well as on-line activities from Do2Learn's JobTips website, because my aim is to facilitate social skills that will be useful in a variety of job and community settings.   


Although my main technology tool for working with groups is the SMARTBoard,  I've found that using a combination of interactive whiteboard and iPad activities to be especially effective.  I'm paving the way for more role-play activities in the future, and attempting to use technology to my advantage.


This past week, I used the iGaze app, created by Dunedin Multimedia, to help a group of high-school level students practice establishing and maintaining eye gaze, something that is difficult for most of them to demonstrate "in-person".  I was amazed.  Each student was excited to take his or her turn.  Even more amazing?  When each student took a turn, the other students looked at their eyes and faces.  No one rocked or "stimmed".  No one made noises.  I observed several instances of joint attention, much to my delight.   


Below is a video from Dunedin Multimedia's YouTube channel that is similar to what the students viewed during their group activity:
Here is some information from Dunedin Multimedia about the iGaze app:
"Eye contact is important to communication and social development, and yet the impaired ability to make and maintain eye contact is one of the most striking aspects of autism. iGaze is an eye contact simulator that can help to build confidence in using this important means of nonverbal social communication.  The app also contains information on eye contact and eye gaze, with links to relevant research."

During the social skills activity involving the iGaze app,  I used the SMART Board to display a large picture of a boss and a worker standing face to face, making eye contact, engaged in conversation.  The picture served as an anchor to remind the students of pictures and videos they'd previously viewed that illustrated the concept of face-to-face interaction and the importance of establishing eye-contact with others from time-to-time.


I'm hoping I will be able to access the YouTube videos from Dunedin so I can use them on the SMART Board. It will be interesting to see how this plays out!   I'm also planning to take a closer look at Dunedin Multimedia's emotion x app for the iPad.


RELATED 
Screen-shot of iGaze for the iPad Dunedin:
iPad Screenshot 1
SOMEWHAT RELATED
The SMART Table at my school was updated today - I'm looking forward to using it for some group activities, now that it is back in working order and has new applications loaded up and ready to go!


If you are interested in learning more about technology related to students with special needs, be sure to check out Kate Ahern's blog, Teaching Learners with Multiple Special Needs

Kate's post about the features of Unique Learning Systems.


Upcoming:  more about tablets, interview with folks from Stantum, social-skills game-in-progress.....large displays in public spaces update....

Oct 28, 2009

Interactive Multimedia Across Platforms and Screens: Adobe's Open Screen Project; MEX Mobile User Experience Manifesto.... (Please don't annoy the user!)

I'm not sure what I think about ubiquitous Flash 10.1. and Adobe's Open Screen project. I like the idea of anything that is seamlessly cross-platform, but I shudder to think that this might let out a wave unwanted or annoying "push" advertising on on-the-go screens of all sizes. I'm assuming web developers, along with TV ad producers, will be jumping on this train without fully thinking about how their applications and designs will play out in the off-the-desktop, digital-out-of home world.

I decided to take a look, drill down through the hype, and share a few links related to this topic.

Adobe Pushes for a Flash-ier Mobile Web
Rob Pegararo, Faster Forward, Washington Post (10/5/09)
"Are you anxious to bring Flash to the mobile Web, even if it means being subjected to some over-eager Web coder's song-and-dance routine? Or would you rather do without it on the go, even if that means having to switch to a "real" computer to use some Web sites' features?"

Hopefully the "over-eager web coders" will heed the MEX Manifesto:

MEX:  Mobile User Experience 2009 Manifesto (pdf)
"The Manifesto sets out our beliefs as to how user-centred design principles can enhance the experience of multi-platform digital services."

A framework for user journeys in a multi-platform world:  Marek Poawlowski, founder of MEX

MEX: User experience journeys in a multi-platform environment from Marek Pawlowski on Vimeo.

"User experiences are evolving into increasingly complex sets of interactions between multiple devices.  In this video presentation, Marek Pawlowski of the MEX Mobile User Experience strategy forum, shows how a framework can be used to map user journeys through the multi-platform environment."

"Unencumbered by wires, information is flowing into every corner of our world at an ever increasing rate and through an ever increasing range of digital platforms. The single greatest challenge facing digital industries is understanding how this explosion of data will be woven into the fabric of consumers' lives." -- Marek Pawlowski, founder of MEX.

MEX Blog 

OPEN SCREEN VIDEO

Open Screen Project from Vyshak V on Vimeo.
"The Open Screen Project is an industry-wide initiative, led by Adobe and backed by other industry leaders who all share one clear vision: Enable consumers to engage with rich Internet experiences seamlessly across any device, anywhere. Partners in the Open Screen Project are working together to provide a consistent runtime environment for open web browsing and standalone applications — taking advantage of Adobe® Flash® Player and, in the future, Adobe® AIR®. This consistent runtime environment will remove barriers to publishing content and applications across desktops, mobile phones, televisions, and other consumer electronics." Learn more

Reinventing Storytelling in the Digital Age Across Platforms, Across Screens

NAB 2009 presentation by Shantanu Narayen of Adobe and A.D. Albers, of Disney Interactive Media Group, from NAB 2009
Adobe and NVIDIA Deliver Rich Web Experiences on Netbooks and Mobile Devices
Reuters (10/5/009)
"At Adobe MAX, Adobe's worldwide developer conference, Adobe Systems Incorporated and NVIDIA Corporation..announced that both companies are bringing uncompromised browsing of rich Web content to netbooks, smarphones and smartbooks built with NVIDIA GPUs. The companies have been working closely together as part of the Open Screen Project to optimize and dramatically improve performance of Flash Player 10.1 by taking advantage of GPU video and graphics acceleration on a wide range of mobile Internet devices. NVIDIA customers embracing Flash Player 10.1 for their new devices include HP, Lenovo, Samsung, Acer, Asus and more..."


RIM Joins Open Screen Project  Reuters (10/4/09 )


Honey I Shrunk the Flash Player Simon Bisson and Mary Branscome, ZDNET, 10/12/09
Teaming up with Adobe and the Open Screen Project -Google Blog   (10/5/09)


Paramount Digital Entertainment Launches Interactive Thriller on MySpace  Tracy Sedlow, InteractiveTV Today (10/28/09)
"The company says that it can deliver the show's multiple interactive elements to viewers across devices using Adobe Flash Player and Adobe AIR, "because of efforts by the Open Screen Project, an industry-wide initiative led by Adobe and supported by PDE and close to 50 other industry leaders, to enable people to engage with rich Internet experiences across any device, anywhere." -

Jul 23, 2013

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform and Kaiser Permanente therapists offers a barrel of possibilities!

"The goal was to build a game that is extremely accessible, non-complex and includes simple mechanics that children with autism can quickly understand to retain their attention, prevent over-stimulation, encourage play, and prevent frustration." -Vectorform "Monkey Business" Team


I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.

I was impressed!

As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child.  The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills.  Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.

In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application.  Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.


Below are screen shots and descriptions of a few activities from the Monkey Business game:

Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another.  The child is provided with a set of toy lily pads on the floor.  As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.























Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.






















Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen.  The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements.  Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.






















Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:












































In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.  


Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials.  If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results.  Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.   

To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism.  Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment.   A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams.  I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement.  Monkey Business holds potential to fill the bill.

What next?

I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.  

Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction.  The most recent area of exploration is Google Glass. 

Intrigued by Google Glass?  So am I.  

During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass.  As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.

In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision.  In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine.  He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.


RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.


















Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:





Microsoft Kinect and Autism (SlideShare) Susan McCarthy, Little Angel's School 2/09/13


Italian Team Uses the Kinect to Treat Autistic Children Andrea Lorini, Epoch Times, 12/13/12

Microsoft Surface Multi-touch Application for Pediatric Neuropsychology Assessment (Featuring Vectorform) Lynn Marentette, TechPsych Blog, 8/26/09   



Xbox One, Kinect 2.0 and the future of health technology Marcelo Calbucci, Mobihealth News, 5/26/13



Accessible Games for Health and K-12 Education: Lessons from the Classroom (SlideShare) Lynn Marentette, 5/9/08, Games for Health Conference Presentation

Researchers: Microsoft Kinect is a Money-Saving Telemedicine Device Gabriel Perna, Healthcare Informatics, 2/15/13

Lowes, LP., Alfano LN, Yetter BA, Worthen-CHaudhari, L, Hinchman W, Samona P, Flanigan KM, Mendell JR Proof of Concept of the Ability of the Kinect to Quantify Upper Extremity Function in Dystrophinopathy
PLoS Curr. 2013 Mar 14; 5   doi:  10.1371/currents.md.9ab5d872bbb944c6035c9f9bfd314ee2

Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them.