Showing posts sorted by relevance for query off-the-desktop. Sort by date Show all posts
Showing posts sorted by relevance for query off-the-desktop. Sort by date Show all posts

Apr 25, 2010

Exploring the Design Space in Technology- Augmented Dance at CHI 2010: Celine Latulipe's team from UNC-Charlotte

I was at the CHI conference in Atlanta for a few days earlier this month, where I participated in a workshop on on the next generation of human-interaction and emerging technologies for education.  While I was at the conference, I managed to attend several interesting presentations, which I'll be including in future posts.  I was not able to stay for the entire conference.  The good thing is that many of the conference participants provided links to video, photos, slides, and related papers!

I had a chance to see a performance of dancers from the Dance.Draw team, headed by Dr. Celine Latulipe from UNC-Charlotte.  Dr. Latulipe is involved in several related projects that explore creativity, interaction, dance, music, and art:

Description of the performances held at the April CHI 2010 Conference, Atlanta, Georgia:

Layered Surveillance - A Collaborative Interactive Art Installation 
Celine Latulipe, University of North Carolina at Charlotte Annabel Manning, New Media Artist
View Video Demonstration

Artist Annabel Manning explores the world of immigration and identity, and explores imagery related to border crossings and surveillance. Computer scientist Celine Latulipe explores embodied, collaborative interaction. The intersection of these two worlds leads to research in embodied collaborative interaction and an interactive art exhibit in which participants can explore both static images through interactive layers, and moving video through interactive surveillance lenses. Participants can explore alone or with others, using gyroscopic mice to control different aspects of the artwork. The participants are led, through interaction, to contemplate the (in)visibility of the immigrant and the agency of surveillance.


Photos from the Layered Surveillance Flickr Photostream (Annabel Manning)




Exploring the Design Space in Technology- Augmented Dance (Dance.Draw)
Celine Latulipe, Sybil Huskey, David Wilson, University of North Carolina at Charlotte, USA
Mike Wirth, Queens University of Charlotte, USA
Berto Gonzalez, Arthur Carroll, Melissa Word, Erin Carroll, Vikash Singh, University of North Carolina at Charlotte, USA

Danielle Lottridge, University of Toronto, Canada


Video demonstration of the Dance.Draw project (mp4 file)
"This performance is part of an ongoing Dance.Draw project at the University of North Carolina at Charlotte, which investigates lightweight methods for integrating dance motion with interactive visualizations and enhancing audience interaction with dance."



-Photo from the Dance Draw website 


RELATED

References for Layered Surveillance:


C. Latulipe, I. Bell, C. L. Clarke, and C. S. Kaplan. sym-Tone: Two-handed manipulation of tone reproduction curves. In GI 2006 Proceedings, pages 9--16. Canadian Information Processing Society, 2006.

C. Latulipe and A. Manning. Interactive surveillance: Audience interaction with moving digital paintings. Interactive Installation at the (re)Actor3 Digital Live Art Conference, September 2008. http://www.digitalliveart.com/.

L. Loke, A. T. Larssen, T. Robertson, and J. Edwards. Understanding movement for interaction design: frameworks and approaches. Personal Ubiquitous Comput., 11(8):691--701, 2007.

D. Maynes-Aminzade, R. Pausch, and S. Seitz. Techniques for interactive audience participation. In ICMI '02:Proceedings of the 4th IEEE International Conference on Multimodal Interfaces, page 15, Washington, DC, USA, 2002. IEEE Computer Society.

S. S. Snibbe and H. S. Raffle. Social immersive media: pursuing best practices for multi-user interactive camera/projector exhibits. In CHI '09: Proceedings of the 27th international conference on Human factors in computing systems, pages 1447--1456, New York, NY, USA, 2009. ACM.


Below is a post I wrote about Dance.Draw in 2008:


DANCE.DRAW: EXQUISITE INTERACTION
(Updated)

"The movement of the visualizations are artifacts in real-time of the movements of the dancers. They draw while they dance, they dance together and they draw together. Every performance generates a new visual imprint." -DanceDraw website


Interactive multimedia technology, blended with the arts!

Dr. Celene LaTulipe
, from UNC-Charlotte's Software and Information Systems Department, Professor Sybil Huskey, from the dance department, dance students, and others collaborated to create an amazing performance that I had the opportunity to see performed during the
Visualization in the World Symposium in April (2008).

If you look closely, you will see that each dancer holds two wireless mice, one in each hand. The mice trigger the visualization that is projected in the background. Dr. LaTulipe has focused some of her research on two-handed computer interaction. It is interesting to see how her work has been applied to this beautiful "off-the-desktop" application.

Dance.Draw is a work in progress- visit the following links for more information:

Website (Updated)
Movie
Technical Info
Dr. Kosara's Eager Eyes post about Dance.Draw

Note:
Dr. LaTulipe was my HCI professor- Dr. Kosara was my Visualization/Visual Communication professor.





Nov 6, 2012

Jeff Han Discusses Windows 8 for Large Displays, Microsoft's New Multi-touch, Ink & Pen Tech; Experience Design for Large Displays (2012 Microsoft Build)

Earlier this year, Microsoft bought Jeff Han's company, founder of Perceptive Pixel.  Jeff Hans now works for the Microsoft Office division of the company, where there is a push to bring new and improved large multi-touch displays to the corporate world. With this off-the-desktop move, developers are challenged to develop applications for Windows 8 that will be geared towards supporting collaboration in corporate settings.

Jeff Han and Nathan Fish were at Microsoft's "Build" conference recently  to give a presentation about the nuts and bolts of the new technology - how it works, and suggestions for design.  Although the video presentation is over an hour long, it is worth setting aside some time to watch! 

New multi-touch, ink, and pen technologies that can revolutionize your apps on any size of screens (10/30/12, Microsoft Build)



Here is the description of the presentation:
"Develop solutions that flow with your customers business - starting with individuals and information workers on their phones, desks or virtual offices, moving into conference rooms, brainstorm areas, boardrooms and common areas.  Learn how Microsoft's new pen and touch capabilities, software and hardware will enable you to develop products that will help businesses be more efficient.  Have a sneak peek on what our large multi-touch displays (82' and 55') can bring to your solutions." -Microsoft Build

Jeff Han emphasized that touch interaction is everywhere and is a fundamental requirement in applications. Windows 8 apps should be designed for all screen sizes, saying "It's all about collaboration of people and a collaboration of devices that gets your work done....this is the real thing, and it is about to happen.  Devices (referring to large displays) are coming... we do have a mechanism for early access hardware."

During the second half of the video (38:00), Nathan Fish discusses guidelines for designing experiences on large displays, including optimization, handling various interactions via pen, touch, and gestures, the use of application bars and context menus, and more.

At the beginning of his presentation, Nathan Fish briefly discusses the uncharted territory of designing for collaborative large displays, and tells the audience that the guidelines are just suggestions, " nothing is written, we are figuring this out as we go."

My fear is that Windows 8 developers might not have the inclination to think past the smaller screens of smartphones and tablets, resulting in touchy-feely variations of PowerPoint and other Office applications that have been around for too long.

RELATED
Microsoft Welcomes Perceptive Pixel to the Office Division
The Official Microsoft Blog, 7/30/12


What happened to the Surface Table?
I was surprised that Jeff Han's company was not folded into PixelSense, formally known as Surface, the division of Microsoft that focuses on large multi-touch tables. In my opinion, the tables had some potential to support collaborative work in the corporate world.  It will be interesting to see how things unfold!  

PixelSense
Take a look at the video clip below for more information:
 

Discover the Samsung SUR40 (with PixelSense)
Samsung SUR40 with Microsoft PixelSense
PixelSense Development Partners
Insights on Interactive Tabletops: A Survey of Researchers and Developers (pdf)
Hrvoje Benko, Meredith Ringel Morris, A.J. Bernheim Brush, and Andrew D. Wilson 2009, Microsoft Research

SOMEWHAT RELATED
I've been interested in large screen displays and how they interact with smaller devices for a very long time.  In 2007, I was taking graduate courses at UNC-Charlotte and planned to pursue further research and work related to this area.   At the time, I prepared a paper to present at a conference, but was unable to present due a serious health emergency in my family.  The abstract is below:


"Large screen displays support the collaboration of two or more people, especially 
when the nature of the content is visual.  These displays, in the form of walls, windows, 
boards, and touch-tables, will become ubiquitous in the future, and can be found in workplaces, educational institutions, museums, and other public spaces. New display systems have been developed that allow for co-located communication between groups of people, and also allow for collaboration with individuals in remote locations, using mobile computing or communication devices.  This paper will provide an overview of the technologies that support  this new way of communication, discuss the challenges involved in 
establishing "interoperability" within these new systems, and examine the possibilities for 
the future."

If this topic interests you, take a look at one of my Flickr collections:

Ubiquitous Sightings of Urban Screens: Interactive Displays in Public Spaces, Digital Signage, and Screens of All Sizes

Description:

"All kinds of displays are cropping up in shopping centers, malls, banks, stores, airports, schools, businesses, and meeting places. What is the purpose of all of these displays?

Displays and mobile devices, everywhere we go.

It doesn't look like we've really figured out how to harness the potential of the technology that surrounds us."





May 6, 2010

Link to Innovative Interactivity (II) & post: SMART Technologies' acquisition of NextWindow: A "smart window" to the world

I am happy to announce that will be contributing a post bi-weekly on the 1st and 3rd Thursday of each month for the  Innovative Interactivity (II) blog.

In my first post, I discuss interactive technologies in education and the explosion in the number of interactive whiteboards making way in classrooms in the US and around the globe.   The motivation for this post came from last week's announcement of Smart Technologie's acquisition of NextWindow.  Not long after the announcement, I had a chance to speak with Al Monro, CEO of NextWindow, and Nancy Knowlton, CEO of Smart Technologies. I share some of their insights in my post:

SMART Technologies' acquisition of NextWindow: A "smart window" to the world

About Innovative Interactivity (II):


"Innovative Interactivity serves as an open forum for multimedia producers, web developers and all other visualization junkies. Content focuses on the dynamics and theory of how people receive and react to different forms of information on the Web, both through visual, multimedia storytelling and interactive data visualization."

"The goal is that this blog will provide an outlet for those in the online realm, whether you are interested in learning about multimedia, interactivity, programming languages, data visualization, or all of the above. Hopefully you will be inspired from what you read here to surpass your current web standards in order to develop highly effective multimedia interactives for the digital community."

Tracy Boyer | Founder & Managing Editor

Tracy Boyer
Tracy Boyer is an award-winning multimedia producer, specializing in interactive Web development and multimedia storytelling. Currently, she is a dual master’s candidate (MBA/MSIS) at UNC-Chapel Hill where she is studying Human-Computer Interaction in the School’s Information Science program and Entrepreneurship at Kenan-Flagler Business School.

Previously, she was a multimedia producer at Roanoke.com, served as the UNC correspondent for CNN.com and interned with The Atlanta Journal-Constitution. In 2007, she was selected to participate in the Poynter Summer Fellowship. Boyer graduated with a multimedia degree from UNC’s School of Journalism and Mass Communication. Her passions lie in travel and multimedia production with a focus on video, audio and interactive graphics. See more of her work at www.tracynboyer.com.

Boyer is available for speaking engagements and seminars. Please contact her for more information.

Andrea Ballocchi | Spanish Editor & Social Media Manager

Andrea Ballocchi
Andrea Ballocchi is a Chilean journalist. She studied video production at the Art Institute in Fort Lauderdale, Fla., and worked as an assistant producer for Sábado Gigantes on channel 13 in Santiago, Chile, and as a producer at Anticipa, an internet company also in Chile.
In 2005, she participated in the multimedia project “The Ancient Way”, in Spain. Since then she has participated in several other projects, including “Chasing Crusoe,” “Atacama Stories,” “Special Olympics in Shanghai and Idaho,” and “South of Here.” She has also taught and coordinated projects at Universidad de los Andes, Chile.
Andrea Ballocchi is currently a multimedia journalism graduate student at the University of Miami, Fla. and works at The Knight Center for International Media.

Ellen Peck | Contributor, Philanthropic Multimedia

Posts publish bi-weekly on the 2nd and 4th Thursdays of each month

Ellen Peck
Ellen Peck works as consultant with a focus on the Non-profit sector. Ellen worked for over 15 years with Save the Children as a Director of Development, and also created and managed their Emergencies and Crisis Leadership Council. She has worked with start-up non-profits, and first-time capital campaigns for more established organizations. She has been involved in fundraising and strategic partnerships with individuals, corporations, foundations and organizations, and in strategic planning with artists, agency project teams and senior management at organizations for new initiative development.

Ellen also serves in the role of producer and creative/content advisor to documentary film, music and other projects in the arts associated with social and environmental issues. Her projects include the film and soundtrack for “Born into Brothels,” (USA) which garnered the Oscar for Best Documentary in 2005, and for “Stolen,” (Australia) currently showing at international film festivals.

Ellen is a Liberal Arts grad with a degree from Amherst College.

Lynn Marentette | Contributor, Academic Multimedia

Posts publish bi-weekly on the 1st and 3rd Thursdays of each month

Lynn Marentette
Lynn Marentette has over 20 years of professional experience as a school psychologist, and has used interactive multimedia applications and games with students who have disabilities since the early 1990’s. She is passionate about emerging collaborative technologies, natural user interactions and interfaces, and how the power of interactive multimedia can be harnessed in education. In addition to her work as a psychologist, she blogs about accessible off-the-desktop natural-user interfaces at “Interactive Multimedia Technology.”

Lynn has presented on topics such as multimedia thinking and learning, universal usability of interactive multimedia, and universal design and accessibility for games. She returned to school a few years ago to learn how to make interactive multimedia applications and games for the web, handheld devices, and large displays, and has taken graduate courses such as game design, ubiquitous computing, and virtual reality in preparation for a potential PhD in Information Technology at UNC-Charlotte.

Jun 26, 2010

A few links: GizmoWatch's 10 Interactive User Interfaces for the Future, CNN's Eatocracy, EVA 2010 and More!

Here's a quick link to a recent Gizmowatch post, Ten Interactive User Interfaces for the Future.  Bharat, the authro, reviews a variety of interfaces, input methods, and interaction techniques, such as Skinput, a water-based touch screen, a muscle-computer interface, air gestures, brain-computer systems, and even a mud-tub interface.


I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.


Totally Unrelated


Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz".  The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.


Here is the description of Eatocracy from the website:

Eatocracy  "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."

Enjoy!

(The above is a repost from The World is My Interactive Interface)

Coming Soon
--More about 3D TV and Interactive TV
--Highlights from CHI 2010 (better late than never!)
--My experiments- SMARTTable, a game, interactive timeline prototype pictures...
--A post about Lieven van Velthoven's interesting Post-WIMP explorations - here are some links that he recently sent me:
As I took a peek at Lieven's video links, I noticed an interesting video mash-up Lieven created from the open-source code from the RadioHead's House of Cards music video and his One Million Particles app. I'll post them soon.

I'll try to get video, pictures, and commentary about EVA 2010.  EVA stands for Electronic Visualization and the Arts. "Electronic Information, the Visual Arts, and Beyond.

FYI
I'm in the process of sorting through and re-organizing my blogs, which have been around for over four years!  During this time, my blogs have attracted a growing number of readers. Because of this, I'd like to make things a bit user-centered.  So expect to see little changes here and there.  I promise I'll give my readers warnings in advance if I make any serious changes! 

If you are new to this blog, you should know that my blogs started out as on-line filing cabinets, open to the world.  Although there is a bit of overlap of material and some cross-posting between the blogs, they are arranged to serve as a paper-less way of keeping track of things that I've learned through my coursework, conference attendance, readings, and research. Since emerging technologies are high on my list of interests, I also use my blogs to share interesting things that cross my path.    
  
I changed the name of my World Is My Interface blog to The World Is My Interactive Interface.   "Off-the-desktop natural user interfaces, interaction, and user experience" are the main topics of the blog.  It sometimes includes information about ubiquitous computing and DOOH, otherwise known as Digital Out Of Home.

I plan to tinker with my TechPsych blog later on. It focuses on topics that are useful to psychologists, educators, special education teachers, speech and language therapists, health and wellness professionals, and parents.

Feel free to leave comments, as I welcome your input.

Jun 6, 2013

Interactive Displays and "Billboards" in Public Spaces; Pervasive Displays 2013

The 2013 International Symposium on Pervasive Displays (PerDis 2013), recently convened  in Mountain View, California.  Since I couldn't attend this conference, I was happy to learn from Albrecht Schmidt that the conference proceedings were recently uploaded to the ACM Digital library.  There are many exciting things going on in this interdisciplinary field!

Researchers involved with the Instant Places project, described in the video below, presented their work at PerDis 2013. The Instant Places project was part of PD-Net, a series of research efforts exploring the future of pervasive display networks in Europe. (See the "Related" section for additional references and links.)


Instant Places: Tools and Practices for Situated Publication in Display Networks

Below is information from the Instant Places video and website:
"The video describes a novel screen media system that explores new practices for individual publication and identity projection in public digital displays." 

"Instant Places has been developed by the Ubicomp group of the Information Systems Department, at the University of Minho, and has been funded within the scope of pd-net: Towards Future Pervasive Display Networks, by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 244011."

Saul Greenberg was the keynote speaker at PerDis 2013.  His keynote, "Proxemic Interactions: Displays and Devices that Respond to Social Distance", highlights how far off-the-desktop our digital/physical lives have become, and how this has influenced recent research in human-computer interaction. Saul is a professor at the University of Calgary and leads research in Human Computer Interaction, Computer Supported Cooperative Work, and Ubiquitous Computing.

Although the video of Saul Greenberg's presentation below is not from PerDis 2013, it touches on the same topics and is worth taking an hour to watch.  In this video, Greenberg presents an overview of the history of human-computer interaction. He also offers up a discussion how an understanding social theory, perception of spatial relationships, and embodied interaction can be applied to the design of natural user interfaces and interactive systems.  Useful examples of interaction design explorations, within an ecological context, are provided later in the video.

Proxemic Interactions: the New Ubicomp?




RELATED


My Backstory
Regular readers of this blog know that to subject interactive displays in public spaces holds my interest. When I was taking computer courses during the mid 2000s, I focused some of my energy on projects designed for large interactive displays, inspired by reading articles like "Physically Large Displays Improve Performance on Spatial Tasks" (Desney S. Tan, Darren Gergle, Peter Scupelli, and Randy Pausch) and "Dynamo: public interactive surface supporting the cooperative sharing and exchange of media(Shahram Izadi, Harry Brignull, Tom Rodden, Yvonne Rogers, Mia Underwood).  

Jeff Han's 2006 TED talk was another inspiration. I remember my excitement as watched his demonstration of an interactive multi-touch touch screen the size of a drafting board, before the iPhone/iPad was born.  Another inspiration was Hans Rosling's TED Talk  about health statistics, with his animated interactive data visualizations presented on a huge screen.

The following year, I stumbled upon the  NUI-Group while searching for information about multi-touch displays, and was inspired by many of the early members of the group.  I also became acquainted with a world-wide network of people who share similar interests, such as Albrecht Schmidt and his team of researchers at the Unversity of Stuttgart. This busy group recently presented at PerDis 2013 and at CHI 2013 and are involved in a wider range of ongoing projects.

INTERACTIVE DISPLAYS
Alt, F. Sahami, A., Kubitza, T., Schmidt, A.  Interaction Techniques for Creating and Exchanging Content with Public Displays. In: Proceedings of the 2013 ACM Annual Conference on Human Factors in Computing Systems 
Hinrichs, U., Carependale, S., Valkanova, N., Kulkkaniemi, K., Jacucci, G., Moer, A.V., Interactive Public Displays   Computer Graphics, Vol. 33(2) IEEE Computer Society (25-27)
PerDis 2013 Program
Sample Papers:
Otero, N., Muller, M., Alissandrakis, A., and Milrad, M. Exploring video-based interactions around digital public displays to foster curiosity about science in the schools. PerDis 2013 (pdf)
Alt, F., Schneegass, S., Girgis, M., Schmidt, A. Cognitive Effects of Interactive Public Display Applications. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 2013
Langeinrich, M., Schmidt, A., Davies, N., and Jose, R.  A practical framework for ethics: the 

Note:  Members of ACM have access to all of the proceedings of PerDis2013 in the ACM Digital Library. Non-members have access to the abstracts.

PD-NET
PD-net approach to supporting ethics compliance in public display studies. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 139-143
PD-Net 
PD-NET Publications - a great reference list, with links to many papers
Reading List on Pervasive Public Displays
About Instant Places
About the Living Lab for Screens Set

DOOH-DIGITAL OUT-OF-HOME
Daily Digital Out of Home post "Billboards That Look Back" : Could miniature cameras embedded in ads lead to Big Brother at the mall? The World Is My Interactive Interface, 5/28/08
J. Müller et al., "Looking Glass: A Field Study on Noticing Interactivity on a Shop Window," Proc. 2012 SIGCHI Conf. Human Factors in Computing Systems (CHI 12), ACM, 2012, pp. 297–306
Michelis, D., Meckel, M. Why Do We Want to Interact With Electronic Billboards in Public Space?  First Workshop on Pervasive Advertising, Pervasive 2009, 5/11/09
The Rage of Interactive Billboards
The Print Innovator, 11/28/12
10 Brilliant Interactive Billboards (Videos)
Amy-Mae Elliot, Mashable, 8/21/11


SOME INTERESTING EARLIER WORK
Jeff Han's 2006 TED Talk (This is worth revisiting, as it came out before the iPhone, iPad, etc.)


Tan, D.S., Gergle, D, Scupelli, P., Pauch, R. Physically large displays improve performance on spatial tasks. ACM Transactions on Computer-Human Interaction, V13(1) 2006 (71-99)

Revisiting promising projects: Dynamo an application for sharing information on large interactive displays in public spaces (blog post)
Lynn Marentette, Interactive Multimedia Technology, 09/16/07

Brignull, H., Izadi, S., Fitzpatrick, G., Rogers, Y., Rodden,  T. The introduction of a shared interactive surface into a communal space. Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW'04), Chicago, ACM Press, 2004 (pdf)


Izadi, S., Brignull, H., Rodden, T., Rogers, Y. and Underwood,M. Dynamo: public interactive surface supporting the cooperative sharing and exchange of media. In Proc. User
Interfaces and Software Technologies (UIST’03), Vancouver, ACM Press, 2003, 159-168. (pdf)

Proxemics (Wikipedia)


Why Do We Want to Interact With Electronic Billboards in Public Space? 


Sep 8, 2011

Update, plus iGaze app by Dunedin Multimedia for use during social skills group activities

So what am I up to now?  


I'd like to share with my readers that I've decided to continue in my present position as a school psychologist, while still devoting a portion of my free time to technology. From time-to-time I think deep thoughts about usability, accessibility,  and UX/Interaction related to off-the-desktop interactive multimedia applications running on screens of all sizes.  I'm hoping to create a few multimedia experiments using HTML5 and JavaScript, and explore jQuery if and when I can find the time!  


For the present school year, my main school is a program for students with more significant disabilities, including autism spectrum disorders.  My second school is a magnet high school for technology and the arts,  located on the same campus.  I also consult throughout the district on cases involving students who have suffered traumatic brain injuries, as well as students who have multiple disabilities.  I am thankful that I have a job in a school district that values 21st Century technology.  


I'm looking forward to another technology-rich school year.  I've spent some of the time I usually devote to blogging devoted exploring iPad apps instead.  Since I'm new to the world of iPads, I'm still in discovery mode.  What an adventure!   


There are plenty of educational apps out there, and many of them are suitable for students with special needs.  On the other hand, there is much room for improvement - across all iPad app categories.  Since there is very little research about what makes up a killer app- or suite of apps- for students with special needs, experimenting with  iPad apps is uncharted territory. 



I made the decision to bring my personal iPad2 to work after I discovered a number of apps that I thought would be useful in my work as a school psychologist with students who have special needs, including autism spectrum disorders.  

One of my intervention themes this year focuses on social skills.   This is especially important for students who participate in our schools community-based job training program.  I'm using some content from Unique Learning's transition materials,  as well as on-line activities from Do2Learn's JobTips website, because my aim is to facilitate social skills that will be useful in a variety of job and community settings.   


Although my main technology tool for working with groups is the SMARTBoard,  I've found that using a combination of interactive whiteboard and iPad activities to be especially effective.  I'm paving the way for more role-play activities in the future, and attempting to use technology to my advantage.


This past week, I used the iGaze app, created by Dunedin Multimedia, to help a group of high-school level students practice establishing and maintaining eye gaze, something that is difficult for most of them to demonstrate "in-person".  I was amazed.  Each student was excited to take his or her turn.  Even more amazing?  When each student took a turn, the other students looked at their eyes and faces.  No one rocked or "stimmed".  No one made noises.  I observed several instances of joint attention, much to my delight.   


Below is a video from Dunedin Multimedia's YouTube channel that is similar to what the students viewed during their group activity:
Here is some information from Dunedin Multimedia about the iGaze app:
"Eye contact is important to communication and social development, and yet the impaired ability to make and maintain eye contact is one of the most striking aspects of autism. iGaze is an eye contact simulator that can help to build confidence in using this important means of nonverbal social communication.  The app also contains information on eye contact and eye gaze, with links to relevant research."

During the social skills activity involving the iGaze app,  I used the SMART Board to display a large picture of a boss and a worker standing face to face, making eye contact, engaged in conversation.  The picture served as an anchor to remind the students of pictures and videos they'd previously viewed that illustrated the concept of face-to-face interaction and the importance of establishing eye-contact with others from time-to-time.


I'm hoping I will be able to access the YouTube videos from Dunedin so I can use them on the SMART Board. It will be interesting to see how this plays out!   I'm also planning to take a closer look at Dunedin Multimedia's emotion x app for the iPad.


RELATED 
Screen-shot of iGaze for the iPad Dunedin:
iPad Screenshot 1
SOMEWHAT RELATED
The SMART Table at my school was updated today - I'm looking forward to using it for some group activities, now that it is back in working order and has new applications loaded up and ready to go!


If you are interested in learning more about technology related to students with special needs, be sure to check out Kate Ahern's blog, Teaching Learners with Multiple Special Needs

Kate's post about the features of Unique Learning Systems.


Upcoming:  more about tablets, interview with folks from Stantum, social-skills game-in-progress.....large displays in public spaces update....

Oct 28, 2009

Interactive Multimedia Across Platforms and Screens: Adobe's Open Screen Project; MEX Mobile User Experience Manifesto.... (Please don't annoy the user!)

I'm not sure what I think about ubiquitous Flash 10.1. and Adobe's Open Screen project. I like the idea of anything that is seamlessly cross-platform, but I shudder to think that this might let out a wave unwanted or annoying "push" advertising on on-the-go screens of all sizes. I'm assuming web developers, along with TV ad producers, will be jumping on this train without fully thinking about how their applications and designs will play out in the off-the-desktop, digital-out-of home world.

I decided to take a look, drill down through the hype, and share a few links related to this topic.

Adobe Pushes for a Flash-ier Mobile Web
Rob Pegararo, Faster Forward, Washington Post (10/5/09)
"Are you anxious to bring Flash to the mobile Web, even if it means being subjected to some over-eager Web coder's song-and-dance routine? Or would you rather do without it on the go, even if that means having to switch to a "real" computer to use some Web sites' features?"

Hopefully the "over-eager web coders" will heed the MEX Manifesto:

MEX:  Mobile User Experience 2009 Manifesto (pdf)
"The Manifesto sets out our beliefs as to how user-centred design principles can enhance the experience of multi-platform digital services."

A framework for user journeys in a multi-platform world:  Marek Poawlowski, founder of MEX

MEX: User experience journeys in a multi-platform environment from Marek Pawlowski on Vimeo.

"User experiences are evolving into increasingly complex sets of interactions between multiple devices.  In this video presentation, Marek Pawlowski of the MEX Mobile User Experience strategy forum, shows how a framework can be used to map user journeys through the multi-platform environment."

"Unencumbered by wires, information is flowing into every corner of our world at an ever increasing rate and through an ever increasing range of digital platforms. The single greatest challenge facing digital industries is understanding how this explosion of data will be woven into the fabric of consumers' lives." -- Marek Pawlowski, founder of MEX.

MEX Blog 

OPEN SCREEN VIDEO

Open Screen Project from Vyshak V on Vimeo.
"The Open Screen Project is an industry-wide initiative, led by Adobe and backed by other industry leaders who all share one clear vision: Enable consumers to engage with rich Internet experiences seamlessly across any device, anywhere. Partners in the Open Screen Project are working together to provide a consistent runtime environment for open web browsing and standalone applications — taking advantage of Adobe® Flash® Player and, in the future, Adobe® AIR®. This consistent runtime environment will remove barriers to publishing content and applications across desktops, mobile phones, televisions, and other consumer electronics." Learn more

Reinventing Storytelling in the Digital Age Across Platforms, Across Screens

NAB 2009 presentation by Shantanu Narayen of Adobe and A.D. Albers, of Disney Interactive Media Group, from NAB 2009
Adobe and NVIDIA Deliver Rich Web Experiences on Netbooks and Mobile Devices
Reuters (10/5/009)
"At Adobe MAX, Adobe's worldwide developer conference, Adobe Systems Incorporated and NVIDIA Corporation..announced that both companies are bringing uncompromised browsing of rich Web content to netbooks, smarphones and smartbooks built with NVIDIA GPUs. The companies have been working closely together as part of the Open Screen Project to optimize and dramatically improve performance of Flash Player 10.1 by taking advantage of GPU video and graphics acceleration on a wide range of mobile Internet devices. NVIDIA customers embracing Flash Player 10.1 for their new devices include HP, Lenovo, Samsung, Acer, Asus and more..."


RIM Joins Open Screen Project  Reuters (10/4/09 )


Honey I Shrunk the Flash Player Simon Bisson and Mary Branscome, ZDNET, 10/12/09
Teaming up with Adobe and the Open Screen Project -Google Blog   (10/5/09)


Paramount Digital Entertainment Launches Interactive Thriller on MySpace  Tracy Sedlow, InteractiveTV Today (10/28/09)
"The company says that it can deliver the show's multiple interactive elements to viewers across devices using Adobe Flash Player and Adobe AIR, "because of efforts by the Open Screen Project, an industry-wide initiative led by Adobe and supported by PDE and close to 50 other industry leaders, to enable people to engage with rich Internet experiences across any device, anywhere." -

Jul 23, 2013

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform and Kaiser Permanente therapists offers a barrel of possibilities!

"The goal was to build a game that is extremely accessible, non-complex and includes simple mechanics that children with autism can quickly understand to retain their attention, prevent over-stimulation, encourage play, and prevent frustration." -Vectorform "Monkey Business" Team


I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.

I was impressed!

As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child.  The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills.  Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.

In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application.  Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.


Below are screen shots and descriptions of a few activities from the Monkey Business game:

Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another.  The child is provided with a set of toy lily pads on the floor.  As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.























Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.






















Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen.  The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements.  Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.






















Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:












































In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.  


Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials.  If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results.  Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.   

To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism.  Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment.   A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams.  I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement.  Monkey Business holds potential to fill the bill.

What next?

I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.  

Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction.  The most recent area of exploration is Google Glass. 

Intrigued by Google Glass?  So am I.  

During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass.  As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.

In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision.  In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine.  He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.


RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.


















Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:





Microsoft Kinect and Autism (SlideShare) Susan McCarthy, Little Angel's School 2/09/13


Italian Team Uses the Kinect to Treat Autistic Children Andrea Lorini, Epoch Times, 12/13/12

Microsoft Surface Multi-touch Application for Pediatric Neuropsychology Assessment (Featuring Vectorform) Lynn Marentette, TechPsych Blog, 8/26/09   



Xbox One, Kinect 2.0 and the future of health technology Marcelo Calbucci, Mobihealth News, 5/26/13



Accessible Games for Health and K-12 Education: Lessons from the Classroom (SlideShare) Lynn Marentette, 5/9/08, Games for Health Conference Presentation

Researchers: Microsoft Kinect is a Money-Saving Telemedicine Device Gabriel Perna, Healthcare Informatics, 2/15/13

Lowes, LP., Alfano LN, Yetter BA, Worthen-CHaudhari, L, Hinchman W, Samona P, Flanigan KM, Mendell JR Proof of Concept of the Ability of the Kinect to Quantify Upper Extremity Function in Dystrophinopathy
PLoS Curr. 2013 Mar 14; 5   doi:  10.1371/currents.md.9ab5d872bbb944c6035c9f9bfd314ee2