Showing posts sorted by relevance for query off-the-desktop. Sort by date Show all posts
Showing posts sorted by relevance for query off-the-desktop. Sort by date Show all posts

Jun 26, 2010

A few links: GizmoWatch's 10 Interactive User Interfaces for the Future, CNN's Eatocracy, EVA 2010 and More!

Here's a quick link to a recent Gizmowatch post, Ten Interactive User Interfaces for the Future.  Bharat, the authro, reviews a variety of interfaces, input methods, and interaction techniques, such as Skinput, a water-based touch screen, a muscle-computer interface, air gestures, brain-computer systems, and even a mud-tub interface.


I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.


Totally Unrelated


Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz".  The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.


Here is the description of Eatocracy from the website:

Eatocracy  "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."

Enjoy!

(The above is a repost from The World is My Interactive Interface)

Coming Soon
--More about 3D TV and Interactive TV
--Highlights from CHI 2010 (better late than never!)
--My experiments- SMARTTable, a game, interactive timeline prototype pictures...
--A post about Lieven van Velthoven's interesting Post-WIMP explorations - here are some links that he recently sent me:
As I took a peek at Lieven's video links, I noticed an interesting video mash-up Lieven created from the open-source code from the RadioHead's House of Cards music video and his One Million Particles app. I'll post them soon.

I'll try to get video, pictures, and commentary about EVA 2010.  EVA stands for Electronic Visualization and the Arts. "Electronic Information, the Visual Arts, and Beyond.

FYI
I'm in the process of sorting through and re-organizing my blogs, which have been around for over four years!  During this time, my blogs have attracted a growing number of readers. Because of this, I'd like to make things a bit user-centered.  So expect to see little changes here and there.  I promise I'll give my readers warnings in advance if I make any serious changes! 

If you are new to this blog, you should know that my blogs started out as on-line filing cabinets, open to the world.  Although there is a bit of overlap of material and some cross-posting between the blogs, they are arranged to serve as a paper-less way of keeping track of things that I've learned through my coursework, conference attendance, readings, and research. Since emerging technologies are high on my list of interests, I also use my blogs to share interesting things that cross my path.    
  
I changed the name of my World Is My Interface blog to The World Is My Interactive Interface.   "Off-the-desktop natural user interfaces, interaction, and user experience" are the main topics of the blog.  It sometimes includes information about ubiquitous computing and DOOH, otherwise known as Digital Out Of Home.

I plan to tinker with my TechPsych blog later on. It focuses on topics that are useful to psychologists, educators, special education teachers, speech and language therapists, health and wellness professionals, and parents.

Feel free to leave comments, as I welcome your input.

Jun 6, 2013

Interactive Displays and "Billboards" in Public Spaces; Pervasive Displays 2013

The 2013 International Symposium on Pervasive Displays (PerDis 2013), recently convened  in Mountain View, California.  Since I couldn't attend this conference, I was happy to learn from Albrecht Schmidt that the conference proceedings were recently uploaded to the ACM Digital library.  There are many exciting things going on in this interdisciplinary field!

Researchers involved with the Instant Places project, described in the video below, presented their work at PerDis 2013. The Instant Places project was part of PD-Net, a series of research efforts exploring the future of pervasive display networks in Europe. (See the "Related" section for additional references and links.)


Instant Places: Tools and Practices for Situated Publication in Display Networks

Below is information from the Instant Places video and website:
"The video describes a novel screen media system that explores new practices for individual publication and identity projection in public digital displays." 

"Instant Places has been developed by the Ubicomp group of the Information Systems Department, at the University of Minho, and has been funded within the scope of pd-net: Towards Future Pervasive Display Networks, by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 244011."

Saul Greenberg was the keynote speaker at PerDis 2013.  His keynote, "Proxemic Interactions: Displays and Devices that Respond to Social Distance", highlights how far off-the-desktop our digital/physical lives have become, and how this has influenced recent research in human-computer interaction. Saul is a professor at the University of Calgary and leads research in Human Computer Interaction, Computer Supported Cooperative Work, and Ubiquitous Computing.

Although the video of Saul Greenberg's presentation below is not from PerDis 2013, it touches on the same topics and is worth taking an hour to watch.  In this video, Greenberg presents an overview of the history of human-computer interaction. He also offers up a discussion how an understanding social theory, perception of spatial relationships, and embodied interaction can be applied to the design of natural user interfaces and interactive systems.  Useful examples of interaction design explorations, within an ecological context, are provided later in the video.

Proxemic Interactions: the New Ubicomp?




RELATED


My Backstory
Regular readers of this blog know that to subject interactive displays in public spaces holds my interest. When I was taking computer courses during the mid 2000s, I focused some of my energy on projects designed for large interactive displays, inspired by reading articles like "Physically Large Displays Improve Performance on Spatial Tasks" (Desney S. Tan, Darren Gergle, Peter Scupelli, and Randy Pausch) and "Dynamo: public interactive surface supporting the cooperative sharing and exchange of media(Shahram Izadi, Harry Brignull, Tom Rodden, Yvonne Rogers, Mia Underwood).  

Jeff Han's 2006 TED talk was another inspiration. I remember my excitement as watched his demonstration of an interactive multi-touch touch screen the size of a drafting board, before the iPhone/iPad was born.  Another inspiration was Hans Rosling's TED Talk  about health statistics, with his animated interactive data visualizations presented on a huge screen.

The following year, I stumbled upon the  NUI-Group while searching for information about multi-touch displays, and was inspired by many of the early members of the group.  I also became acquainted with a world-wide network of people who share similar interests, such as Albrecht Schmidt and his team of researchers at the Unversity of Stuttgart. This busy group recently presented at PerDis 2013 and at CHI 2013 and are involved in a wider range of ongoing projects.

INTERACTIVE DISPLAYS
Alt, F. Sahami, A., Kubitza, T., Schmidt, A.  Interaction Techniques for Creating and Exchanging Content with Public Displays. In: Proceedings of the 2013 ACM Annual Conference on Human Factors in Computing Systems 
Hinrichs, U., Carependale, S., Valkanova, N., Kulkkaniemi, K., Jacucci, G., Moer, A.V., Interactive Public Displays   Computer Graphics, Vol. 33(2) IEEE Computer Society (25-27)
PerDis 2013 Program
Sample Papers:
Otero, N., Muller, M., Alissandrakis, A., and Milrad, M. Exploring video-based interactions around digital public displays to foster curiosity about science in the schools. PerDis 2013 (pdf)
Alt, F., Schneegass, S., Girgis, M., Schmidt, A. Cognitive Effects of Interactive Public Display Applications. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 2013
Langeinrich, M., Schmidt, A., Davies, N., and Jose, R.  A practical framework for ethics: the 

Note:  Members of ACM have access to all of the proceedings of PerDis2013 in the ACM Digital Library. Non-members have access to the abstracts.

PD-NET
PD-net approach to supporting ethics compliance in public display studies. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 139-143
PD-Net 
PD-NET Publications - a great reference list, with links to many papers
Reading List on Pervasive Public Displays
About Instant Places
About the Living Lab for Screens Set

DOOH-DIGITAL OUT-OF-HOME
Daily Digital Out of Home post "Billboards That Look Back" : Could miniature cameras embedded in ads lead to Big Brother at the mall? The World Is My Interactive Interface, 5/28/08
J. Müller et al., "Looking Glass: A Field Study on Noticing Interactivity on a Shop Window," Proc. 2012 SIGCHI Conf. Human Factors in Computing Systems (CHI 12), ACM, 2012, pp. 297–306
Michelis, D., Meckel, M. Why Do We Want to Interact With Electronic Billboards in Public Space?  First Workshop on Pervasive Advertising, Pervasive 2009, 5/11/09
The Rage of Interactive Billboards
The Print Innovator, 11/28/12
10 Brilliant Interactive Billboards (Videos)
Amy-Mae Elliot, Mashable, 8/21/11


SOME INTERESTING EARLIER WORK
Jeff Han's 2006 TED Talk (This is worth revisiting, as it came out before the iPhone, iPad, etc.)


Tan, D.S., Gergle, D, Scupelli, P., Pauch, R. Physically large displays improve performance on spatial tasks. ACM Transactions on Computer-Human Interaction, V13(1) 2006 (71-99)

Revisiting promising projects: Dynamo an application for sharing information on large interactive displays in public spaces (blog post)
Lynn Marentette, Interactive Multimedia Technology, 09/16/07

Brignull, H., Izadi, S., Fitzpatrick, G., Rogers, Y., Rodden,  T. The introduction of a shared interactive surface into a communal space. Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW'04), Chicago, ACM Press, 2004 (pdf)


Izadi, S., Brignull, H., Rodden, T., Rogers, Y. and Underwood,M. Dynamo: public interactive surface supporting the cooperative sharing and exchange of media. In Proc. User
Interfaces and Software Technologies (UIST’03), Vancouver, ACM Press, 2003, 159-168. (pdf)

Proxemics (Wikipedia)


Why Do We Want to Interact With Electronic Billboards in Public Space? 


Sep 8, 2011

Update, plus iGaze app by Dunedin Multimedia for use during social skills group activities

So what am I up to now?  


I'd like to share with my readers that I've decided to continue in my present position as a school psychologist, while still devoting a portion of my free time to technology. From time-to-time I think deep thoughts about usability, accessibility,  and UX/Interaction related to off-the-desktop interactive multimedia applications running on screens of all sizes.  I'm hoping to create a few multimedia experiments using HTML5 and JavaScript, and explore jQuery if and when I can find the time!  


For the present school year, my main school is a program for students with more significant disabilities, including autism spectrum disorders.  My second school is a magnet high school for technology and the arts,  located on the same campus.  I also consult throughout the district on cases involving students who have suffered traumatic brain injuries, as well as students who have multiple disabilities.  I am thankful that I have a job in a school district that values 21st Century technology.  


I'm looking forward to another technology-rich school year.  I've spent some of the time I usually devote to blogging devoted exploring iPad apps instead.  Since I'm new to the world of iPads, I'm still in discovery mode.  What an adventure!   


There are plenty of educational apps out there, and many of them are suitable for students with special needs.  On the other hand, there is much room for improvement - across all iPad app categories.  Since there is very little research about what makes up a killer app- or suite of apps- for students with special needs, experimenting with  iPad apps is uncharted territory. 



I made the decision to bring my personal iPad2 to work after I discovered a number of apps that I thought would be useful in my work as a school psychologist with students who have special needs, including autism spectrum disorders.  

One of my intervention themes this year focuses on social skills.   This is especially important for students who participate in our schools community-based job training program.  I'm using some content from Unique Learning's transition materials,  as well as on-line activities from Do2Learn's JobTips website, because my aim is to facilitate social skills that will be useful in a variety of job and community settings.   


Although my main technology tool for working with groups is the SMARTBoard,  I've found that using a combination of interactive whiteboard and iPad activities to be especially effective.  I'm paving the way for more role-play activities in the future, and attempting to use technology to my advantage.


This past week, I used the iGaze app, created by Dunedin Multimedia, to help a group of high-school level students practice establishing and maintaining eye gaze, something that is difficult for most of them to demonstrate "in-person".  I was amazed.  Each student was excited to take his or her turn.  Even more amazing?  When each student took a turn, the other students looked at their eyes and faces.  No one rocked or "stimmed".  No one made noises.  I observed several instances of joint attention, much to my delight.   


Below is a video from Dunedin Multimedia's YouTube channel that is similar to what the students viewed during their group activity:
Here is some information from Dunedin Multimedia about the iGaze app:
"Eye contact is important to communication and social development, and yet the impaired ability to make and maintain eye contact is one of the most striking aspects of autism. iGaze is an eye contact simulator that can help to build confidence in using this important means of nonverbal social communication.  The app also contains information on eye contact and eye gaze, with links to relevant research."

During the social skills activity involving the iGaze app,  I used the SMART Board to display a large picture of a boss and a worker standing face to face, making eye contact, engaged in conversation.  The picture served as an anchor to remind the students of pictures and videos they'd previously viewed that illustrated the concept of face-to-face interaction and the importance of establishing eye-contact with others from time-to-time.


I'm hoping I will be able to access the YouTube videos from Dunedin so I can use them on the SMART Board. It will be interesting to see how this plays out!   I'm also planning to take a closer look at Dunedin Multimedia's emotion x app for the iPad.


RELATED 
Screen-shot of iGaze for the iPad Dunedin:
iPad Screenshot 1
SOMEWHAT RELATED
The SMART Table at my school was updated today - I'm looking forward to using it for some group activities, now that it is back in working order and has new applications loaded up and ready to go!


If you are interested in learning more about technology related to students with special needs, be sure to check out Kate Ahern's blog, Teaching Learners with Multiple Special Needs

Kate's post about the features of Unique Learning Systems.


Upcoming:  more about tablets, interview with folks from Stantum, social-skills game-in-progress.....large displays in public spaces update....

Oct 28, 2009

Interactive Multimedia Across Platforms and Screens: Adobe's Open Screen Project; MEX Mobile User Experience Manifesto.... (Please don't annoy the user!)

I'm not sure what I think about ubiquitous Flash 10.1. and Adobe's Open Screen project. I like the idea of anything that is seamlessly cross-platform, but I shudder to think that this might let out a wave unwanted or annoying "push" advertising on on-the-go screens of all sizes. I'm assuming web developers, along with TV ad producers, will be jumping on this train without fully thinking about how their applications and designs will play out in the off-the-desktop, digital-out-of home world.

I decided to take a look, drill down through the hype, and share a few links related to this topic.

Adobe Pushes for a Flash-ier Mobile Web
Rob Pegararo, Faster Forward, Washington Post (10/5/09)
"Are you anxious to bring Flash to the mobile Web, even if it means being subjected to some over-eager Web coder's song-and-dance routine? Or would you rather do without it on the go, even if that means having to switch to a "real" computer to use some Web sites' features?"

Hopefully the "over-eager web coders" will heed the MEX Manifesto:

MEX:  Mobile User Experience 2009 Manifesto (pdf)
"The Manifesto sets out our beliefs as to how user-centred design principles can enhance the experience of multi-platform digital services."

A framework for user journeys in a multi-platform world:  Marek Poawlowski, founder of MEX

MEX: User experience journeys in a multi-platform environment from Marek Pawlowski on Vimeo.

"User experiences are evolving into increasingly complex sets of interactions between multiple devices.  In this video presentation, Marek Pawlowski of the MEX Mobile User Experience strategy forum, shows how a framework can be used to map user journeys through the multi-platform environment."

"Unencumbered by wires, information is flowing into every corner of our world at an ever increasing rate and through an ever increasing range of digital platforms. The single greatest challenge facing digital industries is understanding how this explosion of data will be woven into the fabric of consumers' lives." -- Marek Pawlowski, founder of MEX.

MEX Blog 

OPEN SCREEN VIDEO

Open Screen Project from Vyshak V on Vimeo.
"The Open Screen Project is an industry-wide initiative, led by Adobe and backed by other industry leaders who all share one clear vision: Enable consumers to engage with rich Internet experiences seamlessly across any device, anywhere. Partners in the Open Screen Project are working together to provide a consistent runtime environment for open web browsing and standalone applications — taking advantage of Adobe® Flash® Player and, in the future, Adobe® AIR®. This consistent runtime environment will remove barriers to publishing content and applications across desktops, mobile phones, televisions, and other consumer electronics." Learn more

Reinventing Storytelling in the Digital Age Across Platforms, Across Screens

NAB 2009 presentation by Shantanu Narayen of Adobe and A.D. Albers, of Disney Interactive Media Group, from NAB 2009
Adobe and NVIDIA Deliver Rich Web Experiences on Netbooks and Mobile Devices
Reuters (10/5/009)
"At Adobe MAX, Adobe's worldwide developer conference, Adobe Systems Incorporated and NVIDIA Corporation..announced that both companies are bringing uncompromised browsing of rich Web content to netbooks, smarphones and smartbooks built with NVIDIA GPUs. The companies have been working closely together as part of the Open Screen Project to optimize and dramatically improve performance of Flash Player 10.1 by taking advantage of GPU video and graphics acceleration on a wide range of mobile Internet devices. NVIDIA customers embracing Flash Player 10.1 for their new devices include HP, Lenovo, Samsung, Acer, Asus and more..."


RIM Joins Open Screen Project  Reuters (10/4/09 )


Honey I Shrunk the Flash Player Simon Bisson and Mary Branscome, ZDNET, 10/12/09
Teaming up with Adobe and the Open Screen Project -Google Blog   (10/5/09)


Paramount Digital Entertainment Launches Interactive Thriller on MySpace  Tracy Sedlow, InteractiveTV Today (10/28/09)
"The company says that it can deliver the show's multiple interactive elements to viewers across devices using Adobe Flash Player and Adobe AIR, "because of efforts by the Open Screen Project, an industry-wide initiative led by Adobe and supported by PDE and close to 50 other industry leaders, to enable people to engage with rich Internet experiences across any device, anywhere." -

Jul 23, 2013

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform and Kaiser Permanente therapists offers a barrel of possibilities!

"The goal was to build a game that is extremely accessible, non-complex and includes simple mechanics that children with autism can quickly understand to retain their attention, prevent over-stimulation, encourage play, and prevent frustration." -Vectorform "Monkey Business" Team


I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.

I was impressed!

As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child.  The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills.  Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.

In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application.  Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.


Below are screen shots and descriptions of a few activities from the Monkey Business game:

Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another.  The child is provided with a set of toy lily pads on the floor.  As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.























Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.






















Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen.  The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements.  Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.






















Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:












































In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.  


Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials.  If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results.  Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.   

To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism.  Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment.   A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams.  I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement.  Monkey Business holds potential to fill the bill.

What next?

I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.  

Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction.  The most recent area of exploration is Google Glass. 

Intrigued by Google Glass?  So am I.  

During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass.  As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.

In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision.  In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine.  He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.


RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.


















Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:





Microsoft Kinect and Autism (SlideShare) Susan McCarthy, Little Angel's School 2/09/13


Italian Team Uses the Kinect to Treat Autistic Children Andrea Lorini, Epoch Times, 12/13/12

Microsoft Surface Multi-touch Application for Pediatric Neuropsychology Assessment (Featuring Vectorform) Lynn Marentette, TechPsych Blog, 8/26/09   



Xbox One, Kinect 2.0 and the future of health technology Marcelo Calbucci, Mobihealth News, 5/26/13



Accessible Games for Health and K-12 Education: Lessons from the Classroom (SlideShare) Lynn Marentette, 5/9/08, Games for Health Conference Presentation

Researchers: Microsoft Kinect is a Money-Saving Telemedicine Device Gabriel Perna, Healthcare Informatics, 2/15/13

Lowes, LP., Alfano LN, Yetter BA, Worthen-CHaudhari, L, Hinchman W, Samona P, Flanigan KM, Mendell JR Proof of Concept of the Ability of the Kinect to Quantify Upper Extremity Function in Dystrophinopathy
PLoS Curr. 2013 Mar 14; 5   doi:  10.1371/currents.md.9ab5d872bbb944c6035c9f9bfd314ee2