Showing posts with label research. Show all posts
Showing posts with label research. Show all posts

Mar 26, 2011

Bill Gerba's Words of Wisdom about Digital Media Content (Via Digital Signage Insider)

In this article, Bill Gerba, of WireSpring, discusses his approach to testing digital signage content.  The article is one in an on-going series of articles:


Testing Digital Signage Content: The Need for a New Approach
Bill Gerba, Digital Signage Insider, 3/24/11


From what I've seen regarding the digital signage and displays when I'm out and about,  there is a need for folks in this industry to heed Bill Gerba's advice!  FYI: I've embedded a slide show at the end of the post that includes over 200 shots of digital signage and related "DOOH".


RELATED
Digital Signage Screen Placement:  Targeting the Attention Zone 
Bill Gerba, Digital Signage Insider, 12/17/09

Making great digital signage content: Composing shots and scenes
Bill Gerba, Digital Signage Insider, 5/15/08


Lots of "DOOH"

Mar 16, 2011

Cute TV Commercial! Intel Gesture Recognition Technology

This was the first television commercial I saw on TV this morning. I had to stop my morning routine to watch it : )



RELATED
From ChannelIntel: Interview of Francis MacDougall, CTO of GestureTek

Jan 18, 2011

"Hi, Google. My name is Johnny" Johnny Chung Lee leaves Microsoft. (I still wish I could be Johnny Chung Lee for a day.)

"Hi, Google.  My name is Johnny"  Johnny Chung Lee announced on his Procrastineering blog that he's accepted a position at Google as a "Rapid Evaluator".   I'm not sure what he will be doing in this position, but his title is intriguing!

Here are some of my previous posts devoted to the work of Johnny Chung Lee:

I wish I could be Johnny Chung Lee for a Day!  Tracking fingers with the Wii Remote
Video Clips of Projects Inspired by Johnny Chung Lee
More about Project Natal:  Richochet - Great Gaming for Fitness, Johnnie Chung Lee's Contribution


I STILL wish I could be Johnny Chung Lee for a day!

RELATED
Microsoft Kinect Developer Johnny Chung Lee Jumps Ships and Lands at Google
Leena Rao, TechCrunch, 1/18/11
What Microsoft Kinect Defection to Google Means
Rich Tehrani, TMCnet Blog 1/18/11
Microsoft Loses a Top Kinect Researcher to Google
Tricia Duryee,Yahoo! Finance, 1/18/11

Dec 11, 2010

SMALLab Update: Embodied and Engaged Learning - ASU researchers partner with GameDesk

SMALLab is an interdisciplinary collaborative project at the Arts, Media and Engineering program at Arizona State University, and includes people from fields such as education, art, theatre, computer science, engineering, and psychology.  The SMALLab provides students with a multi-sensory, multi-modal way of learning concepts in an immersive environment, and uses a motion capture system that tracks the position of the students as they move and interact within the environment.

SMALLab's project lead is David Birchfield,  a media artist, researcher, and educator who focuses on K-12 learning, media art installations, and live computer music performances.  SMALLab researchers have recently partnered with GameDesk to develop a 6th grade curriculum for a GameDesk charter school in 2012. (Information and links related to GameDesk are located in the RELATED section of this post.)

Below is a detailed excerpt from an overview of SMALLab:
"In today’s world, digital technology must play a central role in students’ learning. A convergence of trends in the learning science and human-computer interaction (HCI) research offers new theoretical and technological frameworks for learning. in particular, mixed-reality, experiential media systems can support learning in a way that is social, collaborative, multimodal, and embodied. These systems comprise a new breed of student-centered learning environments [SCLE’s]. Importantly, they must address the practicalities of today’s classrooms and informal learning environments (eg.: space, infrastructure, financial resources) while embracing the innovative forms of interactivity that are emerging from our media research communities (eg: multimodal sensing, real time interactive media, context aware computing)...
...SMALLab is an extensible platform for semi-immersive, mixed-reality learning. By semi-immersive, we mean that the mediated space of SMALLab is physically open on all sides to the larger environment. Participants can freely enter and exit the space without the need for wearing specialized display or sensing devices such as head-mounted displays (HMD) or motion capture markers. Participants seated or standing around SMALLab can see and hear the dynamic media, and they can directly communicate with their peers that are interacting in the space. As such, the semi-immersive framework establishes a porous relationship between SMALLab and the larger physical learning environment. By mixed-reality, we mean that there is an integration of physical manipulation objects, 3D physical gestures, and digitally mediated components. By extensible, we mean that researchers, teachers, and students can create new learning scenarios in SMALLab using a set of custom designed authoring tools and programming interfaces."

Below are a few videos about SMALLab, and information about GameDesk, an organization that is collaborating with SMALLab in California.


Below is a demonstration of a Smallab learning activity:

SMALLab from SMALLab on Vimeo.


RELATED
Sara Corbett, NYTimes Magazine, 9/15/10

Info about GameDesk, from the GameDesk website:
"GameDesk is a 501(c)3 nonprofit research and outreach organization that seeks to reshape models for learning through game-play and game development. The organization looks to help close the achievement gap and engage students to learn core STEM curriculum. It develops project-based learning with a strong focus on purpose, ownership, and personal value. The organization (originally developed out of research and support at the University of Southern California's IMSC) has now been in development, practice, and/or evaluation for over two years in various schools in the Los Angeles area." -Gamedesk

Gamedesk Concept Chart

Cross-posted on the Tech Psych blog.

Oct 21, 2010

Emerging Interactive Ed. Tech: Classmate Assist and Wayang Outpost -Sensors, AI, and Context Awareness for Learning -and Teaching

Brief background: I've been following developments in intelligent tutoring systems for a while, and find it interesting to see how researchers are combining artificial intelligence, learning theory, affective computing, and sensor networks to create applications that might prove to be useful and effective.

The advantage of using intelligent tutoring applications in some cases is that it provides students with additional support and feedback the moment it is needed, something that is difficult for teachers to provide to students in large classrooms. With the increase in use of smartphones and other mobile devices such as the iPad, there is a good chance that this sort of technology will be used to support learning anywhere, anytime.

Although most intelligent tutoring systems are geared for 1-1 computing, I think there are some components that could be tweaked and then transfered to create intelligent "tutoring" systems for collaborative learning. Students like game-based learning, and what could be more fun than playing AND learning with a partner or group of peers? (I plan to revisit the research in this area in an upcoming post.)

Some thoughts:I envision a system could support learning as well as important skills useful to students in life beyond the school walls, such as positive social interaction, teamwork, and problem-solving skills. The path of least resistance? Most likely applications that support the learning of pairs or small groups of students working at one display. However, in this era of the "21st Century Learner", there is a growing need for applications that can support small groups of students for collaborative groups and project-based learning activities.

There are a few applications developed for collaborative learning activities around a multi-touch table, such as the SMARTTable or the Surface, and more are needed. Also needed are intelligent systems that can support video conferencing and collaborative learning between students who are not physically co-located.

There are some problems that have yet to be solved. For example, the use of multiple sensors for an application designed for young people might be too intrusive. There are serious issues related to privacy/security. Who would have access to data regarding a student's emotional or physiological state? How would this data be utilized? How would this information be protected? Many school districts have security vulnerabilities, so it is possible that this information could be misused, if in the wrong hands.

Below I've highlighted two "intelligent" tutoring systems that incorporate the use of sensors in one form or another to generate information about student learning in a way that simulates what good teachers do every day. The ClassroomAssist application was developed by researchers at Intel, in collaboration with several universities. The Wayang Outpost application was developed by researchers at UMASS, and is aligned with the principles of Universal Design for Learning.

CLASSROOM ASSIST
ClassmateAssist is an application developed by Intel's Everyday Sensing and Perception team. Here is the description of the application from Intel Research:"The advent of 1:1 computing in the classroom opens the door for teachers to set up individualized learning for their students who have a wide spectrum of interests and skills. ClasmateAssist technology uses computer vision and image projection to assist and guide students in a 1:1 learning environment, helping them to independently accomplish tasks at their own pace, while at the same time allowing teachers to be apprised of student progress."

In the following video, Richard Beckwith, a developmental psychologist at Intel, demonstrates a prototype of an application that uses video-sensing to track student's hand movements during a coin sorting lesson. The application provides feedback to the student, and also tracks data about the student's progress that can be transformed into a report for the teacher. The system can also monitor student's facial expression, note attention levels, and provide feedback.


SPAIS Publications:
Theocharous, G., Beckwith, R., Butko, N., Philipose, M. Tractable POMDP Planning Algorithms for Optimal Teaching in "SPAIS". International Joint Conferene on Artificial Intelligence (IJCAI) workshop on Plan Activity, and Intent Recognition (PAIR), Pasadena, California, July 2009.
 May 2010.
Theocharous, G., Butko, N., Philipose, M. Designing a Mathematical Manipulatives Tutoring System using POMDPS. (pdf). POMDP Practitioners Workshop: Solving Real-world POMDP Problems, International Conference on Automated Planning and Scheduling (ICAPS). Toronto, May 2010



Wayang OutpostWeb-based Interactive Math/Intelligent Tutoring System, with Sensors.
I've followed the work of Beverly P. Woolf and her colleagues for some time.  Much of their research has centered around a web-based application, Wayang Outpost, an intelligent electronic tutoring system that incorporates multimedia and animated adventures while providing activities designed to prepare teens for standardized math tests, such as the SAT and state-mandated end-of-course exams.

In recent years, the team has been using non-invasive sensors in their research, including a camera that views facial expressions, a posture-sensing device located in the seat of the student's chair, and a pressure-sensitive mouse, and a wireless skin conductance wristband. Data collected through all of these sensors can provide useful information about student learning.  The system can also note when students try to "game" the system.
Related Publications
Woolf, B.P., Arroyo, I., Muldner, K., Burleson, W., Cooper, D., Dolan, R., Christopherson, R.M (2010)The Effect of Motivational Learning Companions on Low Achieving Students and Students with Disabilties (pdf) International Conference on Intelligent Tutoring Systems, Pittsburgh.
Abstract "We report the results of a randomized controlled evaluation of the effectiveness of pedagogical agents as providers of affective feedback. These digital learning companion were embedded in an intelligent tutoring system for mathematics, and were used by approximately one hundred students in two public high schools. Students in the control group did not receive the learning companions. Results indicate that low-achieving students—one third of whom have learning disabilities—had higher affective needs than their higher achieving peers; they initially considered math problem-solving more frustrating, less exciting, and felt more anxious when solving math problems.  However, after they interacted with affective pedagogical agents, low-achieving students improved their affective outcomes, e.g., reported reduced frustration and anxiety."


Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R. (2009)
Emotion Sensors Go To School. AIED 2009. Pp. 17-24. IOS Press.
Low-tech description of Wayang Outpost, the math application used in the above publication: Paul Franz, Recoder.Com 5/16/09
Cooper, D.G., Arroyo, I., Woolf, B.P., Muldner, K., Burleson, W., Christoperson, R.  Sensors Model Student Self-Concept in the Classroom (pdf) UMass Amherst, June 22, 2009/UMAP 2009


Cross posted in the TechPsych Blog

Oct 18, 2010

Words of Wisdom (and more) from Harry Brignull: UX Roots in Psychology, Design, Info Architecture...and so much more!

Harry Brignull is a User Experience Consultant at Madget in Brighton, England. According to his "about" page info, his work involves "building experiences by blending User Research, Interaction Design, and process consultancy."  Harry's 90 Percent of Everything blog is a well-spring of information and inspiration.


Back Story
I came across Harry's work in 2004 or 2005, when I was taking a VR Class (Virtual Reality for Education and Training) and working on an assignment about large-screen displays.   At the time, Harry was a Ph.D. working in the Interact Lab at the University of Sussex on the Dynamo project, in collaboration with researchers from the Mixed Reality Lab at the University of Nottingham.

I revisited this work again in early 2007 when I was studying HCI and Ubiquitous Computing, and researching information about collaborative interaction on large displays in public spaces.  The following research article inspired me at the time, and looking back, I consider the work of this team to be seminal, and worth revisiting once again.

Izadi, S., Brignull, H., Rodden, T., Rogers, Y., Underwood, M. (UIST'03)
Dynamo: A public interactive surface supporting the cooperative sharing and exchange of media (pdf)  



(The picture was taken from the Dynamo project's website, and shared on my 2007 blog post, Revisiting promising projects, Dynamo, an application for sharing information on large interactive displays in public spaces.)


Links to a few of Harry's useful blog posts:

UX as Applied Psychology:
Clear Reporting & Critical Thinking:  Why User Experience Needs to Remember its Roots in Psychology (10/4/10)

"There was a time, back in the early 1990s, when almost everyone involved with UX research had a background in Psychology.  Back in those days, the term "User Experience" didn't really exist, and the nearest discipline was Human-Computer Interaction (HCI)..."

Comment:
As a school psychologist, I'm well-steeped in the process of research, observation, and data collection.  I also know that the fundamentals of applied psychology are a very important ingredient in UX work.  Don Norman, one of the "grandfathers" of UX/HCI, was a cognitive psychologist.  He was the co-author of one of my psychology textbooks when I was a university student the first time around. Coincidentally, Norman's book, Design of Everyday Things was required reading for my HCI graduate class.

























Mobile Usability Testing for Low Budgets
Mobile Usability Testing Tip: Recording from Two Webcams
In this post, Harry discusses quick and cheap methods of using two webcams for mobile usability testing. This method could be used in other situations, such as developing presentations. (It might also be applicable for use in therapeutic and special education settings.)

Image: Nick Bowmast


UX Brighton Presentation on Dark Patterns: User Interfaces Designed to Trick People
My Presentation on Out of Box Experience Design  (Harry Brignall)
David Ogilvy: We Sell or Else


RELATED
Links to Harry's Blog Posts, By Topic


SOMEWHAT RELATED

Oct 10, 2010

Michael Ogawa, Data Visualization Researcher, VIDI (UC Davis)

I followed a link today to Michael Ogawa's website and blog. He's a Ph.D. student in the Computer Science department at the University of California at Davis, where he participates in research with the VIDI (Visualization and Interface Design Innovation) group.

Here is a description of Michael's research, quoted from his site:

"My research focus is in software visualization. I am interested in bringing to light the efforts of software developers: How they work together on projects effectively to create some of the largest and most complex systems in the world. Complementary to that goal, I am also interested in designing visually appealing applications that capture an emotional aspect not seen in charts and graphs." -Michael Ogawa
I haven't fully explored Owaga's website, but it looks interesting.  Here is a video I found on his site of T.S. Elliot reading the "The Love Song of J. Alfred Prufrock".  The visualization of the poem is interpreted by Open Wordle, an open source library for Processing, available at  http://code.google.com/p/openwordle/



RELATED
Owaga is known for his work on the Code_Swarm project, "an experiment in organic software visualization". Below is a visualization of the commit history of the Eclipse IDE project:

code_swarm - Eclipse (short ver.) from Michael Ogawa on Vimeo.
(Music: "Orange" by Etherfysh)
Open source code for Codeswarm: http://code.google.com/p/codeswarm
Kwan-Lui Ma, iMichael Ogawa's advisor

News and Publication Links from the VIDI website:


Sep 18, 2010

Interactive Tabletops and Surfaces: 2010 ACM Conference, Nov. 7-10, Saarbrucken, Germany. Wish I could go!

If you are new to this blog, you should know that I'm passionate about interactive tables and surfaces of all sizes!   Although this technology has been around for a while, it is a new concept to most people.  The researchers and practitioners involved in the upcoming 2010 Interactive Tabletops and Surfaces Conference have been an important influence in the way people think about interacting with technology, and have made significant contributions to this emerging field over the past several years.   It hasn't been an easy road, given that most of us have minds brainwashed through years of forced keyboard-and-mouse interaction and traditional WIMP (Windows, Icons, Menus, Pointers) interfaces.

I first learned about the first Interactive Tabletops conference, held in 2006, in early 2007.  At the time, I was working on projects for my HCI and Ubiquitous Computing classes, trying to learn everything I could about natural user interaction, large touch-screen displays, tabletop computing, and multi-touch.   I was inspired by the interesting work going on in this field.  This was before the first iPhone was introduced, before Microsoft's multi-touch Surface was unveiled, and three years before Apple broke out with the iPad.

Many of the people involved with the 2010 Interactive Tabletops and Surfaces Conference are (or have been) affiliated with the NUI group NUI stands for Natural User Interface, or Natural User Interaction - the NUI group is "a global research community focused on the open discovery of natural user interfaces."  I joined the NUI-group in 2007 when I was looking for more information about the nuts and bolts of multi-touch programming and systems, and have been encourage to see how things have evolved since then.

Members of another group, sparkon, are also participating in the Interactive Tabletops and Surfaces conference.  Sparkon is an on-line community that includes people involved with  interactive technologies, including tabletop and surface computing. "On sparkon, you'll find projects demonstrating the latest interactive techniques, applications, software frameworks, case studies, and blog articles relating to creative and emergent technology."  (I'm also a member of Sparkon.)


Here's the information from the conference website:

ACM Interactive Tabletops and Surfaces, Saarbrücken, Germany:  7-10 November, 2010
"ITS 2010 is a premier venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a young community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, interaction design, and studies expanding our understanding of design considerations of ITS technologies and of their applications in modern society. ITS 2010 will bring together top researchers and practitioners who are interested in both the technical and human aspects of interactive tabletop and surface technologies. It is our hope that we will be able to achieve increased synergy of approaches between the disciplines engaged in the research in the area of interactive tabletops and surfaces, Design, HCI, UbiComp, Psychology, MobileHCI and other related fields. More directly, we intend to encourage immediate interdisciplinary collaboration on future research topics. Young scholars and Ph.D. students are especially encouraged to submit papers and participate in the doctoral colloquium."


Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH
Conference General Chairs



KEYNOTE SPEAKER:  W. Bradford Paley

"Bio: W. Bradford Paley uses computers to create visual displays with the goal of making readable, clear, and engaging expressions of complex data. He did his first computer graphics in 1973, founded Digital Image Design Incorporated in 1982, and started doing financial & statistical data visualization in 1986. He has exhibited at the Museum of Modern Art; he created TextArc.org; he is in the ARTPORT collection of the Whitney Museum of American Art; has received multiple grants and awards for both art and design, and his designs are at work every day in the hands of brokers on the floor of the New York Stock Exchange. He is an adjunct associate professor at Columbia University, and is director of Information Esthetics: a fledgling interdisciplinary group exploring the creation and interpretation of data representations that are both readable and esthetically satisfying."


SAMPLE TOPICS



  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Interactive surface hardware, including sensing and input technologies with novel capabilities
  • Human-centered design & methodologies





RELATED
Previous Conferences








PLUGS
From the conference website -Links to the conference sponsors:




We appreciate the generous support of the following sponsors, without whom this conference would not be possible. Click on the logos to learn more about our generous supporters, and let us know if you are interested in becoming a sponsor.

Champions:

 

Benefactors:

  

Donors:

Contributors:

Academic Sponsors:

     

Aug 13, 2010

Bill Buxton's Presentation Video: "A Little Tale about Touch" (Microsoft Worldwide Partner Conference 2010)

I am always inspired by Bill Buxton's words of wisdom.  If you haven't heard of him, know that he's been around for a very long time, coming to the world of computer science and IT through his passion for music. According to his on-line biography, "Bill Buxton is a relentless advocate for innovation, design, and - especially - the appropriate consideration of human values, capacity, and culture in the conception, implementation, and use of new products and technologies. This is reflected in his research, teaching, talks, and writing - including his column on design and innovation for BusinessWeek.com, and his 2007 book, Sketching User Experiences."

A Little Tale about Touch "It's about the physical and the social context where it is happening" (Bill Buxton)
Get Microsoft Silverlight

RELATED
“The only true voyage of discovery is not to go to new places, but to see the world through different eyes”.  - Bill Buxton, quoting Proust (1913, Remembrance of Things Past)

Bill Buxton: After the Show (Interviewed by Caroline Goles, Worldwide Partner Group Business Manager)
Get Microsoft Silverlight

Bill Buxton's Website
Buxton, William. (1994) The three mirrors of interaction: a holistic approach to user interfaces. In L.W. MacDonald & J. Vince (Eds.) Interacting with virtual environments. New York: Wiley.

Here is a quote from the above reference:

"The thesis of this chapter is that we should consider technology in terms of the fidelity with which it reflects human capabilities on three levels:


* physical: how we are built and what motor/sensory skills we possess;

* cognitive: how we think, learn, solve problems and what cognitive skills we possess;

* social: how we relate to our social milieu, including group structure and dynamics, power, politics, and what social skills we possess.

Our metaphor is one of three separate mirrors, each reflecting one of these levels. In order to be judged acceptable, designs must provide an acceptable degree of fidelity in how they reflect each of these three aspects of human makeup and activity. The benefit is in how the model can provide a simple but valuable test that can be used during the design process. We now look at each of these mirrors in turn."

May 29, 2010

Preview: Update on Touch & Multitouch Technologies, Websites, and Touch-Interactive Multimedia Apps

It is about time for an update about touch/gesture- interactive technologies.

I've been researching the latest in "touch" screens and new developments in interactive multi-media content.  In just one year, a multitude of websites have been transformed from static to interactive. 

Although the initial objective for some of these websites was to optimize the interface and navigation for people accessing websites via touch-screen cell phones,  some are ideal for use on touch-enabled slates, the iPad, and even larger touch screen displays and surfaces.   

Convergence seems to be the buzz word of the day.   Interactive TV.  Game sets with Internet access.  Movies on your cell phone.  Touch screen Coke machines displaying movie trailers.  What's happening now, and what is next?

I welcome input from my readers in the form of links to websites, university labs with grad students and professors who are obsessed with emerging interactive technologies, proof-of-concept video clips, video clips of related technologies that are new-to-market, etc.   

I will add video clips to the following playlist:


FYI: I'm also in the middle of writing a series of posts about 3D television technologies for the Innovative Interactivity blog, and welcome input from my readers about this topic.


RELATED (Previous posts)
(the above post includes links to various multi-touch developer kits and resources)




Apr 26, 2010

Interactive Multi-Touch for Children with Autism Spectrum Disorders: Research and Apps by Juan Pablo Hourcade, Thomas Hanson, and Natasha Bullock-Rest, University of Iowa

Dr. Juan Pablo Hourcade heads a team of researchers at the University of Iowa who are creating multi-touch applications and other technologies to support communication, collaboration, creativity, and self-expression for young people with Autism Spectrum Disorders.  The picture below is a screen shot of the team's web page that includes a few videos of the team's important work: (Videos can be found on the Technologies for Children with Autism Spectrum Disorders webpage.)


RELATED VIDEOS

pymt demo reel from Thomas Hansen on Vimeo.


Multi Touch @ University of Iowa from Thomas Hansen on Vimeo.


RELATED
Hourcade, J.P., & Hansen, T.E. (2010). Multitouch Displays to Support Preschool Children's Learning in Mathematics, Reading, Writing, Social Skills and the Arts (pdf)


Dr. Hourcade is organizing the Digital Technologies and Marginalized Youth workshop on Satuirday, June 12, 2010 at the 9th International Conference on Interaction Design and Children (IDC), which will be held this year in Barcelona, Spain.


Other workshops that will be held at IDC 2010:
Interactive Storytelling for Children
Extended Abstract: Interactive Storytelling for Children (pdf)
Designing Tangibles for Children
Play and Technology:  How does technology initiate and facilitate physical play?


Dr. Hourcade's publications