Sep 3, 2008

Lazybrains 3D game: Another Brain-Computer Interface!
















I came across an article about the BCI (Brain-Computer Interface) 3D game, Lazybrains, on the Wired website today. "Brain Scanners, Fingercams Take Computer Interfaces Beyond Multitouch"

LazyBrains was a Digital Media Senior Project of Aaron Bohenick, James Borden, Sachary Brooks, Kenneth Oum, and Jordan Santell, students at Drexel University.


Here is a video:

Game Teaser


Description of the BCI, a fNIR:
  • "The Functional Near-Infrared Imaging Device (fNIR) is a technology that was developed at the University of Pennsylvania, but is currently being used by the Drexel University biomedical department. The device shines infrared light into the user's forehead, and records the amount of light that gets transmitted back. The change in the amount of light can be used to deduce information about the amount of oxygen in the blood. When the user concentrates, their frontal lobe needs more oxygen and this change can be detected by the device."
http://www.voxel6.com/images/fNIR_CUTOUT_thumb.png
For more information, see the Voxel6 website.


Here is a link to a post about a similar BCI system:

Emotive System's Neural Game Controller Headset: Human-Computer Interface of the Future?

It will be interesting to see how this technology unfolds. In my opinion, it will be quite useful for cognitive rehabilitation, as well as providing access to games for people who have significant physical limitations.

Sep 1, 2008

Interactive Touch-Screen Technology, Participatory Design,and "Getting It"....

PLEASE SEE THE UPDATED VERSION OF THIS POST:
Interactive Touch Screen Technology, Participatory Design, and "Getting It", Revisited

http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg

Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:

When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?

Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios in which surface computing would be a welcome breath of fresh air.

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.

http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.

HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it.


Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers.

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like.

(The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.)


Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.

A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities.

(By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

Some resources:
HP TouchSmart PC website, with demo
HP's TouchSmart YouTube videos
lm3labs (catchyoo, ubiq'window)
NUI Group (See member's links)
NextWindow
Fingertapps
thirteen23
SmartTechnologies
Perceptive Pixel - Jeff Hans
Microsoft Surface
iPhone
(More can be found by doing a search on this blog or The World Is My Interactive Interface.)

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI:

Need for Improvement: User-Unfriendly Information Kiosk Interactive Map


Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0



Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.


For more about lm3labs, including several videoclips, take a look at one of my previous posts:
Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Aug 28, 2008

Surface Computing, Health, and Hands-on Science Education

"....from the first time I saw surface in Andy Wilson's lab at Microsoft Research, I knew it had healthcare written all over it. It has taken some time to bring together the right developers and partners to apply Surface technology in health, but we are finally there." -Bill Crounse, MD, Microsoft HealthBlog

As I've previously suggested, surface computing would be useful in education K-12 settings. One look at the graphics posted below at a demonstration about Microsoft Surface in Health.

We know that one of the challenges in our public schools is to to encourage more students to take STEM-related courses. (If you are not familiar with the acronym, STEM stands for Science, Technology, Engineering, and Mathematics.) One look at the hands-on graphics below might convince reluctant students to sign up for class!






















The video of the 3D interactive heart simulation can be found at the bottom of the following post:

Microsoft Health Blog

Aug 27, 2008

Digital Lighbox for Hospitals - The Multi-touch Future of Electronic Medical Records?

Is this what the future holds for electronic medical records?

Digital Lightbox For Hospitals

I came across this on Richard Bank's blog, rb.trends. This multi-touch display is from BrainLAB AG, a company located in Germany. Here is a quote from Ubergizmo:

"Digital Lightbox replaces the conventional light box used to observe analog x-ray images. Connected to the hospital PACS, the new digital platform can be installed both in meeting rooms and in operating rooms, where clinicians can then access, manipulate, and utilize data for surgery planning. By displaying the human body in 3D, Digital Lightbox helps clinicians to more clearly demonstrate to patients what effects a disease can have and which procedures may be necessary. Digital Lightbox enables clinicians to select the most valuable images from large amounts of existing medical data. Ergonomic touchscreen technology with zoom functionality makes working with data easy and effective. Clinicians can intuitively navigate within pictures and between settings. Image scrolling can be performed with one finger; zooming in and out of images with two. Images from different sources can also be fused easily. A measure functionality enables clinicians to set size and other dimensions."


Something like this would be good for high school science classrooms.

Update:
For more photos of the Digital Ligthbox and the iPlan Net software that supports remote collaboration, visit the Future-Making Serious Games blog.

Aug 23, 2008

Digital Students@Analog School videoclip from 2004: Do the sentiments of the students still ring true?

It is the beginning of the school year, the best time of the year to for educators to seriously reflect on the many ways they can play an important role in engaging and inspiring their students.

I learned about the above video today from "Back-to School Tech Ideas for K-5", written by C.C. Long on her Tech Integration in Schools blog, and I thought it was worth sharing. The video was created by several college students in 2004, and can be found on TeacherTube. It is similar in spirit to the videos I included on a post about engaged learning earlier this year.

Here are a few quotes from the students in video:

"We are more visual learners, we use different technologies to express ourselves, we don't use just pen and paper".

"99 % of the teachers do it the old fashioned way of ...sit down, you listen to someone lecture for 40-50 minutes...

"Just lectures, it limits my learning."

"There are several options in expressing yourself and expressing your viewpoints. And I think the university limits that."

"I was given one way, and that was how I had to do it."

"The professor still wants to teach it the same way they learned.."

"I figured I'd have to to write papers, I figured I'd have to do problem sets, but I thought there would be more options."

"It is frustrating, because it doesn't seems that college is accommodating the visual learner"

"Anywhere you go outside of the classroom, the technology is being used. I don't understand why we aren't applying it to class."

"Listen. Sit down and talk with me. Give me a choice to express myself in their class."

"I think it would make it more exciting for them."

"When I become a teacher, I am going to have to learn and assess that students are going to have even more that is accessible to them. And if I don't adapt to that, I'm going to start to lose my students."


My hunch is that many educators still do not feel comfortable keeping up with world of the tech-savvy. To do so takes quite a bit of effort, time, and determination. And frustration. If you've worked in public schools for a while, you know what I mean. Much of the technology that educators have been handed over the years has been teacher-unfriendly.
.
Things are changing.

It has been six years since CAST and the Association for Supervision and Curriculum Development created an on-line "how-to" book about technology and the concept of Universal Design for Learning (UDL). This online multimedia book, Teaching Every Student in the Digital Age, still provides a good foundation for teachers who plan to integrate technology into teaching and learning activities to support all learners.

It is exciting to know that many school districts have initiated study groups provide on-line resources for their teachers to support the implementation of UDL. . At the university level, the concept of using technology to support universal design for instruction is not as alien as it might have been 10 years ago.


Mindful, reflective use of technology, including interactive multimedia technology, can support multiple means of learning, communication, collaboration, and knowledge sharing among among all learners, no matter what age. In turn, an engaging and meaningful environment for learning can be sustained.


So what now?

If you are an educator, it wouldn't hurt to see what new educational applications have arrived at your school. Volunteer to be the teacher who teaches with the new interactive whiteboard. Sign up for the Wi-Fi laptop cart once a week for a semester. Hunt down the digital video cameras and do a search for where the video editing software might be hiding. Don't let the tech-savvy teacher down the hall hog it all, even if you consider yourself to be a technophobe!

Most importantly, establish a relationship with the technology "go-to" person at your school or university department, and see if it is not too late to order a few new technology tools. Find a few other people who have decided to do more with technology this year. Then sign up for a few workshops before your calendar is filled. There are no guarantees, but you just might have the best school year ever.

If you are new to this blog, do a search for what might interest you. I am sure you will find links to information that will help. Be sure to visit C.C. Long's Tech and Integration blog for specific technology related activities you can implement right away.

For reflection, take the time to watch the video clips on C.C. Long's blog that were used during Thinkfinity training.

For more inspiration, you might enjoy following a few of the links below:

Engaged Learning Revisited: Four videoclips for reflection....

Response to Intervention, Universal Design for Learning: Resources for Implementation

Visual Literacy and Multimedia Literacy Quotes - Odds and Ends PART TWO

Engaged Learning and Social Physics: Phun, an Interactive 2D Physics Sandbox

Updated MegaPost - Resources For All: Interactive Multimedia and Universal Design for Learning