Showing posts with label educational technology. Show all posts
Showing posts with label educational technology. Show all posts

May 15, 2009

iPod Touch Apps, WiiMote Whiteboards, 3D multi-user environments in education, and a teacher's video of the SMARTTable in action.

I thought I'd share the last two posts from my TechPsych blog here, since they focus on newer technologies that involve multi-touch or multi-user interaction.

A teacher explores the multi-touch, multi-user SMARTtable in his classroom

From what I can see, multi-touch, multi-user applications are ideal for students to learn collaborative, cooperative social skills at the same time they learn academic skills. Smart Technologies, well-known in the education world for interactive whiteboards, has unleashed a few tables, known as SMARTTables, in classrooms. One teacher, Tom Barret, is sharing his journey with technology, including the SMARTtable, on-line via his blog, SPACE FOR ME TO EXPLORE

The following is a video of young children doing math on a multi-touch SMARTTable. In order to solve the finger- arithmetic problems, the students must work cooperatively


Addition App - Set to multi-touch finger counts from Tom Barrett on Vimeo.

(In the video, you will see some shapes that Tom mistakenly added, so disregard them as you view the video.)


Here is a quote from Tom's blog about his experience with the addition application:

"I was most pleased with the level of engagement from the children and although on the surface this seems to be a simple application, it definitely requires a level of teamwork that you often do not get.

It is intriguing watching the children’s first attempts and how they realise they need to work together. As the challenge is small scale, once they have been successful they begin to refine their approach, communicate better and so get to later answers quicker."


Educational iPod Touch Apps for Students and Teachers: Eric Sailers' blog
Eric Sailers is a speech and language pathologist and assisted technology specialist who explores new technologies that he's found useful in the schools. Below is Eric's demonstration of applications such as "I Write Words", Wikipanion, Preschool Adventure, Twitterific, Google Mobile, and the calendar.

To demonstrate the iPod Touch,Eric uses the Elmo document camera that projects onto a screen. Note that as Eric demonstrates the Twitterific application, , he navigates to a link to a blog of one of his colleagues, which highlights the way one school is using the Wii as an augmentive communication tool and also an assessment tool for occupational therapy.



Take some time to explore Eric's Speech-Language Pathology Sharing blog. It is full of great information!

Update: Here are two video clips Eric created to prepare for an interview as a finalist for the Cox Communication Innovation in Special Education award. In one of the videos, Eric discusses the EduSim application, a 3D multi-user virtual world platform and authoring toolkit intended for classroom interactive whiteboards.

Interactive Applications for Special Education: Wiimote Whiteboards and iPod Touch in Special Education, Part I


Wiimote Whiteboards and iPod Touch in Special Education, Part II

May 10, 2009

Michael Haller Discusses Multi-touch, Interactive Surfaces, and Emerging Technologies for Learning

I came across an excellent overview of interactive display technologies that hold promise for education. The link below is a research article written by Michael Haller for BECTA, formally known as the British Educational Communications and Technology Agency.

Emerging Technologies for Learning: Interactive Displays and Next Generation Interfaces(pdf)
Becta Research Report (2008) Michael Haller Volume 3 (2008)


"Multi-touch and interactive surfaces are becoming more interesting, because they allow a natural and intuitive interaction with the computer system.

These more intuitive and natural interfaces could help students to be more
actively involved in working together with content and could also help improve whole-class teaching activities. As these technologies develop, the barrier of having to learn and work with traditional computer interfaces may diminish.

It is still unclear how fast these interfaces will become part of our daily life and
how long it will take for them to be used in every classroom. However, we strongly believe that the more intuitive the interface is, the faster it will be accepted and used. There is a huge potential in these devices, because they allow us to use digital technologies in a more human way." -Michael Haller

Michael Haller works at the department of Digital Media of the Upper Austria University of Applied Sciences (Hagenberg, Austria), where he is the head of the Media Interaction Lab.

Michael co-organized the Interaction Tomorrow course at SIGGRAPH 2007, along with Chia Shen, of the Mitsubishi Electric Research Laboratories (MERL). Lecturers included Gerald Morrison, of Smart Technologies, Bruce H. Thomas, of the University oof Southern Australia, and Andy Wilson, of Microsoft Research. The course materials from Interaction Tomorrow are available on-line, and include videos, slides, and course notes.

Below is an excerpt from the discription of the Interaction Tomorrow SIGGRAPH 2007 course:

"Conventional metaphors and underlying interface infrastructure for single-user desktop systems have been traditionally geared towards single mouse and keyboard-based WIMP interface design, while people usually meet around a table, facing each other. A table/wall setting provides a large interactive visual surface for groups to interact together. It encourages collaboration, coordination, as well as simultaneous and parallel problem solving among multiple people.

In this course, we will describe particular challenges and solutions for the design of direct-touch tabletop and interactive wall environments. The participants will learn how to design a non-traditional user interface for large horizontal and vertical displays. Topics include physical setups (e.g. output displays), tracking, sensing, input devices, output displays, pen-based interfaces, direct multi-touch interactions, tangible UI, interaction techniques, application domains, current commercial systems, and future research."

It is worth taking the time to look over Haller's other publications. Here is a few that would be good to read:

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009. "
Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

A. D. Cheok, M. Haller, O. N. N. Fernando, and J. P. Wijesena, 2009.
"Mixed Reality Entertainment and Art," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009. "Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]


M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation." IGI Publishing, 2008. [bibtex]

D. Leithinger and M. Haller, 2007. "Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]


J. Leitner, J. Powell, P. Brandl, T. Seifried, M. Haller, B. Dorray, and P. To, 2009."Flux: a tilting multi-touch and pen based surface," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3211-3216. [bibtex]

P. Brandl, J. Leitner, T. Seifried, M. Haller, B. Doray, and P. To, 2009. "Occlusion-aware menu design for digital tabletops," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3223-3228. [bibtex]


References from the BECTA paper:

Elrod, S., Bruce, R., Gold, R., Goldberg, D., Halasz, F., Janssen, W., Lee, D., Mc-Call, K., Pedersen, E., Pier, F., Tang, J., and Welch, B., Liveboard: a large interactive display supporting group meetings, presentations, and remote collaboration, CHI ’92 (New York, NY, USA), ACM Press, 1992, pp. 599–607.

Morrison, G., ‘A Camera-Based Input Device for Large Interactive Displays’, IEEE Computer Graphics and
Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Albert, A. E. The effect of graphic input devices on performance in a cursor positioning task. Proceedings ofthe Human Factors Society 26th Annual Meeting, Santa Monica, CA: Human Factors Society, 1982, pp. 54-58.

Dietz, P.H., Leigh, D.L., DiamondTouch: A Multi-User Touch Technology, ACM Symposium on User
Interface Software and Technology (UIST), ISBN: 1-58113-438-X, pp. 219-226, November 2001.

Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,

CHI 2002, 2002.

Kakehi, Y., Iida, M., Naemura, T., Shirai, Y., Matsushita, M.,
Ohguro, T., ‘Lumisight Table: Interactive View-Dependent Tabletop Display Surrounded by Multiple Users’, In IEEE Computer
Graphics and Applications, vol. 25, no.1, pp 48 – 53, 2005.

Streitz, N., Prante, P., Röcker, C., van Alphen, D., Magerkurth, C.,
Stenzel, R., ‘Ambient Displays and Mobile Devices for the Creation of Social Architectural Spaces: Supporting informal communication and social awareness in organizations’ in Public and Situated Displays: Social and Interactional Aspects of Shared Display Technologies, Kluwer Publishers, 2003. pp. 387-409.

Morrison, G., A Camera-Based Input Device for Large Interactive
Displays, IEEE Computer Graphics and Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-Joseph, E.,
Yeung, L. and Zahra, K., Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. IEEE and ACM International Symposium on Mixed and Augmented Reality ACM Press, Darmstadt, Germany.

Han, Y., Low-cost multi-touch sensing through frustrated total internal reflection, UIST ’05 (New York), ACM
Press, 2005, pp. 115–118.

Hull., J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J., Olst, D., Paper-Based Augmented Reality. In
Proceedings of the 17th International Conference on Artificial Reality and Telexistence (Esbjerg, Denmark,November 28-30, 2007). ICAT ’07. IEEE, 205-209.

Haller, M., Leithinger, D., Leitner, J., Seifried, T., Brandl, P., Zauner, J., Billinghurst, M., The shared design space. In SIGGRAPH ’06: ACM SIGGRAPH 2006 Emerging technologies, page 29, New York, NY,USA, 2006. ACM Press.

Research email: emtech@becta.org.uk

Main email: becta@becta.org.uk
URL: www.becta.org.uk

(This was also posted on the TechPsych blog.)

Apr 24, 2009

SMART Table in the Classroom: Tom Barret's Journey


Tom Barret is a teacher who is using a SMART Table in his classroom. His recent post, "SMART Table in my classroom- Days 2-5: Teething Problems" provides some insight about potential problems teachers might face when introducing this sort of technology to students.

(Tom blogs about educational technology, including topics such as "
Using the Nintendo Wii to Support My Numeracy Lesson")

Here are Tom's first-glance comments about the SMART Table:

"A couple of things that I have learned already:

There is a long way to go in terms of the toolkit and software development"

"The table is very robust."

"There is a place in the primary classroom for this type of technology, it feels natural to have this style of technology in my classroom. "

"My instincts tell me their is a future in this style of work for kids."

"Multi-touch and the behind the scenes technology that is needed to operate it, can be very temperamental."

"Children take to the medium very easily and naturally."

"They can be networked"

"3rd party software can run on them but you would lose the 40 touch capability"


"For 9 and 10 year olds (upper junior), the optimum number for using the Table is 4. Any more and it gets a little congested, limiting the screen real estate that you can use. This is crucial, you might be able to get 6 Year 5s around it but they will not get significant enough access to the surface and so the learning activity. "

Feb 27, 2009

Tangible User Interfaces Part II: More Examples, Resources, and Use for TUI's in Education

In Part I of my "mini-series" about Tangible User Interfaces, I discussed the origins of TUI and provided some examples of Siftables. In this section, I've provided some links to information about Tangible User Interfaces and the abstracts of two articles pertaining to TUI's in educational settings.

Zen Waves: A Digital (musical) Zen Garden



reactable from Nick M. on Vimeo.

Reactable
http://upload.wikimedia.org/wikipedia/commons/e/e3/Reactable_Multitouch.jpg
More about the Reactable
"The reactable hardware is based on a translucent, round multi-touch surface. A camera situated beneath the table, continuously analyzes the surface, tracking the player's finger tips and the nature, position and orientation of physical objects that are distributed on its surface. These objects represent the components of a classic modular synthesizer, the players interact by moving these objects, changing their distance, orientation and the relation to each other. These actions directly control the topological structure and parameters of the sound synthesizer. A projector, also from underneath the table, draws dynamic animations on its surface, providing a visual feedback of the state, the activity and the main characteristics of the sounds produced by the audio synthesizer."


The Bubblegum Sequencer: Making Music with Candy



Jabberstamp: Embedding Sound and Voice in Children's Drawings
(pdf)
(A TUI application to support literacy development in children)

Affective TouchCasting
(pdf)

TapTap: A Haptic Wearable for Asynchronous Distributed Touch Therapy
(pdf)

BodyBeats: Whole-Body, Musical Interfaces for Children
(pdf)

Telestory is a Siftables application that looks like it would be quite useful for supporting children who have communication disorders or autism spectrum disorders.

Telestory Siftables application from Jeevan Kalanithi on Vimeo.

"Telestory is an educational, language learning application created by Seth Hunter. In this video, the child is looking at a television screen. He can control onscreen characters, events and objects with the siftables. For example, he has the dog and cat interact by placing the dog and cat siftables next to each other."
TeleStory Project Website

Here is a video of how Siftables can be used as equation editors:


Siftables Equation Editor from Jeevan Kalanithi on Vimeo.

RESOURCES ABOUT TUI'S:


5 lessons about tangible interfaces, GDC Lyon, December 2007(pdf) Nicolas Nova


Special Issue on Tangible and Embedded Interaction (Guest Editors: Eva Hornecker, Albrecht Schmidt, Brygg Ullmer) Journal of Arts and Technology (IJART) Volume 1 Issue 3/4 - 2008


Reality-Based Interaction: A Framework for Post-WIMP Interfaces (pdf)


Here are a couple of abstracts of articles related to the use of TUI's in education:

Evaluation of the Efficacy of Computer-Based Training Using Tangible User Interface for Low-Functioning Children with Autism Proceedings of the 2008 IEEE International Conference on Digital Games and Intelligent Toys

"Recently, the number of children having autism disorder increases rapidly all over the world. Computer-based training (CBT) has been applied to autism spectrum disorder treatment. Most CBT applications are based on the standard WIMP interface. However, recent study suggests that a Tangible User Interface (TUI) is easier to use for children with autism than the WIMP interface. In this paper, the efficiency of the TUI training system is considered, in comparison with a conventional method of training basic geometric shape classification. A CBT system with TUI was developed using standard computer equipment and a consumer video camera. The experiment was conducted to measure learning efficacy of the new system and the conventional training method. The results show that, under the same time constraint, children with autism who practiced with the new system were able to learn more shapes than those participating in the conventional method."

Towards a framework for investigating tangible environments for learning Sara Price, Jennifer G. Sheridan, Taciana Pontual Falcao, George Roussos, London Knowledge Lab, 2008

"External representations have been shown to play a key role in mediating cognition. Tangible environments offer the opportunity for novel representational formats and combinations, potentially increasing representational power for supporting learning. However, we currently know little about the specific learning benefits of tangible environments, and have no established framework within which to analyse the ways that external representations work in tangible environments to support learning. Taking external representation as the central focus, this paper proposes a framework for investigating the effect of tangible technologies on interaction and cognition. Key artefact-action-representation relationships are identified, and classified to form a structure for investigating the differential cognitive effects of these features. An example scenario from our current research is presented to illustrate how the framework can be used as a method for investigating the effectiveness of differential designs for supporting science learning"

Feb 18, 2009

Ready for the SMARTTable?

The Smart Table is now available for purchase!



Here is the plug:

"The world's first multitouch, multiuser table for primary education - the SMART Table - is now available for purchase.Order the SMART Table"

"As a collaborative learning center, the SMART Table enables engaging and motivating small-group learning experiences. Up to eight students can use their fingers intuitively to sweep, slide and spin objects on the interactive screen. The SMART Table's ready-made activities help primary students gain and further their skills in areas like counting and reading."

"The SMART Table also makes an ideal complement to whole-class activities on the SMART Board interactive whiteboard. It helps reinforce concepts in a small-group setting and ensures students can participate in interactive and creative learning experiences."

(Cross-posted on the TechPsych and Technology-Supported Human-World Interaction blogs.)

Feb 3, 2009

New SMARTBoard Touch Recognition from SMART Technologies: The YouTube Video



Here's the plug:
"SMART's new Touch Recognition feature allows the SMART Board to recognize your touch and switch modes automatically. You can write with a pen, erase with the palm and move objects around with your finger without having to access other tools, buttons or on-screen menus."

Related

Learning Through Touch: The story behind the SMART Table pdf (Heather Ellwood, EdCompass, January 2009)

SMART Table Website

Jan 20, 2009

Children at the Surface Table: BETT 2009

The picture below was posted on the Shakeout blog, and shows children gathered around Microsoft's Surface at the recent BETT 2009 educational technology conference in the UK. Read "Interesting tidbits from BETT 2009" for more information.



Also posted on the TechPsych blog

Jan 18, 2009

BETT 09: UK's Annual ICT (Ed Tech) conference - Tabletop Computing and More

BETT is the annual ICT & educational technology conference held in the UK. The UK has the highest number of classrooms in the world with interactive whiteboards, which has been an interesting transformation to follow over the past few years.

I've posted several video clips from BETT '09, which was held this month (January), along with some other resources. Tabletop computing applications for education were demonstrated by Microsoft Surface and Smart Technologies. Take a look!


BETT 2009 Video Overview


Microsoft Surface at BETT 2009

More Surface for Education: User Interface and Paint


Physics and Social Studies


Orientation and Images


Science: Medical and Health Care


SMARTTable at BETT 09
(Note: The two clips below look as if they were taken with a cell phone video camera. I'll post higher-quality videos if I find them.)



SMART Technologies PR video

I think there is a need for more application development in this area!

RELATED
The following two clips are from the visitor's point of view, overwhelmed by it all...


SMARTTechnologies SMARTTable

Microsoft Surface
BETT 2008 Video Magazine
BETT 2008 Teachers TV Report

Nov 19, 2008

Video of touch interaction on a HP TouchSmart, with NextWindow's Gesture Server Technology

Here is a short video clip of some TouchSmart interaction:



The video shows the new NextWindow Gesture Server Application.

Info from the NextWindow website:

"NextWindow Gesture Server Application in conjunction with a NextWindow touch screen enables two-touch gestures to be used on the Microsoft Windows Vista desktop and certain applications.

You perform a gesture by double-tapping or dragging two fingers on the touch surface. The Gesture Server interprets these actions as commands to the operating system. For example a two-touch vertical drag on the Vista desktop can adjust the computer's audio volume control up or down as required."


Also from the website:

Vertical Scroll Vertical scroll: drag two fingers up or down the touch screen.

Vertical Scroll Horizontal scroll: drag two fingers left or right on the touch screen.

Vertical Scroll Zoom: move two fingers apart or together.

Vertical Scroll Double Tap: double-tap two fingers on screen.

"You can enable or disable the two-touch functionality and adjust the sensitivity of each of the four two-touch gestures. You can also select the command that is executed with the double-tap gesture."

Oct 30, 2008

IDC 2009: The 8th International Conference on Interaction Design and Children

If you are interested in children, technology, and new methods of interaction design, take a look at the web site for IDC 2009: The 8th International Conference on Interaction Design and Children. It will be held in Como, Italy Jun 3-5, 2009.

"For young people today, technology is pervasive in many aspects of life. From childhood onwards, they learn and play using computers and other technological devices; as they grow, they build and maintain friendships using computers and mobile phones; they interact with one another virtually; and even find critical interpersonal support and therapy using computers, the web, and other technology-enhanced artifacts. The IDC 2009 conference will continue IDC's tradition of better understanding children’s and youngsters’ needs in relationship to technology, exploring how to create interactive products for and with them, and investigating how technology-mediated experiences affect their life. IDC 2009 will present and discuss the most innovative contributions to research, development, and practice in these areas, gathering the leading minds in the field
."


The deadline for the call for workshop proposals is January 12, 2009, full papers, January 19, 2009, and short-papers & demos, March 6, 2009.

Sep 24, 2008

Link to post: Cool Interactive Whiteboard Activities for Teaching Math

Take a look at my recent post on the TechPsych blog to learn more about the math activities created by Spencer Riley for use on interactive whiteboards, then visit his website. The activities are free!

Sep 1, 2008

Interactive Touch-Screen Technology, Participatory Design,and "Getting It"....

PLEASE SEE THE UPDATED VERSION OF THIS POST:
Interactive Touch Screen Technology, Participatory Design, and "Getting It", Revisited

http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg

Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:

When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?

Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios in which surface computing would be a welcome breath of fresh air.

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.

http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.

HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it.


Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers.

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like.

(The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.)


Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.

A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities.

(By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

Some resources:
HP TouchSmart PC website, with demo
HP's TouchSmart YouTube videos
lm3labs (catchyoo, ubiq'window)
NUI Group (See member's links)
NextWindow
Fingertapps
thirteen23
SmartTechnologies
Perceptive Pixel - Jeff Hans
Microsoft Surface
iPhone
(More can be found by doing a search on this blog or The World Is My Interactive Interface.)

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI:

Need for Improvement: User-Unfriendly Information Kiosk Interactive Map


Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0



Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.


For more about lm3labs, including several videoclips, take a look at one of my previous posts:
Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Aug 18, 2008

Digital Storytelling, Multimodal Writing, Multiliteracies...

Digital storytelling, multimodal writing, and multiliteracies are overlapping concepts that weren't around during my first round as a university student. As more people of all ages create and share digital content on the web in new and imaginative ways, teachers and university scholars have taken notice. Is there a consensus that the printed word, as we've known it, is in the middle of a digital transformation?

Let's start out with digital storytelling.

By now, everyone knows about YouTube and vlogs as new means of communication. There is more to digital storytelling than uploading a few hastily put-together video clips from the family camcorder, or slapping together a PowerPoint presentation with a few bells and whistles. There are now some standards. Digital storytelling is an art.


The following definition is from an article from EduCause, 7 things you should know about Digital Storytelling.:

  • "Digital storytelling is the practice of combining narrative with digital content, including images, sound, and video, to create a short movie, typically with a strong emotional component. Sophisticated digital stories can be interactive movies that include highly produced audio and visual effects, but a set of slides with corresponding narration or music constitutes a basic digital story. Digital stories can be instructional, persuasive, historical, or reflective. The resources available to incorporate into a digital story are virtually limitless, giving the storyteller enormous creative latitude. Some learning theorists believe that as a pedagogical technique, storytelling can be effectively applied to nearly any subject. Constructing a narrative and communicating it effectively require the storyteller to think carefully about the topic and consider the audience’s perspective."

Petter Kittle, from the Northern California Writing Project, Summer Institute 2008, touches on the topic of multimodal writing in Multimodal Texts: Composing Digital Documents. Related to this is the concept of digital writing.

"
Multiliteracies is an approach to literacy which focuses on variations in language use according to different social and cultural situations, and the intrinsic multimodality of communications, particularly in the context of today's new media."

  • "...it is no longer enough for literacy teaching to focus solely on the rules of standard forms of the national language. Rather, the business of communication and representation of meaning today increasingly requires that learners are able figure out differences in patterns of meaning from one context to another. These differences are the consequence of any number of factors, including culture, gender, life experience, subject matter, social or subject domain and the like. Every meaning exchange is cross-cultural to a certain degree." -from Kalantzis and Cope's Multiliteracies website
Here is a short list of resources:
The Center for Digital Storytelling
Multimedia Storytelling
What are multimodality, multisemiotics, and multiliteracies?
(Ben Williamson, Futurelab)
Reading Images: Multimodality, Representation, and New Media
(Gunther Kress)
New Learning: Elements of a Science of Education
(Mary Kalantzis & Bill Cope)
Multiliteracies
The Multiliteracy Project
Multimodal Writing
http://multimodalwriting.com/
(new website, under development)
Multimedia Blogging
(a post from 2004, worth reading for historical context)
Thinking about multimodal assessment
(Digital Writing, Digital Teaching)
Standards related to digital writing
(from Teaching Writing Using Blogs, Wikis...)

I conclude this text-based post with a promise to incorporate more multimedia experiences in my upcoming posts....stay tuned.


Aug 9, 2008

Creative Programming: openFrameworks - AWESOME for interactive multimedia applications!

openFrameworks: Better Tools, Enhanced Creativity, Better Projects: YES. Artists can make tools at the same time they make artwork.

To learn all about this, delve into the video. It highlights interviews with creative people who are using openFrameworks, including their innovative work.


made with openFrameworks from openFrameworks on Vimeo.

If you are working with openFrameworks, or thinking about it, let me know.


This looks like a great tool to use for projects I'm creating for my new HP TouchSmart....

.....and my multi-touch thought experiments ; }














I learned about openFrameworks from Seth Sandler, aka "cerupcat", a member of NUI-group who was chosen to participate in Google's Summer of Code. He's posted about his progress on his
AudioTouch blog.

Here is a screenshot of Seth's tracking application, still under development, is the result of porting touchlib, the main tracker used by NUI-Group members, to openFrameworks:
http://www.nuicat.com/tracker.jpg

May 29, 2008

Umajin Creative -Digital Story Telling for Interactive Whiteboard or Touch Screen- free demo available


Umajin Creative is an application designed for digital storytelling. I haven't had a chance to preview it. I was impressed with the pictures on the website. If you use this application, please leave a comment. It looks kid and teacher friendly. I can see that it has potential!

According to the website, you can "compose multi-page digital documents with rich text, digital photography (including blue screen support), illustrations, sound, video, 3D models, particle fx, interactive functionality, and so much more... runs on both a Mac and PC. It also supports interactive whiteboards and touch screen PC's.. so you can interact directly with the content. With the HP Touchsmart PC you can use real brushes on the screen..to experience digital painting with variable width brush strokes!

Interactive Digital Storybook:

ebook1.gif
Below: Cool-looking digital brushes.
The image “http://www.umajin.com/slides/album1/images/clip.jpg” cannot be displayed, because it contains errors.
http://www.umajin.com/slides/album1/images/3d.jpg
Above- Sample 3D models
Below- Sample of blue-screened cutout images
http://www.umajin.com/slides/album1/images/cutouts.jpg
Below: Samples of animated particle effects
The image “http://www.umajin.com/slides/album1/images/pfx.jpg” cannot be displayed, because it contains errors.
Seven Custom Functions:

The image “http://www.umajin.com/slides/album1/images/animations.jpg” cannot be displayed, because it contains errors.

I think the people from Fingertapps are responsible for UMAJIN. Below is a video highlighting multi-touch applications:

May 24, 2008

Game Based Learning: Second European Conference

The 2008 Second European Conference on Game-Based Learning will be held in Barcelona, Spain, October 16-17, hosted by the Universitat Oberta de Catalunya.

Speaker Bios

Conference Program

If you are working in a K-12 setting and interested in sharing your ideas about game-based learning, please leave me a message. I'm especially interested in how interactive games support engaged, meaningful learning.


May 4, 2008

NASA's promotion of MMO games to support STEM learning; Vision-play's SpaceStationSim game; EASe games for children with autism spectrum disorders


NASA is looking for a partner to develop a massively multiplayer online learning game to support education in science, technology, engineering, and mathematics, known as STEM. The following is a quote from NASA's website regarding the advantages of providing learners with MMO games:

"Persistent immersive synthetic environments in the form of massive multiplayer online gaming and social virtual world, initially popularized as gaming and social settings, are now finding growing interest as education and training venues. There is increasing recognition that these synthetic environments can serve as powerful “hands-on” tools for teaching a range of complex subjects. Virtual worlds with scientifically accurate simulations could permit learners to tinker with chemical reactions in living cells, practice operating and repairing expensive equipment, and experience microgravity, making it easier to grasp complex concepts and transfer this understanding quickly to practical problems. MMOs help players develop and exercise a skill set closely matching the thinking, planning, learning, and technical skills increasingly in demand by employers. These skills include strategic thinking, interpretative analysis, problem solving, plan formulation and execution, team-building and cooperation, and adaptation to rapid change."

NASA's Request for Proposals document outlines the specifics for game developers who'd like to partner with NASA on this project. The MMO's target audience is teens in highschool and above, with adoption expected at the middle school level. Partners should be know how to make the game accessible to people with disabilities.


SPACESTATIONSIM


If you like games about space, or know young people who do, you might be interested in Vision-play's SpaceStationSim, which was developed in collaboration with NASA. For more information, you can visit the Vision-play website, where you can find an on-line manual for the game, screenshots, and a free demo.

EASe GAMES


Vision-Play also created four games for use with children who have autism spectrum disorders, building on the EASe CD series used by some occupational therapists to help with auditory hypersensitivity, hyperacusis, central auditory processing disorders, or sensory integration disorders. The EASe games "not only fun to play, but stimulate a child’s auditory/vestibular and visual/balance sensory inputs, and help teach them to manage noise and regulate balance." EASe games allow for three speed settings. It is not clear if they are switch-adaptible.

It will be interesting to see how these games play out in school settings!

Mar 5, 2008

Microsoft's Photosynth: Immersive Learning Possibilties

I just came across an interesting article in the on-line MIT Technology Review, written by Jeffery MacIntyre, about Photosynth, developed by Microsoft's Live Labs:

Microsoft's Shiny New Toy: Photosynth is an application that's still a work in progress. It is dazzling, but what is it for?

I think that the PhotoSynth application would be very useful in education settings - K-12 and also at the university level.

More classrooms now have interactive whiteboards that connect to the Web, providing a broader range of possibilities for educators to create engaging, immersive learning experience for their students.

Young people would have the opportunity to experience virtual field trips and collaborate with students around the world.

An application such as PhotoSynth, coupled with an information/data visualization component, would be quite useful in high school and university classrooms.


From the TechReview Article:


"You are here: Photosynth, an application in development at Microsoft’s Live Labs, offers an immersive way to view photos of a given thing or place. The software has not yet been released, but Microsoft is demonstrating it online with photo collections such as this one of Venice’s St. Mark’s Square." Credit: Courtesy of Microsoft Live Labs

Below are links from the TechReview article:

"Watch Photosynth stitch photos together."

"View images from Photosynth and see how it works."


From Microsoft's Website:
"Photosynth takes a large collection of photos of a place or object, analyzes them for similarities, and displays them in a reconstructed 3-Dimensional space."

"With Photosynth you can:
  • walk or fly through a scene to see photos from any angle;
  • seamlessly zoom in or out of a photograph even if it's gigapixels in size;
  • see where pictures were taken in relation to one another;
  • find similar photos to the one you’re currently viewing;
  • explore a custom tour or see where you’ve been; or
  • send a collection to a friend."

If any readers have thoughts about the use of PhotoSynth in educational settings or situations, please leave a comment!

Microsoft's PhotoSynth website

Jan 31, 2008

Gigapan: Good for interactive whitebooards and large touch screen displays

Gigapan, a web based panorama sharing website, offers a range of visual resources that are ideal for interactive whiteboards and touch screen displays.

I'll update this post with video, pics, and a review soon!