Showing posts sorted by relevance for query "natural user interface". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interface". Sort by date Show all posts

Dec 3, 2010

More gesture and multi-touch interaction! Windows 7 Navigation with Kinect; Product browser by Immersive Labs,

Here are a couple of new natural user interface videos.  The first video, by Evoluce, demonstrates gesture interaction/navigation in Windows 7 applications supported by Kinect. The second video, by Immersive Labs, shows multi-touch product browsing interaction on a large display.

Kinect Treatment of Windows 7, by Evoluce

Evoluce: Leading Surface Technologies


Immersive Labs - Multi-touch Product Browser

Immersive Labs

Feb 3, 2010

Touchco, Ken Perlin & Ilya Rosenburg's Multitouch Company, Bought By Amazon

Here is the scoop:
Amazon buys touchscreen startup Touchco, merging with Kindle division
Ross Miller, Engadget, 2/3/2010

Amazon Said to Buy Touch Start-Up
Nick Bilton and Brad Stone, New York Times, 2/3/2010


Is Amazon Building a SuperKindle?
Nick Bilton, NYT, 2/3/2010
"Unlike traditional capacitive sensors, our patent-pending system can detect any object — not just a finger — and can determine how much pressure is being applied to every point on a sensor simultaneously. IFSR sensors are natively multitouch, use less power than capacitive sensors, and are much less expensive to produce, making them a highly disruptive technology with widespread market applications."

RELATED
Here is my Aug. 29 post:
Natural User Interface Surfaces:  TouchCo - IFSR Technology company, founded by NYU Media Research Lab's Ken Perlin and Ilya Rosenburg


TOUCHCO GUI:

Touchco GUI example from Nick Bilton on Vimeo.

A Touch of Ingenuity:  Inexpensive pressure-sensitive pad could make surfaces smarter
Kate Green, MIT Technology Review, Sept-Oct


Multitouch Screens Could Enliven New Devices
Nick Bilton, NYT, 12/20/09


TouchCo was an outgrowth of the UnMousePad research at NYU.


Photo from the UnMousePad website:



Photo from NYT:
clear touch technology
Photo from NYT:
eink screens
Photo from NYT:
flexible touchco display


Ken Perlin

Nov 21, 2009

Want to make some multi-touch? Try PyMT- Python Multitouch. Featured in Make. (via Sharath Patali)

Sharath Patali, a member of the NUI-Group, has been working with Python Multitouch, otherwise known as PyMT, to create multi-touch applications.  He shared a link to a recent post in Make, featuring PyMT.  Sharath is the author of the UI Addict blog, and is currently doing his internship at NUITEQ (Natural User Interface Technologies).

I've been told that the beauty of PyMT is that it makes it "easy" to create multi-touch prototype applications using very few lines of code, which is great for trying out different ideas in a short period of time.  It helps if you already know Python!


PyMT - A post-WIMP Multi-Touch UI Toolkit from Thomas Hansen on Vimeo.

"PyMT is a python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. PyMT is written in Python, based on pyglet toolkit."


PyMT Programming Guide


PyMT Website

Note: 
Christopher, author of The Space Station blog, is a member of the NUI-Group, and is building his own multi-touch table running his PyMT-based applications. Christopher is a student in Koblenz, Germany, studying computational visualistics, known as information visualization in the US.

Oct 10, 2009

News from NUITEQ and 3M

Various multi-touch and natural-user interface/interaction companies have been collaborating, and results are promising. 

Here is the result of a collaboration between 3M and NUITEQ, the company based in Sweden:

3M Multi-Touch &; NUITEQ Suite Software Video


In this video, you'll see SnowFlake software used in a variety of contexts, such as education, architecture, health, financial consultation, product browsing, games, playing with special effects, and more.












What a difference since 2007, when I shared what I was doing in my graduate classes on this blog!

Aug 28, 2009

Ron George's Interaction Design Toolbox

Ron George, an interaction designer, recently posted a comment on my blog. So I visited his blog and found that he has a great set of resources that many of my readers would appreciate!

For starters, take a look at Ron's Interaction Design Toolbox page, and then spend some time browsing around the rest of his site/blog.

Ron works at Microsoft on a team that is exploring Natural User Interface/Interaction (NUI). He previously was on the Surface team.

Jun 6, 2009

Interactive Touch Tables are Multiplying! Ideum's new 100" multi-touch museum display; Ubisoft's Ruse on a Surface....

http://www.ideum.com/blog/wp-content/uploads/100-multitouch-table-cropped-1024x724.jpg
Photo from the Museum Exhibit and Design News/Ideum Blog

The above table is 100", created for Nashville's Adventure Science Center's Sudekum Planetarium, is based on Ideum's mt2 table: "Built for museum environments, the high-resolution mt2 Table supports intuitive gestures, allowing visitors to interact and explore custom exhibits. With multiuser support, mt2 Table serves as digital gathering place, allowing visitors to interact with museum content and each other."

Two computers power this table. Natural User Interface's Snowflake handles the optical support and tracking, and the second computer will be running the EM spectrum application, which is developed in Adobe Flash. It is a work in progress. For more information, visit the Ideum blog.

Here is a video trailer of Ubisoft's R.U.S.E. played on a multi-touch table. Much of the trailer appears to be a simulation. I liked the part where the battleships are traveling over the water. According to a blog post, the table depicted in the trailer is the Intuiface, by Intuilab.

Photos of Ubisoft's R.U.S.E. game, running on a multi-touch table:

http://www.blogcdn.com/www.joystiq.com/media/2009/06/rusesurfacea.jpg

http://static2.cdn.ubi.com/emea/gamesites/ruse/community/uk/RUSEBLOG.005.jpg
Photo from the Ubisoft Blog

Intuiface Information PDF
Intuilabs Website

If you visit the Intuilabs website, be sure you interact with the photos of the team! I couldn't get the pictures to change when I touched the screen of my HP TouchSmart PC, so I think they are activated by a mouse-over. It would be nice if they could touch-enable their website.)

Apr 22, 2009

Dell Studio One 19 Touch Zone App by Fingertapps: The Video

Here is the video demonstration of Dell Studio One 19 Touch Zone, developed by Fingertapps, a New Zealand software company:



The Dell Studio One with Fingertapp's multi-touch natural user interface software is due for launch soon, according to Ben Wilde and Dave Brebner, of Fingertapps. Here is a link to a recent Engadget article by Paul Miller: Dell demos multitouch on the Studio One 29 (with additional videos)

http://www.fingertapps.com/fingertapps-brand_linear.png

Apr 21, 2009

NUI-Group's Christian Moore Interview-Podcast on the Interactive Display 2009 Conference website

As I write this post, the 2009 Interactive Displays conference is underway in San Jose, California. The home page of the conference website provides a series of interviews of many of the conference participants and presenters, including Chris Moore, a member of the NUI group. Chris discusses the importance of the open-source NUI (natural user interface) community in moving interactive display and NUI-based technology forward.

Interactive Displays 2009 Conference Website

Interview Participants:
Mats Johansson - EON Reality
Guillaume Largillier - Stantum
Christian Moore- NUI Group
Brent Bushnell- Tapcode
Steven Bathiche - Microsoft
Jerry Bertrand - Microscent
Henry Kaufman- Tactable


If you are at the conference and you have some video clips to share of a presentation, demonstration, or anything else that might be of interest, please give me the link so I can share it on my blog.

Below are some pictures from the companies represented by the people who were interviewed for the Interactive Displays 2009 conference:

http://notes2self.net/mob_img/EonTouchlight.jpg
Eon Reality

http://www.tactable.com/images/homeimage1.jpg
Tactable

http://www.maximumpc.com/files/u46168/microsoft-surface.jpg
Microsoft's Surface

http://www.nuicat.com/tracker.jpg
NUI-Group

http://media.tapcode.net/videos/website/products/thumbs/hardware180.png
Tapcode


I don't usually put company logos on my blog, but if you are a visiting my blog searching for links to companies that are involved with interactive display technologies, this will save you some time. (I received no payment for this gesture.)

https://www.regonline.co.uk/custImages/249955/3M-logoID.gifhttps://www.regonline.co.uk/custImages/249955/logo_NUIEU.gifhttps://www.regonline.co.uk/custImages/249955/Tyco-ElectronicsElo-TouchSy.png
https://www.regonline.co.uk/custImages/249955/OsramLogo.gifhttps://www.regonline.co.uk/custImages/249955/Touch-International.png
https://www.regonline.co.uk/custImages/249955/Kyo2.jpghttps://www.regonline.co.uk/custImages/249955/NWID.jpghttps://www.regonline.co.uk/custImages/249955/RPO2.png








Apr 17, 2009

Pervasive Checkers on Microsoft's Surface: The Gamepack Video

In early 2007, before we knew of the existence of Microsoft's multi-touch surface, I worked on a "Pervasive Checkers" project with Johnny Hopkins, a classmate in my Ubiquitous Computing class. I'd previously worked with XNA Game Studio Express in an AI for Games class, and thought that it would be cool to make a casual checkers game on a multi-touch table that could be played in gathering spots such as coffee houses and neighborhood cafes.

Below is a screen shot of what we created using Inspiration software - (in the application, you can click on an item and it expands to reveal additional information.)



Two years later, and the Pervasive Checkers idea is reality - but I wasn't involved in the process. Checkers is one of the games that is included in a game-pack created specifically for the Surface.

Take a look at the video:



From Surface Computer News:

"The Microsoft Surface Games Pack is a clear illustration of where the Natural User Interface of Windows 7 has the potential to take games. Windows is traditionally the number 1 gaming platform around the world. With the introduction of the NUI, allowing players to literally have titles at their fingertips via touch, Windows 7 can kick the gaming experience up a notch. Provided that developers rise up to the challenge."

Oct 13, 2008

New multi-touch "Gravitoy" application from NUI; Multi-touch Space Invaders from Barcelona; "Open source" touch-less multi-touch SDK on CodePlex

NUI (Natural User Interface) recently unveiled Gravitoy, a multi-touch physics application that offers two modes of interaction. Below is a picture of children engaged in collaborative play on an interactive table running Gravitoy in the Sandbox mode. The children can draw objects that interact using principles of physics. (Photos are from NUI co-founder Harry van der Veen's blog)

photo

Below are table-shots of the Sandbox mode:
photo

photo
See the triangle?

The second mode of Gravitoy is "Stacks". Two players play a game of stacking objects of various sizes. The difficulty level can be adjusted by changing gravity options.

photo

This would make science lab engaging for students!

In addition, NUI developed a 3d model viewer application, which looks like it would be fun for budding architects and designers.

Visit the NUI website to learn more about the company and NUI's partners.

Take a look at Multi-touch Space Invaders from Multi-touch Barcelona. It looks like a fun game to play during a rainy recess:

Multitouch Space Invaders! from Multitouch Barcelona (NUI-Group members) on Vimeo.

If you are interested in learning more about developing multi-touch applications and systems, visit the NUI Group website for opensource resources, code, and "how-to" support forums.

Video clips and links related to the work of NUI-Group members, such as Christian Moore, Harry van der Veen, and Seth Sandler, can be found on a previous post:
More Multi-touch from Members of NUI Group!




Touchless
Multi-touch...

I recently learned of another "open-source" touch-less "multi-touch" resource. The Touchless SDK was developed for a college project by a Columbia University student, Mike Wasserman.
You will need Visual Studio 2008, the code, and a webcam to get started. From what I can tell, the code will work with in the XNA development environment.

The Touchless SDK website has links to a video demonstration, demo code, FAQs, and more. The discussion list has a lot of good information.

Mike Wasserman is now working at Microsoft Labs.

Feb 13, 2011

Wii Just Dance2 and Kinect Dance Central: UI and Usability Approaches; Challenges for Developing Accessible Games (revised)

I love to dance- I studied dance through college, and off and on as an adult.   I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust.  I'm ready for new challenges.


I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.


So what do the wise men of usability have to say about new ways of interacting with games and other applications?


"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen


Jakob Nielsen, one of the godfathers of usability,  shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions.  Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.


If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post.  Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability


Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years.  Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities.  For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers.  There are students who have milder mobility challenges who love to dance, and the current games don't address their needs.  Some of my students have vision or hearing impairments, too.  They deserve a chance to play things designed for the Kinect.

"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."

Note:  I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs.   We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii,  and soon (we hope), a Kinect.   If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.

FIRST STEPS
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to  move and dance, let alone access fast-paced dance games on the Wii or Kinect. 


My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm.  Where would I start?


Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.


I decided to take a look at Just Dance2's  MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting  in less movement, and less participation with peers in physical activities, such as playing dance games.  If a teen has depression as part of this mix, we know that exercise can help,  and a fun dance game might be a life-saver, in more ways than one.




The screen shots below show how the movement icons are used in the game:








I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game.  (I don't know if there was anyone consulted about accessibility concerns for the game.)


Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't.  Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:

Dance Central Full Motion Preview


In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen.  The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing.  Players can select dances according to level of difficulty. 


Kinect Usability with Regular People

Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study".  The quote below is from the Steve's article:

"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.

We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.

Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
  1. Allow users to make selections through positive gestures, rather than timed positions
  2. Place options on a single axis to make them easier and quicker to select
  3. Allow users to control menus with the game pad if they prefer
  4. Use large easy to read text
  5. Don’t make users scroll through options unnecessarily – it takes too long
  6. Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
  7. Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"

Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:







RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
Kexa MacDonald, 10/19/10







Nov 11, 2010

Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited

I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while. 


Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same.   I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post. 


(The missing piece of information?  An update about apps for the iPad and similar touch-screen tablets.)

Sit back and enjoy!


http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg
-Images: HP; Wired

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg
-Images: ClassematePC


Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing.  There is a need for a breath of fresh air!

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.  
HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the  Dell, the iPad, etc.)

Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.

Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

UPDATE:  Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world


There are some interesting changes going on at the intersection of HCI and educational technology research.  I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe:   Next Generation of HCI and Education

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI: TOUCH SCREEN DISPLAYS:  NEED FOR IMPROVEMENT!

Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke


User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)

User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic.  When I went to visit a relative at the hospital a year or so later, the map was no longer there.


BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Midwife Toad App on a Microsoft Surface, Discovery Place Science Center


TellTable:  Digital Storytelling on the Surface:  Microsoft Research, UK


DECEMBER 31, 2009 -Interactive Soda Machine for Fun

The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them.  Note:  No one from this family actually purchased a soft drink.  I was hoping to time how long it would take them to do so!



Some resources:
lm3labs (catchyoo, ubiq'window)
NUITeqNUI Group (See member's links)
Sparkon (See members links and multi-touch projects)

(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)

If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."

FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining.   The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below: