Showing posts sorted by relevance for query Smart Table. Sort by date Show all posts
Showing posts sorted by relevance for query Smart Table. Sort by date Show all posts

Oct 12, 2011

RENCI Update: Combining Gaming and Visualization Technologies to Support Efficient and Effective Decision-Making

        RENCI stands for the Renaissance Computing Institute.  It is a multidisciplinary collaboration between UNC-Chapel Hill, Duke University, and North Carolina State University, with Engagement Sites at UNC Asheville, Duke University, Eastern Carolina University, North Carolina University, UNC Chapel Hill, UNC Charlotte, and the UNC Costal Studies Institute.  
        According to the mission statement, "RENCI develops and deploys advanced technologies to enable research discoveries and practical innovations."  Much of the work of RENCI focuses on large-scale information and data visualization.
        Why is this important?  It provides an effective visual-cognitive means of understanding complex data from a variety of disciplines, and also supports the collaboration of researchers across a variety of disciplines.   It has the potential to support larger-scale decision-making and problem-solving in our technology-dependent, interconnected world.  

Take a few minutes and explore what I've posted below:
       
Press release from RENCI about the interactive dome, pictured below (photo credit: RENCI-UNC Asheville):  "To understand human-induced global changes, there's no place like domeNancy Foltz, 10/12/11

RENCI: Gaming the Future
        The video below provides an overview of how innovative interactive visualization tools support decision-making across many disciplines.

RENCI: Unity 3D game engine to support immersive information visualization applications:

RENCI Situation Room Multi-touch Table, UNC-Charlotte:





RELATED
RENCI pioneering the visualization industry with innovative interfaces
Tracy Boyer Clark, Innovative Interactivity (II), 2/8/10
RENCI Visualization Center Update
Lynn Marentette, Interactive Multimedia Technology, 2/9/10
RENCI Tutorial: "Beautiful Code, Compelling Evidence: Functional Programming for Information Visualization and Visual Analytics" (pdf)  J.R. Heard
RENCI: Data to Decisions
Recent Publications from RENCI:
Y. Xin, I. Baldine, A. Mandal, C. Heermann, J. Chase, and A. Yumerefendi. “Embedding Virtual Topologies in Networked Clouds.”The 6th International Conference on Future Internet Technologies (CFI). Seoul, Korean, June 2011 
Y. Xin, I. Baldine, J. Chase, T. Beyene, B. Parkhurst, and A. Chakrabortty. “Virtual Smart Grid Architecture and Control Framework.” 2nd IEEE International Conference on Smart Grid Communications (IEEE SmartGridComm), Brussels, Belgium, Oct. 2011 
X. Ju, H. Zhang, W. Zeng,M. Sridharan, J. Li, A. Arora, R. Ramnath, Y. Xin. “LENS: Resource Specification for Wireless Sensor Network Experimentation Infrastructures. ” The 6th International Workshop on Wireless Network Testbeds, Experimental Evaluation and Characterization (WinTECH), Las Vegas, Nevada, Sep. 2011
RENCI's Facebook Page
Twitter: @RENCI

Jul 6, 2011

Revisiting CHI 2011: Videos of Interactive Touch, Gesture, Large Surface, and Mobile Apps with Potential for Use in Education (CHI = Computer Human Interaction)

One of my interests is how the power and potential post-WIMP interactive technologies can be harnessed for use for formal and informal education purposes, including life-long collaborative learning.  


In May, I had a chance to meet with a number of like-minded people during the CHI 2011 conference at the 2nd Workshop on UI Technologies and Impact on Educational Pedagogy.  I was impressed with the depth and breadth of the presentations at the workshop.   Since then, I've been looking through other papers and videos from CHI 2011 to find interesting applications that hold potential for use in educational settings.  


I've come across a good number of interesting applications and prototypes, so be sure to check back for future posts on this topic.  For now, here are a few applications that I'd like to share.  


Below are a few videos from Phillip Chi-Wing Fu.  (He doesn't know it yet, but I've admitted his videos into the Post-WIMP Explorers' Club.)


Interactive Multi-touch Sketching Interface for Diffusion Curves

"A novel multi-touch sketching interface enabling interactive and practical design with 2D diffusion curves is proposed; featured interaction techniques include simultaneous sketching of multiple diffusion curves and at-the-spot colors tuning."


Distinguishing Multiple Smart-Phone Interactions on a Multi-touch Wall Display using Tilt Correlation

"This paper proposes a novel matching technique, called tilt correlation, which employs the built-in tilt sensor on smart-phones to identify their concurrent contacts on a common multi-touch wall display."


WYSIWYF: Exploring and Annotating Volume Data with a Tangible Handheld Device (CHI 2011)


"Integration of a multi-touch wall display with a tangible handheld device with multi-touch and tilt sensing capabilities to provide intuitive what-you-see-is-what-you-feel visual exploration and annotation of volume data."


The following videos were uploaded by alucero:


Pass-Them-Around: Collaborative Use of Mobile Phones for Photo Sharing (CHI 2011)

"Pass-Them-Around is a phone-based application that allows a small group of collocated people to share photos using the metaphor of passing paper photos around. The prototype encourages people to share their devices and use them interchangeably while discussing photos face-to-face. The prototype supports ad-hoc photo sharing in different contexts by taking into account the spatial arrangement of users around a table, measured with sensors embedded in their mobile phones."


The next video was part of MobileHCI '10:
MindMap: Collaborative Use of Mobile Phones for Brainstorming


Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them. 




Aug 18, 2009

CRISTAL: One Giant Remote Control Multi-Touch Coffee Table; ACM Interactive Tabletops and Surfaces 2009 in Banff, Canada


Via Wired Gadget Lab Priya Ganapati 8/14/09

What is CRISTAL ? Control of Remotely Interfaced Systems using Touch-based Actions in Living Spaces and acronym for a project at the Media Interaction Lab at the Upper Austria University of Applied Sciences, Digital Media.

Watch the videos:




"CRISTAL simplifies the control of our digital devices in and around the living room. The system provides a novel experience for controlling devices in a home environment by enabling users to directly interact with those devices on a live video image of their living room using multi-touch gestures on a digital tabletop." -mediainteractionlab, YouTube

The CRISTAL project is a collaboration between several people, spanning across a few universities, according to the Media Interaction Lab website:
Christian Rendl
Media Interaction Lab
Florian Perteneder
Media Interaction Lab
Thomas Seifried
Media Interaction Lab
Michael Haller
Media Interaction Lab
Daisuke Sakamoto
University of Tokyo
Jun Kato
University of Tokyo
Masahiko Inami
Keio University
Stacey D. Scott
University of Waterloo
CRISTAL received the Best Emerging Technology Award at the 36th International Conference and Exhibition on Computer Graphics and Interactive Techniques (SIGGRAPH 2009)

Below is a sample of the Interactive Media Lab's publications:

M. Haller, P. Brandl, C. Richter, T. Seifried, J. Leitner, and A. Gokcezade, 2009.
"Interactive Displays and Next-Generation Interfaces." Springer, 2009. [bibtex]

C. Köffel, W. Hochleitner, J. Leitner, M. Haller, A. Geven, and M. Tscheligi, 2009.
"Using Heuristics to Evaluate the Overall User Experience of Video Games and Advanced Interaction Games." Springer, 2009. [in press] [bibtex]

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009.
"Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009.
"Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. 7, pp. 33-40, 2009. in press [bibtex]

J. Leitner, M. Haller, K. Yun, W. Woo, M. Sugimoto, M. Inami, A. D. Cheok, and H. D. Been-Lirn, 2009.
"Physical Interfaces For Tabletop Games," Computer Entertainment, vol. XX, p. XX, 2009. [bibtex]

M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation."

D. Leithinger and M. Haller, 2007.
"Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]

D. Regenbrecht, M. Haller, J. Hauber, and M. Billinghurst, 2006.
"Carpeno: interfacing remote collaborative virtual environments with table-top interaction," Virtual Reality, vol. 10, iss. 2, pp. 95-107, 2006. [bibtex]

One of the people involved in the CRISTAL project is Stacey D. Scott, Ph.D., is an assistant professor of systems design engineering at the University of Waterloo. She is also the director of the Collaborative Systems Laboratory. The Collaborative Systems Laboratory focuses on fundamental interfaces and interaction techniques for shared large-screen displays, such as multi-display environments and social-supporting digital tabletop interfaces, and also collaborative and decision support interfaces for complex, time-critical team environments.

Dr. Scott is also one of the program co-chairs of the upcoming ACM Interactive Tabletops and Surfaces 2009 Conference will be held November 23-25 in Banff, Canada.

Mark your calendars!

The following topics, as they relate to interactive tabletops and surfaces, will be presented:

  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Sensing and input technologies
  • Human-centered design & methodologies
Here is the "who's who" of interactive tabletops and surfaces- the Interactive Tabletops and Surfaces program committee:

Patrick Baudisch Hasso Plattner Institute Potsdam, Germany
Francois Berard University of Grenoble, France
Peter Brandl Media Interaction Lab, Upper Austria University of Applied Sciences, Austria
Andreas Butz University of Munich, Germany
Francois Coldefy Orange Labs, France
Morten Fjeld Chalmers University of Technology, Sweden
Kentaro Fukuchi University of Electro-Communications, Japan
Tovi Grossman Autodesk Research, Canada
Mark Hancock University of Calgary, Canada
Petra Isenberg University of Calgary, Canada
Yuichi Itoh Osaka University, Japan
Karrie Karahalios University of Illinois, USA
Hiro Kato Osaka University, Japan
Hideki Koike University of Electro-Communications, Japan
Frank Maurer University of Calgary, Canada
Max Mühlhäuser TU Darmstadt, Germany
Christian Muller-Tomfelde CSIRO-ICT Centre, Australia
Miguel Nacenta University of Saskatchewan, Canada
Patrick Olivier Newcastle University, UK
Jun Rekimoto Sony / University of Tokyo, Japan
Meredith Ringel Morris Microsoft Research, USA
Daisuke Sakamoto Tokyo University, Japan
Yoichi Sato University of Tokyo, Japan
Chia Shen Harvard University, USA
Masahiro Takatsuka University of Sydney, Australia
Lucia Terrenghi Vodafone Group R&D, Germany
Bruce Thomas University of Southern Australia, Australia
Melanie Tory University of Victoria, Canada
Edward Tse SMART Technologies, Canada
Fred Vernier South-Paris University, France
Andy Wilson Microsoft Research, USA
Massimo Zancanaro Bruno Kessler Foundation (formerly ITC), Italy



If you are a university student researching interactive tabletops, multi-touch surfaces, and/or gesture interaction, I hope this post helps!

Dec 12, 2010

Interactive Surveillance: Live digital art installation by Annabel Manning and Celine Latulipe

Interactive Surveillance, a live installation by artist Annabel Manning and technologist Celine Latulipe, was held at the Dialect Gallery in the NoDa arts district of Charlotte, N.C. on Friday, December 10th, 2010. I attended this event with the intention of capturing some of the interaction between the participants and the artistic content during the experience, but I came away with so much more. The themes embedded in the installation struck a chord with me on several different levels.


Friday's version of Interactive Surveillance provided participants the opportunity to use wireless gyroscopic mice to manipulate simulated lenses on a large video display. The video displayed on the screen was a live feed from a camera located in the stairway leading to the second-floor gallery.  When both lenses converged on the screen, a picture was taken of the stairway scene, and then automatically sent to Flickr. Although it was possible for one person to take a picture of the scene holding a mouse in each hand, the experience was enhanced by collaborating with a partner.

In another area of the gallery, guests had the opportunity to use wireless mice to interact with previously recorded surveillance video on another large display.  The video depicted people crossing desert terrain at night from Mexico to the U.S. In this case, the digital lenses on the screen functioned as search lights, illuminating - and targeting- people who would prefer not to be seen or noticed in any way.  On a nearby wall was another smaller screen with the same video content displayed on the larger screen.  This interaction is demonstrated in the video below:



A smaller screen was set out on the refreshment table so participants could view the Flickr photostream of the "surveillance" pictures taken of the stairway.   On a nearby wall was a smaller digital picture frame that provided a looping video montage of Manning's photo/art of people crossing the border.

The themes explored in the original Interactive Surveillance include border surveillance, shadow, and identity, delivered in a way that creates an impact beyond the usual chatter of  pundits, politicians, and opinionators. The live installation provided another layer to the event by providing participants to be the target of the "stairway surveillance", as well as play the role of someone who conducts surveillance.    

Reflections:
In a way, the live component of the present installation speaks to the concerns of our present era, where the balance between freedom and security is shaky at best. It is understandable that video surveillance is used in our nation's efforts to protect our borders. But in our digital age, surveillance is pervasive. In most public spaces it is no longer possible to avoid the security camera's eye.  Our images are captured and stored without our explicit knowledge. We do not know the identities or the intentions of those who view us, or our information, remotely. 

We are numb to the ambient surveillance that surrounds us. We go about our daily activities without notice.  We are silently tracked as we move across websites,  dart in and out of supermarkets and shopping malls, and pay for our purchases with plastic.  Our SMART phones know where we are located and will give out our personal information if we are not vigilant, as our default settings are often "public".

It is easy to forget that the silent type of surveillance exists.  It is not so easy to ignore more invasive types of "surveillance".  We must agree to submit to a high degree of inspection in the form of metal detectors, baggage searches, and in recent weeks, uncomfortable physical pat-downs, for the privilege of traveling across state borders by plane, within our own country.  In some airports, we are subject to whole-body scans that provide strangers with views of our most private spaces. We go along with this effort and prove our innocence on-the-spot, for the greater good.   Conversely, we have multiple means of conducting our own forms of surveillance, through Internet searches, viewing pictures and videos posted to the web, and playing around with Google Streetview. 

As I wandered around the Dialect Gallery with my video camera, I realized that I was conducting my own form of surveillance, adding another layer to the mix.  Unfortunately, some of the time I had my camera pressed to "pause" when I thought I was filming, and vice versa, and as a consequence, I did not capture people using the wireless mice to interact with the content on the displays. I went ahead with my mission and created a short video reflection of my impressions of Interactive Surveillance.  If you look closely at the video between :40 and :47, you'll see some people from across the street from the gallery that I unintentionally captured, and now they are part of my surveillance.

Although the video below was hastily edited, it includes music and sounds from the iMovie library that approximated the "soundtrack" that formed in my mind as I experienced the exhibit.

To get a better understanding of Interactive Surveillance,  I recommend the following links:


Barbara Schrieber, Charlotte Viewpoint



Video Reflection of Interactive Surveillance (Lynn Marentette, 12/10/10)

Live Installation: Interactive Surveillance, by Annabel Manning and Celine Latulipe from Lynn Marentette on Vimeo.



Interactive Surveillance Website



Interactive Surveillance Flickr Photostream

Nov 11, 2010

Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited

I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while. 


Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same.   I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post. 


(The missing piece of information?  An update about apps for the iPad and similar touch-screen tablets.)

Sit back and enjoy!


http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg
-Images: HP; Wired

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg
-Images: ClassematePC


Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing.  There is a need for a breath of fresh air!

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.  
HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the  Dell, the iPad, etc.)

Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.

Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

UPDATE:  Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world


There are some interesting changes going on at the intersection of HCI and educational technology research.  I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe:   Next Generation of HCI and Education

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI: TOUCH SCREEN DISPLAYS:  NEED FOR IMPROVEMENT!

Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke


User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)

User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic.  When I went to visit a relative at the hospital a year or so later, the map was no longer there.


BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Midwife Toad App on a Microsoft Surface, Discovery Place Science Center


TellTable:  Digital Storytelling on the Surface:  Microsoft Research, UK


DECEMBER 31, 2009 -Interactive Soda Machine for Fun

The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them.  Note:  No one from this family actually purchased a soft drink.  I was hoping to time how long it would take them to do so!



Some resources:
lm3labs (catchyoo, ubiq'window)
NUITeqNUI Group (See member's links)
Sparkon (See members links and multi-touch projects)

(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)

If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."

FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining.   The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below:



Aug 10, 2010

Wendy Keay-Bright's ReacTickles revised for use on the multi-touch SMARTTable!

Wendy Keay-Bright works at the Cardiff School of Art and Design in the UK, and is part of the Sensory Design Research Group.   She has focused much of her research on interactive technologies and participatory design, working children with autism, educators, and others to create applications such as ReacTickles and ReactColors, originally designed for use on interactive whiteboards.  These applications have been found to be especially effective with young people who have autism spectrum disorders. ReacTickles has been updated for use on the SMARTTable, a multi-touch, multi-user interactive table that supports collaboration, depicted in the video clips below:




(Posted on agent4changenet's YouTube channel.)

RELATED
Tabletop ReacTickles looks like a SMART move
MerlinJohnOnline 4/25/10


Note: Wendy Keay-Bright is involved with the ESRC Technology Enhanced Learning ECHOES project, which stands for "Improving Children's Social Interaction through Exploratory Learning in a Multimodal Environment."

Jun 26, 2010

A few links: GizmoWatch's 10 Interactive User Interfaces for the Future, CNN's Eatocracy, EVA 2010 and More!

Here's a quick link to a recent Gizmowatch post, Ten Interactive User Interfaces for the Future.  Bharat, the authro, reviews a variety of interfaces, input methods, and interaction techniques, such as Skinput, a water-based touch screen, a muscle-computer interface, air gestures, brain-computer systems, and even a mud-tub interface.


I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.


Totally Unrelated


Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz".  The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.


Here is the description of Eatocracy from the website:

Eatocracy  "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."

Enjoy!

(The above is a repost from The World is My Interactive Interface)

Coming Soon
--More about 3D TV and Interactive TV
--Highlights from CHI 2010 (better late than never!)
--My experiments- SMARTTable, a game, interactive timeline prototype pictures...
--A post about Lieven van Velthoven's interesting Post-WIMP explorations - here are some links that he recently sent me:
As I took a peek at Lieven's video links, I noticed an interesting video mash-up Lieven created from the open-source code from the RadioHead's House of Cards music video and his One Million Particles app. I'll post them soon.

I'll try to get video, pictures, and commentary about EVA 2010.  EVA stands for Electronic Visualization and the Arts. "Electronic Information, the Visual Arts, and Beyond.

FYI
I'm in the process of sorting through and re-organizing my blogs, which have been around for over four years!  During this time, my blogs have attracted a growing number of readers. Because of this, I'd like to make things a bit user-centered.  So expect to see little changes here and there.  I promise I'll give my readers warnings in advance if I make any serious changes! 

If you are new to this blog, you should know that my blogs started out as on-line filing cabinets, open to the world.  Although there is a bit of overlap of material and some cross-posting between the blogs, they are arranged to serve as a paper-less way of keeping track of things that I've learned through my coursework, conference attendance, readings, and research. Since emerging technologies are high on my list of interests, I also use my blogs to share interesting things that cross my path.    
  
I changed the name of my World Is My Interface blog to The World Is My Interactive Interface.   "Off-the-desktop natural user interfaces, interaction, and user experience" are the main topics of the blog.  It sometimes includes information about ubiquitous computing and DOOH, otherwise known as Digital Out Of Home.

I plan to tinker with my TechPsych blog later on. It focuses on topics that are useful to psychologists, educators, special education teachers, speech and language therapists, health and wellness professionals, and parents.

Feel free to leave comments, as I welcome your input.