Showing posts sorted by date for query multitouch multi-touch. Sort by relevance Show all posts
Showing posts sorted by date for query multitouch multi-touch. Sort by relevance Show all posts

Dec 6, 2010

ICE PAD: Interactive Multitouch Ice Sculpture by Art Below Zero (video)

ICE PAD: Interactive Multitouch Ice Sculpture by Art Below Zero


Here is the information about the interactive sculpture from the Art Below Zero YouTube Channel:

"Created by David Sauer & Max Zuleta for the Lake Forest Tree Lighting Festival.This Ice Crystal Display was the 1st to be created in the USA, Transforming 300 pounds of ice into the equivalent of a giant Ipad touch screen. "People always want to touch our Ice Sculptures, This Interactive Display gave them the perfect reason to get their hands cold." said Max Zuleta owner of Art Below Zero. The public response was amazement and interest in the workings of the touch screen in ice. Our favorite guess was "It must work by sensing body heat!"..."

"...The system is known as Rear Diffused Illumination or Rear DI. It works because an Infrared light is shone from the opposite side of the ice wall through the ice. When an object such as a finger, hand, or mitten stops the infrared light it reflects the light back to a custom camera built by Peau Productions. The illuminated objects are then converted to points of interaction using an open source program Community Core Vision which outputs TUIO data streams to a Flash program for animation. We like the look and feel of the Fluid Solver flash application. The output from the computer is then projected into the ice and ice diffracts the light into something beautiful. By this method the user can manipulate a visible light screen via an invisible light that only the camera can see..."



Thanks to Nolan Ramseyer, of PeauProductions, for the link!
PeauProductions Blog: Multitouch and Technology


RELATED
Ubice = Multi-touch On Ice at the Nokia Research Center in Finland (Video + Pic via Albrecht Schmidt)
Art Below Zero

Nov 30, 2010

Call for Participation - Large Displays in Urban Life: From Exhibition Halls to Media Facades (CHI 2011 Workshop)

Large Displays in Urban Life: From Exhibition Halls to Media Facades 
CHI 2010 Workshop May 7 or 8, 2011 (final date to be announced)


Call for Participation
Large interactive displays are now common in public urban life. Museums, libraries, public plazas, and architectural facades already take advantage of interactive technologies for visual and interactive information presentation. Researchers and practitioners from such varied disciplines as art, architecture, design, HCI, and media theory have started to explore the potential and impact of large display installations in public urban settings.


This workshop aims to provide a platform for researchers and practitioners from different disciplines such as art, architecture, design, HCI, social sciences, and media theory to exchange insights on current research questions in the area. The workshop will focus on to the following topics: how to design large interactive display installations that promote engaging experiences and go beyond playful interaction, how different interaction models shape people’s experience in urban spaces, and how to evaluate their impact.


Workshop Goals & Topics
The goal of this one-day CHI 2011 workshop is to cross-fertilize insights from different disciplines, to establish a more general understanding of large interactive displays in public urban contexts, and to develop an agenda for future research directions in this area. Rather than focusing on paper presentations, this workshop aims to trigger active and dynamic group discussions around the following topics:


Beyond Playful Interaction
A number of studies found that large display installations invite for playful interaction but often fail to convey meaningful experiences related to content. This raises the following questions:
  • How can we design installations that endure people’s attention past the initial novelty effect and direct the interest toward the content?
  • What design strategies can be applied to promote an active individual and social exploration and discussion of the presented information?
Character of Interaction
A number of interaction techniques have been explored for large displays in public spaces ranging from interaction via cell phones, to direct-touch or full body interaction. We would like to discuss:
  • How do different interaction methods shape people’s experience of large display installations in urban spaces?
  • How do interaction methods differ from each other in terms of triggering interaction and engagement with the presented content?
Evaluation
Different quantitative and qualitative methods have been applied to evaluate people’s experience and use of large display installations in public spaces. During the workshop we would like to discuss:
  • How can we evaluate the "success" of large display installations in urban spaces?
  • How can particular aspects of public large display installations such as engagement be evaluated?
  • What kind of evaluation methods are most effective in different progress stages (design phase/installment phase)?
We see this workshop as an opportunity to start thinking about a general framework that can inform the design and evaluation of large interactive displays in different urban contexts. With a diverse research community present at the workshop we hope to come up with an agenda for future research directions in this area.

For more details on the workshop please refer to our extended abstract and workshop proposal.

Submission Details
Submit a position paper (maximum 4 pages) to largedisplaysinurbanlife@gmail.com by January 14, 2011 using the CHI extended abstract format. The paper should describe experiences, works in progress, or theories around designing and/or evaluating large interactive displays in public urban settings. We plan to explore approaches and insights from different disciplines to this topic so submissions from art, architecture, design, HCI, media theory, and social science are highly encouraged. We welcome all methodological approaches and techniques centered around the topic of large interactive displays in urban life.


At least one author of each accepted position paper needs to register for the workshop and for one or more days of the CHI conference itself.


Important Dates
Submission Deadline: January 14, 2011
Notification of acceptance: February 11, 2011
Workshop: May 7 or 8, 2011 (final date to be announced)

WORKSHOP ORGANIZERS
Uta Hinrichs is a PhD candidate in computational media design at the Innovations in Visualization (InnoVis) research group of the University of Calgary, Canada, under the supervision of Sheelagh Carpendale. Her research focuses on the design and study of large display interfaces to support lightweight information exploration in walk-up-and-use scenarios
Nina Valkanova is doing her PhD at the interaction group of the Universitat Pompeu Fabra (UPF) in Barcelona, Spain under the supervision of Ernesto Arroyo. Her research interest focuses on the design of urban media facades exploring the intersections between scientific and artistic design knowledge.
Kai Kuikkaniemi is a project manager in Helsinki Institute for Information Technology. He is currently leading a national research project focusing on public displays. His earlier research has focused on exploring novel multiplayer game designs ranging from pervasive gaming to biosignal adaptive gaming.
Giulio Jacucci is a professor at the University of Helsinki at the Dept. of Computer Science and director of the Network Society Programme at the Helsinki Institute for Information Technology. He leads several interactional projects on interaction design and ubiquitous computing, and is co-founder of MultiTouch Ltd. a company commercializing products for multi-touch screens.
Sheelagh Carpendale is a Professor at the University of Calgary where she holds a Canada Research Chair: Information Visualization and an NSERC/iCORE/SMART Industrial Research Chair: Interactive Technologies. She directs the Innovations in Visualization (InnoVis) research group and her research focuses on information visualization, collaborative visualization, and large interactive displays.
Ernesto Arroyo holds an associate teaching position at the Dept. of Information and Communication Technologies of the Universitat Pompeu Fabra  (UPF) in Barcelona, Spain. He earned his PhD at MIT Media Lab in 2007. His research at the Interactive Technologies Group  focuses on interaction design, visualization, and user-centered interfaces, enabling and preserving the fluency of user engagement.

Thanks to Uta Hinrich for sending this my way!

Nov 13, 2010

HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)

MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's  proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction.  The application was built on Ubuntu Linux written using libfreenect, by marcan42  and Florian's creation, libTISCH.



Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"


(I have SO many ideas for this!  I'll throw a few out there in an upcoming post....maybe someone can run with them!)


RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10


FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework.  You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.


LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10

Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.

For your convenience, I've reposted something I wrote about libTISCH back in 2009:

For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers



































Here is information from libTISCH announcement:

Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.


MeTaTop A Multi Sensory Table Top System for Medical Procedures

Nov 11, 2010

Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited

I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while. 


Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same.   I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post. 


(The missing piece of information?  An update about apps for the iPad and similar touch-screen tablets.)

Sit back and enjoy!


http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg
-Images: HP; Wired

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg
-Images: ClassematePC


Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing.  There is a need for a breath of fresh air!

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.  
HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the  Dell, the iPad, etc.)

Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.

Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

UPDATE:  Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world


There are some interesting changes going on at the intersection of HCI and educational technology research.  I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe:   Next Generation of HCI and Education

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI: TOUCH SCREEN DISPLAYS:  NEED FOR IMPROVEMENT!

Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke


User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)

User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic.  When I went to visit a relative at the hospital a year or so later, the map was no longer there.


BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Midwife Toad App on a Microsoft Surface, Discovery Place Science Center


TellTable:  Digital Storytelling on the Surface:  Microsoft Research, UK


DECEMBER 31, 2009 -Interactive Soda Machine for Fun

The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them.  Note:  No one from this family actually purchased a soft drink.  I was hoping to time how long it would take them to do so!



Some resources:
lm3labs (catchyoo, ubiq'window)
NUITeqNUI Group (See member's links)
Sparkon (See members links and multi-touch projects)

(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)

If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."

FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining.   The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below:



Nov 1, 2010

Unlocking the Future of Cities through Multi-Touch Interactive Visualization at RENCI (UNC-Charlotte)

Here is a link to an article that was in the SciTech section of my morning paper today!


Unlocking the Future of Cities:  UNCC scientists work across disciplines to predict how urban areas will use open land. Tyler Dukes, Charlotte Observer, 10/31/10


"As part of a three-year, $286,000 grant from the National Science Foundation, the group of scientists from UNC Charlotte is researching the complex relationship between the Queen City and its surrounding forest and pastoral lands. Using a combination of social, natural and computer science, they're working to build an interactive map-based simulation capable of showing the impact of future development and policy on land use....It's a project requiring Meentemeyer's team to peel back multiple layers of cultural and economic values surrounding land in the South. The research will have implications beyond the Charlotte area...By allowing the public to explore those possibilities visually on anything from a laptop to a touch-screen table, the research team is hoping its work will mean more informed decisions about how people use the land around them."  -Charlotte Observer




Image Source: Charlotte Observer


Wouldn't this be a great tool to use to support collaborative learning projects in the schools?


RELATED
RENCI at UNC-Charlotte has a Multi-touch Table in the Visualization Center
RENCI Visualization Center Update
Visualization Resources at RENCI UNC-Charlotte
RENCI at UNC Charlotte
Multi-Touch at RENCI
Research by Touch:  RENCI Multitouch Table Gives Computer Science Research an Intuitive Interface



Oct 22, 2010

Quick Link: 3M Invests in Perceptive Pixel, Jeff Han's Multitouch Tech Company

3M Invests in Perceptive Pixel


"3M, through its 3M New Ventures business, has invested in Perceptive Pixel Inc., a developer of advanced multi-touch solutions based in New York City. Terms of the transaction were not disclosed.Founded by multi-touch pioneer Jeff Han in 2006, Perceptive Pixel is dedicated to the research, development and production of multi-touch interfaces for the knowledge worker. The company's hardware and software products enable users to manipulate complex datasets through a new class of intuitive, powerful and visually rich interface techniques. The combination of its technologies with those of 3M will create incredible new opportunities for both companies."

"To see Perceptive Pixel multi-touch solutions in action on 3M Projected Capacitive Technology, see the video at http://www.3m.com/touchPPI. For more information about 3M MicroTouch products, visit www.3M.com/touch. For an overview of popular touch technologies and terminology, visit www.touchtopics.com."

Catching up with multitouch pioneer Jef Han Ina Fried, Cnet 10/22/10

Jeff Han's 2006 Ted Talk



Jeff Han, 2007



Thanks to Seth Sandler for the link!

Oct 12, 2010

Update on Josh Blake, newly designated Microsoft Surface MVP

Josh Blake is the Tech Lead of the InfoStrat Advance Technology Group in DC.  He has been creating multi-touch applications Microsoft's Surface multi-user table-tops for a while. Recently, his team built a suite of applications designed for use by young children at a museum.  Below is a video demonstration of some of this work. It really looks exciting!


Microsoft Surface and Magical Object Interaction

Josh Blake's blog is called Deconstructing the NUI- for those of you new to this blog, NUI stands for Natural User Interface (also known as Natural User Interaction).  See his post, Microsoft Surface and Magical Object Interaction, for more information!

RELATED
Here is a plug for Josh Blake's book, "Multitouch on Windows"

Book Ordering Information

FYI:  InfoStrat  is hiring  WPF experts as well as Microsoft CRM and Microsoft SharePoint experts.


Microsoft Surface MVPs
Dr. Neil Roodyn
Dennis Vroegop
Rick Barraza
Joshua Blake





Oct 11, 2010

Designing for Multitouch Tables and Surfaces, by Erin Rose, Open Exhibits Blog

If you are interested in exploring collaborative tabletop applications, take a look at the Open Exhibits blog. Erin Rose's recent post, "Designing for Multi-touch Tables and Surfaces", is a good overview of lessons learned over the past couple of years in design, development, and implementation of multi-user interactive tabletop applications.

Although the focus of Open Exhibits is on applications and systems designed for museum exhibits, many of the design challenges hold true for similar applications in other settings, such as classrooms, libraries, and other public spaces.

Erin's post explores each of the following topics in more detail:

  • Don't forget that the table is omni-directional.
  • Individual control of objects encourages multi-user interaction.
  • Promote collaboration, founded in healthy competition.


(Erin Rose is a developer and community liaison for Open Exhibits.)

RELATED
Exhibit Files
Jim Spadaccini
Visitors Explore L.A. in Google Maps and Flickr Mashup.

Sep 21, 2010

Grant from the National Science Foundation for Multi-touch Interactive Museum Exhibits!

This is interesting!

"Open Exhibits is a National Science Foundation-funded initiative to develop a library of free and open multitouch-enabled software modules for exhibit development. Build using the popular Adobe Flash and Flex authoring tools, museum professionals will be able to create innovative floor and web-based exhibits easily and inexpensively" -- Open Exhibits

VIDEO: Introducing Open Exhibits: Open Source Exhibit Software


Open Exhibits Core is based on the commercial GestureWorks software package.


RELATED
Open Exhibits Funded by the National Science Foundation
Jim Spadaccini, Open Exhibits Blog 9/21/10

About Open Exhibits
Jim Spadaccini, of Ideum, is the Principal Investigator of the Open Exhibits project.  Kate Haley Goldman is the co-PI and main researcher.  The three museum partners are the Don Harrington Discovery Center, the Maxwell Museum of Anthropology, and the New Mexico Museum of Natural History and Science


Ideum website