In this post, I'd like to share some news from NUITEQ, a privately held multi-touch software technology that is based in Sweden. NUITEQ is known for Snowflake Suite, a multi-touch application that is used in a variety of situations.
"NUITEQ is as happy as a kid in a candy shop to announce that it has released version upgrade 1.9.4 of its award-winning, industry leading multi-touch software product Snowflake Suite" - Harry van der Veen, NUITEQ
The screen shot below gives a nice view of the range of applications that included in Snowflake Suite:
Blind Squirrel Digital is a digital media company located in Newton, N.C. One of the company's services is multi-touch surface design and implementation. Below is a picture of their recently constructed multi-touch wall:
About Blind Squirrel Digital: Blind Squirrel Digital's services include iPhone and iPad application development, 3D animation/Visualization/Previz, multi-touch surface design/implementation, augmented reality design/implementation, multi-platform game development, digital signage/interactive and static content management systems, kiosk development, software and hardware design, exhibit design, simulation training application development, and real-time forensic reconstruction application development.
People behind Blind Squirrel Digital: C. Michael Neely, the founder of the company, serves as President and Creative Director of Blind Squirrel Digital. He has experience working with 3D applications and teaches digital effects, animation, and game development technology. He also has a few film credits under his belt, including "Pirates of the Caribbean III", "Spider Man III", "Fantastic Four", and "The Incredible Hulk 2". Jeremy Cooper is the Vice President and Producer at Blind Squirrel. He has experience working on multiple creative projects (Digital Juice, Ballistic Pixel, and the Digital Design Center), and has taught a number of courses in the Simulation and Game Development program at Western Piedmont Community College.
(The above information was gathered from the Blind Squirrel Digital website.)
In Chapter 1 of Natural User Interfaces in .NET, Josh Blake asks and answers a question posed by many people who have been under the spell of keyboard input and GUI/ WIMP interaction:
Why bother switching from GUI to NUI? The answer? Read Chapter 1 (pdf) of the book - the chapter is free.
Here are a few of my personal reasons: 1. I want to buy the next version of the iPad or something like it. 2. I want to buy a new large-screen Internet HD TV.
3. I want to buy a Kinect.
4. I do NOT want to interact with my new TV with a Sony remote. Too many tiny buttons!
5. I do NOT want to interact with my new TV with a keyboard, because it reminds me of...work.
6. Most importantly:
I want to design apps for the people I care about, and others with similar needs:
My mom.
My grandson.
Moms and dads with kids in tow. People with special needs and/or health concerns, and the people who care and guide them. Knowledge sharers and (life-long) learners....
"It’s easy to forget that the computer mouse is over 45 years old."
"What’s not as easy to forget is that we’re now collectively getting used to interacting with computers via means and interfaces that have moved way beyond the keyboard and the mouse — the iPhone and Wii being the most prominent examples."
"The truth is that we stand on the verge of a major revolution in the models of Human Computer Interaction (HCI). A revolution that will fly right past academic and into a world of retail, medical, gaming, military, public event, sporting, personal and marketing applications."
"From multi-touch to motion capture to spatial operating environments, over the next 10 years, everything we know about HCI will change."
"Blur is the only conference that is exploring the line of interaction between computers and humans in a substantive, real-world and hands-on way."
"At Blur, vendors, strategists, buyers and visionaries assemble to not only discuss the larger issues of HCI, but also to lay their hands on the latest in HCI technology. Blur is the only forum for a focused, hands-on exploration of the varied technologies evolving in the HCI."
"Come play, investigate, learn and apply at Blur — where we’re changing how you interact with computers forever." -Blur
BLUR Conference Agenda (Note: I added the links to conference participants and/or their organizations. Feel free to leave a comment if you know of any corrections or better links!) Keynotes:
Neuroergonomics: How an Understanding of the Brain is Changing the Practice of Human Factors Engineering - Dr. Kay Stanney, Design Interactive
To learn more about this project, take a look at the video and related publications below. This is a great example of a team that is harnessing emerging technologies to improve the lives of people with disabilities.
I came across this demonstration of I/O DataCenter's 12 by 7 foot interactive video wall that makes playing around with views of data center modules...interesting! The display is a gesture-based "multi-touch" system. (I'll update this post when I get more information.)
Here is the description from the Datacenter YouTube channel:
"Instead of hauling a 40-foot long modular data center to a trade show, i/o Data Centers is taking a high-tech approach to customer tours of their i/o Anywhere modular data center. The i/o team has created a 12-foot by 7-foot touchscreen video wall to provide interactive tours of the company's facilities. Selecting a "hot spot" pops up a virtual data center, complete with cross sections and product info, following the concept of the touch screens in the sci-fi movie "Minority Report.""
FYI: I/O Data Centers has an application that runs on the Surface.
ICE PAD: Interactive Multitouch Ice Sculpture by Art Below Zero
Here is the information about the interactive sculpture from the Art Below Zero YouTube Channel:
"Created by David Sauer & Max Zuleta for the Lake Forest Tree Lighting Festival.This Ice Crystal Display was the 1st to be created in the USA, Transforming 300 pounds of ice into the equivalent of a giant Ipad touch screen. "People always want to touch our Ice Sculptures, This Interactive Display gave them the perfect reason to get their hands cold." said Max Zuleta owner of Art Below Zero. The public response was amazement and interest in the workings of the touch screen in ice. Our favorite guess was "It must work by sensing body heat!"..."
"...The system is known as Rear Diffused Illumination or Rear DI. It works because an Infrared light is shone from the opposite side of the ice wall through the ice. When an object such as a finger, hand, or mitten stops the infrared light it reflects the light back to a custom camera built by Peau Productions. The illuminated objects are then converted to points of interaction using an open source program Community Core Vision which outputs TUIO data streams to a Flash program for animation. We like the look and feel of the Fluid Solver flash application. The output from the computer is then projected into the ice and ice diffracts the light into something beautiful. By this method the user can manipulate a visible light screen via an invisible light that only the camera can see..."
Call for Participation Large interactive displays are now common in public urban life. Museums, libraries, public plazas, and architectural facades already take advantage of interactive technologies for visual and interactive information presentation. Researchers and practitioners from such varied disciplines as art, architecture, design, HCI, and media theory have started to explore the potential and impact of large display installations in public urban settings.
This workshop aims to provide a platform for researchers and practitioners from different disciplines such as art, architecture, design, HCI, social sciences, and media theory to exchange insights on current research questions in the area. The workshop will focus on to the following topics: how to design large interactive display installations that promote engaging experiences and go beyond playful interaction, how different interaction models shape people’s experience in urban spaces, and how to evaluate their impact.
Workshop Goals &Topics The goal of this one-day CHI 2011 workshop is to cross-fertilize insights from different disciplines, to establish a more general understanding of large interactive displays in public urban contexts, and to develop an agenda for future research directions in this area. Rather than focusing on paper presentations, this workshop aims to trigger active and dynamic group discussions around the following topics:
Beyond Playful Interaction A number of studies found that large display installations invite for playful interaction but often fail to convey meaningful experiences related to content. This raises the following questions:
How can we design installations that endure people’s attention past the initial novelty effect and direct the interest toward the content?
What design strategies can be applied to promote an active individual and social exploration and discussion of the presented information?
Character of Interaction A number of interaction techniques have been explored for large displays in public spaces ranging from interaction via cell phones, to direct-touch or full body interaction. We would like to discuss:
How do different interaction methods shape people’s experience of large display installations in urban spaces?
How do interaction methods differ from each other in terms of triggering interaction and engagement with the presented content?
Evaluation Different quantitative and qualitative methods have been applied to evaluate people’s experience and use of large display installations in public spaces. During the workshop we would like to discuss:
How can we evaluate the "success" of large display installations in urban spaces?
How can particular aspects of public large display installations such as engagement be evaluated?
What kind of evaluation methods are most effective in different progress stages (design phase/installment phase)?
We see this workshop as an opportunity to start thinking about a general framework that can inform the design and evaluation of large interactive displays in different urban contexts. With a diverse research community present at the workshop we hope to come up with an agenda for future research directions in this area.
Submission Details Submit a position paper (maximum 4 pages) to largedisplaysinurbanlife@gmail.com by January 14, 2011 using the CHI extended abstract format. The paper should describe experiences, works in progress, or theories around designing and/or evaluating large interactive displays in public urban settings. We plan to explore approaches and insights from different disciplines to this topic so submissions from art, architecture, design, HCI, media theory, and social science are highly encouraged. We welcome all methodological approaches and techniques centered around the topic of large interactive displays in urban life.
At least one author of each accepted position paper needs to register for the workshop and for one or more days of the CHI conference itself.
Important Dates Submission Deadline: January 14, 2011 Notification of acceptance: February 11, 2011 Workshop: May 7 or 8, 2011 (final date to be announced)
WORKSHOP ORGANIZERS Uta Hinrichs is a PhD candidate in computational media design at the Innovations in Visualization (InnoVis) research group of the University of Calgary, Canada, under the supervision of Sheelagh Carpendale. Her research focuses on the design and study of large display interfaces to support lightweight information exploration in walk-up-and-use scenarios Nina Valkanova is doing her PhD at the interaction group of the Universitat Pompeu Fabra(UPF) in Barcelona, Spain under the supervision of Ernesto Arroyo. Her research interest focuses on the design of urban media facades exploring the intersections between scientific and artistic design knowledge. Kai Kuikkaniemi is a project manager in Helsinki Institute for Information Technology. He is currently leading a national research project focusing on public displays. His earlier research has focused on exploring novel multiplayer game designs ranging from pervasive gaming to biosignal adaptive gaming. Giulio Jacucci is a professor at the University of Helsinki at the Dept. of Computer Science and director of the Network Society Programme at the Helsinki Institute for Information Technology. He leads several interactional projects on interaction design and ubiquitous computing, and is co-founder of MultiTouch Ltd. a company commercializing products for multi-touch screens. Sheelagh Carpendale is a Professor at the University of Calgary where she holds a Canada Research Chair: Information Visualization and an NSERC/iCORE/SMART Industrial Research Chair: Interactive Technologies. She directs the Innovations in Visualization (InnoVis) research group and her research focuses on information visualization, collaborative visualization, and large interactive displays. Ernesto Arroyo holds an associate teaching position at the Dept. of Information and Communication Technologies of the Universitat Pompeu Fabra (UPF) in Barcelona, Spain. He earned his PhD at MIT Media Lab in 2007. His research at the Interactive Technologies Group focuses on interaction design, visualization, and user-centered interfaces, enabling and preserving the fluency of user engagement.
MULTI-TOUCH WITH HACKED KINECT Here is NUI-Group member Florian Echtler's proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction. The application was built on Ubuntu Linux written using libfreenect, by marcan42 and Florian's creation, libTISCH.
Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"
(I have SO many ideas for this! I'll throw a few out there in an upcoming post....maybe someone can run with them!)
FOR THE TECH-CURIOUS: TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework. You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.
For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction, this is great news!
Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge. TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans. libTISCH, a C++ software framework, is included in this project. It provides a means for creating GUIs based on multi-touch and/or tangible input devices.
Here is how it works:
Here is information from libTISCH announcement:
Highlights of this release are, among others, the following features:
- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
as well as custom-defined gestures
- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink
- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python
libTISCH has a lot to offer for the multitouch developer. For example,
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.