Showing posts sorted by relevance for query NUI. Sort by date Show all posts
Showing posts sorted by relevance for query NUI. Sort by date Show all posts

Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Aug 9, 2009

Surface Flight Tracker Video from fboweb labs / flightwise.com, with background music by Art of Noise for your NUI pleasure.



This flight-tracker application for the Surface, looks fun to use. As I watched the video, I realized that it wasn't the application itself that I liked. It was the music that accompanied the video. The choice of music was from the 80's synth-pop band, Art of Noise

Since I'm a music lover, the music got me thinking.

Wouldn't it be great if productivity/work-related applications like Flight Tracker could be developed to provide a means for incorporating a sound-track?


Several thoughts and ideas flashed into my mind:

  • Surface and related natural user interface/interaction (NUI) applications have the potential to transform routine, ho-hum work tasks into activities that are a bit more pleasant. Since people often listen to music while they work, it stands to reason that NUI productivity applications should incorporate a music component, at least as an option.
  • To support a user-centered music platform for NUI applications, the application could incorporate a "smart" music library within the system, with the capability of integrating music libraries and playlists from user's mobile devices, as well as the web, effortlessly.(Of course, there are privacy/security and firewall issues to address, but that is another story.)
  • Users could have a choice of listening to their own music playlists (including a shuffle option, selecting from a variety of presets, or go for something like the iTunes genius effect, listening to music generated from an algorithm that takes into account music preferences and user interaction with the productivity application over time.
  • Since many Surface/NUI applications are designed to support collaborative work and interaction between two or more people, the music situation could get a bit complicated, since people have differing tastes. If co-workers disagreed about the music selection, the program would automatically default to generic elevator music, or silence.
  • NUI applications might even pave the way for a new genre of music. This concept isn't too far-fetched. Think of all the music we've come to love over the years that was composed for movies and even video games!

If you know of anyone that is working on this concept, or would like to collaborate with me sometime in the future on this concept, please let me know. I'm slowly working on an interactive timeline prototype, and I have some ideas about adding a music/sound track component.

RELATED
Art of Noise - Close To The Edit (Version 1):

Apr 21, 2009

NUI-Group's Christian Moore Interview-Podcast on the Interactive Display 2009 Conference website

As I write this post, the 2009 Interactive Displays conference is underway in San Jose, California. The home page of the conference website provides a series of interviews of many of the conference participants and presenters, including Chris Moore, a member of the NUI group. Chris discusses the importance of the open-source NUI (natural user interface) community in moving interactive display and NUI-based technology forward.

Interactive Displays 2009 Conference Website

Interview Participants:
Mats Johansson - EON Reality
Guillaume Largillier - Stantum
Christian Moore- NUI Group
Brent Bushnell- Tapcode
Steven Bathiche - Microsoft
Jerry Bertrand - Microscent
Henry Kaufman- Tactable


If you are at the conference and you have some video clips to share of a presentation, demonstration, or anything else that might be of interest, please give me the link so I can share it on my blog.

Below are some pictures from the companies represented by the people who were interviewed for the Interactive Displays 2009 conference:

http://notes2self.net/mob_img/EonTouchlight.jpg
Eon Reality

http://www.tactable.com/images/homeimage1.jpg
Tactable

http://www.maximumpc.com/files/u46168/microsoft-surface.jpg
Microsoft's Surface

http://www.nuicat.com/tracker.jpg
NUI-Group

http://media.tapcode.net/videos/website/products/thumbs/hardware180.png
Tapcode


I don't usually put company logos on my blog, but if you are a visiting my blog searching for links to companies that are involved with interactive display technologies, this will save you some time. (I received no payment for this gesture.)

https://www.regonline.co.uk/custImages/249955/3M-logoID.gifhttps://www.regonline.co.uk/custImages/249955/logo_NUIEU.gifhttps://www.regonline.co.uk/custImages/249955/Tyco-ElectronicsElo-TouchSy.png
https://www.regonline.co.uk/custImages/249955/OsramLogo.gifhttps://www.regonline.co.uk/custImages/249955/Touch-International.png
https://www.regonline.co.uk/custImages/249955/Kyo2.jpghttps://www.regonline.co.uk/custImages/249955/NWID.jpghttps://www.regonline.co.uk/custImages/249955/RPO2.png








Mar 18, 2009

More for Multi-touch: NextWindow Plug-in for Natural User Interface's Snowflake Multi-touch Software -and more.



Those of you have an HP TouchSmart, Dell Studio One PC, or NextWindow displays might be interested in the new NUI plug-in that supports NUI Suite Snowflake software. Here are the features of the plugin, according to information from the Natural User Interface website:
  • Detailed user manual included with FAQ
  • Developed on fast and reliable C++ platform
  • Intuitive
  • Customizable
  • Gesture recognition library
  • TUIO/OSC (Open Sound Control) support (sending and receiving events)
  • Low level API
  • Hardware accelerated rendering
  • Support for wide variety of media types
  • Advanced window handler that supports scaling and rotation
  • Suitable for Windows® XP and Windows® Vista (Mac OSX and Linux can be developed on request)
  • Audio support
  • Single, dual support
  • Multi-threaded resource handler (For fast data visualization)

"NUI has partnered up with NextWindow™, an international leader in the development of optical multi-touch technology and the manufacturer of optical multi-touch screens, overlays and OEM touch components."

"NextWindow™'s integrated technology allows for natural and intuitive interaction of digital content on flat TFT, LCD and Plasma solutions."

"The NUI NextWindow™ plug-in can be used with any programming language that supports TUIO, i.e. C/C++/C#, Java, Flash, Python, VVVV etc, meaning that software developers can run their own applications on NextWindow™, utilizing the NUI NextWindow™ plug-in."

Comment:
I became a fan of NextWindow touch-screen displays in early 2007 when I worked on a couple of touch-screen projects in my HCI and Ubicomp classes at UNC-Charlotte.


I've been using my HP TouchSmart PC at work with students with disabilities. I'm experimenting with the NUI Suite SnowFlake on my TouchSmart, and found that interacting with the Particles application delighted students with severe autism. The activities provided opportunities to establish joint attention. I also noticed an increase in the number of vocalizations and/or verbalizations among the students. Of course, this was NOT a scientific study.

RELATED
Definition of Joint Attention from UConn:

"Joint Attention is the process of sharing one’s experience of observing an object or event, by following gaze or pointing gestures. It is critical for social development, language acquisition, cognitive development…"

http://eigsti.psy.uconn.edu/jt_attn.JPG


Establishing joint attention is an important step in the development of social interaction skills among young people who have autism spectrum disorders.

More about joint attention:

Joint Attention Study Has Implication for Understanding Autism
Science Daily, 9/29/07

Asperger-Advice: Joint Attention

Autism Games: Joint Attention and Reciprocity

Why is joint attention a pivotal skill in autism?
Tony Charman
Philos Trans R Soc Lond B Biol Sci. 2003 February 28; 358(1430): 315–324.
doi: 10.1098/rstb.2002.1199.

Jul 18, 2008

Natural User Interface: Overview of multi-touch technology and application development by Harry van der Veen,- Business to Buttons

The image “http://www.multitouch.nl/sverige/smoke1.jpg” cannot be displayed, because it contains errors.The image “http://transfer.naturalui.com/cpc/P1000258.JPG” cannot be displayed, because it contains errors.

Harry van der Veen from Natural User Interface Europe AB, was one of the keynote speakers at the Business to Buttons: Designing for Effect conference, held in June 2008.
In this presentation video, Harry discusses the past, present, and future of multi-touch technology, and reviews the importance of multi-touch over single touch displays. He also provides a good overview of gesture interaction, something that he researched when he was a student. This presentation includes several video examples of multi-touch applications in action.

The presentation is well worth the 30-minute view!


"Harry van der Veen is a Bachelor of Multimedia, derived from the Dutch education Communication, Multimedia and Design, focused on Interaction Design and Project Management. He is CEO, co-founder and co-owner of the Sweden based commercial company Natural User Interface Europe AB, which focuses on delivering standardized and customized multi-touch hardware / software solutions and services to the global market. In addition to that, he co-founded the NUIGroup community, which is the worlds largest online platform where a global network of people share their ideas and information in an open source community, focused on multi-touch hardware and software solutions."

The image “http://nuigroup.com//images/nui.jpg” cannot be displayed, because it contains errors.
NUIGroup Community

The image “http://www.multitouch.nl/sverige/nuilogo.bmp” cannot be displayed, because it contains errors.
Harry van der Veen's blog

Natural User Interface Europe AB (Harry van der Veen's company)

NUIGroup Wiki: This wiki includes tutorials for developing multi-touch applications, building your own low-cost multi-touch table, and information about current projects that are in progress.

Related Information:


The Business to Buttons: Designing for Effect conference was held on June 12-12 in Malmo, Sweden, organized by Malmo University and inUse, a user experience consultancy. Partners in this conference included Adaptive Path, a product experience strategy and design company, Patrick W. Jordan, a design, marketing, and brand strategist, the cocktail, a user experience and interaction design studio, cooper, a product design company, and OresundIT, a non-profit network.


Don Norman, the author of books such as "Design of Everyday Things" and "The Design of Future Things", presented at this conference. Don Norman is one of the founding fathers of the Human-Computer Interaction and related fields, and is the co-founder of the Nielsen Norman Group, a consultant firm that helps company create human-centered products.

Videos of Don Norman's Presentations:
Emotional Design: Total User Experience
Cautious Cars and Cantankerous Kitchens

Other:
Business to Buttons 2008 Recorded Sessions

Business to Buttons 2008 Downloads

My posts about the work of NUI Group members:

Multi-Touch Plug-in for NASA World Wind?!

More Multitouch: NUI Group's Christopher Jette's multi-touch work featured in Engaget ; Croquet?

More Multi-Touch from members of the NUI group!

Multi-touch Crayon Physics from multitouch-barcelona, inspired by Crayon Physics by Kloonig Games

Cross Post: Seth Sandler's YouTube Video, "How to Make a Cheap Multi-touch Pad" goes viral

NUI-Group Member Bridger Maxwell Receives High School Science Fair Award for Multi-Touch Screen Project

Look, touch, listen, and play: Seth Sandler's interactive Audio Touch Table video; NUI Group and Google's Summer of Code

[nuiab.jpg]

Mar 11, 2013

Leap Motion: My Dev Kit Arrived - Now What?! Thoughts About "NUI" Child-Computer-Tech-Interaction - and More



My Leap Motion developer kit arrived last week. I carefully unboxed the small device and tried out the demo apps that came with the SDK.  I'm doing more looking than leaping at this point.

I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year.  It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.

Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson.   Family and friends captured his first moments after birth with iPhones, and shared across the Internet.  Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.

The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video.   He was seven months old when he first encountered my first iPad.  It was fingers-and-toes interaction from the start.  

In the first picture below,  he's playing with NodeBeat.  In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.




















My grandson is new to motion control applications, so I'm just beginning to learn what he likes,  and what he is capable of doing.  A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft!  (I think next time we'll try Kinect Sesame Street TVor revisit Kinectimals.)  


One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far.  There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.

One app I really like is  Adam Somer's AirHarp, featured in the video clip below:


I also like the idea behing the following app, developed by undergraduate students:

Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
 
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."

"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team



There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation

"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab



Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine.  From the video, you can tell that they had a blast!  

Here is an excerpt from the chatter:  "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."

According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."  

The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:



Below are a few more videos featuring Leap Motion:


Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)


The Leap Motion Experience at SXSW 2013


LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...



RELATED
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Social Sign
IntuiLab
Leap Motion: Low Cost Gesture Control for Your Computer Display

SOMEWHAT RELATED
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."

Comment:
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group.  I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs.   It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.

Sep 18, 2010

Interactive Tabletops and Surfaces: 2010 ACM Conference, Nov. 7-10, Saarbrucken, Germany. Wish I could go!

If you are new to this blog, you should know that I'm passionate about interactive tables and surfaces of all sizes!   Although this technology has been around for a while, it is a new concept to most people.  The researchers and practitioners involved in the upcoming 2010 Interactive Tabletops and Surfaces Conference have been an important influence in the way people think about interacting with technology, and have made significant contributions to this emerging field over the past several years.   It hasn't been an easy road, given that most of us have minds brainwashed through years of forced keyboard-and-mouse interaction and traditional WIMP (Windows, Icons, Menus, Pointers) interfaces.

I first learned about the first Interactive Tabletops conference, held in 2006, in early 2007.  At the time, I was working on projects for my HCI and Ubiquitous Computing classes, trying to learn everything I could about natural user interaction, large touch-screen displays, tabletop computing, and multi-touch.   I was inspired by the interesting work going on in this field.  This was before the first iPhone was introduced, before Microsoft's multi-touch Surface was unveiled, and three years before Apple broke out with the iPad.

Many of the people involved with the 2010 Interactive Tabletops and Surfaces Conference are (or have been) affiliated with the NUI group NUI stands for Natural User Interface, or Natural User Interaction - the NUI group is "a global research community focused on the open discovery of natural user interfaces."  I joined the NUI-group in 2007 when I was looking for more information about the nuts and bolts of multi-touch programming and systems, and have been encourage to see how things have evolved since then.

Members of another group, sparkon, are also participating in the Interactive Tabletops and Surfaces conference.  Sparkon is an on-line community that includes people involved with  interactive technologies, including tabletop and surface computing. "On sparkon, you'll find projects demonstrating the latest interactive techniques, applications, software frameworks, case studies, and blog articles relating to creative and emergent technology."  (I'm also a member of Sparkon.)


Here's the information from the conference website:

ACM Interactive Tabletops and Surfaces, Saarbrücken, Germany:  7-10 November, 2010
"ITS 2010 is a premier venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a young community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, interaction design, and studies expanding our understanding of design considerations of ITS technologies and of their applications in modern society. ITS 2010 will bring together top researchers and practitioners who are interested in both the technical and human aspects of interactive tabletop and surface technologies. It is our hope that we will be able to achieve increased synergy of approaches between the disciplines engaged in the research in the area of interactive tabletops and surfaces, Design, HCI, UbiComp, Psychology, MobileHCI and other related fields. More directly, we intend to encourage immediate interdisciplinary collaboration on future research topics. Young scholars and Ph.D. students are especially encouraged to submit papers and participate in the doctoral colloquium."


Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH
Conference General Chairs



KEYNOTE SPEAKER:  W. Bradford Paley

"Bio: W. Bradford Paley uses computers to create visual displays with the goal of making readable, clear, and engaging expressions of complex data. He did his first computer graphics in 1973, founded Digital Image Design Incorporated in 1982, and started doing financial & statistical data visualization in 1986. He has exhibited at the Museum of Modern Art; he created TextArc.org; he is in the ARTPORT collection of the Whitney Museum of American Art; has received multiple grants and awards for both art and design, and his designs are at work every day in the hands of brokers on the floor of the New York Stock Exchange. He is an adjunct associate professor at Columbia University, and is director of Information Esthetics: a fledgling interdisciplinary group exploring the creation and interpretation of data representations that are both readable and esthetically satisfying."


SAMPLE TOPICS



  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Interactive surface hardware, including sensing and input technologies with novel capabilities
  • Human-centered design & methodologies





RELATED
Previous Conferences








PLUGS
From the conference website -Links to the conference sponsors:




We appreciate the generous support of the following sponsors, without whom this conference would not be possible. Click on the logos to learn more about our generous supporters, and let us know if you are interested in becoming a sponsor.

Champions:

 

Benefactors:

  

Donors:

Contributors:

Academic Sponsors: