Showing posts with label collaboration. Show all posts
Showing posts with label collaboration. Show all posts

Aug 28, 2012

Mobile Design Best Practices: Joshua Mauldin's UX Charlotte Talk - photos, sketchnote, and slides


Joshua Mauldin, founder of Invisible Interface & Awesome Gifs, and app creator/manager for The Business Journals, gave a talk about best practices in touchable mobile design at a recent Charlotte User Experience meet-up. I especially liked his use of clips from Arrested Development to remind us of the wide range of emotional responses people get when they use technology, good or bad.  (Take a moment and look over the slides for the meat of Joshua's talk.)



(Photo by Brad E.)

Slide-share of Joshua Mauldin's Mobile Design Best Practices talk:

Mobile design best practices from Joshua Mauldin

Below is sketchnote of the talk, drawn by hand by Antoine RJ Wright, using Adobe Ideas on his iPad:







(Photo by Bermon Painter)

The event was hosted by MyJive, located at the NC Music Factory. I was fortunate to grab a comfy spot to sit - there were about 85 in attendance.At the end of the meetup, I won a ticket to Web Afternoon!

Oct 20, 2011

T(ether): Next-Gen Data Gloves + iPad - just what I need for the colder weather! (quick post)



"T(ether) is a novel spatial aware display that supports intuitive interaction with volumetric data. The display acts as a window affording users a perspective view of three- dimensional data through tracking of head position and orientation. T(ether) creates a 1:1 mapping between real and virtual coordinate space allowing immersive exploration of the joint domain. Our system creates a shared workspace in which co-located or remote users can collaborate in both the real and virtual worlds. The system allows input through capacitive touch on the display and a motion-tracked glove. When placed behind the display, the user’s hand extends into the virtual world, enabling the user to interact with objects directly." -Vimeo

For more pictures and information, see the following post on the Creative Applications Network website: 
T(ether) [Cinder]
Filip Visnjic



Jul 14, 2011

News from the HCI lab at UNC-Charlotte - Creative Interactions (Videos)

In the video below, Nathan Nifong, discusses and demonstrates his independent study work through the HCI LAB (Human-Computer Interaction) at UNC-Charlotte. He focused his work on creating a motion-responsive musical instrument that integrated movement, sounds, and graphics.


"We're trying to research anything and everything that will help to make computers easier to use." -Nathan Nifong


Thanks to Celine LaTulipe, for the link!


RELATED
To get a better idea about the HCI lab, a small, yet sparkling gem on the UNC-Charlotte campus, take a look at the following video:




"I get to create things, and to me, you just can't beat that!"- Celine Latulipe


Dr. Latulipe is known for her work with the Dance.Draw project, Interactive Surveillance, and tools that support things like bimanual interaction, collaboration, and creativity.

Dec 14, 2010

Short documentary of the story behind the Reactable, a tangible user interface for creating music. (Includes an interview of Joel Bonasera, of Charlotte's Discovery Place museum.)

The following video provides a look into the history of the Reactable,  from the initial paper prototypes to the present, including the Reactable Mobile application designed for the iPad, iPhone, and iPod touch.   The video includes interviews of Sergi Jorda and Gunter Geiger, members of the original team at Pompeu Fabra University (Barcelona) that created the Reactable.  The other team members are Martin Kaltenbrunner and Marcos Alonso.




FYI:  At about 2:34 in the video, Joel Bonasera briefly discusses the Reactable installation at Charlotte's Discovery Place museum. Joel is a project manager at Discovery Place.


RELATED
How the Reactable Works
John Fuller, howstuffworks
Music Technology Group, Pompeu Fabra University
Reactable Website
Reactable Concepts
Reactable History
Discovery Place
Interactive Technology in the Carolinas: Discovery Place Science Center
(Includes a short video clip I took of the Reactable at Discovery Place)

Mar 11, 2010

Social Thinking blog - Great post from Michelle Winner's Blog

We're planning an application for a multi-touch SMART Table at one of my schools, and one of the goals is to ensure that it can facilitate social interaction among students with and without disabilties, including autism.  Keeping this in mind, one of my colleages sent me an email a minute ago with a link to a great post by Michelle Winner's blog about social thinking and "inclusion" vs "integration" in education settings.  It is worth a read!

http://www.socialthinking.com/michelles-blog/social-skills-for-an-integrated-setting.html

Feb 26, 2010

Web-based Technology-Supported Interaction: 6Rounds mixes webcam, games, and social media....

I've been so busy writing reports* that this almost passed me by!

I found out about 6rounds because they use Twitter as a promotional platform.  I happened to notice that this company was following me and clicked on the link.

6rounds started out as an outgrowth of a speed dating website, and the application was initially designed for people to use while waiting for speed dating sessions. According to the 6rounds website FAQ's, "6rounds is a live meeting point, offering users a variety of experiences that they enjoy together using a combination of webcams, real-time games, social activities and media engagements."

Since I'm a happily married middle-aged woman, I'm not sure 6rounds is up my alley.  I think  social singles, college students, and others who don't mind flashing their faces through a webcam would like it.

If I had time, I might like to play around with GixOO, the opensource API that underpins 6rounds. GixOO has the potential for developers to develop games and activities.  The application allows the users to track each other as they move their mice, and also enables people to see the same things as their friends as they interact online.

6rounds looks like it might provide possibilities for collaborative projects in education, but I won't be sure until I give it a try.

So what is 6rounds?



FOR THE TECH-CURIOUS

The following information was quoted from the Openomics blog from Sun Microsystem's  ISV Engineering:


"6rounds is the first product built on the GixOO live social platform, initially developped on the LAMP stack. As a member of the Sun Startup Essentials program, GixOO connected with Sun's ISV Engineering team to test the scalability of their platform on SAMP --the Solaris-based AMP stack, available in an integrated and optimized package from Sun, the Sun Glassfish Web Stack f.k.a. CoolStack. At the time, we ran the benchmark on a Sun SPARC Enterprise T5120 server --featuring the 64-way CoolThreads processor UltraSPARC T2-- running Solaris 10 and CoolStack 1.3. GixOO loved the DTrace kernel instrumentation of Solaris 10 --DTrace gives unique insights into how the application performs, live on a production system-- and the Containers technology a.k.a. Zones --this light-weigth virtualization layer of Solaris allows multiple applications to run in isolation from each other on the same physical hardware--, and quickly adopted them for their internal use.
"At GixOO, we use Sun SPARC-based server, powered by Solaris 10 for our R&D environment. The system gives us the required flexibility and components isolation that we need. Thanks to SPARC's great SMP abilities, we achieve high performance for many development environments running on one single 1U server.
Solaris Zones are very comfortable and simple to configure, and allow the full utilization of the great power hidden in this small machine, which makes Solaris 10 an excellent choice for system administrators. We are using Sun MySQL Server which gives our application high speed data storage solution, and in the future we might migrate to the MySQL Cluster solution to get even faster results."
Dmitry Shestak, CTO,
GixOO"
Somewhat Related

2/26/10: Oracle bought Sun in 2009. Here were the latest results when I did a search to get more information:













Not Really Related 


*For those new to this blog, I'm a school psychologist who returned to her day job full time a year and 1/2 ago, when the economy was taking a nosedive. Before that, I was working part-time and taking computer and technology classes, initially to learn how to create interactive multimedia applications and games.  


Since some of the kids and teens I work with have a range of abilities and disabilities, including autism, I developed an interest in accessibility.  How can universal design principles  be applied to games and emerging interactive technologies?   I'm also fascinated by interactive displays and surfaces of all sizes, especially ubiquitous systems that support cognition, collaboration and communication.  

One of my pet projects:

My vision? A collaborative multimedia, multi-modal interactive time-line might help us to understand  complex, interrelated factors and events more effectively. It would provide an opportunity for the inquisitive to view things from a broad perspective, and also explore things in rich detail. Ideally, the time-line would support multi-touch, multi-user interaction on larger displays and interactive whiteboards, and allow for people who are remotely located to participate in the process. 


Now that one of my schools will be getting a multi-touch SMARTTable, I'd like to experiment with time-line concepts and interactions on a table surface. I'd also like to figure out how this can work seamlessly with the existing SMARTBoard that is in the classroom.  Of course, this would have to take place during after work hours!

Jan 22, 2010

Classroom of the Future: Orchestrating Collaborative Spaces (Interesting book!)


I'd like to share a link to a new book that looks worth reading:  "Classroom of the Future:  Orchestrating Collaborative Spaces".


I follow the blogs of two of the book's authors.   Stefano Baraldi is  a researcher and consultant in the field of HCI, and  Albrecht Schmidt, a professor and chair of pervasive computing and User Interface Engineering at the University of Duisburg-Essen in Germany.



"The future of the classroom is an issue that essentially concerns many of us as students, parents, taxpayers, policymakers, teachers, design professionals, or researchers. A glance at the history of pedagogical practice reveals, however, that despite rapid developments in the outside world, classrooms have evolved very little over the years. While our understanding of learning and favourable learning environments has substantially improved and technological innovations are offering a variety of new possibilities, it still seems that most of today’s schools and universities remain more or less unaffected by these developments.

This book brings together the perspectives of researchers, architects, technical designers, and teachers on emerging theoretical and technological developments pertaining to the classroom of the future.

Innovative ideas are offered on how new technologies and learning approaches can be integrated into schools. It challenges us to think of learning spaces in a new way. Classroom of the Future is of interest to researchers and students, designers and educators across various disciplines including education, cognitive, social and educational psychology, didactics, computer science and design as well as to parents and policymakers." (Publisher's description)
Free preview of the book (1st chapter)






For your convenience, the book's table of contents:




1. The Classroom of the future - an introduction ......................................................1
Kati Mäkitalo-Siegl, Jan Zottmann, Frederic Kaplan and Frank Fischer

Part I: Learning Spaces Shaped by Instructional, Classroom, and School Building Design
2. The classroom of the past .................................................................................15
Andreas Schratzenstaller
3. Spaces for learning — schools for the future?     ...................................................41
Joanna Sutherland and Rosamund Sutherland

Part II: Facilitating Learning Using Technology-Enhanced Objects and Furniture
4. Taking teaching beyond the classroom: Pervasive computing technologies
for the classroom of the future    ..........................................................................63
Paul Holleis, Albrecht Schmidt, Heiko Drewes, Richard Atterer and Petra Dollinger
5. Making the classroom a play-ground for knowledge         ................. .....................87
Stefano Baraldi
6. Computer support for children’s collaborative story-making in the classroom...........115
Giulia Gelmini Hornsby
7. Scriptable classrooms ......................................................................................141
Frederic Kaplan and Pierre Dillenbourg

Part III: Knowledge Building in Virtual and Physical Learning Spaces
8. Knowledge communities in the classroom of the future........................................163
Nicolae Nistor
9. Digital cultural heritage to support novel activities in the classroom of the future...181
Lily Díaz and Lotta Partanen

Part IV: Joint Efforts for Designing and Implementing Future Spaces of Learning
10. Designing tomorrow’s classroom with today’s teachers ........................................199
Andreas Lingnau
11. Evolving the classrooms of the future: The interplay of pedagogy, technology
and community ...................................................................................................215
Jim Slotta

Part V: Discussion
12. Turning the classroom of the future into the classroom of the present..................245
Ingo Kollar
List of contributors..............................................................................................257
Author index.......................................................................................................259
Subject index......................................................................................................267

Stefano Baraldi's blog:  On the Tabletop
Albrecht Schmidt's blog:  User Interface Engineering


Jan 14, 2010

Shared computing with Windows MultiPoint in classrooms: Why not use Mouse Mischief (beta version)?

I came across this post this on Long Zheng's I Started Something blog:

Windows MultiPoint Server -- a multiseat computing solution worthy for the home?
Long Zheng points out that Window's MultiPoint server is an outgrowth of the Multi-Mouse project, in which students multiple numbers of students can work together to interact with content a PC screen or a projected PC screen.

The picture below shows how a Windows MultiPoint server can work in a classroom.

-Microsoft

I'm not so sure I like the set up in the picture of the Multi-Point 2010 system in the above picture. The students all have huge monitors in front of them, so the opportunities for shared or collaborative interaction are limited.  I like the multi-mice concept better, since the children can really be together

Mouse Mischief
Neema Moraveji, of the Stanford University HCI group, has videos and information about the multiple mice-related work on his project Page:

Teachers provide content using an add-on for PowerPoint that allows for simultaneous input from multiple mice.  The teacher can set up limits regarding how the mice are used by  the students.


I tried this with a few students during the 2008-09 school year, and they liked it.  Since I serve more schools this current year, I haven't had the opportunity to explore this further. I plan to download a newer version and try it out soon.

Good news!
The free beta version of Microsoft Mouse Mischief from the Microsoft website was recently released: Microsoft Mouse Mischief: Make your PowerPoint presentations interactive

Below is information about Mouse Mischief from the Microsoft website:

"Mouse Mischief is a tool that Microsoft makes available free of charge, and that allows teachers to work with Microsoft Office PowerPoint to make interactive presentations. With Mouse Mischief, teachers can add multiple choice questions to their presentations, and large groups of students can answer the questions using mice connected to the teacher’s PC."
"Mouse Mischief not only gives students the ability to engage, have fun, and learn in new, interactive ways, but it also provides teachers with a more affordable alternative to purchasing expensive student response systems, commonly known as clickers, by letting students use affordable wired or wireless USB mice that their school already own."
"It’s simple. After Mouse Mischief is installed, the Mouse Mischief toolbar will appear as part of the PowerPoint ribbon when a new or old PowerPoint presentation is opened. This intuitive Mouse Mischief toolbar lets teachers add interactive elements such as multiple-choice question slides with a single click. When the teacher opens a Mouse Mischief enabled presentation, students in the classroom can answer each question by clicking it with their uniquely designed mouse cursor. Once the students have selected their answers, the teacher can display the correct answer...The best part? Mouse Mischief gives teachers the option to have their students answer questions individually or as part of a team, in order to encourage both competition and collaboration in the classroom...Special teacher controls allow the teacher to disable student’s mouse cursors, navigate between slides, set timers, and more. With Mouse Mischief the teacher is always in control, whether there are two or 25 cursors on the screen."

If you are interested in developing applications for Mouse Mischief, you can download Windows MultiPoint Software Development Kit 1.5  This kit allows developers to enable up to 25 mouse devices to work at the same time on one computer. It was released on 1/12/2010 and can be downloaded from the Microsoft website.

Here a plug from Microsoft about the benefits of the MultiPoint Mouse SDK:
"Applications built on the MultiPoint Mouse SDK can provide teachers with tools to gain real-time assessment information to help them provide a personalized learning experience for each of their students...Applications built on the MultiPoint Mouse SDK can increase student learning comprehension through interactive methods.MultiPoint Mouse applications can further a student’s engagement, collaboration, interaction and overall cognitive and social skills within a classroom or lab environment."

Here is the information about the MultiPoint SDK:
"The Windows MultiPoint Mouse SDK version 1.5 is a development framework that allows developers to build applications that enable up to 25 individual mouse devices to work simultaneously on one computer. As a developer, you can use the MultiPoint Mouse SDK to create educational applications that take advantage of collaborative learning methodologies. In schools with minimum infrastructure, MultiPoint Mouse greatly enhances the shared computing experience. Initial pilot programs conducted in India by Microsoft Research show that for certain subjects, MultiPoint Mouse can enhance learning when compared to a 1:1 computing scenario."

"MultiPoint Mouse should not be confused with applications that allow multiple people to control multiple mouse devices to perform standard operations. In those cases, the system traditionally cannot identify which mouse has made which changes, and there is normally no option for controlling the permissions of the various devices. MultiPoint Mouse is a development framework that enables developers to build applications to take advantage of multiple mouse devices, including the ability to handle mouse clicks from different users independently and to assign different permissions to each mouse. For example, the mouse belonging to a teacher in a learning application might need additional permissions to control the activity."



The MultiPoint SDK is compatible with Windows 7, Windows Vista Service Pack 2, Windows XP Service Pack 3, the .NET Framework version 3.5 SP1 or higher, Microsoft Expression Blend (you can use the trial version), Visual Studio 2008 or 2010 (you can use the free Express version), 2-4 mice devices for testing, and USB ports on the computer




Other thoughts:
Schools with money for advanced technology tools have purchased SMARTTables, and few have Microsoft Surface tables.  They are expensive, and don't offer a range of form factors to choose from.

I sort of like the concept behind the multi-user poker table that was in the casino on my cruse ship:



















Near the poker table is a display that shows the action from the poker game. In classroom settings, this display could be an interactive whiteboard, a projected display, or even a flat-panel screen.


There is a need for tables of different shapes in the schools. Speech pathologists, school psychologists, counselors, and others who provide guided group activities in the schools could use a multi-user table that follows this tried and true configuration:
http://www.qualitychurchfurniture.com/images/activitytablecolors/lrg_48KID72LOset.jpg
I'd love to hear from anyone who is using MultiPoint or Mouse Mischief, and also from anyone who is experimenting with various multi-touch table form factors.

Related:
Multple Mice for Computers in Education in Developing Countries (pdf)

Sep 11, 2009

Voyagers and Voyeurs: Supporting Collaborative Information Visualization. A little dry, but worth watching.

I'd like to see how these concepts play out on a multi-touch, multi-user table system!

In this video, Jeffrey Heer, from the Computer Science Department at Stanford University, presents his work in the field of interactive information visualization, and focuses on visualization techniques,visualization tools, and collaborative visualization.

The video is nearly one hour long, but worth it if you are a techie who is interested in information visualization.



Another option - scan through the following presentation:
Voyagers and Voyeurs: Supporting Social Data Analysis

Infoblog

Aug 1, 2009

Telepresence @ InfoComm 2009: Howard Lichtman's review

Howard Lichtman, of the Human Productivity Lab, wrote an interesting post reviewing various Telepresence companies featured at InfoComm 2009:

Telepresence at InfoComm 2009: A Review


According to Lichtman, "telepresence is the science and art of creating visual collaboration environments, networks, and strategies that duplicate in-person meeting experiences as completely as possible in both internal and external business communications. Effectively leveraging telepresence as an organizational and collaborative strategy can improve productivity and effectiveness by enhancing business communication, collaboration, and reducing physical travel."

(I'm a bit short on time today, so for more information regarding telepresence, take a look at the various links I've posted on this page.)

Here are a few items from Lichtman's post:

Musion Announces First Live Transatlantic Interactive 3D Hologram Public Broadcast


DVE Huddle Room with REALroom 3D (TM) option (augmented Reality/holographic appearance) -from http://www.devetelepresence.com/


Howard Lichtman's YouTube Channel

May 31, 2009

RENCI at UNC-Chapel Hill: Serious Gaming and Simulation, Unity 3D Platform

VIDEO
Running the Unity 3D Game Engine in a Global Immersion 4-Projector Dome System at the RENCI@UNC Engagement Center



For more information:
Serious Gaming and Simulation at RENCI@UNC

"
The application of game technology to teaching, training, and research has been a topic of interest at RENCI@UNC. More specifically, the adaptation of game engines and 3D environments to the specialized visualization environments at the engagement center is a promising area for experimentation and collaboration...We very recently obtained copies of Unity, and have been working through some of the basics. One of the first activities was to create specialized code to show Unity-built 3D environments in our Global Immersion dome system. This was successfully accomplished with a good bit of code-hacking to render the proper camera views and viewports for the four channel dome system. There are some known issues with the Windows stand-alone viewer that we were able to work around. We did some imaginative manipulation of the Unity Plug-in using a browser embedded within a WPF application that allows arbitrary screen resolutions, as well as spanning across multiple displays. Look for a post later with some technical details."
FYI

Here are some pictures from RENCI:

http://unc.renci.org/wordpress/wp-content/themes/renci2.0/images/sections/resources/visualization-resources-b.jpg

http://unc.renci.org/wordpress/wp-content/uploads/2009/03/dome-001-300x225.jpgdome-003-a
dome-003-ahttp://farm4.static.flickr.com/3085/3177205942_79d56b3481.jpg

Apr 25, 2009

How soon will we see interactive information visualization for multi-touch & gesture systems?

The field of information visualization is growing. Until recently, most visualizations were created for use on a single PC or larger screen, allowing for interaction by only one user at a time. I have a feeling that this will be changing in the very near future.

Why? Interactive duo and multi-touch interfaces are becoming more common, and now come in all sizes of screens, from the iPhone, the Surface, and CNN's multi-touch "Magic Wall". People of all ages play interactive games on the Wii, often on large flat-panel displays. In my opinion, the time is right for those developing applications for the InfoViz world to think about harnessing the power of multi-touch.


Below is a picture of the front page of the Visual Complexity website. If you go to the site, you can select a visualization, and then explore it more in detail, as each picture links to a web-page that provides background information about the visualization project, the artist or team behind the project, and links to the project's website.

I took a look at a variety of the examples posted on the Information Complexity website, and think many would be enhanced by a transformation to a multi-touch, gesture, bi-manual, or duo/multi user system. I'm interested in learning what others think about this. If you are working on a collaborative information visualization project, feel free to add a comment and post a link.

Here is a a nice quote from the website:

"Functional visualizations are more than innovative statistical analyses and computational algorithms. They must make sense to the user and require a visual language system that uses colour, shape, line, hierarchy and composition to communicate clearly and appropriately, much like the alphabetic and character-based languages used worldwide between humans."

Matt Woolman
Digital Information Graphics




Update: I did a search for "multi-touch" on the Visual Complexity website and found a couple of interactive applications:

Reactable (I've posted about this system a few times!)
(Reactable website)



















Prototouch

(Wirmachenbunt Website)



















Mar 6, 2009

Interaction with the Web of Things; LIFT '09

I recently came across the Web of Things blog and found it to be a gem. Vlad Trifa and Dominique Guinard are the brains behind the blog. They have lots of interesting ideas that are at the intersection of WSN (Wireless Sensor Networks) and HCI (Human Computer Interaction).

Vlad and Dominique recently presented at LIFT '09. If you haven't heard of Lift, take some time and visit the website:

"LIFT is a series of events to inspire and connect the community of doers and thinkers exploring the social impact of new technologies. Each LIFT conference is a three days experience made of talks, workshops, interactive art and discussions to understand and anticipate the most important social changes, and meet the people behind them."


Everything on the Web of Things blog inspired me to write two posts, back to back, on my Technology Supported Human World Interaction blog:

More cool things from the Web of Things blog


The Web of Things, Wireless Sensor Networks, Embedded Systems, and (Everyware) Health Care
(The above post was inspired by my experience at the Cleveland Clinic, spending time every day with my father, who has been in the cardio ICU since his surgery a few days ago.)

Nov 13, 2008

RENCI at Duke University: Multi-Touch Collaborative Wall and Table utilizing TouchLib; More about UNC-C's Viz lab...

RENCI is a multi-disciplinary collaboration between several universities in North Carolina, with centers located at the Europa Center, Duke University, N.C. State, UNC Chapel Hill, East Carolina University, UNC-Asheville, UNC-Charlotte, and the Health Sciences Library at UNC-Chapel Hill. Many of the centers focus on visualization and collaborative technologies, and have been involved in multi-touch "surface" computing.

The pictures below are from the RENCI center at Duke University:

http://vis.renci.org/multitouch/wp-content/themes/daleri-dark-10/img/front.jpg

Duke Multi-Touch Collaborative Wall

The multi-touch wall is 13 x 5 feet, and utilizes six high-definition projectors, resulting in a combined resolution of 5760-2160, and supports multiple users. According to information on the RENCI website, the design is scalable and applicable to non-flat surfaces. The wall system runs on Windows and Linux.

Duke Multitouch Wall. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

DI, or Direct Illumination is used for touch detection in both the wall and the table for detecting touch. A separate instance of Touchlib runs for each of the 8 cameras used to detect touch. A gesture engine interprets the information about touches on the screen as gesture events. Each camera is handled separately for image processing and blob tracking tasks.

Direct Illumination (DI)

Graphics from the RENCI Vis Group Multi-Touch Blog

The Duke Multi-Touch Wall System

Here is cool picture of the "Multi-touch Calibration Device", which uses a built-in TouchLib utility.

Calibrating using the utility built into TouchLib.

Additional information can be found on the RENCI Vis Group Multi-Touch Blog.

FYI

Touchlib is a multi-touch development kit that can be found on the NUI-Group website.

"Touchlib is a library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light, and sends your programs these multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It includes a configuration app and a few demos to get you started, and will interace with most types of webcams and video capture devices. It currently works only under Windows but efforts are being made to port it to other platforms."

If you are interested in creating your own multi-touch table, the NUI-Group website and forums are a great place to start.

Related:

If you follow my blog, you probably know that I've taken several graduate courses at UNC-Charlotte. Some of my professors and a classmate or two have been involved in some exciting visualization research over the past year. (If you are serious about multi-touch and other visually-based applications, it is worth taking some time to familiarize yourself with visualization and interaction research.)

News from the UNC-Charlotte Vis Center:

At the University of North Carolina at Charlotte, RENCI is a collaboration between the UNC Charlotte Urban Institute, the Center for Applied Geographic Information Science, and the Charlotte Visualization Center.

11/06/2008
Robert Kosara’s group wins two awards at IEEE VisWeek Caroline Ziemkiewicz and Robert Kosara won Honorable Mention (the second highest award) at the IEEE InfoVis Conference for their paper, “The Shaping of Information by Visual Metaphors”. Also, Alex Godwin, Kosara’s student, won Best Poster for his submission, “Visual Data Mining of Unevenly-Spaced Event Sequences”.

The Vis Center is pretty fascinating, as you can see by the group of visitors at an open house.

If you are just as fascinated by this stuff as the guys in the picture, here are links to some recent papers by UNC-Charlotte faculty affiliated with the Vis Center:

The Shaping of Information by Visual Metaphors (Caroline Ziemkiewicz and Robert Kosara)

Evaluating the Relationship Between User Interaction and Financial Visual Analysis (Don Hyun Jeong, Wenwen Dou, Felsia Stukes, William Ribarsky, Heather Richter Lipford, Remco Chang)

Visual Analytics for Complex Concepts Using a Human Cognition Model (Tera Marie Green, William Ribarsky, and Brian Fisher)

Aug 16, 2008

Microsoft Research project: MouseMischief - Multi-user, Multi-Mice Interaction on Large Displays

This is an interesting demonstration of the use of multiple mice, controlled by children on an interactive whiteboard. The collaborative application uses Microsoft's Multi-Point technology. For more information and free downloads, go to MouseMischief.org.


Aug 5, 2008

Mozilla's Concept Series: A call for collaborative participation. Demos look like they would work well on a touch screen...

Mozilla labs has a call for participation in the development of nextgen web interaction design. "Be bold. Be radical. The crazier, the better. Let’s explore the future together."
-Mozilla Labs

The following videos give a good overview of the innovations initially created for this new endeavor:

Aurora, created by Adaptive Path, is an adaptive browser interaction concept, incorporating radial and wheel menus, data visualization objects, a 3D navigation system, and more.

Aurora (Part 1) from Adaptive Path on Vimeo.



Bookmarking and History Concept Video from Aza Raskin on Vimeo.



Firefox Mobile Concept Video from Aza Raskin on Vimeo.



Excerpt about this collaborative project, from the Mozilla Labs website:

"Today we’re calling on industry, higher education and people from around the world to get involved and share their ideas and expertise as we collectively explore and design future directions for the Web.

You don’t have to be a software engineer to get involved, and you don’t have to program. Everyone is welcome to participate. We’re particularly interested in engaging with designers who have not typically been involved with open source projects. And we’re biasing towards broad participation, not finished implementations.

We’re hoping to lower the barrier to participation by providing a forum for surfacing, sharing, and collaborating on new ideas and concepts. Our goal is to bring even more people to the table and provoke thought, facilitate discussion, and inspire future design directions for Firefox, the Mozilla project, and the Web as a whole."


-via Putting People First

This will be an interesting trend to follow. I'd like to work on web navigation systems optimized for multi-touch and large screen displays. Is it now possible?

May 24, 2008

Dance.Draw Project : Exquisite Interaction - Collaboration between Software Information Systems -HCI- and Dance Departments at UNC-Charlotte


DANCE.DRAW: EXQUISITE INTERACTION
(Updated)

"The movement of the visualizations are artifacts in real-time of the movements of the dancers. They draw while they dance, they dance together and they draw together. Every performance generates a new visual imprint." -DanceDraw website


Interactive multimedia technology, blended with the arts!

Dr. Celene LaTulipe
, from UNC-Charlotte's Software and Information Systems Department, Professor Sybil Huskey, from the dance department, dance students, and others collaborated to create an amazing performance that I had the opportunity to see performed during the
Visualization in the World Symposium in April (2008).

If you look closely, you will see that each dancer holds two wireless mice, one in each hand. The mice trigger the visualization that is projected in the background. Dr. LaTulipe has focused some of her research on two-handed computer interaction. It is interesting to see how her work has been applied to this beautiful "off-the-desktop" application.

Dance.Draw is a work in progress- visit the following links for more information:

Website (Updated)
Movie
Technical Info
Dr. Kosara's Eager Eyes post about Dance.Draw

Note:
Dr. LaTulipe was my HCI professor- Dr. Kosara was my Visualization/Visual Communication professor.



May 21, 2008

Cross Post: One Laptop Per Child's Redesign- Dual Touchscreens, Flexibile Use



Photo from One Laptop per Child via MIT Technology Review


"Hundred-dollar laptop, revisited: The next-generation version of the One Laptop per Child machine will dispense with keypads. It can be folded flat to make one larger screen (left); here, two children could play a game, each using the touch-screen capability. Or it can be held on its side and used as an electronic book (right)."


I'm impressed with the new design of the OLPC, the dual touch screen, the support of collaboration and sharing between children, and the flexibility it will provide educators and students. It can even be used as an e-Book! This laptop would be welcomed in UDL classrooms.

For detailed information about the new OLPC laptop, which has not yet been released, read David Talbot's article in the
MIT Technology Review.

I want one.

Feb 13, 2008

Cooperation between special education and regular education technology is vital: Link to TechPsych post and information from Education Week

I've shared some thoughts regarding collaboration between special education and regular education regarding technology, questions for educators about technology and educational data management systems, a link to a related article from Education Week's Digital Directions, and links to resources, in a recent TechPsych post. Take a look!


Dec 9, 2007

Link to a great post about serious games and virtual reality in business

Elaine Alhadeff's Future-Making Serious Games blog has a wealth of interesting information for people interested in how interactive multimedia technology can impact the present- and the future. Take the time to visit her blog and take a look!

Here is a slice from one of her latest posts:

Via: daytaOhio - Barco and daytaOhio Equipped Collaborative Visualization Complex at Wright State’s Joshi Research Center

"Barco, a leading global provider of visual display systems, has partnered with daytaOhio to implement the industry’s most advanced virtual collaboration and visualization complex at Wright State University’s Joshi Research Center. "




It is exciting to learn about the growth in collaborative visualization! The businessmen below seem to be enjoying the experience:




For more interesting ideas, look at Eliane's archived posts about serious games.