Showing posts with label NUI. Show all posts
Showing posts with label NUI. Show all posts

Apr 26, 2010

Interactive Multi-Touch for Children with Autism Spectrum Disorders: Research and Apps by Juan Pablo Hourcade, Thomas Hanson, and Natasha Bullock-Rest, University of Iowa

Dr. Juan Pablo Hourcade heads a team of researchers at the University of Iowa who are creating multi-touch applications and other technologies to support communication, collaboration, creativity, and self-expression for young people with Autism Spectrum Disorders.  The picture below is a screen shot of the team's web page that includes a few videos of the team's important work: (Videos can be found on the Technologies for Children with Autism Spectrum Disorders webpage.)


RELATED VIDEOS

pymt demo reel from Thomas Hansen on Vimeo.


Multi Touch @ University of Iowa from Thomas Hansen on Vimeo.


RELATED
Hourcade, J.P., & Hansen, T.E. (2010). Multitouch Displays to Support Preschool Children's Learning in Mathematics, Reading, Writing, Social Skills and the Arts (pdf)


Dr. Hourcade is organizing the Digital Technologies and Marginalized Youth workshop on Satuirday, June 12, 2010 at the 9th International Conference on Interaction Design and Children (IDC), which will be held this year in Barcelona, Spain.


Other workshops that will be held at IDC 2010:
Interactive Storytelling for Children
Extended Abstract: Interactive Storytelling for Children (pdf)
Designing Tangibles for Children
Play and Technology:  How does technology initiate and facilitate physical play?


Dr. Hourcade's publications

Apr 9, 2010

Interesting work: Mobile Projection & More from Johannes Shoening's YouTube channel

Here are a few videos of Johannes Schoening's research videos related to mobile projection:








SOMEWHAT RELATED: 

GeoLens: Allowing Multi-User Interaction with Geographic Information Systems on Interactive Surfaces

Apr 5, 2010

Update on Ideum: Multitouch, Multiuser Table at the California Science Center

Ideum is a company that has been around since 1999.  It focuses on the design and creation of "computer-based exhibits, multitouch installations, rich Internet applications, and social networking sites for museum and Web visitors."   Gestureworks is the software developed by Ideum, which is an authoring solution for Adobe Flash and Adobe Flex.

Below is some information, including a video clip, of one of Ideum's most recent installations a multiuser, multitouch table at the California Science Center.

Case Study:  Ideum's  L.A. Zone Multi-touch, Multiuser Table (Jim Spadaccini)

Here is an excerpt from Jim Spadaccini's post:
"This multiuser, multitouch table exhibit in the L.A. Zone allows visitors to explore a satellite image and map of the LA Basin and view overlays in thematic areas that explore fire, air, water, and earth. In addition, points of interest are placed on the map in the form of images and videos. All of the content and navigational controls are available in English and Spanish. The exhibit software runs on a custom-built 50" multitouch table that supports 60+ simultaneous points of input."

The Ideum team has spent a great deal of attention to the way groups of museum visitors interact with their multitouch applications. In the past, they've noticed that on a 50" surface, visitors would experience interaction conflicts. In the present application, more than one visitor can manipulate the content without changing the interaction of others.  Spadaccini points out that multi-user control of touch screen exhibits is new, so interface design concepts developed for single-user interaction are no longer useful.  Social interactions around a multiuser display are now something to consider.

In my opinion, what the Ideum team learns through their design process can be quite useful to those of us planning to develop multiuser educational applications for interactive surfaces and displays. What I liked about this exhibit is that it has a monitor that mirrors the display, so people can view the interaction from afar.  This feature would be useful in classroom settings that have an interactive table and an interactive whiteboard. The interactive whiteboard as the secondary display so the teacher and other students could see what the group at the table was exploring.



Multitouch Google Maps and Flickr Mashup Built with Flash



Update: A video showing how Ideum's multi-touch table can withstand all sorts of heavy-duty treatment:




Note:I have been pretty busy over the past couple of months- I have much that I haven't yet blogged about.  More to come!

Apr 4, 2010

ASU's SMALLab, a Mixed Reality Learning Environment : Multi-modal, and Collaborative; Links to Publications about Digital Media and Learning for the 21st Century

"Nothing's impossible, we just get smarter and smarter by the day." - Student, commenting about his experiences in the SMALLab environment. 


The research team at Arizona State university, lead by David Birchfield, has worked to create embodied, multimodal, and collaborative mediated learning learning environments using mixed reality that has been in use at Coronado High School with much success. The SMALLab is a learner-centered approach to learning that provides multi-modal, multi-sensory activities that engages learners, and also results in deeper understanding of more complex concepts.


Here is some information from the SMALLab website:

SMALLab @ Arizona State University - 2009 from aisling kelliher on Vimeo.


Video of high school students describing their work in SMALLab (Coronado High School)
"Central to our work is the development of a new interactive mixed reality learning environment, the Situated Multimedia Art Learning Lab [SMALLab]. SMALLab is an environment developed by a collaborative team of media researchers from education, psychology, interactive media, computer science, and the arts.  SMALLab is an extensible platform forsemi-immersivemixed-reality learning. By semi-immersive, we mean that the mediated space of SMALLab is physically open on all sides to the larger environment.  Participants can freely enter and exit the space without the need for wearing specialized display or sensing devices such as head-mounted displays (HMD) or motion capture markers.  Participants seated or standing around SMALLab can see and hear the dynamic media, and they can directly communicate with their peers that are interacting in the space. As such, the semi-immersive framework establishes a porous relationship between SMALLab and the larger physical learning environment.  By mixed-reality, we mean that there is an integration of physical manipulation objects, 3D physical gestures, and digitally mediated components.  Byextensible, we mean that researchers, teachers, and students can create new learning scenarios in SMALLab using a set of custom designed authoring tools and programming interfaces.

SMALLab supports situated and embodied learning by empowering the physical body to function as an expressive interface. Within SMALLab, students use a set of “glowballs” and peripherals to interact in real time with each other and with dynamic visual, textual, physical and sonic media through full body 3D movements and gestures.  For example, working in theSpring Sling scenario, students are immersed in a complex physics simulation that involves multiple sensory inputs to engage student attention.  They can hear the sound of a spring picking up speed, see projected bodies moving across the floor, feel a physical ball in their own hands and integrate how the projected ball moves in accordance with their own body movements to construct a robust conceptual model of the entire system."


SLINK links the various installations of SMALLab, which are found in museums, schools, the Institute of Play in NYC (used with students who attend the digital-games based, technology-rich charter school, Quest to Learn), 
Gaming SMALLab [IOP, ASU]: Lab install at Parsons DT
-Institute of Play's SMALLab

-Arizona State University: A SMALL way to keep up with technology (Sheilah Britton, 5/11/09)

-Arizona State University:  A SMALL way to keep up with technology (Sheilah Britton, 5/11/09)
How SMALLab Works:


SMALLab modules


More about the way it works: SMALLab-Situated Multimedia Arts Learning


About David Birchfield:
David Birchfield is "a media artist, researcher, and educator.  He has created work that spans from interactive music performance to generative software to robotic installationsn to K-12 learning environments. In recent years, this work cuts across three areas of exploration:K-12 learning, media art installations, and live computer music performance."


Some publications:
              Birchfield, D., Megowan-Romanowicz, Johnson-Glenberg, M., Next Gen Interfaces: Embodied Learning Using Motion, Sound, and Visuals – SMALLab. To appear in Proceedings of the American Educational Research Association Annual Conference; SIG Applied Research in Virtual Environments for Learning [ARVEL], San Diego, CA, April 2009.
               Megowan-Romanowicz, M., Uysal, S., Birchfield, D., Growth in Teacher Self-Efficacy Through Participation in a High-Tech Instructional Design Community, to appear in proceedings of the National Association for Research in Science Teaching Annual Conference, Garden Grove, CA, April 2009.





  • Birchfield, D., Thornburg, H., Megowan-Romanowicz, C., Hatton, S., Mechtley, B., Dolgov, I., Burleson, W., Embodiment, Multimodality, and Composition: Convergent Themes Across HCI and Education for Mixed-Reality Learning Environments, Journal of Advances in Human-Computer Interaction, Volume 2008, Article ID 874563.

  • Dolgov, I., Birchfield, D., McBeath, M., Thornburg, H., Todd, C., Amelioration of Axis-Aligned Motion Bias for Active versus Stationary Judgments of Bilaterally Symmetric Moving Shapes’ Final Destinations, Perception and Psychophysics, in press 2008.

  • D. Birchfield, B. Mechtley, S. Hatton, H. Thornburg, Mixed-Reality Learning in the Art Museum Context, Proceedings of ACM SIG Multimedia, Vancouver, BC, October 27, 2008.

  • S. Hatton, D. Birchfield, M.C. Megowan, Learning Metaphor through Mixed-Reality Game Design and Game Play, Proceedings of ACM Sandbox Conference, Los Angeles, CA, August 10, 2008. [pdf]

Institute of Play's SMALLab contact:
          Katie SalenExecutive Director, Institute of Play Associate Professor, Parsons The New School for Design


The Institute of Play, along with the Joan Ganz Cooney Center and others, have a number of publications related to technology and learning:

"The mission of The Joan Ganz Cooney Center is to catalyze and support research, innovation and investment in digital media technologies to advance children's learning. Nurturing foundational and "21st century" literacies:

The inaugural focus of the Center—given the national need—will be on determining how technology can help elementary-aged children develop the fundamental building blocks of literacy. These include the vital reading, writing, speaking and listening capabilities that all children must develop during the primary grades. A special emphasis of the Center will be on struggling readers who risk educational failure if they do not catch up to their peers by grade four...Another important focus of the Center is to leverage the potential of interactive media to promote "21st century" literacies that students will need to compete and cooperate in our connected world—competencies such as critical thinking and problem solving, second language competency, inter-cultural understanding and media literacy."

Here are a few background resources from the Institute of Play's resources page:
The following reports will bring you up-to-date about technology and "21st Century" learning.
Generation M2: Media in the Lives of 8- to 18-Year-Olds
The Kaiser Family Foundation - Report - 2010
Wallis, C. The Impacts of Media Multitasking on Children’s Learning & Development
The Joan Ganz Cooney Center - Meeting Report - 2010 (pdf)
Shuler, C. iLearn: A Content Analysis of the iTunes App Store’s Education Section
The Joan Ganz Cooney Center - Report - 2009 (pdf)
Wellings, J., Levine, M.H.The Digital Promise: Transforming Learning with Innovative Uses of Technology
The Joan Ganz Cooney Center - White Paper - 2009 (pdf)
Thai, A. M., Lowenstein, D., Ching, D., and Rejeski, D. Game Changer: Investing in Digital Play to Advance Children's Learning and Health
The Joan Ganz Cooney Center - Policy Brief - 2009 (pdf)
There are a number of similar articles published by CISCO and/or the Metiri Group:
Lemke, C., Coughlin, E., Reifsneider, D. (2009). Technology in the Schools: What Does the Research Say? (pdf)
Lemke, C. (2009) Multitimodal Learning through Media:  What the Research Says (pdf)
Williams, S.M. The Impact of Collaborative, Scaffolded Learning in K-12 Schools: A Meta-Analysis (pdf)



Feb 26, 2010

More Multi-touch: New video from NUITEQ: Snowflake Suite Multi-touch Software on a 46" HD LCD



Here is the promotional information from NUITEQ:


"NUITEQ's Snowflake Suite off the shelf multi-touch software product showcased on a 46" flat full HD multi-touch LCD.  Available for purchase now.


Snowflake Suite is honored with a Stevie Awards finalist recognition for Best Product or Service of the Year 2009 in the category Media and Entertainment for the International Business Awards. Snowflake Suite is available to OEM's, SI's, VAR's, software engineers and end clients. Snowflake Suite comes with hands full of multi-touch applications, an API and a SDK.


Compatible with different multi-touch hardware technology platforms, including: 3M Touch Systems, N-trig, NextWindow, Lumio, Nexio, IR Touch, rear camera based systems, dreaMTouch and others."


"Natural User Interface Technologies AB (NUITEQ) is a Swedish technology company, that offers off the shelf and customized software for interactive single and multi-touch devices. Additionally NUITEQ executes large scale customization projects, concerning multi-touch technology hardware, software and services. In parallel, NUITEQ is working on other innovative emerging technologies within the field of Human Computer Interaction (HCI)."


Today's News:  NUITEQ wins MerlT Award for development through collaboration, growth potential and innovative thinking.


RELATED



NUITEQ Flat 46'' Multi-touch LCD






Key Features
arrow icon Size: 46"
arrow icon Snowflake Suite life time license included
arrow icon Full HD
arrow icon Simultaneous detectionand tracking of up 32 touch points
arrow icon Definite and reliable detection of touch points
arrow icon Real multi-touch function: no specific constraints, like temporal order or position of touch points
arrow icon Particularly suitable for embedded systems: Processing of complex evaluation steps via touch controller,
e.g. keyboard emulation
arrow icon Sample rate: 50 frames/s
arrow icon Glass: 4 mm toughened safety glass
arrow icon Slim design: only 3 cm construction depth and 5 cm frame width
arrow icon USB interface and separate power supply
arrow icon No drift, no calibration required
arrow icon Applicable to all display technologies
arrow icon Protocol: TUIO
arrow icon Detection of size and position of objects in the active area
arrow icon 2 years warranty
Electrical
arrow icon Power Supply: 19 V DC ± 20 %
arrow icon Power Consumption: 11 W
arrow icon Interface: USB 1.1, full speed
 
Functional
arrow icon Simultaneous Touch Points: 32
arrow icon Touch Point Size: > 10 mm
arrow icon Spatial Resolution: < 2 mm
arrow icon Communication Protocol : Propriatary or ·· TUIO 1.0 (2Dcur and 25Dcur profiles)
arrow icon Scan Speed: 55 ms
Mechanical Data
arrow icon Frame Dimensions: 1136 x 680 x 28 mm
arrow icon Frame Finish: Black powder coating (RAL 9011)
arrow icon Active Area: 1018.1 x 572.7 mm
arrow icon Window: 4 mm double-sided anti-reflective laminted safety filterglass
arrow icon Weight: 11 kg (including filter glass)
arrow icon Monitor: Direct fit to SHARP PN-465E (35,5 KG)
Environmental
arrow icon Operating Temperature: 0 °C to 40 °C
arrow icon Operating Humidity: 20 % to 80 % (no condensation)







Also available as a horizontal solution.









NUITEQ Wiki
Harry van der Veen's NUITEQ Blog
(I've been following Harry van der Veen's journey since he was a university student. At the time, he was a leading member of the NUI-Group, creating a DYI multi-touch table as part of his studies. This was before Microsoft Surface was born.)



Jan 27, 2010

How a multi-touch table is made: NUI Group members Aaron Bitler and Brady Simpson on HAK.5

If you want to know more about the inner workings of DYI multi-touch, you'll be interested in the video below.

Aaron Bitler and Brady Simpson learned how to build a multi-touch table through their participation in the NUI Group, and recently formed a company, 3M8 to distribute multi-touch solutions.  The 1/2 hour video provides an in-depth tour of multi-touch technology, in an an on-line broadcast on the HAK.5 website.


"Brady Simpson explains the different multi-touch methods. Aaron Bitler gives us a tour of the x19 multitouch tables construction. Brady explains the software used to power the x19 lcdmultitouch table, including the nuigroup CCV application. Aaron wraps up by explaining to us how he got involved in multitouch. He shares with us some sites where others can get started in their own homebrew projects, as well as his own multitouch company 3M8s at LCDMultiTouch.com."

UPDATE:  New website for Brady Simpson's company: http://www.lcdmt.com/

Jan 23, 2010

Ubice = Multi-touch On Ice at the Nokia Research Center in Finland (Video + Pic via Albrecht Schmidt)

Albrecht Schmidt posted a link to the following video on his User Interface Engineering blog -
watch how the multi-touch surface is built from ice blocks:



The following pictures are from Albrecht Schmidt's User Interface Engineering blog:
[multitouc-on-ice.jpg]

What a great idea for a "Winterfest"!    They should take it to the next level:  Multi-touch ice sculpture.


(Cross-posted on The World Is My Interface blog.)

Jan 20, 2010

MediaTeam's Interactive Wall and Table

Interactive Touchscreens

Interactive Table

Mediateam Interactive Multitouch Table from mediateam on Vimeo.

Wall Screen

Mediateam Interactive Multitouch Screen from mediateam on Vimeo.
Video is from Mediateam

-via NUITEQ

I don't have much information about Mediateam. I think it might be MediaTeam Oulu, but I'm not sure. MediaTeam Oulu has quite a bit of research that focuses on ubiquitous computing.

Jan 14, 2010

Shared computing with Windows MultiPoint in classrooms: Why not use Mouse Mischief (beta version)?

I came across this post this on Long Zheng's I Started Something blog:

Windows MultiPoint Server -- a multiseat computing solution worthy for the home?
Long Zheng points out that Window's MultiPoint server is an outgrowth of the Multi-Mouse project, in which students multiple numbers of students can work together to interact with content a PC screen or a projected PC screen.

The picture below shows how a Windows MultiPoint server can work in a classroom.

-Microsoft

I'm not so sure I like the set up in the picture of the Multi-Point 2010 system in the above picture. The students all have huge monitors in front of them, so the opportunities for shared or collaborative interaction are limited.  I like the multi-mice concept better, since the children can really be together

Mouse Mischief
Neema Moraveji, of the Stanford University HCI group, has videos and information about the multiple mice-related work on his project Page:

Teachers provide content using an add-on for PowerPoint that allows for simultaneous input from multiple mice.  The teacher can set up limits regarding how the mice are used by  the students.


I tried this with a few students during the 2008-09 school year, and they liked it.  Since I serve more schools this current year, I haven't had the opportunity to explore this further. I plan to download a newer version and try it out soon.

Good news!
The free beta version of Microsoft Mouse Mischief from the Microsoft website was recently released: Microsoft Mouse Mischief: Make your PowerPoint presentations interactive

Below is information about Mouse Mischief from the Microsoft website:

"Mouse Mischief is a tool that Microsoft makes available free of charge, and that allows teachers to work with Microsoft Office PowerPoint to make interactive presentations. With Mouse Mischief, teachers can add multiple choice questions to their presentations, and large groups of students can answer the questions using mice connected to the teacher’s PC."
"Mouse Mischief not only gives students the ability to engage, have fun, and learn in new, interactive ways, but it also provides teachers with a more affordable alternative to purchasing expensive student response systems, commonly known as clickers, by letting students use affordable wired or wireless USB mice that their school already own."
"It’s simple. After Mouse Mischief is installed, the Mouse Mischief toolbar will appear as part of the PowerPoint ribbon when a new or old PowerPoint presentation is opened. This intuitive Mouse Mischief toolbar lets teachers add interactive elements such as multiple-choice question slides with a single click. When the teacher opens a Mouse Mischief enabled presentation, students in the classroom can answer each question by clicking it with their uniquely designed mouse cursor. Once the students have selected their answers, the teacher can display the correct answer...The best part? Mouse Mischief gives teachers the option to have their students answer questions individually or as part of a team, in order to encourage both competition and collaboration in the classroom...Special teacher controls allow the teacher to disable student’s mouse cursors, navigate between slides, set timers, and more. With Mouse Mischief the teacher is always in control, whether there are two or 25 cursors on the screen."

If you are interested in developing applications for Mouse Mischief, you can download Windows MultiPoint Software Development Kit 1.5  This kit allows developers to enable up to 25 mouse devices to work at the same time on one computer. It was released on 1/12/2010 and can be downloaded from the Microsoft website.

Here a plug from Microsoft about the benefits of the MultiPoint Mouse SDK:
"Applications built on the MultiPoint Mouse SDK can provide teachers with tools to gain real-time assessment information to help them provide a personalized learning experience for each of their students...Applications built on the MultiPoint Mouse SDK can increase student learning comprehension through interactive methods.MultiPoint Mouse applications can further a student’s engagement, collaboration, interaction and overall cognitive and social skills within a classroom or lab environment."

Here is the information about the MultiPoint SDK:
"The Windows MultiPoint Mouse SDK version 1.5 is a development framework that allows developers to build applications that enable up to 25 individual mouse devices to work simultaneously on one computer. As a developer, you can use the MultiPoint Mouse SDK to create educational applications that take advantage of collaborative learning methodologies. In schools with minimum infrastructure, MultiPoint Mouse greatly enhances the shared computing experience. Initial pilot programs conducted in India by Microsoft Research show that for certain subjects, MultiPoint Mouse can enhance learning when compared to a 1:1 computing scenario."

"MultiPoint Mouse should not be confused with applications that allow multiple people to control multiple mouse devices to perform standard operations. In those cases, the system traditionally cannot identify which mouse has made which changes, and there is normally no option for controlling the permissions of the various devices. MultiPoint Mouse is a development framework that enables developers to build applications to take advantage of multiple mouse devices, including the ability to handle mouse clicks from different users independently and to assign different permissions to each mouse. For example, the mouse belonging to a teacher in a learning application might need additional permissions to control the activity."



The MultiPoint SDK is compatible with Windows 7, Windows Vista Service Pack 2, Windows XP Service Pack 3, the .NET Framework version 3.5 SP1 or higher, Microsoft Expression Blend (you can use the trial version), Visual Studio 2008 or 2010 (you can use the free Express version), 2-4 mice devices for testing, and USB ports on the computer




Other thoughts:
Schools with money for advanced technology tools have purchased SMARTTables, and few have Microsoft Surface tables.  They are expensive, and don't offer a range of form factors to choose from.

I sort of like the concept behind the multi-user poker table that was in the casino on my cruse ship:



















Near the poker table is a display that shows the action from the poker game. In classroom settings, this display could be an interactive whiteboard, a projected display, or even a flat-panel screen.


There is a need for tables of different shapes in the schools. Speech pathologists, school psychologists, counselors, and others who provide guided group activities in the schools could use a multi-user table that follows this tried and true configuration:
http://www.qualitychurchfurniture.com/images/activitytablecolors/lrg_48KID72LOset.jpg
I'd love to hear from anyone who is using MultiPoint or Mouse Mischief, and also from anyone who is experimenting with various multi-touch table form factors.

Related:
Multple Mice for Computers in Education in Developing Countries (pdf)