May 31, 2009

RENCI at UNC-Chapel Hill: Serious Gaming and Simulation, Unity 3D Platform

VIDEO
Running the Unity 3D Game Engine in a Global Immersion 4-Projector Dome System at the RENCI@UNC Engagement Center



For more information:
Serious Gaming and Simulation at RENCI@UNC

"
The application of game technology to teaching, training, and research has been a topic of interest at RENCI@UNC. More specifically, the adaptation of game engines and 3D environments to the specialized visualization environments at the engagement center is a promising area for experimentation and collaboration...We very recently obtained copies of Unity, and have been working through some of the basics. One of the first activities was to create specialized code to show Unity-built 3D environments in our Global Immersion dome system. This was successfully accomplished with a good bit of code-hacking to render the proper camera views and viewports for the four channel dome system. There are some known issues with the Windows stand-alone viewer that we were able to work around. We did some imaginative manipulation of the Unity Plug-in using a browser embedded within a WPF application that allows arbitrary screen resolutions, as well as spanning across multiple displays. Look for a post later with some technical details."
FYI

Here are some pictures from RENCI:

http://unc.renci.org/wordpress/wp-content/themes/renci2.0/images/sections/resources/visualization-resources-b.jpg

http://unc.renci.org/wordpress/wp-content/uploads/2009/03/dome-001-300x225.jpgdome-003-a
dome-003-ahttp://farm4.static.flickr.com/3085/3177205942_79d56b3481.jpg

May 30, 2009

Ball's Plight, an iPhone and iPod Touch Game App by PriPri Games (plug for a former classmate) -and more games-related info....

Priyesh Dixit, a game developer has a free game app for the iPhone and iPod Touch. "Ball's Plight", can be downloaded from the iTunes Store. Here is the description:

"You are Ball, an alien with telekinetic powers and no limbs. An evil scientist has captured you and is forcing you to play his little game so he can unlock the secrets of your powers. You must survive in order to escape!"

"At its core, Ball's Plight is a classic game of ball and paddle. Touch the ball to the paddle to increase your score, reach the goal score to advance to the next level while earning points."

http://priyesh.org/pics/BallsPlight2.jpghttp://priyesh.org/pics/BallsPlight1.jpg
Priyesh was one of my classmates in Dr. Michael Youngblood's Artificial Intelligence for Game Design Class at UNC-Charlotte during the fall of 2006. He went on to work with Dr. Youngblood and Hunter Hale, a Ph.D. student, on the Common Games Understanding and Learning Toolkit (CGUL).

The mission of the CGUL project:

"To identify, develop, collect, and integrate a suite of tools and techniques for improving the understanding of interaction and behavior in all FPS/3PS games, incorporating knowledge as game artifacts, and learning how to advance game artificial intelligence through these mechanisms." (CJUL downloads)

About Priyesh:

Priyesh is a programmer at TimeGate Studios in Surgar Land, TX. He has a MS in Computer Science with a certificate in Game Design and Development.

Priyesh twitters as "
madgamer7", if you'd like to follow him.


RELATED:


http://playground.uncc.edu/images/header.jpg
The Playground: UNC-Charlotte's Game Design and Development Program

http://playground.uncc.edu/GameIntelligenceGroup/GIG-L1.png
UNC-Charlotte's Game Intelligence Group, lead by Dr. Michael Youngblood


http://playground.uncc.edu/Game2Learn/images/buttonlogo.png

UNC-Charlotte's Game2Learn Group, lead by Dr. Tiffany Barnes

Publication:
D.Hunter Hale, G. Micheal Youngblood, & Priyesh N. Dixit Automatically-generated Convex Region Decomposition for Real-time Spatial Agent Navigation in Virtual Worlds (pdf) Association for the Advancement of Artificial Intelligence
(Voted Most Infuential Game AI Publication of 2008)

Game:
Incursion, an Xbox game created by UNC-Charlotte Game Lab students,is available for download from the Xbox LIVE Marketplace website.

May 29, 2009

May 28, 2009

Multi-player multi-touch: "NuMTyPysics", based on Tim Edmond's Numpty Physics (similar to Crayon Physics)

I have Numpty Physics on my Nokia n800 internet tablet, and Crayon Physics on my HP TouchSmart PC. Both are designed for single touch, and are fun to play. Since my TouchSmart can handle duo-touch input, I wondered what Crayon Physics might be like if it supported two players at once.

http://www.tuxi.com.ar/wp-content/uploads/numpty-physics-tuxi.jpg

As you can see from the video, Thomas Perl and his colleagues have figured this out- at least with Numpty Physics!



The music is worth the watch. It's by Triplexity.

Numpty Physics and Crayon Physics both use the Box2D engine. Here is some information from the website:

"NuMTyPYsics are our enhancements to Tim Edmond's NumptyPhysics game. We added support for receiving TUIO messages from tbeta via PyTUIO by embedding an Python interpreter into the NumptyPhysics code. Currently, we simply emulate mouse input by pushing hand-crafted input events (SDL_Event) onto the SDL event queue (SDL_PushEvent). In the future, we plan to do bi-directional communication between the game engine (written in C++) and our multi-touch handling code, which will be written in Python."
-Thomas Perl

Note:
I've use Crayon Physics Deluxe with several of the students I work with who have severe autism. It is amazing how well they can figure out solutions for the levels. It would be even better if it could be enabled for duo-touch. It supports joint attention, which is a very important social interaction skill for young people with autism to develop.

May 27, 2009

My Nabaztag:tag and Mir:ror Arrived!

http://regmedia.co.uk/2009/01/13/mirror_1.jpg

I'll update this post after I've played with my new Wi-Fi and RFID toys.

For more information:

violet: Let All Things Be Connected

"The Internet of Things starts here"

May 26, 2009

GestureTek's "Cube": A compact Interactive Gesture-based Display System

GestureTek's "Cube"




From the GestureTek website:

"Introducing The Cube - a compact, turnkey, 'plug and play' interactive display unit that brings the power of gesture control to a variety of display spaces. Project the interactive 80” diagonal display onto almost any floor, wall, table or counter for branding, advertising, entertainment and product promotion. The Cube will engage customers, turn heads and drive business results."

Brochure
(pdf)

May 24, 2009

Thorsten Blum and Johann Korndörfer's Multi-touch 360 Sphere for Art Installations- and Space Invaders!

I came across the Multi-touch 360 project today and I thought it would be the best surface for a prototype I worked on a couple of years ago. Multi-touch 360 would be great for travel planning and teaching geography. I could see one of these interactive spheres in the lobby of every cruise ship!

This


on this!

The Multi-touch 360 project is the work of Thorsten Blum and Johann Korndorfer, based on their previous work on the Space Invader's 360 game, featured later in this post. It is similar to Microsoft's Sphere project.

Multi-Touch 360 First Tests Video:




Picture Collage of Multi-Touch 360 In Process:

Multi-Touch Materials Research

SPACE INVADERS 360 VIDEO


SPACE INVADERS 360


Space Invaders 360: An Interactive Multplayer Gaming Installation That Makes You Move (pdf)
MindTrek 2008 October 6-9, 2008, Tampere, FINLAND ACM

ABSTRACT
"Space Invaders 360 is a tribute to the video game Space Invaders designed by Tomohiro Nishikado in 1978. While the original version is a two-dimensional arcade shooter game, we have created an interactive gaming installation combin-ing the fun and thrill of the classic video game with experimental concepts of display technologies and new ways of user interaction.
Beside the technical aspects we also want to discuss the social aspects of our work. How does the gameplay change
in the 21st century? What new ways of user interaction can be thought of? The success of Nintendo Wii and Nintendo DS [7] shows the interest of professional and casual gamers in new methods of user interaction. While the last decades have been dominated by video games on fixed screens, over- sized gamepads and realistic graphics, gamers are now moving towards video gaming with a focus on physical interaction."



Thorsten maintains another blog, which is worth a look. I like the links he posts:
More about Microsoft's Sphere prototype:

"Our prototype device builds on a commercially available globe projection system (Global Imagination’s Magic Planet). We have added touch-sensing capabilities with an infrared camera that shares the optical path with the projector. This novel configuration permits us to enclose both the projection and the sensing mechanism in the base of the device and allows for easy 360 degrees access to the device and high degree of interactivity without shadowing or occlusion problems. We have also developed a custom graphics projection pipeline that solves the distortion problems and enables projected data to be correctly visualized on the curved surface.

The applications developed for Sphere currently include a picture and video browser, interactive globe visualization, finger painting, globe visualization, and an omni-directional video-conferencing application (360 degree panoramic video from Microsoft RoundTable device). These applications are designed for highly interactive public space environments (e.g., museums, lobbies, information booths) and highlight the appeal of the device as well as its visualization and interaction capabilities." Hrvoje Benko, Andy Wilson, Ravin Balakrishnan, Billy Chen

Note: Microsoft's RoundTable 360 video conferencing application is now known as the Polycom CX5000 Unified Conference Station, optimized for the Microsoft Office Communicator. More information can be found on the Polycom website.

http://research.microsoft.com/en-us/um/people/benko/projects/sphere/IMG_3834_small.jpg
Microsoft's Sphere

May 23, 2009

Razorfish's Touch Framework "Razorfashion" - A lot like my idea for an in-home FashionMirrorAdvisor...

Razorfish recently unveiled the Razorfashion application designed to provide shoppers with an engaging retail experience within the "multi-channel shopping ecosystem". I'm not the "shop to you drop" type of gal, but I can see that this concept could be useful in other situations, after a few tweaks.




As soon as I saw this Razorfish Touch "Fashion" demo video, it touched a nerve. I've been playing around with a similar idea, but for my personal use, in the form of an RFID-enabled system. I'd call it something like "FashionMirrorAdvisor".

Instead of showing skinny fashion models like the Razorfashion application, I'd harness the power of built-in web-cam and mirror my own image on the screen. My mirror would dress me up in the morning when I'm way too foggy to think about matching colors and accessories.

My FashionMirrorAdvisor would be my friend. My "smart" friend, since all of my clothes would be RFID-tagged, along with my shoes, jewelry, and other accessories. My make-up, too.

It would be a no-brainer. I really could use this application - just ask my husband!

More often than not, most mornings I find myself staring at the clothes in my closet, frozen in time, unable to formulate a fashion thought. I might set my eyes on a favorite blouse, but blank out when I try to think about the rest of the steps I need to pull my look together.


I know I can't wear my reddish-pink camisole with my dusty-orange/brown slacks, but at 5:15 A.M., who has the time to think about this little detail? My friend, the TouchFashionMirror would prevent me from making this fashion faux-pas.

No problem.

My FashionMirrorAdvisor would show me a few outfits, and dress my real-time moving image on the screen. Since she knows all things, she'd show me ONLY the articles of clothing that were clean, since my RFID system would keep up with all of that. It would be much more functional than a "virtual wardrobe" application.

I could try out different earrings without having to get them out.

If I couldn't find something, the RFID system would take care of this detail. My FashioMirrorAdvisor would know where I misplaced my clothes, accessories, and even my keys, since they would all be tagged. The mirror application would provide me with a nice little map of my house and car, and highlight the location of the item.

My FashionMirrorAdvisor would keep track of my laundry, too. This would be a great feature. So if my dirty laundry was piling up, and I wanted to wear outfit X, Y, or Z over the next few days, I'd receive a gentle reminder that I'd need to do some laundry first!

Another practical feature:

My FashionMirrorAdvisor would also serve as my health consultant, keeping track of my weight and BMI. This data, along with information gained from the webcam, would be combined so that my advisor would NEVER suggest an outfit that would be too...snug.

I could program the system to provide me with gentle reminders if my weight was an issue. My FashionMirrorAdvisor would show me images of myself "before" and "after", outfits included.

Information about the "after" outfits could be fed to the system from the web-catalogs of my favorite fashion retailers, and once I lost those 10 darned pounds, I'd find a nice parcel delivered to my door.

Thanks to my FashionMirrorAdvisor, I know that the outfit would be just right.


UPDATE 5/8/10:  The FashionMirrorAdvisor would be integrated with a mobile app - since I now have a smartphone, this would be quite useful in planning shopping trips centered around the purchase of new clothes, shoes, accessories, and coordinating cosmetics!  I created a little game  that I think would be ideal for this sort of thing, too.

I still want to work on this....someday.

Too many ideas, too little time!


RELATED
From the Razorfish site:
"The Razorfish Emerging Experiences team is a dedicated group of highly experienced professionals focused solely on emerging experiences and technologies. "Effective innovation" is our multifaceted approach to concepting and delivering pioneering solutions for our clients"

"Founded in 2008, Razorfish Emerging Experiences is a cross-functional team composed of strategists, artists, experience designers, and technologists. We’re part of the Razorfish Strategy & Innovation practice led by Shannon Denton. Jonathan Hull is the managing director of the team, Steve Dawson is the technology lead and Luke Hamilton is the creative lead."


Razorfish Emerging Experiences Portfolio

May 22, 2009

Axiotron's Mac Modbook

Axiotron's Tablet Mac

http://www.axiotron.com/fileadmin/templates/axt/img/143_01_fp_slice.jpghttp://www.axiotron.com/fileadmin/templates/axt/img/am_mar_01_MODBOOK-SLICED.jpg

$$$$. If it was multi-touch, I'd buy it.

Multimedia Business Cards, Face Tracker from LM3LABS

Multimedia augmented business cards offered by Lm3Labs. When a corporate logo on the business card is viewed by a customer's webcam, multimedia events are triggered:























"Catchyoo Face Tracker tracks faces of passing people for creating eye-catching interactivity with digital content. Faces become pointers which trigger events and follow people".

This window looks like it would be fun to play with!
LM3LABS English
LM3LABS JP

May 20, 2009

xXtraLab's Multi-touch Projects

xXtraLab is an interaction design firm located in Taiwan. The xxtralab team has been working on some interesting multi-touch projects. Take a look!






















Multi-touch wall for briefing and real-time info sharing
Multisensory iTea-table



"xXtraLab Design Co. is one of the leading multimedia company in Taiwan, focusing on the design & engineering of HCI (Human-Computer Interaction) interfaces in museum, exposition, and showrooms (client lists here). Members of xXtraLab come from diversifying fields such as visual design, digital media, architecture, interior design, information engineering, design computing, industrial design, and fine art. we respect different cultural views and work as a multi-disciplinary team to offer inclusive design services."

Kodu: A visual programming language made for children to create games, running on the Xbox, using a game controller for input.

"The core of the Kodu project is the programming user interface. The language is simple and entirely icon-based. Programs are composed of pages, which are broken down into rules, which are further divided into conditions and actions. Conditions are evaluated simultaneously.

The Kodu language is designed specifically for game development and provides specialized primitives derived from gaming scenarios. Programs are expressed in physical terms, using concepts like vision, hearing, and time to control character behavior. While not as general-purpose as classical programming languages, Kodu can express advanced game design concepts in a simple, direct, and intuitive manner." -Microsoft

Video from MSR Techfest (low quality, but provides an example of Kodu in action)

Interview about Kodu (link to video)

Kodu Blog

Article: New program allows everyone to program videogames (USA Today 1/7/09)

The Kodu Technology Club at Explorer Elementary School

Screenshots from the Kodu website:

http://research.microsoft.com/en-us/projects/kodu/programming_ui.jpghttp://research.microsoft.com/en-us/projects/kodu/load_ui.jpg
http://research.microsoft.com/en-us/projects/kodu/sensor_wheel.jpg
http://research.microsoft.com/en-us/projects/kodu/kodu_turtle.jpghttp://research.microsoft.com/en-us/projects/kodu/kodu_sutekki.jpg

Update from the Microsoft Research UK's Innovation Day 2009 -SecondLight, Collaborative Slate, and More!

Demonstration of SecondLight, Microsoft's plans for the next-gen Surface:

More from the Microsoft Research UK website:

Family Archive (links to video)


Links to videos can be found on the Enabling Innovation Through Research2009 website

For your convenience, you can link to the videos from here:

Keynote

Andrew Herbert

Digital Lifestyle

HomeBook
HomeWatcher
Kitchen Postcard
SecondLight
TimeCard
Wayve
Family Archive

Converging Sciences

Doing Real Computer Mathematics
Modelling & Simulating Biochemical Systems

Education

NodeXL

Environment

Somniloquy
Understanding and
Simulating Forests


RELATED:
Microsoft Research Fast Facts

May 19, 2009

MeeTTable: Kevin O'Mahoney's Multi-Touch Meeting Table Project

Kevin O' Mahoney is a multimedia developer who is working on his Masters in Media Design at Cork University in Ireland. The MeeTTable, pictured above, supports collaboration among 2 or more people who are developing 3D design projects. (Kevin is a member of the NUI-Group.)

More information about Kevin's project, include a video, can be found on his blog.

May 16, 2009

IEEE International Symposium on Multimedia Dec. 14-16, San Diego, CA

The IEEE International Symposium on Multimedia will be held in San Diego, CA. December 14-16, 2009. A variety of workshops will be held in conjunction with this symposium - links are provided below.

The information below is from the ISM 2009 website:

The IEEE International Symposium on Multimedia (ISM2009) is an international forum for researchers to exchange information regarding advances in the state of the art and practice of multimedia computing, as well as to identify the emerging research topics and define the future of multimedia computing. The technical program of ISM2009 will consist of invited talks, paper presentations, and panel discussions.

Submissions of high quality papers describing mature results or on-going work are invited. Topics for submission include but are not limited to:

Multimedia systems, architecture, and applications
Multimedia networking and QoS
Peer-to-peer multimedia systems and streaming
Pervasive and interactive multimedia systems including mobile systems, pervasive gaming, and digital TV
Multimedia meta-modeling techniques and operating systems
Architecture specification languages
Software development using multimedia techniques
Multimedia signal processing including audio, video, image processing, and coding
Visualization
Virtual Reality
Multimedia file systems, databases, and retrieval
Multimedia collaboration
Rich media enabled E-commerce
Computational intelligence including neural networks, fuzzy logic, and genetic algorithms
Intelligent agents for multimedia content creation, distribution, and analysis
Internet telephony and hypermedia technologies and systems
Multimedia security including digital watermark and encryption
Mobile Multimedia Systems and Services
Multimodal Interaction, including Human Factors
Multimodal User Interfaces: Design, Engineering, Modality-Abstractions, etc.
Multimedia tools including authoring, analyzing, editing, and browsing


The conference will offer a variety of workshops:

The ISM2009 Workshop Call for Proposals can be found here.

Papers submitted to each workshop will be reviewed by the program committee and external reviewers of the workshop.

The following workshop proposals have now been accepted:

DSMSA
Data Semantics for Multimedia Systems and Applications

MTEL
Multimedia Technologies for e-Learning
http://www.math.tu-berlin.de/~knipping/ieee/ism09-mtel/

MIPR
The Fifth IEEE International Workshop on Multimedia Information Processing and Retrieval
http://www.cis.fiu.edu/conferences/mipr09/

MASP
Multimedia Audio and Speech Processing: advancing the state-of-the-art
http://speechlab.ifsc.usp.br/ism2009/

CBTV
Content-based audio/video analysis for novel TV services
http://ism2009.eecs.uci.edu/cbtv09/

MS
Third International Workshop on the Many Faces of Multimedia Semantics

AdMIRe
International Workshop on Advances in Music Information Research
http://www.cp.jku.at/conferences/admire2009/

Any general questions regarding ISM2009 Workshops and workshop proposals should be directed to the ISM2009 Workshop Co-Chairs:

Shu-Ching Chen, Florida International University, USA
Chengcui Zhang, University of Alabama at Birmingham, USA
Atsuo Yoshitaka, Japan Advanced Institute of Science and Technology, Japan
Ilja Radusch, Technische Universitaet Berlin, Germany

at: ismwork@eecs.uci.edu

Check the website for updates.

May 15, 2009

iPod Touch Apps, WiiMote Whiteboards, 3D multi-user environments in education, and a teacher's video of the SMARTTable in action.

I thought I'd share the last two posts from my TechPsych blog here, since they focus on newer technologies that involve multi-touch or multi-user interaction.

A teacher explores the multi-touch, multi-user SMARTtable in his classroom

From what I can see, multi-touch, multi-user applications are ideal for students to learn collaborative, cooperative social skills at the same time they learn academic skills. Smart Technologies, well-known in the education world for interactive whiteboards, has unleashed a few tables, known as SMARTTables, in classrooms. One teacher, Tom Barret, is sharing his journey with technology, including the SMARTtable, on-line via his blog, SPACE FOR ME TO EXPLORE

The following is a video of young children doing math on a multi-touch SMARTTable. In order to solve the finger- arithmetic problems, the students must work cooperatively


Addition App - Set to multi-touch finger counts from Tom Barrett on Vimeo.

(In the video, you will see some shapes that Tom mistakenly added, so disregard them as you view the video.)


Here is a quote from Tom's blog about his experience with the addition application:

"I was most pleased with the level of engagement from the children and although on the surface this seems to be a simple application, it definitely requires a level of teamwork that you often do not get.

It is intriguing watching the children’s first attempts and how they realise they need to work together. As the challenge is small scale, once they have been successful they begin to refine their approach, communicate better and so get to later answers quicker."


Educational iPod Touch Apps for Students and Teachers: Eric Sailers' blog
Eric Sailers is a speech and language pathologist and assisted technology specialist who explores new technologies that he's found useful in the schools. Below is Eric's demonstration of applications such as "I Write Words", Wikipanion, Preschool Adventure, Twitterific, Google Mobile, and the calendar.

To demonstrate the iPod Touch,Eric uses the Elmo document camera that projects onto a screen. Note that as Eric demonstrates the Twitterific application, , he navigates to a link to a blog of one of his colleagues, which highlights the way one school is using the Wii as an augmentive communication tool and also an assessment tool for occupational therapy.



Take some time to explore Eric's Speech-Language Pathology Sharing blog. It is full of great information!

Update: Here are two video clips Eric created to prepare for an interview as a finalist for the Cox Communication Innovation in Special Education award. In one of the videos, Eric discusses the EduSim application, a 3D multi-user virtual world platform and authoring toolkit intended for classroom interactive whiteboards.

Interactive Applications for Special Education: Wiimote Whiteboards and iPod Touch in Special Education, Part I


Wiimote Whiteboards and iPod Touch in Special Education, Part II

May 14, 2009

For a smile - T-Mobile & Flash Mobs: Dancing and Singing in the UK

T-Mobile: Dancing at the Liverpool Street Station

This viral ad from T-Mobile gives a glimpse of how music in a public space, a small crowd of dancers, and YouTube, can bring a smile to over 11,771,896 viewers. (My guess is that many of video's views were repeat views, because I immediately pressed "replay" after the first go-round.)



Apparently a real flash mob congregated at the Liverpool Station a month after the T-Mobile ad was filmed:
Flash mob mimicks T Mobile advert - and closes train station

I guess this concept has been spreading in the UK. The following video was taken in Trafalgar Square, with 13,000 people singing karaoke:

 

The flash-mob dancing and singing in public places brought back a memory of a video clip I came across a few years ago of a "faux" dance happening at a mall during the late 1990's. The video was set to Fat Boy Slim's "Praise You".
It turns out that Spike Jonze, the man behind the upcoming film of Maurice Sendak's story Where the Wild Things Are, was involve in the creation of the video.
 

Enjoy!




RELATED:
Spike Jonze Unmasked

Arizona State University's ArtsWork Arts Education Center Website: 

Interactive Gateway: Dance, an 8-unit curriculum for high school students that explore the art and culture of the 1960's.

"Students will improvise and choreograph dances structured by chance methods, sports, games, political issues, surroundings, everyday gestures and the manipulation of objects. Through these investigations students realize that in the 1960’s, dance was redefined to include everyday people, everyday movement, indeterminacy, and alternative spaces"

Lesson 7. Anytime, Anywhere, Anything
Unit of Study: Interactive Gateway
Theme: Redefining Dance As Happening Anytime, Anywhere, Including Any Movement
Sub theme: Happenings
Grades: 9-12th grade


May 12, 2009

The Children's Interactive Library: User Experience Design and the Library!

The Children's Interactive Library was a collaboration between Interactive Spaces, the Department of Computer Science, University of Aarhus, the Department for Design, Aarhus School of Architecture, and others.

(Please excuse the audio - there are two narrators, each speaking a different language.)

"The Children's Interactive Library project was an untraditional interdisciplinary research and innovation project exploring the children's library of the future.


The objective was to create spaces for children in the library that offer new experiences, learning, events, sense impressions and physical activity. And at the same time acknowledging the library's unique capacity of being the place where children come in order to acquire information and knowledge."

The video shows how pervasive computing and natural interaction, and a child-centered focus were combined to create rich "user experience" at the library.

The Story Surfer:
http://farm4.static.flickr.com/3205/3005626860_bfd24c4811.jpg?v=0


http://biblioragazzi.files.wordpress.com/2006/11/cover.jpg

Inspiration: Strategies and Prototypes for the Future (pdf)
Abstract from Children's Interactive Library Project, 2004-06, with lots of pictures and diagrams depicting the design process and products.

The following video provides a good overview of how user-focused HCI strategies were used a means to develop innovative plans for cool new libraries, otherwise known as "Media Spaces
".
From what I can see, these strategies were used by the
Aarhus Public Libraries,in Denmark, during the development of the Children's Interactive Library project, highlighted in the previous video.


Thanks, Matt Gullet, for the link to the video.



FYI: The Public Library of Charlotte and Mecklenburg County, in North Carolina, is pretty cool.

Pictures from ImagineOn, the main library in Charlotte, N.C. for children and teens (Spangler Library & PLCMC):

http://www.luxurycollectiondestinations.com/images/uploaded/arts/img_1230089510299_3688_4073_2281_9133_5793.jpghttp://www.flagsacrossthenation.org/wp-content/uploads/2009/03/imaginon.jpg
ImagineOn, the LibraryLoft for Teens, and the Spangler Children's Library

http://farm1.static.flickr.com/81/206604860_feccdd874a.jpg?v=0http://farm1.static.flickr.com/84/206604920_5d610b4824.jpg?v=0
Video production and editing at ImagineOn


Making games with Game Maker at the library


Lan Party at the Library

May 10, 2009

Future of Interaction: Microsoft Lab's Vision 2019 Montage; Future Healthcare Montage

The following video is from Microsoft Office Vision 2019. If you visit the YouTube source, you'll find an interesting discussion about this five-minute+ clip.



Video Presentation from MIX09: Untold Stories of Touch, Gesture, & NUI
Joseph Fletcher, Design Manager, Microsoft Surface


"Learn about new patterns, behaviors, and design approaches for touch and gesture interfaces from a practitioners point of view. Learn early lessons from applied knowledge of touch applications, devices, and design methods."




LukeW's notes from Joseph Fletcher's presentation
LukeW's post: Design Principles at Microsoft

LukeW is the Senior Director of Product Ideation & Design at Yahoo! Inc.

Michael Haller Discusses Multi-touch, Interactive Surfaces, and Emerging Technologies for Learning

I came across an excellent overview of interactive display technologies that hold promise for education. The link below is a research article written by Michael Haller for BECTA, formally known as the British Educational Communications and Technology Agency.

Emerging Technologies for Learning: Interactive Displays and Next Generation Interfaces(pdf)
Becta Research Report (2008) Michael Haller Volume 3 (2008)


"Multi-touch and interactive surfaces are becoming more interesting, because they allow a natural and intuitive interaction with the computer system.

These more intuitive and natural interfaces could help students to be more
actively involved in working together with content and could also help improve whole-class teaching activities. As these technologies develop, the barrier of having to learn and work with traditional computer interfaces may diminish.

It is still unclear how fast these interfaces will become part of our daily life and
how long it will take for them to be used in every classroom. However, we strongly believe that the more intuitive the interface is, the faster it will be accepted and used. There is a huge potential in these devices, because they allow us to use digital technologies in a more human way." -Michael Haller

Michael Haller works at the department of Digital Media of the Upper Austria University of Applied Sciences (Hagenberg, Austria), where he is the head of the Media Interaction Lab.

Michael co-organized the Interaction Tomorrow course at SIGGRAPH 2007, along with Chia Shen, of the Mitsubishi Electric Research Laboratories (MERL). Lecturers included Gerald Morrison, of Smart Technologies, Bruce H. Thomas, of the University oof Southern Australia, and Andy Wilson, of Microsoft Research. The course materials from Interaction Tomorrow are available on-line, and include videos, slides, and course notes.

Below is an excerpt from the discription of the Interaction Tomorrow SIGGRAPH 2007 course:

"Conventional metaphors and underlying interface infrastructure for single-user desktop systems have been traditionally geared towards single mouse and keyboard-based WIMP interface design, while people usually meet around a table, facing each other. A table/wall setting provides a large interactive visual surface for groups to interact together. It encourages collaboration, coordination, as well as simultaneous and parallel problem solving among multiple people.

In this course, we will describe particular challenges and solutions for the design of direct-touch tabletop and interactive wall environments. The participants will learn how to design a non-traditional user interface for large horizontal and vertical displays. Topics include physical setups (e.g. output displays), tracking, sensing, input devices, output displays, pen-based interfaces, direct multi-touch interactions, tangible UI, interaction techniques, application domains, current commercial systems, and future research."

It is worth taking the time to look over Haller's other publications. Here is a few that would be good to read:

M. Haller, C. Forlines, C. Koeffel, J. Leitner, and C. Shen, 2009. "
Tabletop Games: Platforms, Experimental Games and Design Recommendations." Springer, 2009. in press [bibtex]

A. D. Cheok, M. Haller, O. N. N. Fernando, and J. P. Wijesena, 2009.
"Mixed Reality Entertainment and Art," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]

J. Leitner, C. Köffel, and M. Haller, 2009. "Bridging the gap between real and virtual objects for tabletop games," International Journal of Virtual Reality, vol. X, p. X, 2009. in press [bibtex]


M. Haller and M. Billinghurst, 2008.
"Interactive Tables: Requirements, Design Recommendations, and Implementation." IGI Publishing, 2008. [bibtex]

D. Leithinger and M. Haller, 2007. "Improving Menu Interaction for Cluttered Tabletop Setups with User-Drawn Path Menus," Horizontal Interactive Human-Computer Systems, 2007. TABLETOP 07. Second Annual IEEE International Workshop on, pp. 121-128, 2007. [bibtex]


J. Leitner, J. Powell, P. Brandl, T. Seifried, M. Haller, B. Dorray, and P. To, 2009."Flux: a tilting multi-touch and pen based surface," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3211-3216. [bibtex]

P. Brandl, J. Leitner, T. Seifried, M. Haller, B. Doray, and P. To, 2009. "Occlusion-aware menu design for digital tabletops," in CHI EA 09: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, New York, NY, USA, 2009, pp. 3223-3228. [bibtex]


References from the BECTA paper:

Elrod, S., Bruce, R., Gold, R., Goldberg, D., Halasz, F., Janssen, W., Lee, D., Mc-Call, K., Pedersen, E., Pier, F., Tang, J., and Welch, B., Liveboard: a large interactive display supporting group meetings, presentations, and remote collaboration, CHI ’92 (New York, NY, USA), ACM Press, 1992, pp. 599–607.

Morrison, G., ‘A Camera-Based Input Device for Large Interactive Displays’, IEEE Computer Graphics and
Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Albert, A. E. The effect of graphic input devices on performance in a cursor positioning task. Proceedings ofthe Human Factors Society 26th Annual Meeting, Santa Monica, CA: Human Factors Society, 1982, pp. 54-58.

Dietz, P.H., Leigh, D.L., DiamondTouch: A Multi-User Touch Technology, ACM Symposium on User
Interface Software and Technology (UIST), ISBN: 1-58113-438-X, pp. 219-226, November 2001.

Rekimoto, J., SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces,

CHI 2002, 2002.

Kakehi, Y., Iida, M., Naemura, T., Shirai, Y., Matsushita, M.,
Ohguro, T., ‘Lumisight Table: Interactive View-Dependent Tabletop Display Surrounded by Multiple Users’, In IEEE Computer
Graphics and Applications, vol. 25, no.1, pp 48 – 53, 2005.

Streitz, N., Prante, P., Röcker, C., van Alphen, D., Magerkurth, C.,
Stenzel, R., ‘Ambient Displays and Mobile Devices for the Creation of Social Architectural Spaces: Supporting informal communication and social awareness in organizations’ in Public and Situated Displays: Social and Interactional Aspects of Shared Display Technologies, Kluwer Publishers, 2003. pp. 387-409.

Morrison, G., A Camera-Based Input Device for Large Interactive
Displays, IEEE Computer Graphics and Applications, vol. 25, no. 4, pp. 52-57, Jul/Aug, 2005.

Ishii, H., Underkoffler, J., Chak, D., Piper, B., Ben-Joseph, E.,
Yeung, L. and Zahra, K., Augmented Urban Planning Workbench: Overlaying Drawings, Physical Models and Digital Simulation. IEEE and ACM International Symposium on Mixed and Augmented Reality ACM Press, Darmstadt, Germany.

Han, Y., Low-cost multi-touch sensing through frustrated total internal reflection, UIST ’05 (New York), ACM
Press, 2005, pp. 115–118.

Hull., J., Erol, B., Graham, J., Ke, Q., Kishi, H., Moraleda, J., Olst, D., Paper-Based Augmented Reality. In
Proceedings of the 17th International Conference on Artificial Reality and Telexistence (Esbjerg, Denmark,November 28-30, 2007). ICAT ’07. IEEE, 205-209.

Haller, M., Leithinger, D., Leitner, J., Seifried, T., Brandl, P., Zauner, J., Billinghurst, M., The shared design space. In SIGGRAPH ’06: ACM SIGGRAPH 2006 Emerging technologies, page 29, New York, NY,USA, 2006. ACM Press.

Research email: emtech@becta.org.uk

Main email: becta@becta.org.uk
URL: www.becta.org.uk

(This was also posted on the TechPsych blog.)