Showing posts with label NUI. Show all posts
Showing posts with label NUI. Show all posts

May 15, 2012

NUITEQ's Latest Multitouch Showreel: Snowflake Suite

I've been following a number of people that have been working in the area of natural user interfaces and interaction for many years.  An example of this work is NUITEQ, a company lead by Harry van deer Veen.  Below is NUITEQ's most recent show reel of Snowflake Suite, an off-the-shelf multitouch SDK.


Here is the description of the software from the naturaluserinterface YouTube channel:


"NUITEQ's award-winning multitouch software product Snowflake Suite comes off the shelf with 30+ apps, a free SDK to develop your own multitouch software apps and its content is easy to customize. The solution is offers high performance, stability, quality and comes with dedicated support. Apps includes presentation, productivity and creativity tools as well as games. The software can be used in different scenarios such as corporate presentations, exhibitions, entertainment, education, public spaces, consumer electronics, retail and digital signage."

FYI: Tutorials about the user of Snowflake Suite can be found on the naturaluserinterface YouTube channel. 


Harry van der Veen has been sharing his NUI journey journey since 2007 on his Multitouch blog.






Mar 1, 2012

Seamless Collaborative Computing Between Tables and Tablets: nSquared Presenter -Video

I came across the following video about nsquare's presenter application. It looks like it has potential.  It supports "seamless" collaboration between people, multimedia content, interactive tables, interactive touch screens, and tablet devices.






RELATED
nsquared website
Video presentation about "Seamless Computing", by Neil Roodyn


More blog posts on the topic of natural user interaction, interfaces, and examples of seamless computing are planned for the future!


This was shared by one of my FB friends.

Feb 22, 2012

Keepin' It Real: Organic, Natural, or Reality-Based Interactions? Video of TEI'12 panel discussion: Bill Buxton, Rob Jacob, Mili John Tharakan, and Roel Vertegaal (and more from TEI '12)

TEI '12 is finishing up in Kingston, Ontario.  If you missed it, like I did, don't worry.  Many of the presentations were streamed live, and are available to watch at your convenience.  
(The panel gets started at about 1: 48 on the timeline)


Keepin' It Real: Organic, Natural or Reality-Based Interactions?
Video streaming by Ustream Moderator: Amanda Parkes, Columbia University 
Panelists
William Buxton, Microsoft Research
Rob Jacob, Tufts University 
Mili John Tharakan, The Swedish School of Textiles 
Roel Vertegaal, Queen's University


RELATED
Here are some TEI '12 presentations interest me!
Process Pad: A Low-Cost Multi-Touch Platform to Facilitate Multimodal Documentation of Complex Learning
Shima Salehi, Stanford University
Jain Kim, Stanford University
Colin Meltzer, Stanford University
Paulo Blikstein, Stanford University
Process Pad project website


Exploring peripheral interaction design for primary school teachers
Saskia Bakker Eindhoven University of Technology, the Netherlands 
Elise van den Hoven Eindhoven University of Technology, the Netherlands 
Berry Eggen Eindhoven University of Technology, the Netherlands 
Kees Overbeeke Eindhoven University of Technology, the Netherlands 


Others:
Exploring Tabletops as an Effective Tool to Foster Creativity Traits
Mobile Lorm Glove - Introducing a communication device for deaf-blind people
Mobile Lorm Glove project website
The HapticTouch Toolkit: Enabling Exploration of Haptic Interactions

HapticTouch Toolkit from Haptic Touch on Vimeo.








Jan 19, 2012

iBooks Textbooks for iPads, iTunesU App -and links to info about 1:1 iPad initiatives in public schools

There is a lot of buzz about iPads in education. 


I'm convinced that tablet technology will be adopted by schools at a faster rate than anticipated a year or so ago, now that Apple has launched iBooks Textbooks for iPad and the iTunesU app.


Parents, teachers, students, administrators, and school IT specialists, hold on to your hats!

RELATED
Learning Together: The Evolution of a 1:1 iPad Program
Carolyn Foote, Internet@Schools, 1/1/12
The Swiss-Army Knife of Education
Mr. Hooker, The WIFI--Eanes ISD iPad Pilot Project, 10/3/11
iPads in Education
John Evans, Curator
iBooks Textbooks
iTunesU App

iPad for Art: Sketchbook Express, via the Montlieu Academy of Technology, a public K-5 magnet school that adopted a 1:1 iPad initiative in August, 2011.


COMMENT
I have been planning on devoting more posts in the future about the iPad and related interactive technologies for learning- and for fun.  When I have a chance to write my mega-post, you won't be disappointed!

Jan 3, 2012

"Kinect-based Telepresence with Room Sized 3D Capture and Life Sized Display", Includes Behind-the Scene "how-to" (UNC Chapel Hill)

Jim Spadaccini, of Open Exhibits, recently told me about a project that involves the real-time, interactive 3-D capture of people in a room.  As the viewer moves around the screens, the depth-detecting feature of the Kinect is harnessed to set the stage for a realistic telepresence experience.  


Take the time to view the video, which contains some interesting views of how the system works:





Thanks, Jim, for the link!

RELATED/SOMEWHAT RELATED
Kinect Real-Time Room Telepresence
Kinecthacks, 1/3/12
From the project's website:
Maimone, A. and H. Fuchs. "A First Look at a Telepresence System with Room-Sized Real-Time 3D Capture and Large Tracked Display." The 21st International Conference on Artificial Reality and Telexistence (ICAT) (Osaka, Japan, November 28-30, 2011) [paper] [video]
Maimone, A. and H. Fuchs. "Encumbrance-free Telepresence System with Real-time 3D Capture and Display using Commodity Depth Cameras." The IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 2011 (Basel, Switzerland, October 26-29, 2011) [paper] [video]
Open Exhibits

Jan 2, 2012

Near Field Communication in Education? Arts? Wayfinding?

A few weeks ago I posted about the use of Near Field Communication at the Museum of London, and started to slowly gather information about the potential of this technology for use in educational settings.  Below are a few links, more to follow:


Future Forecasting: Near Field Communication (NFC) in mLearning
Diane Rees, Instructional Design Fusions

Reference for Diane's article:
Miraz, G.M., Ruiz, I.L., & Gómez-Nieto, M.A. (2009). University of Things: Applications of Near Field Communication Technology in University Environments. The Journal of E-working, 3(1), 52-64. Retrieved from http://www.inderscience.com/www/info/ijwi/art/tjew3101.pdf


One of Diane's links:
A Platform for Pervasive Infrastructures (pdf)
Saros, J., Lindstrom, D., & Karin Wollbrand, Ericsson Research


RELATED/SOMEWHAT RELATED
Near Field Communication adds a new Layer to Museums
Kirsten Winkler, Big Think, 8/18/11


Ervasti, M.; Isomursu, M.; Kinnuyla, M. Experiences from NFC Supported School Attendance Supervision for Children Mobile and Ubiquitous Computing, Systems, Services and Technologies 2009.  Ubicomm '0
Internet Librarian 2011:  Next Big Trends: Near Field Communication & Interactive Books
Gretchen Caserotti, Kristen Yarmey, and Sheli McHugh, Librarian in Black, 10/17/11
"Phygital" Wayfinding and Experience Mapping
Mark VanderKlipp, Corbin Design, 9/23/11
Near Field Communication: Changing the Digital Signage Value Proposition
NFC Bootcamp
How to Have Fun with Near Field Communication on Android
Ryan Whitwam, Tested,  4/27/11


NOTE:
More posts are in the works - update from interactive multimedia, games,  and NUI companies, links to great blogs, and more.

Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Dec 13, 2011

Kinect in Education! (kinectEDucation)

Although I'm currently exploring the world of interactive HTML5, interactive video, etc., I think I just might make "kinecteducation" the focus of my tech-hobbies. I have some experience with game programming-one of my computer courses required a project using XNA- and I know quite a bit about gesture and multitouch, multi-user interactio, so it would'nt be too much of a stretch.


My motivation?

As a school psychologist, my main assignment is a school/program for students with disabilities, including about 40 or so who have autism spectrum disorders. Yesterday, the principal of the school attended a demonstration of the Kinect and requested that our school be considered for piloting it. One of my other assignments is a magnet high school for technology and the arts, and rumor has it that it will be offering a game programming curriculum.  I'd love to co-sponsor an after-school game club and encourage the students to program educational apps for the Kinect sometime in the near future! 


I'm also working as a client, in collaboration with come of my educator colleagues, with a team of university students who are creating a communication/social skills game suite geared for students with autism and related disabilities....


I'm inspired by the possibilities!


We have large SMARTboards in each classroom and in other locations around the building, and we have a Wii set up in the large therapy room adjacent to my office. The Wii has proven to be very useful in helping the students develop social and leisure skills that they can use in and outside of the school settings, but some of the students have difficulty manipulating the buttons on the controllers.


You can get Kinect-based apps from the Kinect Education website! Below are selected links from the website:

You can also get additional information from the Microsoft in Education "Kinect in the Classroom" website.

Below are a few videos to give you an overview of how open-source applications designed for the Kinect can be used in education: 






Nov 26, 2011

TWINE video and the Supermechanical Blog: Moving Toward the Web of People and Things!

"A wireless square with sensors and a simple web app to set rules, Twine tells you what your things are doing by email, text or Twitter."   I want one!

This project was developed by David Carr and John Kestener, the designer-engineers behind Supermechanical. They are passionate about creating connectable objects.They honed their skills in the interdisciplinary  MIT Media Lab.  


More information about Twine can be found on the KICKSTARTER website.  Here is a bite of info from the site for the tech-curious:

"Twine is a wireless module tightly integrated with a cloud-based service. The module has WiFi, on-board temperature and vibration sensors, and an expansion connector for other sensors. Power is supplied by the on-board mini USB or two AAA batteries (and Twine will email you when you need to change the batteries)."
"The Spool web app makes it simple to set up and monitor your Twines from a browser anywhere. You set rules to trigger messages — no programming needed. The rules are put together with a palette of available conditions and actions, and read like English: WHEN moisture sensor gets wet THEN tweet "The basement is flooding!" We'll get you started with a bunch of rule sets, and you can share rules you create with other Twine owners."
"Because the hardware and software are made for each other, setup is easy. There's nothing to install — just point Twine to your WiFi network. Sensors are immediately recognized by the web app when you plug them in, and it reflects what the sensors see in real time, which makes understanding and testing your rules easy."

RELATED

Nov 19, 2011

Camera-less Tabletop Computing with Samsung SUR40 for Microsoft® Surface® with PixelSense™

Here is the press release: 
Next Generation of Microsoft Surface Available for Pre-Order in 23 Countries
"Software developers interested in creating solutions for the Samsung SUR40 can get started immediately by visiting the Surface Developer Center. The site provides free and easy access to the Surface 2.0 software developer kit, featuring the Input Simulator, which enables developers to write Surface applications on any Windows 7 machine, as well as other helpful developer-related resources. There are already hundreds of highly skilled Surface software development partners that can be found at http://www.surface.com."


(I've listed Microsoft Surface partners at the end of this post and plan to share more about the latest applications for surface computing in the near future.)


The following product information was taken from the Samsung website:
40" Surface Experience 
"Samsung SUR40 is the new generation of Microsoft® Surface® experience featuring PixelSense™ technology, which gives LCD panels the power to see without the use of cameras. Building from the innovation of the first version of Microsoft® Surface® and Samsung’s leading display technology, it is now possible for people to share, collaborate and explore together using a large, thin display that recognizes fingers, hands and other objects placed on the screen." 


PixelSense™ 
"PixelSense™ allows an LCD display to recognize fingers, hands, and objects placed on the screen, including more than 50 simultaneous touch points. With PixelSense™, pixels in the display see what’s touching the screen and that information is immediately processed and interpreted."


Resolution:             1920 x 1080 
Viewing Angle:       (H/V) 178 / 178° (CR ≥ 10) 
CPU:                     Athlon X2 Dual-Core 245e (2.9GHz) 
Operating System: Windows 7 Professional  x64 


GPU:        AMD HD6750M 
N/B:         AMD RS780E 
S/B:         AMD SB710 


Storage:                  SATA2 320 GB 
Memory:                 DDR3 4 GB
USB:                      4 USB 2.0 
VGA:                     Out HDMI Ethernet: 100 / 1000 
Audio Codec:         Realtek ALC262 Azalia CODEC


Product Dimensions (With Stand):            1,095 x 728 x 707.4 mm 
Product Dimensions (Without Stand):       1,095 x 102.5 x 707.4 mm 
Shipment Dimensions:                            1,214 x 299 x 832 mm 


Product Weight:     35 Kg 
Shipment Weight:  45.4.Kg


RELATED
Samsung
Microsoft Surface
Microsoft Surface "What's New"
Microsoft Surface Partners:
Aftermous.com
AKT
AM Production
Black Marble
ETT
Headcandy
IdentityMine
Information Strategies
Infusion
Inhance Digital
Interknowlogy
Intuilab
nSquared
Object Consulting
Onwijs
Razorfish
Sevensteps
Stimulant
Touchtech
T-Systems
MultiMedia
UID
Vectorform
XFace

Nov 13, 2011

Art + Tech: The work of Aparna Rao and Soren Pors - a Ted Talk video worth watching!

Arparna Rao: High-tech art (with a sense of humor)

Aparna Rao is an artist from Bangalore, India, who creates interactive installations supported by electro-mechanical systems.  She collaborates with artist Soren Pors.


RELATED
Pors & Rao website - a delightful interactive website highlighting the works of Arpana Rao and Soren Pors.  Below are pictures of some of the duo's work, along with links to the Vadehra Art Gallery, where the works can be viewed online:


Pygmies: http://www.vadehraart.com/exhibition/viewDetails/63/1285
Decoy: http://www.vadehraart.com/exhibition/viewDetails/63/1287

Drifter: http://www.vadehraart.com/exhibition/viewDetails/63/1288










Nov 11, 2011

Revisiting Good Blogs: IDEUM update

Jim Spadaccini is the director of Ideum and is responsible for the Ideum blog, which is well worth reading if you are interested in creative use of multi-touch, gesture, and interactive multimedia applications.  Jim also serves as the principal investigator of the Open Exhibits project, a non-profit open source, multi-touch, multiuser software initiative, which includes a free software development kit (SDK) for museums, students, nonprofits, and educational use.


Here are a few links to posts from the Ideum blog:
Building a High-Resolution Multi-touch Wall: Part 1, Part 2, Part 3,  Part 4 and Part 5.
Ideum Blog
Events at the The Tech and the Maxwell Museum
Open Exhibits, Year Two

The Open Exhibits Research and Design blog is worth reading, as it often includes Jim Spadaccini's words of wisdom and some interesting video clips.


More about IDEUM
Ideum will be involved in an upcoming day-long conference at the Tech Museum on Tuesday, November 15th (2011), Building Interfaces for the New Decade  Conference San Jose, California, 11/15/11

Ideum is working on interconnectivity between devices, as shown below, where people are connecting their iPhones to a multi-touch table:
Heist at the British Museum

Here is a video about Ideum's Tiny Drifters exhibit,  7 foot multi-touch wall at the Monterey Bay Aquarium:




Here is a promotional video about Ideum's rugged MT-55 multi-touch table. It holds up to the interaction of thousands of museum visitors.  






























Nov 5, 2011

Building Interfaces for the New Decade: Conference at the Tech Museum, San Jose, California

Here is information about a day-long conference that I'm sure will be well worth attending:


November 15, 2011 (Tuesday)
San Jose, California


If you can't make it to San Jose, the conference can be attended "virtually".  You can register for the conference via the EventBright website.


Here is the list of speakers and presentations for the conference:


Tamara Schwarz, Senior Manager of Experience Design, Chabot Space & Science Center
"RFID-enabled Interactive Exhibits, or... What's a warehouse inventory management technology doing in an interactive science center?"
Christopher Stapleton, Creative Venture Catalyst, Simiosys
"The Phydgital InterSpace -The Next Generation of Museum User Experience and Workshop: Experiential Makeover for Learning"
Jim Spadaccini, Director, Ideum; Principal Investigator, Open Exhibits
"Open Exhibits - Community Software for Multi-touch and Motion Recognition"
Rick Ernst, Lead Designer, Ogmento
"How to get People off the Couch and Interact with the World with Augmented Reality Games"
Bob Ketner, Curator, The Tech Test Zone
"The Tech Virtual and The Tech Test Zone Open Platforms"
John MacDuffie Woodburn, Co-Developer, Pixel, and Graduate Student, University of the Arts, PA, & Matt Miller, Co-Developer, Pixel
"Pixel - Using Eye Tracking to Engage and Educate Your Audience"
Ross Smith, Ph.D, Deputy Director, Wearable Computer Laboratory, University of South Australia
"Bringing Fingertip Sculpting Techniques to the Digital Artist"

Below is an excerpt from the conference website:
"Join us on Tuesday, November 15, 2011 for a gathering of the most innovative developers of new interactive technologies at The Tech Museum's Test Zone 2011...Connect with entrepreneurs, developers, and researchers working on cutting-edge interfaces such as:
Augmented Reality
Context Aware 
Deformable Surfaces
Eye Tracking
Feedback
Gestural Interfaces
Gigapixel Images
Motion Sensing
Multi-touch
Natural User Interfaces
RFID
Smart Museum Systems
Tablet Computers
Ubiquitous Media 


Find out how museums and other public-space designers can use these technologies to enhance visitor learning and experience. Experience the interactives firsthand at the opening of The Tech Test Zone Gallery, a new gallery dedicated to temporary exhibitions of prototypes from corporate, academic, and nonprofit partners."

RELATED

About the Tech Museum:
"The Tech Museum is a hands-on technology and science museum for people of all ages and backgrounds. The museum-located in the Capital of Silicon Valley -is a non-profit learning resource established to engage people in exploring and experiencing technologies affecting their lives. Through programs such as The Tech Challenge presented by Cisco, our annual team design competition for youth, and internationally renowned programs such as The Tech Awards presented by Applied Materials, Inc., The Tech Museum celebrates the present and encourages the development of innovative ideas for a more promising future." -Tech Museum YouTube Channel

Hack the Future Day at the Tech Museum






Nov 4, 2011

Le Chal: Smart shoes for the visually impaired, with haptic//vibrational feedback, sensors and GPS


Le Chal is a haptic/vibrational feedback shoe designed for the blind and visually impaired, on of the projects of Anirudh Sharma, a member of the NUI Group (Natural User Interface).  Le Chal was conceptualized at the 2011 MIT Media Lab Design and Innovation Workshop at COEP.
Take me there Shoe/ Le Chal

Shoe for visually impaired - Le Chal Himanshu Khanna, 10/31/11

RELATED
touchaddict blog

Oct 20, 2011

T(ether): Next-Gen Data Gloves + iPad - just what I need for the colder weather! (quick post)



"T(ether) is a novel spatial aware display that supports intuitive interaction with volumetric data. The display acts as a window affording users a perspective view of three- dimensional data through tracking of head position and orientation. T(ether) creates a 1:1 mapping between real and virtual coordinate space allowing immersive exploration of the joint domain. Our system creates a shared workspace in which co-located or remote users can collaborate in both the real and virtual worlds. The system allows input through capacitive touch on the display and a motion-tracked glove. When placed behind the display, the user’s hand extends into the virtual world, enabling the user to interact with objects directly." -Vimeo

For more pictures and information, see the following post on the Creative Applications Network website: 
T(ether) [Cinder]
Filip Visnjic



Oct 1, 2011

Stantum's Multi-Modal, Multi-Touch Technology: Looks Great for 21st Century Learners, Supports Creative, Collaborative Knowledge-Sharing


Not long a go I had the opportunity to participate in a phone conference with Guillaume Largillier, of Stantum,  to learn more about the company's work in developing multi-touch technology, and how this technology can support a wide range of uses.  


What is exciting is that the tablet technology was designed from the beginning with content-creation in mind, unlike the iPad and similar Android-based tablets.  Although Stantum's tablet technology supports  passive "entertainment" and Internet content viewing, it offers so much more.  

I agree with Guillaume that this technology will be ideal for use in education. It is ideal for 21st Century teaching and learning.   In the very near future,  I'm sure we'll see fewer students lugging heavy  backpacks as this technology transforms textbooks to multi-modal, interactive experience platforms!

Stantum' IVSM multi-touch technology (IVSM = Interpolated Voltage Sensing Matrix) enables 10 finger touches with stylus input.  The goal is to enable content creation via tablets, for educational and office settings. This technology provides reliable palm rejection and contact discrimination,  a useful feature for students or anyone else who might have the need for a versatile tablet.  A multi-touch tablet is demoed in the video below:
 



Intel-Quanta Redvale Tablet featuring Stantum's Digital Resistive Technology
RELATED
Add Ambidexterity and Multi-modality to your Touch-enabled Devices
Guillaume Largillier, Electronic Design, 7/14/11
 "In South Korea, the government recently confirmed its plan to replace all printed textbooks nationwide by electronic counterparts by 2015, a move that will benefit from now available multi-touch technologies with precision and stylus input that enable handwriting recognition, making them especially suitable for products sold into Asian markets." 
About Guillaume Largillier:
"Guillaume Largillier’s background with the company includes seven years of entrepreneurship – co-founding JazzMutant and serving as CEO as it evolved into Stantum. Largillier has a broad knowledge of human-machine interface technologies, with strong expertise in multi-modal user interfaces. He has participated in major industry conferences (including SIGGRAPH, Microsoft Summit, SID Mobile Display Conference, Microsoft TechDays, and Interactive Displays Conference). Largillier earned an MA degree in audio-visual studies at the University of Bordeaux." -Stantum


Post: Multi-touch Update from Stantum
2010 Demo: MeeGo Tablet on Intel Moorestown 
Stantum's TouchPoints Newsletter (July 2011)
Stantum Whitepapers:
How to Evaluate Multi-Touch While Standing in a Store (pdf) - a great source of information.
Jim Meador, Pascal Auriel, Gauthier Chastan, Stantum
Specifying and Characterizing Tactile Performances for Multi-touch Panels: Toward a User-Centric Metrology (pdf) - outlines some important points!
Guillaume Largillier, Pascal Joguet, Cyril Recoquillon, Pascal Auriel, Axel Balley, Jim Meador, Julien Olivier, Gauthier Chastan
Stantuminfo@stantum.com

Sep 4, 2011

Contre Jour: My current favorite iPad game - a good example of touchable, movable multimedia

Contre Jour is a game developed for the iPad by Chillingo.  It is my current favorite game, partly because it is a great example of how the power of a touch interface can be harnessed.

The beauty of this game is that it provides a natural means of scaffolding how the tools are used, so that as the player progresses, the learning curve feels almost effortless. In the game, the character is controlled through the player's ability to manipulate and move the environment. The underlying physics is intriguing.

Contre Jour was  inspired by children's book, Little Prince, but in this game, the "prince" is a cute little eyeball character.   In the game,  each chapter/world has a musical theme, and the music integrates with the game's visuals and "movements"  to provide a relaxing ambience.

To learn more, take a look at the following video about Contre Jour.  I'm happy I spent a few dollars to buy the app!



I'll post more about this game as I play it!