Showing posts with label NUI. Show all posts
Showing posts with label NUI. Show all posts

Nov 4, 2012

Stantum Update: Innovative Tablet and Mobile Tech; Tablets Push Leap Towards Interactive Multimedia "Textbooks"

Step into a school and you might notice that older desktop computers have been replaced by an assortment of portable devices.   In many cases, students still carry around book bags that contain an assortment of textbooks, binders, worksheets, along with something digital.  

So what is that "something digital"?   

In some  cases, it might be a school-issued laptop or net-book.  In other cases, it could be an iPad, an e-reader, or another sort of tablet.  In other cases, it could be whatever the student brings to school, as part of a "BYOD" (Bring Your Own Device) experiment.

No one has yet figured all out, but many people, from a number of disciplines, are trying!

Stantum is an example of a company that has an interest in the use of innovative technology for education.  Stantum develops multi-touch technology for mobile devices, including tablets, and follows trends in education closely. 

I've had the opportunity speak with Guillaume Largillier, Stantum's co-founder and CSO on several occasions, most recently in late October.  I'm happy to say that Stantum looks like it will be making some important inroads in education.  

Why? 

Stantum is aware that as educational institutions, K-12 and above, leave textbooks behind, there will be an enormous need for devices that will seamlessly support teaching and learning, all around the world.  Guillaume spoke of  developments in countries such as South Korea, France, Turkey, Thailand, Columbia, Argentina, and China to adopt digital textbooks. Some of these countries have initiated pilot programs using some sort of tablet or mobile devices, or plan to do so in the very near future.


During our conversation, Guilluame pointed out that Stantum is involved in developing durable components and tablets made from materials that can withstand the knocking about that sometimes happens when handled by young hands.  

Unlike the iPad, the tablets will not need a rugged, more expensive case, so it is likely that the cost to schools will be lower.  Since the tablets are open to most operating systems, they are likely to be easier for school IT administrators to deploy, update, and maintain.


A bit about the technology:    

Stantum's Interpolated Voltage Sensing Matrix (IVSM) technology supports 10 simultaneous touches and can handle touch and stylus input at the same time. It "knows" if it has been touched by a finger and can disregard a palm.  This is a good feature to have if a tablet is used young people.  

Earlier this year, Stantum announced its partnership with NISSHA, a company based in Japan, and unveiled its newest development, Fine Touch Z Technology, powered by IVSM.  It can support Windows and Android operating systems.  It has a fast scanning engine, high-resolution handwriting input, and does not produce ghost or masking effects.  (See video below).

One appealing feature of this technology is that it provides has low power consumption. This is a plus when considering the need students to have a device that has an extended battery life. 


Fine Touch Z from Stantum on Vimeo.


RELATED

A leap towards interactive digital "textbooks":

Although students have been accessing educational content in digital form on computers and through educational television programming for a long time, textbooks and paper-based assignments have been major tools used in school to transmit the curriculum.  

The tools are changing as we move to a digital, multimedia world of communication, collaboration, knowledge sharing, teaching, learning, and creating.  There are many questions to consider.
  • What sort of digital content will replace traditional textbooks?
  • Who will create this content? 
  • Will traditional textbook publishers simply transform textbooks into slightly more interactive versions of the ho-hum content students love to hate?
  • How will digital interaction change the way everyone learns?  How will this be measured?
  • How will teachers and students be provided with opportunities to create new ways of sharing knowledge?
  • How will usability and accessibility concerns be addressed, for students, teachers - and family members, given that digital content will be accessed both in and outside of school?
  • What sort of software systems will need to be developed?
  • What sort of infrastructure will be needed to support this influx of online activity and transfer of large data files -  at schools, in homes, and in communities?
  • How will the technological ecology support learning, given multiple devices, interactive whiteboards and other interactive surfaces? 

With change, there is usually confusion as old ways are set aside and people explore new options.  

Apple is making some inroads with the iPad, providing educators (and students) to create interactive books with the free iBooks Author program. Some schools have 1-1 laptop programs in place, and provide access to educational content through more traditional course management systems or web-based activities that accompany existing textbooks.  Schools are signing up for resources such as Google in Education and Microsoft Partners in Learning.  

If you are looking for some resources about the rapid increase in interest and adoption of interactive digital textbooks, not just in the U.S., but around the world, take a look at my recent blog post,  "Got Interactive (Multimedia) Textbooks Inside Your iPad or Tablet?  Lots of Resources!".  In the post, I provide a wealth of links to information from the LEAD commission, the FCC's Digital Textbook Playbook, and more.  

You might also want to take a look at Audrey Watters' article, The Truth About Tablets: Educators are getting iPads and e-readers into students'hands--but it's not easy, which was published online in the School Library Journal earlier this year.

For a global perspective, read Alex Wukman's article, World Bank Begins Global Digital Textbook Initiative 

It wouldn't hurt to learn more about what the textbook publishers have in mind.  For that reason, I've put together a sample of resources about interactive digital content from major educational media and textbook publishing organizations:

DISCOVERY EDUCATION
Techbooks
In the following video, students and teachers from an elementary school in Winston-Salem give their reviews of the new Discovery Education Science Techbook.  Students can access the Discovery Techbook via the web.  In this video, students interact with the content using traditional desktop computers as well as interactive whiteboards.  Techbooks are designed to work on tablets or laptops, too.  At :39, a first grade teacher explains exactly why she likes the science techbook:


"...everything is right here at your hands, it is interesting to the kids, they love to watch the videos, and you can take it further - beyond than that, and everything is done for you, it is just really helpful. It even has the prep, the content review, and it is all aligned to the curriculum." -Laurie Moran



Discovery Education Science Techbook from Judy Uhrig on Vimeo.

Discovery Education Science Techbook Overview (Video)
Techbook for High School Science
Techbook for K-8 Science

PEARSON
Pearson's new interactive textbook for the iPad
"It is really going to revolutionize how we think about the classroom experience and what happens in the classroom in the way of learning, and what happens outside the classroom." -Joseph Levine, Author, Miller & Levine Biology




INKLING
Major Publisher Investment Advances Inkling as the Future of Digital Textbooks
Audrey Watters, ReadWrite, 3/22/11
"What makes Inkling’s apps unique is the fact that “content isn’t bound by pages or sections or chapters in the same linear fashion. Rather, it’s hierarchical, richly illustrated and augmented. It’s interactive. It’s social,” Watters writes. The digital versions include quizzes, interactive infographics, and a scrolling and searchable interface." -Matt MacInnis, Inkling

McGRAW-HILL RYERSON
McGraw-Hill Ryerson Launches New High School iBooks Textbooks
PRNewswire, 10/25/12

HARCOURT
Harcourt Interactive Textbooks (Produced by Vertex)


SOMEWHAT RELATED
The Truth About Tablets:  Educators are getting iPads and ereaders into students'hands--but it's not easy
Audry Watters, School Library Journal, 2/1/12

Jul 30, 2012

Sound IS Important: Sonification, sound synthesis, sound design, interactive sound, the Olympics..

I happened to be listening to the radio and heard an interesting interview featuring on Dennis Baxter, the sound designer/engineer for the 2012 Olympics.  The interview, along with examples of sounds, are provided on the NPR website:

Making The Olympics Sound Right, From a 'Swoosh' to a 'Splash'
Becky Sullivan, 7/28/12, NPR   Listen to this story (All Things Considered featuring r, 5 min, 54 Sec)

Not all of the sounds you'll hear during the Olympics are "real". There are a few sports that are impossible to capture accurately, and for this reason, Dennis Baxter explains, special techniques are used.  

If you'd like to dig deeper into the world of sound, take a look at "The Sonification Handbook", edited by Thomas Hermann, Andy Hunt, and John G. Neuhoff.  The editors have backgrounds in computer science, physics, interactive sonification, multi-touch, music, psychology, auditory percpetion, and neuroscience, collectively.


Below is a description from the book's website:
"This book is a comprehensive introductory presentation of the key research areas in the interdisciplinary fields of sonification and auditory display. Chapters are written by leading experts, providing a wide-range coverage of the central issues, and can be read from start to finish, or dipped into as required (like a smorgasbord menu)"


"Sonification conveys information by using non-speech sounds. To listen to data as sound and noise can be a surprising new experience with diverse applications ranging from novel interfaces for visually impaired people to data analysis problems in many scientific fields." 


"This book gives a solid introduction to the field of auditory display, the techniques for sonification, suitable technologies for developing sonification algorithms, and the most promising application areas. The book is accompanied by the online repository of sound examples.The book is a comprehensive introduction to this interdisciplinary field."


The PDF version of the Sonification Handbook can be downloaded for free:
http://sonification.de/handbook/download/TheSonificationHandbook-HermannHuntNeuhoff-2011.pdf 

RELATED
This section includes a few interesting videos that are best viewed with headphones, along with a number of links to explore.

Andrew South - Sound Designer Show reel

Andrew South - Sound Designer Showreel from Andrew South on Vimeo.

Blip Shaper Walkthrough, Christian Bannister 

Blip Shaper Walkthrough from christian bannister on Vimeo.
FYI:  Christian Bannister's projects

Duet for Synthesizer and the Washing Machine, John Keston, Web Baker 
Produced using binaural sound, use headphones for best effects




Motion-driven Concatenative Synthesis of Cloth Sounds (SIGGRAPH 2012)



99% Invisible: Sound of Sport
Roman Mars, Turnstyle, 10/19/11
Social Sound Design "SSD is a Q&A site encompassing all the wonderful disciplines of sound design: film, game, art and installations, sound effects, new media, software, programming (Max/MSP; Pd, etc.), Arduino and micro-controllers, gear, feedback, recording, techniques and tips... as long as it involves sound design it is welcome here!"
NIME 2012 Online Proceedings (New Interfaces for Musical Expression)
NIME 2013
Big Fat Audio 
Provides royalty-free downloads of sound effects and more.
Perry R. Cook, Professor Emeritus, Princeton
(Author of Real Sound Synthesis for Interactive Applications-2002)
AKPeters Sound Examples (listen if you are curious)
Adding a 3D Positional Effect to a Sound (XNA Game Studio 4)

Jul 29, 2012

Blast from the 2009 past: News, Videos, and Links about Multi-touch and Screen Technologies

One of the things I like to do is share updates about the world of multimedia, multi-touch, gesture, screen, surface, and interactive technologies, focusing on off-the-desktop applications and systems. When I started this blog, I had to put forth quite a bit of effort just to FIND interesting things to blog about.  


These days, there are so many sources that focus on emerging - and now commonplace- interactive technologies, my main challenge is to filter the noise.  Where do I begin?


My archives are vast.   I randomly picked the year 2009 and came across one of my previous posts, "News, Videos, and Links about Multitouch and Screen Technologies."   The post is long, and contains a number of videos and links that probably will be of value to a future curator of the history of technology.


I welcome comments from readers who might be able to help me update information about various applications and systems I've featured on this blog in the past. 

The pictures are screenshots from the results of an  image search for "interactivemultimediatechnology".  Over the past 6 years, I've posted quite a few!








Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Jul 19, 2012

Your Palm as Remote Controller (Video and Links)

Traditional remote controls for televisions and home media centers are particularly frustrating to use, in my opinion.  There are too many buttons, the buttons are too small, and it is easy to mess it all up.  (I've touched on this topic in a number of blog posts, such as UX of ITV: The User Experience and Interactive TV (or Let's Stamp Out Bad Remote Controls) and others, linked at the end of this post.)


There is hope for the future!  I'm happy to share a video about a method of remote control that might make interacting with my TV less annoying...and possibly pleasant! 


Researchers at the Technical University of Darmstadt have been working on a way for people to control their TV without a remote or hand-held device.  I think they are on the right path. The video shows how UI elements, mapped to the hand, can be touched, with accuracy, to interact eyes-free with a large-screen TV.  This work won the Best Paper award at EuroITV'12.


For more details, see Leveraging the Hand Surface as and Eyes-free TV Remote Control and the references below:
Niloofar Dezfuli, Mohammadreza Khalilbeigi, Jochen Huber, Florian Müller, and Max Mühlhäuser. 2012. PalmRC: Imaginary Palm-based Remote Control for Eyes-free Television Interaction. In Proceedings of the 10th International Conference on Interactive Television (EuroITV '12). ACM, New York, NY, USA, to appear. 


Niloofar Dezfuli, Mohammadreza Khalilbeigi, Jochen Huber, Florian Müller, and Max Mühlhäuser. 2012. Leveraging the palm surface as an eyes-free tv remote control. In Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts (CHI EA '12). ACM, New York, NY, USA, 2483-2488. http://doi.acm.org/10.1145/2223656.2223823

SOMEWHAT RELATED
Oh! No! Sony's "Mother of Remote Controls" for Google TV
Interactive Multimedia Across Platforms and Screens: Adobe's Open Screen Project; MEX Mobile User Experience Manifesto... (Please don't annoy the user!)
Usability of a Remote Control
Quick Link:  The Remote - Death by Voice Control?

Jul 13, 2012

Update: Video of My PlayHome App and 19-Month-Old Toddler


This little guy "plays" iPad about once or twice a week. The video shows him with the My PlayHome app. It is interesting to see how much he remembered from the previous week!


In the near future, I plan to write a few posts about the apps that I've used with students with special needs.  I will also touch on a few apps that are great for toddlers and "twos".

Jul 12, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST)

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) 


Overview 
One of the primary goals of teaching is to prepare learners for life in the real world. In this ever-changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Given the potential of these new interfaces, software, and technologies as learning tools, as well as the ubiquitous application of interactive technology in formal and informal learning environments, there is a growing need to explore how next-generation technologies will impact education in the future. 


As a community of Human-Computer Interaction (HCI) and educational researchers, we need to theorize and discuss how new technologies should be integrated into the classrooms and homes of the future. In the last three years, three CHI workshops have provided a forum to discuss key issues of this sort, particularly in the context of next-generation education. The aim of this special issue of Personal and Ubiquitous Computing is to summarize the potential design challenges and perspectives on how the community should handle next-generation technologies in the education domain for both teachers and students. 

We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include but are not limited to: 

  • Gestural input, multitouch, large displays 
  • Mobile devices, response systems (clickers) 
  • Tangible, VR, AR & MR, multimodal interfaces 
  • Console gaming, 3D input devices 
  • Co-located interaction, presentations 
  • Educational pedagogy, learner-centric, child computer interaction 
  • Empirical methods, case studies 
  • Multi-display interaction 
  • Wearable educational media
Important Dates
  • Full papers due: November 9, 2012
  • Initial reviews to authors: January 18, 2013
  • Revised papers due: March 15, 2013
  • Final reviews to authors: April 26, 2013
  • Final papers due: June 14, 2013
Submission Guidelines
Submissions should be prepared according to the Word template located at the bottom of this page. All manuscripts are subject to peer review. Manuscripts must be submitted as a PDF to the easychair submission system. Submissions should be no more than 8000 words in length.

Guest Editors and Contact Information
  • Syed Ishtiaque Ahmed, Cornell University
  • Quincy Brown, Bowie State University
  • Jochen Huber, Technische Universität Darmstadt
  • Si Jung “Jun” Kim, University of Central Florida
  • Lynn Marentette, Union County Public Schools, Wolfe School
  • Max Mühlhäuser, Technische Universität Darmstadt
  • Alexander Thayer, University of Washington 
  • Edward Tse, SMART Technologies

Information about the Journal of Personal and Ubiquitous Computing

Seventh International Conference on Tangible, Embedded and Embodied Interaction, Feb 10-13, Barcelona, Spain: Call for Papers

CALL FOR PAPERS
TEI 13:  Seventh International Conference on 
Tangible, Embedded and Embodied Interaction
February 10-13, 2013.  Barcelona, Spain
Home

Important Dates:
August 6th, 2012, 11:59 pm (UTC+1): Paper submission deadline 
November 26th, 2012, 11:59 pm (UTC+1): Work in Progress submission deadline 
October 22nd, 2012: Paper acceptance notification 


February 10-13th, 2013: TEI 2013 Conference at Barcelona 


Description 
TEI 2013 is the seventh international conference on tangible, embedded, and embodied interaction. The work presented at TEI focuses on physical interaction with computing technology and addresses design challenges, theories, experiences, systems, and new developments. The intimate size of this single-track conference provides a unique forum for exchanging ideas and presenting innovative work through talks, interactive exhibits, demos, hands-on studios, posters, art installations and performances.


Research on tangible, embodied and embedded interaction has gained substantial visibility and activity over the past decade, under many different names and research topics and disciplines, including tangible interfaces, graspable interfaces, physical computing, tangible interaction, IT product design, appliance design and interactive spaces. It has also been associated with larger research areas, including mixed, virtual, and augmented reality and ubiquitous and pervasive computing. TEI brings together this emerging field, providing a meeting ground for the diverse communities of research and practice - from computing, hardware, and sensor technology, to HCI, interaction design and CSCW, to product and industrial design and interactive arts. We invite submissions from all of these perspectives: theoretical, conceptual, technical, applied, or artistic. The conference is designed to provide appropriate presentation forms for different types of contributions. Accepted submissions of all types will be included in the Proceedings as papers and will be integrated within the single-track conference. Interdisciplinary submissions are particularly welcome. 


This year TEI will be held in beautiful Barcelona, one of the world's leading cities for tourism, gastronomy, commerce, education, entertainment, media, fashion, science, and the arts. 


Topics for Submission 
Authors are invited to submit high-quality work detailing original research that contributes to advancing this rapidly developing field. Appropriate topics include but are not limited to:



  • Tools to rapidly develop tangible, embedded and embodied user experiences 
  • Case studies and evaluations of working deployments 
  • Analysis of key challenges and proposals of research agendas 
  • Relation of tangible and embedded interaction to other paradigms 
  • Programming paradigms and tools, toolkits, software architectures 
  • Novel interactive uses of sensors+actuators, electronics+mechatronics 
  • Design guidelines, methods, and processes 
  • Novel application areas, innovative systems, industrial applications 
  • Theoretical foundations, frameworks, and concepts 
  • Philosophical, ethical & social implications Interfaces specific in form and context to particular cultures 
  • Advantages, weakness, affordances of tangible, embedded and embodied interaction Learning from the role of physicality in everyday environments 
  • Embodied interaction, movement, and choreography of interaction 
  • Organic User Interfaces: flexible, non-flat or actuated display interfaces 
  • Role of physicality in human perception, cognition and experience 
  • Teaching experiences, lessons learned, and best practices 
  • Standardization, production, and business applications 

Papers 
Papers must present original material and will be reviewed rigorously by at least three reviewers in a double-blind process. Papers in all areas will be assessed on their contribution to the field. Papers are due on the 6th August 2012. Papers may be 4 or 8 pages long in ACM SIGCHI format. All papers will undergo the same review and publication process. The length must match the contribution. Authors may propose the presentation format that they feel best suits their contribution (short or long talk, demo, interactive exhibit, or both talk and demo/exhibit). TEI 2013 remains committed to the principle followed in previous TEI conferences - that TEI values all submissions equally. 


One author of each accepted submission must register for the conference before the early registration deadline in order for the final paper version to be published in the conference proceedings. Papers will be published in the ACM digital library. 


Please contact the program chairs Martin Kaltenbrunner, Dave Kirk and Paul Marshall at
tei2013_programchairs@googlegroups.com with questions about paper submissions.

Submission Details 
All work must be submitted electronically via the TEI 2013 Precision Conference site at https://precisionconference.com/~tei You will have to follow the guidelines regarding paper formats and submission categories at http://www.tei-conf.org/13/papersubmission 


Work in Progress 
There will be a second submission opportunity for work in progress until November 26th 2012. Selected works will be presented during dedicated poster sessions at TEI 2013 but are not published in the conference nor online proceedings. 


Explorations TEI 2013 
Explorations comprise several submission formats intended to embrace and nurture contributions from a wide range of communities - such as artists, designers and graduate students - as well as support the publication of cutting edge research. TEI Explorations include the Graduate Student Consortium and Art tracks as well as the Tangible Interaction Design Challenge. Full details will be confirmed in subsequent calls for papers.

Jul 8, 2012

PO-MO, a creative group that combines digital art, interaction, movement, and play to create engaging surfaces and spaces.

I recently learned more  PO-MO, a relatively new start-up tech company based in Winnipeg, Canada. According to the company's information, PO-MO "specializes in interactive digital display solutions, including gesture and motion based interactivity, interactive display content creation and management, and large interactive display and projection services for advertisers, educators, and events."  


Po-Motion was a finalist in an elevator pitch video contest last fall. It has several advantages over potential competitors.  The system is easy to use, and priced within the range that is affordable for schools, museums, and other cost-conscious groups who would like to provide technology-supported immersive interactive experiences for people of all ages.  The PO-MOtion software designed for interactive floors and walls starts at $39.99, and works on any computer, using any USB web camera and a projector. Other applications make use of Kinect sensors.


I especially like one of PO-MO's recent projects, the Impossible Animals Museum Exhibit, created using Unity 3-D, for the Manitoba Children's Museum.  How does it work?  Children create a colored egg using crayons and paper, which is then scanned into the exhibit and digitally embedded into the system, which includes an interactive wall and floor.  When the egg is touched, it is activated to hatch, and then becomes a motion reactive animal.  The environment includes things like water, landscapes, and even a spaceship.  The system has a "reset world" button for museum staff to use when needed.  

Impossible Animals Exhibit

Impossible Animals Interactive Museum Installation from PO-MO Inc. on Vimeo.



The following video explains how the PO-MO system works:


PO-MO is also involved in promotional projects, assisting retailers, ad agencies, and brand managers with creative ways to engage customers and clients:
Ragpackers Kinect-based Window Display

Ragpickers Kinect Window Display from PO-MO Inc. on Vimeo.


The following video provides a scrolling description about PO-MO's work, including promising data collected during implementation:

Other products and services provided by PO include mobile app development. I especially like the augmented reality business card depicted in the following video clip:

Augmented Reality Business Card from PO-MO Inc. on Vimeo


Imagine if your local shopping centers, museums, libraries, or even schools offered this level of immersive interaction on a regular basis!

RELATED
The PO-MOtion system has a wide range of uses. It is currently used in an educational setting in a sensory room for students with special needs, something that I'd like to try out in the near future with students at Wolfe School. I plan to share more about this in another post.


PO-MO Case Studies


PO-MO Bios:
Meghan Athavale – Director/CEO, PO-MO Inc.
"Meghan has been a professional designer and animator since graduating from Red River College in 1997. After graduation, she moved to Calgary, where she spent almost two years directing projects at Aurenya Studios, a start-up animation company. In 2001, Meghan was engaged by Community Connections to support community-based IT development projects in rural Manitoba and in Winnipeg’s inner city.  In 2008, Meghan joined Manlab, developing educational interactive games and resources for Immigrate Manitoba. She also launched Meghan PO-MO Project, a sole proprietorship which provided sound reactive visuals for DJs and venues across Canada. In 2009, Meghan was contracted as the User Experience Designer at Tipping Canoe, a multinational internet marketing company.

In 2010, Meghan formed PO-MO Inc. in partnership with Curtis Wachs. She began working exclusively for the company in December, 2010. Today, Meghan is the driving force behind PO-MO Inc."


Curtis Wachs – Technical Director/COO, PO-MO Inc.
"Curtis graduated from Assiniboine Community College in 2003 where he studied object oriented programming. Directly upon graduating, Curtis was hired by Assiniboine Community College to help design and develop software for online classes. Curtis relocated to Winnipeg in 2006 to create interactive training material for sales staff at E.H. Price. During the course of his work, Curt was apprenticed in 3D modelling and animation by Liem Ngyuen, a former Frantic Films resident. In 2008, Curtis joined Manlab, where he created online educational games for Travel Manitoba, Immigrate Manitoba, and other clients. In 2010, Curtis formally joined PO-MO Project, and the company became a partnership. In June 2010, PO-MO Inc. was founded.

Curtis is currently the technical director at PO-MO Inc., overseeing the project management and workflow of contracted and R&D development projects."


May 28, 2012

Interactive Music Challenge: The Reactable & the Reactable for Children with Autism Spectum Disorders

What I'd like to share today might be of interest to educators, techies, musicians, parents, and students. If you create a composition using the Reactable Mobile app and submit it, you might have a chance of winning your very own Reactable Live!  


Information about the challenge can be found on the Reactable website.







































You can find information about creating music with Reactable Mobile for your phone or tablet (Android or Apple) on the Reactable Mobile site. I especially like that the company provides a load of support through the Reactable Community.



Acquisition of Joint Attention and Social Abilities of Children with Autism Spectrum Disorders

The Reactable provided the centerpiece of Lilia Villafuerte Bazan's work for her Master thesis.  Her project focused on the acquisition  of joint attention and social abilities of children with autism spectrum disorders through music. Lilia's work was through the Music Technology Group at Universitat Pompeu Fabra, under the direction of Dr. Sergi Jorda.   

I was delighted to see Lilia's presentation of her work, along with the video, during the recent CHI 2012 conference. 


Project website: 
http://www.villafuerte.info/ascchildrenandtuis
Master Thesis:
http://www.villafuerte.info/wp-content/uploads/2012/05/LiliaVillafuerte_MasterThesis.pdf


-Lilia Villafuerte Bazan

RELATED
REACTABLE LIVE

Comment:
If you follow this blog, you probably know that I am a school psychologist who has a background in technology, music, and related arts.  I spend much of my work week with young people who have severe autism and other disabilities. I also work at a magnet high school for technology and the arts.  I'd like a Reactable for students at both schools!

May 21, 2012

Leap Motion: Low Cost Gesture Control for Your Computer Display

Jessica Vascellaro, of the Wall Street Journal, reports about gesture,  motion. and even object control for computers, highlighting the work of  Leap Motion and Flutter.




Apparently the Leap Motion sensor is less expensive than Microsoft's Kinect. It can track movements down to 1/100 of a millimeter and can track fingers and movement. It handles interaction with 8 cubic feet of space.


Below is a video from the Leap Motion website:






RELATED
Leap FAQs
Leap Motion Developer Kit Application
Leap Motion: 3D hands-free motion control, unbound
Daniel Terdiman, CNET, 5/20/12
FYI:  Do a search and you'll find many more articles and posts about Leap Motion!

May 19, 2012

CHI 2012 SlideShow (quick)

Here are some pictures from CHI 2012:


 I'll be sharing about innovative, interesting interactive technologies in some of my future posts.

URBAN MUSICAL GAME: Play with musical-sensor sports balls and friends (Real-Time Musical Interactions team at IRCAM)

One of my favorite things at CHI 2012 was the Interactivity sessions.  I especially liked the  Urban Musical Game, played with musical sports balls.  I didn't want to stop playing!


This just might transform P.E. and recess.


Below is a version of the game from 2011 created for the Future en Seine festival by the Real-Time Musical Interaction team at the IRCAM, in collaboration with NoDesign, Phonotonic, and composer Andrea Cera:

The balls in the game contain wireless sensors, which trigger sounds and interactive music.  It is supported by MO  (Modular Musical Object) technology, and Max/MSP. Below is a video of the balls in action (in French).

Nicolas Rasamimanana - Phonotonic, Paris, France 
Frederic Bevilacqua - IRCAM, France 
Julien Bloit - IRCAM, France Norbert Schnell - IRCAM, France 
Emmanuel Flety - IRCAM, France Andrea Cera - IRCAM, France 
Uros Petrevski - NoDesign, France 
Jean-Louis Frechin - NoDesign, France


Comment:
This would be a great activity for the large therapy room at Wolfe, a program for students with significant disabilities (including autism), where I spend most of my work days as a school psychologist.  I'd love to have it demonstrated in-person for students at the CATA (Central Academy of Technology & Arts), the high school I also serve, to inspire students about the creative aspects of technology.


I would also like to create my experiments with this sort of technology, based on a number of interesting ideas that have popped up in my mind over the past decade or so.  I'm especially curious to know how this could work in environments that also have multiple large interactive visual displays....


How can I make it so?
(Update:  It is a small world.  I looked up one of the people involved in the Urban Musical Game on Facebook and found that we share three mutual FB "friends".)