Showing posts with label HCI. Show all posts
Showing posts with label HCI. Show all posts

May 6, 2010

Google buys BumpTop

BumpTop is an application designed for touch-screen interaction and was introduced at TED by Anand Agarawala, who worked on the application for his Masters thesis at the University of Toronto. The company was recently bought by Google. 

Below is the video of Agarawala's 2009 TED Talk:




Here is a link to a previous post about BumpTop:
Bump Top 3D Desktop on a Touch Screen: Toss Your Photos to your Facebook Icon!



RELATED

Announcement about Google's acquisition of BumpTop, from the BumpTop website:

Dear BumpTop fans,
More than three years ago, we set out to completely change the way people use their desktops. We're very grateful for all your support over that time — not just financially but also through all the encouraging messages from people who found BumpTop inspiring, useful, and just downright fun.
Today, we have a big announcement to make: we're excited to announce that we've been acquired by Google! This means that BumpTop (for both Windows and Mac) will no longer be available for sale. Additionally, no updates to the products are planned.

For the next week, we're keeping BumpTop Free available for download at bumptop.com/download to give BumpTop fans one last chance to grab a copy.
Existing BumpTop Pro users should visit bumptop.com/pro for more information.
Thanks again for all your support over the years. Despite our change in strategy, we remain as passionate as ever about helping shape the future of computing!
Sincerely,
Signature of 
The Bumps April 30th 2010

Apr 9, 2010

Interesting work: Mobile Projection & More from Johannes Shoening's YouTube channel

Here are a few videos of Johannes Schoening's research videos related to mobile projection:








SOMEWHAT RELATED: 

GeoLens: Allowing Multi-User Interaction with Geographic Information Systems on Interactive Surfaces

Apr 4, 2010

ASU's SMALLab, a Mixed Reality Learning Environment : Multi-modal, and Collaborative; Links to Publications about Digital Media and Learning for the 21st Century

"Nothing's impossible, we just get smarter and smarter by the day." - Student, commenting about his experiences in the SMALLab environment. 


The research team at Arizona State university, lead by David Birchfield, has worked to create embodied, multimodal, and collaborative mediated learning learning environments using mixed reality that has been in use at Coronado High School with much success. The SMALLab is a learner-centered approach to learning that provides multi-modal, multi-sensory activities that engages learners, and also results in deeper understanding of more complex concepts.


Here is some information from the SMALLab website:

SMALLab @ Arizona State University - 2009 from aisling kelliher on Vimeo.


Video of high school students describing their work in SMALLab (Coronado High School)
"Central to our work is the development of a new interactive mixed reality learning environment, the Situated Multimedia Art Learning Lab [SMALLab]. SMALLab is an environment developed by a collaborative team of media researchers from education, psychology, interactive media, computer science, and the arts.  SMALLab is an extensible platform forsemi-immersivemixed-reality learning. By semi-immersive, we mean that the mediated space of SMALLab is physically open on all sides to the larger environment.  Participants can freely enter and exit the space without the need for wearing specialized display or sensing devices such as head-mounted displays (HMD) or motion capture markers.  Participants seated or standing around SMALLab can see and hear the dynamic media, and they can directly communicate with their peers that are interacting in the space. As such, the semi-immersive framework establishes a porous relationship between SMALLab and the larger physical learning environment.  By mixed-reality, we mean that there is an integration of physical manipulation objects, 3D physical gestures, and digitally mediated components.  Byextensible, we mean that researchers, teachers, and students can create new learning scenarios in SMALLab using a set of custom designed authoring tools and programming interfaces.

SMALLab supports situated and embodied learning by empowering the physical body to function as an expressive interface. Within SMALLab, students use a set of “glowballs” and peripherals to interact in real time with each other and with dynamic visual, textual, physical and sonic media through full body 3D movements and gestures.  For example, working in theSpring Sling scenario, students are immersed in a complex physics simulation that involves multiple sensory inputs to engage student attention.  They can hear the sound of a spring picking up speed, see projected bodies moving across the floor, feel a physical ball in their own hands and integrate how the projected ball moves in accordance with their own body movements to construct a robust conceptual model of the entire system."


SLINK links the various installations of SMALLab, which are found in museums, schools, the Institute of Play in NYC (used with students who attend the digital-games based, technology-rich charter school, Quest to Learn), 
Gaming SMALLab [IOP, ASU]: Lab install at Parsons DT
-Institute of Play's SMALLab

-Arizona State University: A SMALL way to keep up with technology (Sheilah Britton, 5/11/09)

-Arizona State University:  A SMALL way to keep up with technology (Sheilah Britton, 5/11/09)
How SMALLab Works:


SMALLab modules


More about the way it works: SMALLab-Situated Multimedia Arts Learning


About David Birchfield:
David Birchfield is "a media artist, researcher, and educator.  He has created work that spans from interactive music performance to generative software to robotic installationsn to K-12 learning environments. In recent years, this work cuts across three areas of exploration:K-12 learning, media art installations, and live computer music performance."


Some publications:
              Birchfield, D., Megowan-Romanowicz, Johnson-Glenberg, M., Next Gen Interfaces: Embodied Learning Using Motion, Sound, and Visuals – SMALLab. To appear in Proceedings of the American Educational Research Association Annual Conference; SIG Applied Research in Virtual Environments for Learning [ARVEL], San Diego, CA, April 2009.
               Megowan-Romanowicz, M., Uysal, S., Birchfield, D., Growth in Teacher Self-Efficacy Through Participation in a High-Tech Instructional Design Community, to appear in proceedings of the National Association for Research in Science Teaching Annual Conference, Garden Grove, CA, April 2009.





  • Birchfield, D., Thornburg, H., Megowan-Romanowicz, C., Hatton, S., Mechtley, B., Dolgov, I., Burleson, W., Embodiment, Multimodality, and Composition: Convergent Themes Across HCI and Education for Mixed-Reality Learning Environments, Journal of Advances in Human-Computer Interaction, Volume 2008, Article ID 874563.

  • Dolgov, I., Birchfield, D., McBeath, M., Thornburg, H., Todd, C., Amelioration of Axis-Aligned Motion Bias for Active versus Stationary Judgments of Bilaterally Symmetric Moving Shapes’ Final Destinations, Perception and Psychophysics, in press 2008.

  • D. Birchfield, B. Mechtley, S. Hatton, H. Thornburg, Mixed-Reality Learning in the Art Museum Context, Proceedings of ACM SIG Multimedia, Vancouver, BC, October 27, 2008.

  • S. Hatton, D. Birchfield, M.C. Megowan, Learning Metaphor through Mixed-Reality Game Design and Game Play, Proceedings of ACM Sandbox Conference, Los Angeles, CA, August 10, 2008. [pdf]

Institute of Play's SMALLab contact:
          Katie SalenExecutive Director, Institute of Play Associate Professor, Parsons The New School for Design


The Institute of Play, along with the Joan Ganz Cooney Center and others, have a number of publications related to technology and learning:

"The mission of The Joan Ganz Cooney Center is to catalyze and support research, innovation and investment in digital media technologies to advance children's learning. Nurturing foundational and "21st century" literacies:

The inaugural focus of the Center—given the national need—will be on determining how technology can help elementary-aged children develop the fundamental building blocks of literacy. These include the vital reading, writing, speaking and listening capabilities that all children must develop during the primary grades. A special emphasis of the Center will be on struggling readers who risk educational failure if they do not catch up to their peers by grade four...Another important focus of the Center is to leverage the potential of interactive media to promote "21st century" literacies that students will need to compete and cooperate in our connected world—competencies such as critical thinking and problem solving, second language competency, inter-cultural understanding and media literacy."

Here are a few background resources from the Institute of Play's resources page:
The following reports will bring you up-to-date about technology and "21st Century" learning.
Generation M2: Media in the Lives of 8- to 18-Year-Olds
The Kaiser Family Foundation - Report - 2010
Wallis, C. The Impacts of Media Multitasking on Children’s Learning & Development
The Joan Ganz Cooney Center - Meeting Report - 2010 (pdf)
Shuler, C. iLearn: A Content Analysis of the iTunes App Store’s Education Section
The Joan Ganz Cooney Center - Report - 2009 (pdf)
Wellings, J., Levine, M.H.The Digital Promise: Transforming Learning with Innovative Uses of Technology
The Joan Ganz Cooney Center - White Paper - 2009 (pdf)
Thai, A. M., Lowenstein, D., Ching, D., and Rejeski, D. Game Changer: Investing in Digital Play to Advance Children's Learning and Health
The Joan Ganz Cooney Center - Policy Brief - 2009 (pdf)
There are a number of similar articles published by CISCO and/or the Metiri Group:
Lemke, C., Coughlin, E., Reifsneider, D. (2009). Technology in the Schools: What Does the Research Say? (pdf)
Lemke, C. (2009) Multitimodal Learning through Media:  What the Research Says (pdf)
Williams, S.M. The Impact of Collaborative, Scaffolded Learning in K-12 Schools: A Meta-Analysis (pdf)



Mar 11, 2010

Social Thinking blog - Great post from Michelle Winner's Blog

We're planning an application for a multi-touch SMART Table at one of my schools, and one of the goals is to ensure that it can facilitate social interaction among students with and without disabilties, including autism.  Keeping this in mind, one of my colleages sent me an email a minute ago with a link to a great post by Michelle Winner's blog about social thinking and "inclusion" vs "integration" in education settings.  It is worth a read!

http://www.socialthinking.com/michelles-blog/social-skills-for-an-integrated-setting.html

Jan 28, 2010

TEI '10 Info and Links: Fourth Annual International Conference on Tangible, Embedded, and Embodied Interaction

In my dreams, I am a full-time tech student. Fortunately, I can follow my inner geek and share what I find on this blog. The information below was inspired by links from a Facebook status update by Laurence Muller, author of the Multi-Gesture blog

The video below is a montage of TEI'10 hands-on studio:

TEI Studios from jay silver on Vimeo.

"From TEI 2010. These are the hands-on studios (like workshops) where 200 people participated in building and making all day long elbow to elbow, getting into the details and taking perspectives."

About TEI:

TEI '10:  Fourth International Conference on Tangible, Embedded, and Embodied Interaction, January 25-27, Cambridge, MA.
"TEI, the conference on tangible, embedded, and embodied interaction, is about HCI, design, interactive art, user experience, tools and technologies, with a strong focus on how computing can bridge atoms and bits into cohesive interactive systems."



Here is a link to the keynote:
http://www.vikmuniz.net/

Here is a link to one of the papers presented at TEI:
Electronic Popables:  Exploring Paper-Based Computing through an Interactive Pop-Up Book (pdf)- Jie Qi and Leah Buechley, MIT Media Lab, High-Low Tech Group

More about Laurence Muller:
Laurence Muller (M.Sc.) Dutch flag, is a Fellow at the Harvard University (USA) at the School of Engineering and Applied Science (SEAS) / The Initiative in Innovative Computing (IIC) in the Scientists' Discovery Room Lab (SDR Lab). Currently he is working on innovative scientific software for multi-touch devices and display wall systems. (I took Laurence's information from his blog.)

More to come!


Jan 22, 2010

Pushing the Envelope of our Gadgets: e-Readers, using the Wii to play iPhone games, etc.

Via Wired Gadget Lab:

"E-readers, as we know them now, are at a crossroads. Next week, Apple is expected to announce a new tabletlike device that could also be used to read digital books. It’s expected to have a color screen, not the monochrome E Ink display found on most e-readers. Apple is reportedly talking to publishers to bring e-books and magazines to the device." Read More



Wii Hacks:


Hack a Wii
Wii Hacks Hack-a-Day


Update: I'm working on a future post that will flesh out this topic.

Nov 28, 2009

Fantasy HCI! Dream Lab and Dream Team for the Future

Fantasy HCI!  

My wish is to have my own lab so I can create and test out various interactive applications that run on screens of all sizes, and play with new interactive gadgets and displays. I'd also like to provide mobile lab services so I can go out and see how emerging technologies play out in real-life situations and settings during the design & development process as well as after-market.

I'd like to focus on social-collaborative & cognitive aspects of emerging technologies. Because of my background in school psychology, I'd work towards ensuring that new applications, technologies, and systems follow the guidelines of Universal Design for Learning as well as Universal Usability. I have some ideas about the transdisciplinary characteristics I'd like to see for members of the lab's Dream Team, but I'm saving that for another post. Now I just need to win the lottery so I can hire my team and run with the ball. Team Charlotte, N.C., anyone?

FYI:
The HCI link is to a blog that corresponds to the Theory and Research in Human Computer Interaction class at Rensselaer Polytechnic Institute. 

For more information about HCI, visit the Human-Computer Interaction Resources website.

Quick Post: Video of Stantum's Multi-touch "Slate PC" Digital Resistive Touch Screen Netbook



The video is is of the Stantum Slate PC, via Netbooked's YouTube channel...The system in the video is running on a modded Dell Mini 10, and doesn't require calibration. Notice how the system easily handles a variety of interaction- fingers, thumbs, pinch, rotation, multiple finger swipes, brush strokes, fingernail action, stylus, and more.

At this time, the Stantum Slate is available for developers only.

RELATED POSTS

Stantum's Multi-touch Slate PC, Windows 7 Certified (11/17/09)
Interactive multi-touch for sound design, dj-ing, and music creation (10/25/09)
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen (9/7/09)
Updates about NextWindow and Stantum; Upcoming Emerging Displays Technologies Conference (6/2/09)

FYI:   Netbooked's Netbook Blog

Oct 20, 2009

Thomas Hansen's Multi-Touch Wisdom: "Windows Touch vs PyMT - Why programming on Windows is too complicated!"



If you are interested in designing or programming multi-touch programs and like elegant, concise code, you'll appreciate Thomas Hansen's recent blog post:


Windows Touch vs PyMT:  Why multi-touch programming on Windows is too complicated!


Here is an excerpt from Thomas Hansen's blog post:


"If you read the rest of the blog post, I’ll show you what I mean about context, and why e.g.  Windows Touch makes life difficult if you want to program multi-touch.  I’ll show you how to rewrite a windows touch example project (5 C# source files and > 400 lines of code) in Python using PyMT (1 source file with 12 lines of code).  Yes 12 lines, you read correctly (and then there is the whole thing about it just running on Linux or OSX as well…but we’ll leave that for another blog post)."


At the end of his post, Thomas makes a comment that I think is worth considering:


"The interaction paradigm is so revolutionary, I think we need to adopt our development tools more to it and explore the interaction space.  Instead I think people are jumping the gun on trying to standardize the interface while using the development paradigms we used for the GUI."


Well said!


Thomas Hanson is a member of the NUI Group.  He is a graduate student at the University of Iowa, pursuing his PhD in Computer Science, specifically Human Computer Interaction.  

Sep 12, 2009

Exploring high quality, low-cost, and opensource eyegaze, and face tracking technologies:

Update:  More information about this topic is posted on the TechPsych blog.

I'm working with a few students who have multiple special needs who are in need of assistive communication technologies that are not expensive. Although my first idea was to look at eye gaze technology, the possibility of face tracking is also appealing. Take a look at the following video from SeeingMachines, a commercial company:



I have a couple of webcams, and I don't mind a trip to RadioShack. I can program a bit, too, but I have very limited time. I welcome any advice you might have.

Sep 6, 2009

Interactive Memorabilia at the Hard Rock Cafe: Microsoft's Multi-touch Rock Wall, Companion Surface Installations, and Awesome Touch-Responsive Interactive Memorabilia Website.

UPDATE: Video of Hard Rock Cafe's Memorabilia Multi-touch Wall



I came across the updated Hard Rock Cafe website and found that it provides an awesome interactive experience!  Visitors can explore the extensive Memorabilia collection in detail. This site is almost ready for inclusion in my UX/Interactive Hall of Fame!

According an article posted on the Hospitality Technology website, ""Tech, Love, and Rock 'n' Roll,  the website was built using Silverlight, and is part of an initiative to expand the reach of the Hard Rock Cafe's extensive memorabilia collection.

"It all started last year with what we call 'Memo 2.0,'" [for memorabilia 2.0] and the rollout of Microsoft's new Silverlight (www.silverlight.net) technology, explains Joe Tenczar, senior director of technology and CIO for Hard Rock International. Silverlight is a web application framework, similar in scope to Adobe flash, that lets companies build custom apps. Hard Rock partnered with its brand agency Duncan/Channon (www.duncanchannon.com) and software developer Vertigo (www.vertigo.com) to create a custom Silverlight application for www.hardrock.com."


Below is a video of the website when it was still under development, unveiled during a MIX08 keynote:


The website is a companion project that is part of a bigger vision that includes an 18-foot interactive multi-touch wall that mirrors the Memorabilia website.  The article goes on to mention that the "Rock Wall can be used by one person to blow up a piece of memorabilia as large as the screen, or optimized for six different user experiences at one time, though the technology has been tested for responsiveness to hundreds of simultaneous touches. "Imagine seeing Bob Dylan's Harley at more than life size; big impact," says Tenczar....Rock Wall uses a single piece of Stewart projection glass, custom-made at the factory for Hard Rock, along with three Christie projectors, numerous lasers, and multiple IR cameras to create a unified projection and touch experience. "The graphics are driven by a Nvidia Quadro Plex and there are two other dedicated servers for localized content and physics. I have seen a lot of cool technology, but this still makes my jaw drop whenever I see it in person." And the technology is smart. "If I walk up to the screen, it will blow up where I am."
 
According to information from the Hard Rock Cafe's press release, the interactive wall was created by Obscura Digital, a company located in San Francisco, along with Technomedia Systems was also involved in this project.  Here's a quote from the press release:

"We were excited to work with Hard Rock on this revolutionary project," said Obscura Digital CEO Patrick Connolly. "This is the longest, largest and most technologically sophisticated multi-touch wall we’ve built. It utilizes our newest Fireframe technology, and the result is the highest resolution, seamless multi-touch display we have ever seen. We are delivering a 6000 x 1000 pixel display, so that multiple users at a time can enjoy the vast Hard Rock Memorabilia collection in stunningly high resolution. With this display, customers can literally zoom so deep into the images, that things like a scratch on Buddy Holly's glasses can be seen 3 feet wide. The graphic quality of this system is something the public has never seen, and the impact on the customers is something they will not soon forget." 
Hard Rock Cafe Multi-Touch
In addition to the website and Rock Wall, booths at the Hard Rock Cafe provide customers with access to music videos from a video distribution server, or VDS, created by Coleman Technologies.  Each booth's touch screen display terminal has its own computer, and this allows guest control over the music videos that are played at the cafe.  Content from the memorabilia system can be explored by customers right from the booth, from what I understand.

http://www.duncanchannon.com/wp-content/uploads/2009/09/hard-rock-microsoft-surface.jpg

Microsoft's Surface tables were installed in the cafe, running a custom application created by Vertigo. that allows customers to interact with a spinning globe to locate and explore other Hard Rock Cafe's around the world.





Memorabilia Website UX:

Unfortunately, I am not in Las Vegas, so I haven't had a chance to interact with the Hard Rock Cafe's Rock Wall or any of the Microsoft Surface installations, which are the centerpiece of this project.

Not to worry. The companion Hard Rock Cafe Memorabilia website, experienced from the comfort of my home on an HP TouchSmart PC, provides a great user experience, even thought it is not fully optimized for touch navigation.

One of the things that I like about the site is that it uses Microsoft's Deep Zoom feature. When you zoom in, you can explore each photo without sacrificing resolution, and explore everything in minute detail.

If you want to try your hand at touch navigation, you will be pleasantly surprised, even though you will have to rely on your mouse at times. To zoom in, tap the initial presentation screen, and use your fingertips to pan around. What you touch and release won't go flying about the screen, which is a good thing.

If you lightly tap an item of interest, you will get a translucent information box on the right side of the screen. The information box might include video clip of an interview with a musician, a story, and a means to share the experience through embedding a widget on your website or blog, posting a link, or emailing a link.

After poking around the site using Firefox, I switched to Internet Explorer, turned the zoom setting to 125%, and had no difficulty using my right index finger to navigate through the information box. (It was more difficult to do this using Firefox.)

The directions for traditional navigation are simple:

"Turn on, zoom in, zoom out. Use your mouse wheel to zoom in and out. You can also zoom in by clicking and zoom out by shift-clicking. To pan, click and drag."

Featured artists on the Memorabilia site include The Who, The Beatles, James Brown, John Lennon, Bob Dylan, Guns N'Roses, Buddy Holly, Madonna, Motley Crue, Elvis Presley, Queen, and The Rolling Stones. You can sort the main presentation page by artist or year, and explore the collection by artist, type of item, genre of music, decade, and location.

This website provides an interactive, user-friendly means of exploring the history of rock music. (It would be cool if the site offered some form of interactive sound track.)

I'm impressed.

I embedded the widgets below from the Memorabilia site. I wasn't sure exactly what might turn up. The interactive widgets, created with Silverlight, went beyond my expectations.  You can explore the content right from this post, if you have Silverlight installed:





I shot some HD video of my touch interaction of the website and will post the videos as soon as I can. For now, take a look at some of the screen shots:

Main portal:



















Eric Clapton's Guitar and Backstory:


















Screen shot of memorabilia sorted by type - instruments:


















Zooming in:



















Screenshot of Gibby Hayne's art:




















Detail of one of Madonna's decorative costumes:


Aug 29, 2009

Natural User Interface Surfaces: TouchCo - IFSR technology company, founded by NYU Media Reseach Lab's Ken Perlin and Ilya Rosenburg

Earlier this year, I had a very vivid dream about haptic touch-screen overlays. That's how things go for me sometimes. At any rate, the dream inspired a blog post, "Last night I dreamt about haptic touch-screen overlays...". Since this is one of my interests, I try to keep up with developments in this field when I can.

It is amusing to see how seemingly unrelated things have similar roots. Today I came across information about a cool mobile phone concept for the visually impaired, designed by Samsung. The touch technology involved in the concept phone incorporated something called Electric Activating Plastic, or EAP. 



This sparked my curiosity, and while searching for more information, I came across an article in MIT's Technology Review that caught my eye:


A Touch of Ingenuity: Inexpensive pressure-sensitive pad could make surfaces smarter
(Kate Green, MIT Technology Review September/October 2009)



UPDATE:  As of 2/2010, TouchCo no longer exists. It was bought by Amazon:



"..Now that more and more smart phones and MP3 players have touch-screen interfaces, people have grown accustomed to interacting with gadgets using only taps and swipes of their fingers. But on the 11th floor of a downtown Manhattan building, New York University researchers Ilya Rosenberg and Ken Perlin are developing an interface that goes even further. It’s a thin pad that responds precisely to pressure from not only a finger but a range of objects, such as a foot, a stylus, or a drumstick. And it can sense multiple inputs at once."

Ken Perlin and Ilya Rosenberg developed this system as part of their human-computer interaction work at NYU's Media Research Lab, and with the success of their work, went on to start a new company, Touchco, to commercialize and market this concept, known as IFSR (Interpolating Force-Sensitive Resistance) technology.

"The IFSR sensor revolutionizes the field of multi-touch human-computer interaction by out-performing existing solutions in both price and capabilities. Unlike traditional capacitive sensors, our patent pending system can detect any object - not just a finger - and can determine how much pressure is being applied to every point on a sensor simultaneously. IFSR sensors are natively multi-touch, use less power than capacitive sensors, and are much less expensive to produce, making them a highly disruptive technology with widespread market applications."


The Touchco website has a wealth of information regarding this technology. (Update: The TouchCo website was taken down in February 2010)

Other members of Touchco include Nadim Awad, Fang Cheng, Julien Beguin, and Christopher Nam.

SOMEWHAT RELATED
I know of Ken Perlin through his Games for Learning work, and also through his on-line application, Responsive Face, which I use in my work with students who have autism.




Ken Perlin's Blog
Games for Learning Institute

TechPsych Post:
Samsung Braille TouchPhone Prototype is Cool: Uses Electric Active Plastic for functional tactile feedback.

Aug 24, 2009

Multi-touch on Firefox: Felipe Gome's Work

Felipe Gomes is a 4th-year computer engineering student at the University of Sao Paulo, Brazil. He's been working on a multi-touch project that enables the FireFox web browser to respond to multi-touch interaction and gestures. Below is a demonstration of this work:

Multitouch on Firefox from Felipe on Vimeo.




Code snippets for the demo can be found on the Mozilla Wiki

Aug 14, 2009

What happens when engineers and musicians get together? They get Calvin Harris "Ready for the Weekend" with the Humanthesizer and Bare Ink

So what happens when engineers and musicians get a chance to mix it up?

The Humanthesizer project is the result of a collaboration between musician Calvin Harris, and masters students from the Royal College of Art Industrial Design Engineering program.
What makes this project innovative is that a special conductive ink, called Bare Ink, was used to support the creation of the music.

From the Bare Conductive website:

"
Bare is a conductive ink that is applied directly onto the skin allowing the creation of custom electronic circuitry. This innovative material allows users to interact with electronics through gesture, movement, and touch. Bare can be applied with a brush, stamp or spray and is non-toxic and temporary. Application areas include dance, music, computer interfaces, communication and medical devices. Bare is an intuitive and non-invasive technology which will allow users to bridge the gap between electronics and the body."

(Please overlook the bikini-clad women in the videos below, because the innovative technology supporting the Humanthesizer is worthy of the watch. Parents be forewarned!
)

The Making of the Humanthezizer


Close to the finished version:



Information from YouTube regarding the Humanthesizer project:


"Calvin performs a version of his new single Ready For The Weekend on a unique human synthesiser. The instrument employs 15 bikini clad models painted with Bare Conductive, a new skin safe ink which conducts electricity. When the performers touch the connection completes a circuit, triggering a sound....The instrument consists of 34 pads on the floor which have been painted with the conductive ink and connected to a computer via some clever custom electronics. The performers stand on the pads, and touch hands to complete a circuit and trigger a sound. Different combinations of pads trigger the different sounds needed to play the track....Bare Conductive was developed by RCA students Bibi Nelson, Becky Pilditch, Isabel Lizardi and Matt Johnson. The custom electronics and software for the project were created by Matt Johnson, employing two Arduinos and the graphical audio programming tool Max MSP. The performers, floor pads, Arduinos and Max MSP combine to create a giant MIDI controller which is used to create music which is sequenced and quantized with Ableton Live." (Special thanks to: RCA / Bare Conductive - Bibi Nelson, Becky Pilditch, Isabel Lizardi, Matt Johnson - http://www.bareconductive.com)
http://www.bareconductive.com/images/ink.jpg

(The Ready for the Weekend music video is available on YouTube. Parents: the music video has dancing girls throughout, in one-piece bathing suits.)

http://www.bareconductive.com/images/harris1_0.jpg


Thanks to Celine Latulipe for the link!

Aug 8, 2009

More about Project Natal: Richochet - Great Gaming for Fitness, Johnnie Chung Lee's Contribution


(Credit: CNET News)
Ina Fried, in a recent CNET Beyond Binary post, recently reviewed her experience playing Ricochet, a 3D game developed by Microsoft for Natal, the company's new gesture-recognition, controller-less Xbox gaming system. Above is a screenshot from Fried's article, Exclusive: Getting up close and personal with Natal:

Here is the video:


Ina Fried had a chance to spend some time in Redmond, Washington to explore the games in development at Microsoft, and hang out with the people responsible for Project Natal.

In her Beyond Binary article, Fried notes that the Ricochet game provides quite a workout, and this has had a positive effect on the Natal team:


"Since I started working on this project, I've lost almost like 10 pounds," said Kudo Tsunoda, general manager of Microsoft Game Studios and the creative director for Project Natal. "We're going to have the most in-shape development team you've ever seen."


Fans of Johnny Chung Lee will be happy to know that his work at Microsoft contributed to this game in someway, if they don't know this by now!

Who is Johnny Chung Lee? Read my post, "I wish I could be Johnny Chung Lee for a Day!".


RELATED

Speaking of Natal, it should be out next year (Ina Fried, CNET)

Gates: Natal to bring gesture recognition to Windows too



Jul 29, 2009

Update on Multi-Touch, Multi-User NuMTyPYsics

A few months ago, I wrote a post about multi-touch "NuMTyPYsics". Here is an update:

Below is a video that shows how two use two single-touch Nokia Internet tablets to provide an emulation of multi-touch input for NuMTyPYsics, a drawing game that based on Numpty Physics:



Here's the description from Perlinet's YouTube channel:
"Using two single-touch tablets and a laptop running NuMTyPYsics, it's possible to emulate multi-touch input (max. 2 cursors - one from each tablet). This allows for precise input control and testing multi-touch applications. More info: http://thpinfo.com/2009/mt/"

http://www.tuxi.com.ar/wp-content/uploads/numpty-physics-tuxi.jpg
The game is also based on Crayon Physics, using the Box2D engine.

Jul 23, 2009

Dr. Jan Borchers' (Annotated) Top Ten List of Books on Human-Computer Interaction - Of interest to HCI students (and HCI students at heart...)

The academic year is coming up, and a new wave of students will be searching for good resources pertaining to human-computer interaction and related areas of study. A couple of months ago, I shared the following information on a blog post, but thought it was worthy of recycling.

The list is useful to HCI students, but also to people who have little background in HCI who find themselves working on real-life projects that require a good amount of this knowledge.



Dr. Jan Borchers, head of the Media Computing Group at RWTH Aachen University, recommends the following list of books. I've read many of these books and I agree that this list is great. (The comments regarding the book are Dr. Borchers'.)

Dr. Jan Borchers' (Annotated) Top Ten List of Books on Human-Computer Interaction:

1. Alan Dix, Janet Finlay, Gregory D. Abowd, and Russell Beale: Human-Computer Interaction, 3rd ed., Prentice Hall, 2004. Currently the best, most well-rounded book I know to teach introductory HCI if you need to limit yourself to a single title. Technical enough, good breadth, not too fuzzy for a CS curriculum, very current, with a web site that includes resources such as sample programs, slides, etc.

2. Ben Shneiderman and Catherine Plaisant: Designing The User Interface, 4th ed., Pearson Addison-Wesley, 2004. Best overall reference book for all areas of HCI, providing an introduction and great up-to-date pointers to most sub-fields of HCI research and practice, especially different interaction techniques. His Golden Rules of User Interface Design and sample questionnaires for user testing are very useful in an introductory class. Unfortunately, the companion web site costs money after an initial trial period.

3. Donald A. Norman, The Design Of Everyday Things, Basic Books, 2002. A classic text from 1988 with an updated introduction that, while some of the technologies described or envisioned seem somewhat outdated now, still provides the best introduction to the spirit of good human-centered design. A not too technical read with hilarious stories of badly designed everyday technology, it provides some very useful basic models for human cognition, such as the Seven Stages of Action. This book also introduced the fundamental concept of affordances to HCI. Changed my view of the world of technology around me, and is probably the best initial brainwash for engineering students to "get" user-centered design.

4. Jenny Preece, Yvonne Rogers, and Helen Sharp: Interaction Design, 2nd ed., Wiley, 2007. This title focuses more on the process of designing good user interfaces, and is less technical, but excellent and up-to-date in the area it addresses. The companion web site has slides, case studies, and other materials.

5. Bill Moggridge, Designing Interactions, MIT Press, 2008. A truly beautiful "coffee-table style" book on interaction design, also covering product and industrial design of digital technology (Moggridge is a founder of IDEO). It has wonderful short essays about seminal digial product designs, from Engelbart's mouse, to the Mac and Palm, to Google and other internet services, as well as articles on digital product design theory. My own Sweet Sports and Baroque Technology article was based on one of the theory articles. Special treat: video interviews and chapters are available for free, on a weekly rotation, at http://www.designinginteractions.com/.

6. Bill Buxton, Sketching User Experiences, Elsevier, 2007. Similar to Moggridge's book in style, this book focuses on the early stages of product design. It also includes very interesting stories of key interactive products, such as Apple's iPod. And of course it's written by one of the long-time key players in HCI. More at http://www.billbuxton.com/.

7. Terry Winograd (ed.): Bringing Design to Software, Addison-Wesley, 1996. An excellent and very well edited collection of contributions from key players in HCI, from Kapor's Software Design Manifesto to Rheinfrank's Design Languages. Its particular value also comes from the profiles that link chapters and give an insider's view of how some of the most seminal UI designs came to be, from the Xerox Star to VisiCalc and HyperCard. Terry has some information about his book at http://hci.stanford.edu/bds/, and I used it with great success when I had the fortunate opportunity to teach an introductory HCI class in his program at Stanford in 2002.

8. Brenda Laurel (ed.): The Art of Human-Computer Interaction, Addison-Wesley, 1990. While ancient by today's standards, this book is another carefully compiled and very coherent collection of highly relevant articles on HCI by some of the most influential people in the field. I particularly like the article by Scott Kim on interdisciplinary design, and Tom Erickson's chapter.

9. Apple Computer: The Apple Software Design Guidelines, latest edition 2005. OK, I'm a Mac head, but then many HCI people are because Apple has such an excellent sense of doing the right thing when it comes to user interface design. These guidelines have been around since the 90's, with several new editions since then, and especially Part I ("Application Design Fundamentals") contains excellent, system-independent, hands-on advice for anybody developing interactive software, especially desktop applications. And it's free! Apple's developer website has the latest version both online and as downloadable PDF. I often recommend this as a quick read for engineering types that just want the bare essentials to help avoid major UI design catastrophes.

10. Jef Raskin, The Humane Interface, Addison-Wesley, 2000. Similar to Norman's book above, but more recent and more technical, this is another good first read to start thinking about user interface design, written by the father of the original Apple Macintosh. Some of the ideas presented here are quite unusual, and that's intended. Some related materials, such as demos of his Zoomable User Interface and The Humane Environment are at http://www.jefraskin.com/.

"So that's my top 10 list. I may add some more in the future. But I figure it's more important to restrict myself to those books I think are really outstanding than bother you with additional titles that don't really have that special something....For a good current PhD-level HCI reading list that is based more on papers and individual chapters than single books, see Terry Winograd's HCI reading list at Stanford University." -Dr. Jan Borchers


While you are at it, Dr. Borchers has a list of HCI hardware toolkits for physical user interface prototyping.

(I want to take more HCI classes and play with this stuff!)