This is my first creation using the Drawing with Sounds application. This cause and effect application was created with Processing 1.5.1 and produces musical sounds and patterns, along with random abstract shapes that are triggered through movements of a mouse. The application has been adapted for switch users, making it accessible to those who have motor impairments.
Processing is an open-source programming and integrated development environment that was build on the Java programming that has been easy to learn by people who have limited technical backgrounds. It is used by people from from a range of fields, such as art, music, journalism, and data visualization.
The fact that Processing is easy to use makes it great for people with just a little bit of coding knowledge to adopt a variety of open-source applications and tweak a few of the variables to according to the need. Using the Drawing with Sounds app as an example, I wanted to change the size of the interactive canvas. By typing in a few keystrokes, I changed the original size of the screen from 800 x 800 to something larger, as shown below:
I then typed in another variable for the background color, and was pleased with the results:
The application was developed from Andrew R. Brown's tutorial on "SoundCipher", based on the SoundCipher library for Processing. According to the description of the application, "The sketch is intended for use....as a sensory stimulus using sound, shape and color to create engagement".
Below is a videoclip created by Keith Manville, demonstrating what happens when you interact with Drawing with Sounds:
I look forward to sharing more on this topic in future blog posts. I'll be sure to include basic "how-to" information for my "low-tech" readers and colleagues who desire to learn a bit more about using and creating basic interactive applications that appeal to young people with special needs.
RELATED
Note: SEN is the UK acronym for "Special Educational Needs"
(An accessible switch-enabled version of Drawing with Sounds can be downloaded from the above link.)
Free sensory applications built in Processing 1.5.1"Download links for sensory applications built in Processing 1.5.1, designed to assist SEN students in engaging in learning through the the use of ICT, interactive “Smart” boards, touchscreen or tablet technologies. Our experience is that using applications on these devices or with natural user interfaces such as the Microsoft Kinect can increase the opportunities for engagement and social communication with many students."
I will be posting more information about emerging interactive technologies for special needs in the future. In the meantime, take a look at the following resources:
How does MaKey MaKey work? It is powered by a board that can support 6 keyboard keys, and mouse control. It runs on top of Arduino, an open-source electronics prototyping platform that supports multi-modal interactive input and output. I see endless possibilities and fun maker-crafting with my little grandson!
In the following video, musician/visual artist j.viewz uses his MaKey MaKey kit to hook up fruits and veggies to his music system. Watch j.viewz play a bunch of grapes! The strawberries sound nice.
I recently learned about OpenPilot, an open-source project that promotes the development of economical unmanned aerial vehicles, or UAVs. According to information on the website, "OpenPilot is an ideal platform for researchers and hobbyists working on aerial robotics or other mobile vehicular platforms where stabilization and guidance are required. OpenPilot brings the cost down to reasonable prices so people can focus on developing and refining applications rather than paying the extremely high prices of most commercial offerings, or having to do ‘from the ground up’ hardware development." A number of OpenPilot community members have used their UAVs to explore interesting landscapes and at the same time, create engaging video clips. Wouldn't it be fantastic to figure out how to get a 3D or 360 camera in a UAV? Below is an assortment of videos I came across while visiting the OpenPilot website. (I've also included some videos that were created using YellowBird 360 technology, which to my knowledge, has not been attempted with a UAV.)
I think that the UAV concept would be great for an after-school technology club. It similar to robotics, but it also would get the kids outdoors. It would provide a great experience for students who are also interested in photography and videography. RELATED OpenPilot Website OpenPilot Wiki YellowBird
"Right here, in your web browser, you may contribute to anyone's creation and steer it in any direction you choose, or start a story of your own, by creating a seed animation for a new tree. Our hope is through the collective creativity of the web, an ever-expanding forest of narrative paths will grow." -Chris Milk/Aaron Koblin This Exquisite Forest is a collaborative art project by Chris Milk and Aaron Koblin, creators of the online Johnny Cash collaborative project. The Johnny Cash Project was a crowd-sourced music video for Johnny Cash's final album (see below). What I like about this project is that it provides viewers with the opportunity to create animations that link with those of other people, online. All of the tools to create animations and music are on the website, along with clear instructions and FAQs. You can even curate your own tree. For those who take a less participatory approach to art, the project's website offers something called "The Endless Theater", where viewers can watch the most popular branches of This Exquisite Forest.
The project makes use of Google's App Engine. Google's Cloud Storage handles the accounts and data storage. To access the project, you need Google's Chrome browser, as the project takes advantage of JavaScript and HTML5. RELATED
It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms. With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.
In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future. We are in a post-WIMP world and there is a lot of catching up to do!
CES 2012 Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)
During the video clip, Jeff explains how far things have come during the past few years:
"Five and
1/2 years ago I had to explain to everybody what multi-touch was and meant. And
then, frankly, we've seen some great products from folks like Apple, and really
have executed so brilliantly, that everyone really sees what a good
implementation can be, and have come to expect it. I also think though, that the explosion of
NUI is less about just multi-touch, but an awareness that finally
people have that you don't have to use a keyboard and mouse, you can demand
something else beside that. People are
now willing to say, "Oh, this is something I can try, you know, touch is
something I can try as my friendlier interface"."
Who wouldn't want to interact with a friendlier interface? Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel. Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.
I continue to be frustrated by the poor usability of many web-based and desk-top applications. I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps. I often meet with disappointment when I encounter interactive displays when I'm out and about during the day. It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!
I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course. I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home. I need to explore new tools, alongside like-minded others.
There ARE many more tools available to designers and developers than there were just four years ago. Some of them are available online, free, or for a modest fee. I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era. (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:
"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)
IDEAS Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences.
An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration. It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas". Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction. The system would provide an option for tangible interaction.
A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods.
Interactive video tools for creation, collaboration, storytelling. (No bad remote controllers needed.)
A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
An public health application designed to provide information useful in understanding and sepsis prevention efforts.This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.
I welcome comments from readers who are working on similar projects, or who know of similar projects. I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project. I hope that my efforts can play a part in helping people make the move from GUI to NUI!
Below are a few videos of some interesting projects, along with a list of a few references and links.
If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.In the meantime, take the time to watch a few of Christian's videos. Enjoy!
Blip Shaper Walkthrough
Blip Shaper Walkthrough from christian bannister on Vimeo. "a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens" Subcycle Walkthrough
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
The following information describing the Blip Shaper is from Christian's Subcycle website/blog: "For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds. These are running in parallel so for each voice there is a separate patch running in each VST. The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound. I love this because I have never seen this before and I can never decide which technique I like better. The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial. I will post a detailed list of parameters and gestures in the future."
Contre Jour is a game developed for the iPad by Chillingo. It is my current favorite game, partly because it is a great example of how the power of a touch interface can be harnessed. The beauty of this game is that it provides a natural means of scaffolding how the tools are used, so that as the player progresses, the learning curve feels almost effortless. In the game, the character is controlled through the player's ability to manipulate and move the environment. The underlying physics is intriguing. Contre Jour was inspired by children's book, Little Prince, but in this game, the "prince" is a cute little eyeball character. In the game, each chapter/world has a musical theme, and the music integrates with the game's visuals and "movements" to provide a relaxing ambience. To learn more, take a look at the following video about Contre Jour. I'm happy I spent a few dollars to buy the app!
Below is information about NodeBeat 1.5 from the Vimeo website:
"This is a brief video showcasing some of the new features in NodeBeat HD 1.5. The same features will be coming to the iPhone/iPod Touch version of NodeBeat in the coming weeks."
"New Features Include:
- Entirely new user interface design - Drag and drop new nodes - Drum Generator Node - Sine, Triangle, Sawtooth, and Square waveform selection - Tempo and BPM controls - Compress recording for faster e-mailing of recordings - Create ringtones from recordings - Sleep Timer. Now you can fall asleep to NodeBeat - Shake to Clear Screen - 5 Finger multitouch reset all" ---------------------------------------------- "NodeBeat is an experimental node-based audio sequencer and generative music application for the iPhone, iPod Touch and iPad."
The Future... thoughts shared by Allison Druin at CHI 2011:
Rethinking the meaning of interaction- well beyond the mouse, beyond the icon
Rethinking our relationships we have with our technologies
Rethinking how to transform technology
Rethinking ho transforming technology can change learning, to really change the future...
Paraphrasing ....."It starts with the experience of the child. Have you ever asked them what they think about the future? Backpacks with ice cream, storytelling machines that fly... layered stories that are tall as building?
How do you get from low-tech prototyping to what gets on tech devices? It's not about data analysis, analyzing things that are easy to analyse."
Allison works with kids and adults together, in a participatory manner, at her lab at the University of Maryland...
Video demonstrating design techniques, including low-tech prototyping, involving children and ideas:
The researchers look at what children are doing today, in homes and in schools. More recently, children's use of search at home, away from the eyes of their teachers.
How do we figure out how designs and ideas become new technologies?
Here is one example: Story kit is a freely available app for the iPhone that can be used as a prototyping tool. Kids are asked to design music and create what it might sound like. It is considered to be a "mid-tech" tool, and supports creativity in the design process.
Three things that will happen in the future, according to Druin:
Technology Ecology - apps are cross platforms and technologies. Where ever kids are and need it (tech agnostic)
Physical/Virtual Switching --- interaction "bursts". Designing for an activity that can be interrupted, in a good way.
Creation of new neighborhood for learning. A blur between the local and the global, "Local 2.0", beyond the walls. Technologies need to accept this, and embrace this.
Allison's inspiration comes from her many years of working with children in innovative and creative ways.
RELATED From the Q and A: Kids now know that technology rapidly changes. Kids continue to be creative. Kids seem to be more confident in their creativity, and that it matters. Kids aren't given enough opportunities to be creative these days, given all of the testing that they endure at school.
"There is a feeling that anything is possible....it does happen!"
I've been pleasantly surprised by the increase of interesting multi-touch and gesture-based applications developed for musical interaction on large displays. This topic is dear to my heart - I took a computer music technology class back in 2003, and think it would have been great if this sort of thing was available back then. Who wants to point and click around a music app for hours on end?!
The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.
Transparent Stage System Specs Design • Floating, transparent HD displays from 32" • Haptic surface with tempered glass backing • High gain image with wide viewing angle • Rugged aircraft-grade aluminum build • Enclosed projector and computer • Minimalist style, compact footprint • Disassembles for easy transportation
Interactivity • Projected capacitive foil or IR bezel options • Up to 32 simultaneous touch points • Precise, responsive touch tracking (3mm) • Immune to external light conditions • Use of fingers, gloves or stylus
Technical Display: • DLP portable projector • 1280×800 HD resolution • 2500 ANSI-lumen, 1800:1 contrast Computer: • Mini-ITX, Core 2 Quad, 4GB RAM • ATI Radeon X1250 graphics card • Wireless keyboard and mouse • Windows 7 Pro
Stuart McClean, the founder of Osmosis, shared the following information about his company:
"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."
For more information about the applications developed by the Osmosis team, take a look at their showcase page.
Scott Sona Snibbe, an interactive artist, created a large-scale video installation on twenty-nine connected HD displays that curved around at the Tom Bradley International Terminal at the LAX airport in 2010. The video tells a story of groups of people traveling through an airport who eventually begin dancing in interesting ways.
Francesca Penzani was the choreographer for this project. Noah Cunningham was responsible for the cinematography, editing, and post-production.
RELATED SNIBBE INTERACTIVE The Snibbe Interactive website has lots of great photos and videos of the company's work.
The video below was an entry in the Google Demo Slam, an effort started by Google to share the word with the world about their innovative technologies. By the time I learned of Epic Docs Animation, the video had over 800,000 views. I plan to view a few more Google Demo Slam videos over the holiday break!
RELATED Epic animation in Google Docs Nathan Yau, Flowing Data, 12/22/10 Google's rationale for creating Demo Slam: "We spend our time making a whole bunch of technologies that are free for the world, but a lot of people dont even know about them. And that kind of sucks. So, we thought organizing the world's most creative tech demo battle would be a great way to help spread the word and teach people about tech. Not to mention, it is a lot of fun." About Demo Slam Hall of Demo Champs
The TellTable is a collaborative digital storytelling application designed for Microsoft's Surface interactive multi-user multi-touch table. It was used by children in a school library in the UK, and was quite successful. The video below provides an overview about the way the TellTable worked. I think every school library/media center should have something like this!
"TellTable, a system developed in the Soci-Digital Systems group, uses Microsoft Surface technology to provide an interactive storytelling experience, similar to how children would tell stories using physical toys. Children can create various digital characters and sceneries on Surface, made out of photos taken of real world objects and environments as well as finger paintings. By manipulating these characters using multi-touch gestures, children can narrate, act and record imaginative stories together, before sharing them with friends later on.
TellTable was deployed in a primary school, and received overwhelmingly positive feedback. It has seen crowds of kids create stories featuring characters as diverse as Ninja chickens and rampaging teddy bears. Teachers were also enthusiastic about the potential of TellTable as an education tool, to help children develop communication skills and to create interactive tutorials when teaching foreign languages." -- Microsoft Research UK
During the opening night community session at the recent AIGA Make/Think conference, twenty AIGA chapter leaders from around the country took the stage. Jeremy Perez-Cruz,of AIGA Orlando, was one of them. This was his 60 seconds:
Make Think from Tea&Letter on Vimeo. Quote from Jeremy's Vimeo channel: "My one minute video presentation for the AIGA design conference's 20/20. 20 designers in 20 minutes. October 8, 2009 Memphis, TN. Shot and edited with the patient assistance of deebstudios.com Music: "Joy" by Four Tet (Make Think remix by me)."
Jeremy's design:related page
Jeremy's AIGA Profile
Doug Fox, of the Kinetic Interface blog (part of Great Dance), shared IKEA's new promotional site that allows for tons of multimedia interaction. The website features a series of rooms to highlight various products.
Visit Come into the Closet! IKEA Interactive Multimedia Game Unlike most static, or even flash-based websites, the user has control over many dimensions of experience:
Simply watch the dancers and listen to the music, navigating from room to room
Control the dancers by tapping on your computer's keyboard
Control the dancers by clapping, singing, or making other noises
Combine tapping your keys while watching and listening
Here is a video preview of some of the dance interaction in the site:
Duncan, of the Inspiration Room, also posted credit information regarding the IKEA website:
The following credits were posted by Duncan on the Inspiration Room Daily:
"The site was developed at Forsman Bodenfors, Sweden by web director Mathias Appelblad, copywriter Fredrik Jansson, art director Anders Eklind, designers Anton Robsarve and Christoffer Persson, agency producers Peter Gaudiano, Charlotte Most, Asa Jansson, account director Philip Mascher."
"An earlier version of the site was developed in 2006, winning a Yellow Pencil at the DandDs, and can be viewed online at Forsman Bodenfors."
"Web development was done at Kokokaka, Gothenburg."
"Filming was shot by director Amir Chamdin via Chamdin & Stöhr, Stockholm, with director of photography Gosta Reiland."