Dec 28, 2011

iPads for every student in an elementary school in NC, Smartboards in the classrooms, and one awesome principal

 

The effects of technology at this Title I school resulted in increased attendance, fewer discipline problems, and  college aspirations for the students.   I would like to visit this school soon and see how the teachers integrate iPads and Smartboards.


The story about the principal of this school is inspiring.


RELATED
Buckley Report: Immigrant Principal
Montlieu Academy of Technology

Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Dec 13, 2011

Kinect in Education! (kinectEDucation)

Although I'm currently exploring the world of interactive HTML5, interactive video, etc., I think I just might make "kinecteducation" the focus of my tech-hobbies. I have some experience with game programming-one of my computer courses required a project using XNA- and I know quite a bit about gesture and multitouch, multi-user interactio, so it would'nt be too much of a stretch.


My motivation?

As a school psychologist, my main assignment is a school/program for students with disabilities, including about 40 or so who have autism spectrum disorders. Yesterday, the principal of the school attended a demonstration of the Kinect and requested that our school be considered for piloting it. One of my other assignments is a magnet high school for technology and the arts, and rumor has it that it will be offering a game programming curriculum.  I'd love to co-sponsor an after-school game club and encourage the students to program educational apps for the Kinect sometime in the near future! 


I'm also working as a client, in collaboration with come of my educator colleagues, with a team of university students who are creating a communication/social skills game suite geared for students with autism and related disabilities....


I'm inspired by the possibilities!


We have large SMARTboards in each classroom and in other locations around the building, and we have a Wii set up in the large therapy room adjacent to my office. The Wii has proven to be very useful in helping the students develop social and leisure skills that they can use in and outside of the school settings, but some of the students have difficulty manipulating the buttons on the controllers.


You can get Kinect-based apps from the Kinect Education website! Below are selected links from the website:

You can also get additional information from the Microsoft in Education "Kinect in the Classroom" website.

Below are a few videos to give you an overview of how open-source applications designed for the Kinect can be used in education: 






Dec 12, 2011

UPDATE POST: Educational Interfaces, Software, and Technology: 2012 ACM-CHI Workshop Call for Papers/Presentations

There is still time left to submit your paper!


CALL FOR PAPERS
EDUCATIONAL INTERFACES, SOFTWARE, AND TECHNOLOGY 2012
3rd Workshop on UI Technologies and Educational Pedagogy
May 5-6 2012
in conjunction with ACM-CHI 2012, Austin, Texas

This will be our third annual workshop in conjunction with CHI 2012.



One of the primary goals of teaching is to prepare learners for life in the real world. In this ever changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Teachers and students can leverage these tools to improve learning outcomes. Educational interfaces and software are needed to ensure that new technologies serve a clear purpose in the classrooms and homes of the future.



Since teachers are always looking for creative ways to engage 21st century learners, there needs to be an academic venue for researchers to discuss novel educational tools and their role in improving learning outcomes. This workshop aims at filling this void: combining the pedagogical expertise of the cooperative learning, and learning sciences communities with the technical creativity of the CHI, UIST and interactive surface communities. The objective of this workshop is to become a conference within two years


We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. 




Topics of interest include:

  • Gestural input, multitouch, large displays
  • Mobile Devices, response systems (clickers)
  • Tangible, VR, AR & MR, Multimodal interfaces
  • Console gaming, 3D input devices
  • Co-located interaction, presentations
  • Educational Pedagogy, learner-centric, Child Computer Interaction
  • Empirical methods, case studies
  • Multi-display interaction
  • Wearable educational media
Submission:  The deadline for workshop paper submissions is Dec 20, 2011. Interested researchers should submit a 4-page position paper in the ACM CHI adjunct proceedings style to the workshop management system. Acceptance notifications will be sent out February 20, 2012. The workshop will be held May 5-6, 2012 in Austin, Texas. Please note that at least one author of an accepted position paper must register for the workshop and for one or more days of the CHI 2012 conference.

Website: http://smarttech.com/eist2012
Contact: Edward Tse, SMART Technologies, edwardtse@smarttech.com


RELATED
Educational Interfaces, Software, and Technology Workshop Organizers
Edward Tse, SMART Technologies 
Lynn V. Marentette, Union County Public Schools
 Syed Ishtiaque Ahmed, Cornell University
 Alex Thayer, University of Washington
 Jochen Huber, Technische Universität Darmstadt

 Max Mühlhäuser, Technische Universität Darmstadt
 Si Jung “Jun” Kim, University of Central Florida

 Quincy Brown, Bowie State University

Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn - an awesome book by Cathy "HASTAC" Davidson. (Her blog is awesome, too!)

Who is Cathy Davidson
According to her most recent bio, Cathy "served from 1998 until 2006 as the first Vice Provost for Interdisciplinary Studies at Duke University, where she worked with faculty to help create many programs, including the Center for Cognitive Neuroscience and the program in Information Science + Information Studies (ISIS).  She is the co-founder of is the co-founder of Humanities, Arts, Science, and Technology Advanced Collaboratory, HASTAC ("haystack"), a network of innovators dedicated to new forms of learning for the digital age.  She is also co-director of the $2 million annual HASTAC/John D. and Catherine T. MacAuthur Foundation Digital Media and Learning Competition."


Here is Cathy's book: Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn
Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn 

Note: I am almost finished with the book. Sadly, it went missing for several weeks, so I have about 50 more pages to read, now that it is found!   I didn't want to wait too long before writing my review, so I thought I'd highlight it here, since it will make a great read-or gift, for the holiday season!   It is well worth reading, as Cathy is a deep thinker who transcends disciplinary confines.


Here are a few Cathy Davidson's blog posts:
So What Again is HASTAC? Post #HASTAC2011 Reflections on a Network Founded on a Theory That's Practice 12/4/11
Why is the Information Age Without the Humanities Like the Industrial Revolution Without the Steam Engine? 1/24/10
Five Ways the Open Web Can Transform Higher Education
12/4/11


RELATED
HASTAC website, where you can discover a number of excellent blogs and interesting links.


Near Field Communication (NFC) for SmartPhones at the Museum of London, plus related video and NFC links

I recently came across the following video demonstrating the use of NFC (Near Field Communication) at the Museum of London.  I wouldn't be surprised if this technology takes root in a variety of applications in the very near future!


Museum of London's use of Near Field Technology with Nokia


London History Museum Adopts Technology of Future
Claire Swedberg, RFID Journal, 8/16/11
"Museum of London has installed NFC RFID tags in its two location, to provide the growing number of NFC-enabled phone users with additional exhibit information, access to social Web sites and vouchers."
Byte at the museum
Trevor, Conversations by Nokia, 9/26/11


RELATED 
Apple, Microsoft Reported to Include NFC in 2012 Smartphones
Christina Bonnington, Wired Gadget Lab, 11/22/11

Gamification and NFC top 2012 Digital Trends
Wenlei Ma, AdNews12/12/11
Microsoft: NFC already supported in Windows Phone
Gareth Beavis, TechRadar.com, 12/9/11

Near Field Communication
Android Developers
Intel joins NFC Forum board
Telcompaper, 12/8/11

Join the NFC (Near Field Communication) Revolution Today
NFC Data Inc, Sacramento Bee, 11/30/11
"Pledge support. Set a new mobile wallet standard for security, convenience, and affordability! Think outside the phone!"

NFC Near Field Communication Tutorial
Radio-Electronics.com
"A summary overview or tutorial of Near Field Communications, detailing what is NFC and its technology"
Vending Machines Now Take Google Wallet Payments

-Wall Street Journal, MarketWatch












Dec 9, 2011

Quick Link: The Remote- Death by Voice Control?

Voice Control, the End of the TV Remote? Samsung, LG, and others are racing to bring voice control to the TV Set 
Peter Burrows and Cliff Edwards, Bloomberg Businessweek, 12/7/11


"Most consumers’ first opportunity to talk to their TVs—and have them listen—will be through voice-enabled apps for their smartphone or tablet."

I look forward to seeing if/how Apple TV uses Siri as a remote control!


Product Details
Picture from Amazon.com
RELATED
Jakob Nielsen, Alertbox, 6/7/04  
(A great read, still holds true in 2011!)

Lynn Marentette, Interactive Multimedia Technology, 10/12/10

Harry Brignull, 90 percent of everything, 10/6/10

Bad hospital TV remote control:

Instructions for my last remote control:

Nov 30, 2011

MozoCam Video Demos: Great for virtual field trips, awesome HD + fluid movement

I was looking for some high-quality video clips to use with student at work for activities that incorporate "virtual field trips".  I was delighted to find the following clips using the MozoCam system.  A customized Segway is used for the MozoCam system, which is why the motion is so smooth.  


Nov 28, 2011

FlatFrog Multitouch Videos: Point Separation, Multi-input, Multi-user input

FlatFrog Multitouch is a company based in Sweden. It was founded by Ola Wassvic and Christer FÃ¥hraeus.  The technologies support 20+ simultaneous touches, and recognize object size, a useful feature. FlatFrog screens can be optimized for a wide range of light conditions  FlatFrog's multi-touch and gesture interaction is featured in the short video clips below.  


FlatFrog is gearing up for commercial release. According to the FAQ's on the website, "all sizes are possible, from 5" to 100" and upward.  Promethean is one of the company's investors.   There is a volume manufacturing agreement with Kortek Corporation, known for industrial and gaming displays.




Thanks Touch User Interface for sharing this information! (Touch User Interface is the blog for Sensible UI, known for the ArduMT, aka the Arduino Multi-touch Development Kit)

Nov 27, 2011

Starbucks Cup Magic: Augmented Reality App for iPhone and Android!



The app was developed by Blast Radius.  FYI, Blast Radius is hiring.
"Blast Radius is a global agency that takes a strategic approach to tackling the complex issues of growing brand and revenue in a digitally connected world."


For more information, tak a look at the Starbucks Cup Magic iTunes preview.


Thanks to Ajit Jaokar for the link!

Nov 26, 2011

Revisiting Good Blogs: User Interface Engineering (Albrecht Schmidt, University of Stuttgart)

Albrecht Schmidt is a professor at the University of Stuttgart. His main interests include novel user interfaces and innovative applications for ubiquitous computing. His blog, Albrecht Schmidt - User Interface Engineering, serves as his note-pad, full of ideas, deep musings, and great links, including links to good scholarly papers. Below are a few of his posts:

Guests in my multimodal interaction class   
I'd love to take Dr. Schmidt's Multimodal Interaction class!  In this post, Albrecht highlights the presentations of  Nigel DaviesMarc Langheirich, and Rui Jose,  all members of the pd-net project.

Call for Papers:  Symposium on Pervasive Display Networks

Closing Keynote at AMI2011:  Beyond Ubicomp - Computing is Changing the Way we Live
(Pdf version - worth taking a look!)


Percom 2011 in Seatle, keynote


RELATED
PD-NET
"The PD-NET project aims to lay the scientific foundations for a new form of communications medium with the same potential impact on society as radio, television and the Internet. The goal is to explore the scientific challenges and to assess the new technologies required to enable the emergence of large scale networks of pervasive public displays and associated sensors. This display network will be designed and implemented to be open to applications and content from many sources and thus provide the foundation for work on a new global communications medium for information access and interaction." 


Note:  One of my interest is public displays in public spaces.  If you are interested, take a look at my Flicker collection, Ubiquitous Sightings of Urban Screens: Interactive Displays, Kiosks, and Digital Signage.

Revisiting Good Blogs: Eager Eyes (Robert Kosara, UNC-C)

Robert Kosara is a professor at UNC-Charlotte, responsible for opening my eyes to the world of information visualization and visual communication when I was a student in his graduate course a few years ago.  He is a deep thinker and his blog/website, Eager Eyes, is well worth taking the time to explore!


Here are some links to his posts:


You Only See Colors You Can Name "While color is a purely visual phenomenon, the way we see color is not only a matter of our visual systems.  It is well known that we are faster in telling colors apart that have different names, but do the names determine the colors or the colors the names? Recent work shows that language has a stronger influence than previously thought."

What is Visualization? A Definition

Understanding Pie Charts

Protovis Primer:  Part 1, Part 2, Part 3

Chart Junk Considered Useful After All

Six Niche Visualization Blogs

Linear vs. Quadratic Change

TWINE video and the Supermechanical Blog: Moving Toward the Web of People and Things!

"A wireless square with sensors and a simple web app to set rules, Twine tells you what your things are doing by email, text or Twitter."   I want one!

This project was developed by David Carr and John Kestener, the designer-engineers behind Supermechanical. They are passionate about creating connectable objects.They honed their skills in the interdisciplinary  MIT Media Lab.  


More information about Twine can be found on the KICKSTARTER website.  Here is a bite of info from the site for the tech-curious:

"Twine is a wireless module tightly integrated with a cloud-based service. The module has WiFi, on-board temperature and vibration sensors, and an expansion connector for other sensors. Power is supplied by the on-board mini USB or two AAA batteries (and Twine will email you when you need to change the batteries)."
"The Spool web app makes it simple to set up and monitor your Twines from a browser anywhere. You set rules to trigger messages — no programming needed. The rules are put together with a palette of available conditions and actions, and read like English: WHEN moisture sensor gets wet THEN tweet "The basement is flooding!" We'll get you started with a bunch of rule sets, and you can share rules you create with other Twine owners."
"Because the hardware and software are made for each other, setup is easy. There's nothing to install — just point Twine to your WiFi network. Sensors are immediately recognized by the web app when you plug them in, and it reflects what the sensors see in real time, which makes understanding and testing your rules easy."

RELATED

Nov 25, 2011

Revisiting Good Blogs: Nathan Yau's Flowing Data

One of my favorite blogs is FlowingData, Nathan Yau's labor of love for the past several years. Nathan is a UCLA PhD candidate in statistics with a focus in data visualization.  He shares interesting tidbits of information on his blog, including those that relate to his main interests, social data visualization, self-surveillance, and data for non-professionals.  He supports accessible and useful data visualization.


Nathan is the author of  the book, Visualize This: The FlowingData Guide to Design, Visualization, and Statistics,  nicely explained in the video below:


ADDRESS IS APPROXIMATE: Stop-motion animation short by Tom Jenkins about the journey of a lonely desk toy through Google Street View


Address Is Approximate from The Theory on Vimeo.


Below is information about the video posted on Vimeo:
"Google Street View stop motion animation short made as a personal project by director Tom Jenkins. Story: A lonely desk toy longs for escape from the dark confines of the office, so he takes a cross country road trip to the Pacific Coast in the only way he can – using a toy car and Google Maps Street View. All screen imagery was animated - there are no screen replacements. 


Produced, animated, filmed, lit, edited & graded by Tom Jenkins (www.theoryfilms.co.uk / https://www.facebook.com/theoryfilms - !NEW MAKING OF PICS ON FB PAGE! / @thetheoryUK / http://twitter.com/#!/thetheoryUK). Shot using Canon 5d MkII, Dragonframe Stop Motion software and customised slider. Music by the wonderfull Cinematic Orchestra (www.cinematicorchestra.com) and the track is Arrival of the Birds."




Thanks to Nathan Yau (Flowing Data) for sharing this video!

Nov 24, 2011

Revisiting Good Blogs: Innovative Interactivity (II) - A Digital Watering Hole for Multimedia Enthusiasts

One of the blogs I enjoy following is Innovative Interactivity (II).  Here is the description from the "About" section of the blog:


"Innovative interactivity serves as an open forum for multimedia producers, interactive web developers, and new media professionals. Content focuses on the dynamics and theory of how people receive and react to different forms of information on the web, both through visual, multimedia storytelling and interactive information design."


"The goal is that this blog will provide an outlet for those in the online realm, whether you are interested in learning about multimedia storytelling, interactive web development, programming languages, software tools, data visualization, or all of the above.  Hopefully you will be inspired from what you read here to surpass your current standards in order to develop highly effective multimedia interactives for the digital community."

Tracy Boyer Clark is the founder & managing editor of Innovative Interactivity.  She is finishing up her MBA/MSIS dual master's degree at UNC-Chapel Hill. 


Here are a few links to recent II blog posts:
Behind the scenes of NFB's One Millionth Tower
Lessons learned from Internet Summit 2011
Facebook COO Sheryl Sandberg delivers monumental speech for women in tech

Mind/Shift Blog Post: Read, Hear, or Create a Story: Apps for Traveling with Kids - and great related links!

Today is Thanksgiving, and one of the things I'm thankful for is the opportunity to share interesting ideas and links with readers of this blog! 


The following post was written by Tina Barseghian, who is an editor/blogger at KQED/NPR for the Mind/Shift blog. Mind/Shift  focuses on ideas and technology that will impact the future of how we learn.  
Read, Hear, or Create a Story: Apps for Traveling with Kids Tina Barseghian, Mind/Shift, 11/23/11

Link to other Mind/Shift post collections:
Children and Social Media
Tech Tools
Mobile Learning
Digital Divide
School Day of the Future


RELATED 
Boredom Busters: 50 Fantastic Play-and-Learn Apps, Sites, and Toys
Tina Barseghian, Mind/Shift, 6/17/11
Libraries and Museums Become Hands-On Learning Labs 
Audrey Watters, Mind/Shift. 11/23/11
"A new competition sponsored by the Institute for Museum and Library Services (IMLS) and the John D. and Catherine T. MacArthur Foundation has just announced 12 winning libraries and museums that will receive $1.2 million in grant money to help push the boundaries of what these institutions look like, specifically helping to create facilities that are  better "learning labs" for teens"
Lauren Britton Smedley (Transliteracy Development Director a the Fayetteville Free Library)
IDEO's Design for Learning (Sandy Speicher, Duane Bray, Rachel Switzky)
K12 Laboratory at D. School (Stanford)
Books Should Be Free
StorynoryStorynory iTunes Library:  Podcasted stories for children, offers a free audio story every week, has some online multimedia activities.



Nov 19, 2011

Camera-less Tabletop Computing with Samsung SUR40 for Microsoft® Surface® with PixelSense™

Here is the press release: 
Next Generation of Microsoft Surface Available for Pre-Order in 23 Countries
"Software developers interested in creating solutions for the Samsung SUR40 can get started immediately by visiting the Surface Developer Center. The site provides free and easy access to the Surface 2.0 software developer kit, featuring the Input Simulator, which enables developers to write Surface applications on any Windows 7 machine, as well as other helpful developer-related resources. There are already hundreds of highly skilled Surface software development partners that can be found at http://www.surface.com."


(I've listed Microsoft Surface partners at the end of this post and plan to share more about the latest applications for surface computing in the near future.)


The following product information was taken from the Samsung website:
40" Surface Experience 
"Samsung SUR40 is the new generation of Microsoft® Surface® experience featuring PixelSense™ technology, which gives LCD panels the power to see without the use of cameras. Building from the innovation of the first version of Microsoft® Surface® and Samsung’s leading display technology, it is now possible for people to share, collaborate and explore together using a large, thin display that recognizes fingers, hands and other objects placed on the screen." 


PixelSense™ 
"PixelSense™ allows an LCD display to recognize fingers, hands, and objects placed on the screen, including more than 50 simultaneous touch points. With PixelSense™, pixels in the display see what’s touching the screen and that information is immediately processed and interpreted."


Resolution:             1920 x 1080 
Viewing Angle:       (H/V) 178 / 178° (CR ≥ 10) 
CPU:                     Athlon X2 Dual-Core 245e (2.9GHz) 
Operating System: Windows 7 Professional  x64 


GPU:        AMD HD6750M 
N/B:         AMD RS780E 
S/B:         AMD SB710 


Storage:                  SATA2 320 GB 
Memory:                 DDR3 4 GB
USB:                      4 USB 2.0 
VGA:                     Out HDMI Ethernet: 100 / 1000 
Audio Codec:         Realtek ALC262 Azalia CODEC


Product Dimensions (With Stand):            1,095 x 728 x 707.4 mm 
Product Dimensions (Without Stand):       1,095 x 102.5 x 707.4 mm 
Shipment Dimensions:                            1,214 x 299 x 832 mm 


Product Weight:     35 Kg 
Shipment Weight:  45.4.Kg


RELATED
Samsung
Microsoft Surface
Microsoft Surface "What's New"
Microsoft Surface Partners:
Aftermous.com
AKT
AM Production
Black Marble
ETT
Headcandy
IdentityMine
Information Strategies
Infusion
Inhance Digital
Interknowlogy
Intuilab
nSquared
Object Consulting
Onwijs
Razorfish
Sevensteps
Stimulant
Touchtech
T-Systems
MultiMedia
UID
Vectorform
XFace

Nov 13, 2011

Is the answer Voronoi? Looking for possible solutions to an art+dance+music+tech idea from a recurring dream....

If you are a long-time reader of this blog, you probably know that I sometimes have some unusual dreams about technology.  I don't blog about my dreams very often, but last night, I had another technology dream, a continuation of a dream I had one night last week.  
Voronoi Diagram (Wikipedia)

I'm pretty sure that the last two dreams were sparked by playing an online interactive demonstration of a Voronoi application before going to sleep one night, and also reading an article about "extracting ordered patterns from a triangular mesh surface" in the Novemeber/December IEEE Potentials magazine before turning in last night. 

The dance probably was influenced by my recent viewing of the North Carolina Dance Theater's performance of Innovative Works with my mom, someone who encouraged my  study of music, art, and dance at an early age.

Some of my tech dreams are sort of...practical. For example, in one recurring dream, I find myself coding for a flexible mesh/grid application. Sometimes the mesh/grid has something to do with wireless sensor networks on curved terrain, perhaps related to something like the Smart Grid, and sometimes I find myself working on an application that analyzes streaming data from a variety of sources, for security prediction purposes.  At other times, I'm coding for something more artistic, my preference.

Last night, my dream focused on creating a flexible mesh fabric that used in a multimedia dance/graphic arts/music performance.  I was coding for this performance using a Voronoi-like algorithm.


This is the best I can do to explain this: The fabric is carried by the dancers, and is both reactive and generative. In essence, the fabric is intertwined/embedded in the dance, the music, and the graphics. In my dream, everything looked/sounded/felt awesome and otherworldly, and the music that merged and morphed during the dance was so beautiful, not only the melodies, but the sounds.  (In a previous dream, the mesh contained a "nanotechnology" component, but I'll save that quest for the future.)


I thought I'd look at some of my web book marks and search a bit more for information related to this topic.  For now, here is the "brain dump".  I have more to add to this post, and plan to port it to a reference page for this blog in the future.  I hope that this post will be useful to some of my art/music/dance/tech readers!

RELATED AND SOMEWHAT RELATED


Update:  Right after uploaded this post, I came across a link to a WebGL demo for a 3D music video of pop singer Ellie Goulding's song, 'Lights', by HelloJoy.   Visitors to the webpage can click to interact with the environment. If you keep the button pressed, you fly faster.  If you tweet the link, you'll see your name crop up as you fly around in the soundspace.  For more information about the making of 'Lights' - take a look at  Behind the scenes of 'Lights": the latest WebGL sensation!  (Carlos Ulloa, 11/9/11)


After I watched the 'Lights' video, I recalled Radiohead's 'House of Cards' video, which I wrote about back in 2008: 
 "We were rolling computers all day"...The Making of Radioheads House of Cards using imaging and info visualization software.   The process behind the making of the House of Cards video was described in detail in Chapter 10 of the book, Beautiful Data.

Bradley, E., Capps, D., Luftig, J, & Stuart, J.M. Toward Stylistic Consonance in Human Movement Synthesis.(pdf)  The Open Artificial Intelligence Journal, 2010, 4, 1-19
Bradley, E., Stuart, J.  Using Chaos to Generate Variations on Movement Sequences (pdf) Chaos, 8:800-807 (1998)
Bradley, E., Stuart, J.  Learning the Grammar of Dance.(pdf)  Proceedings Fifteenth International Conference on Machine Learning, Madicson, WI, 1998
E. Bradley, D. Capps, and A. Rubin, "Can computers learn to dance?," Proceedings International Dance & Technology (IDAT), Tempe AZ, Feb 1999.
Chaotic Dance: Using mathematics to generate choreographic variations
Schedl, M., Hoglinger, C., Knees, P. Large-Scale Music Exploration in Hierarchically Organized Landscapes Using Prototypicality Information (pdf)
Fournel, N. Procedural Audio for Video Games: Are we there yet? (pdf) GDC 2010


Voronoi Cells, created by Nathan Nifong.  A version of this interactive work was used in a DanceDraw performance















Patterns in the Noise (Nathan Nifong's site - FYI, Nathan worked with Celine Latulipe with the Dance.Draw project while completing his bachelor degree in computer science at UNC-Charlotte)
Voronoi Dance (Christian Gross, using OpenFramework)
Voronoi art: Slow Trip (Oktalist/Mat)


The above video, by Mat/Oktal, was inspired by his viewing of Thomas Ruff's Substrat images. 
Scott Snibbe Studio (Intearctive art, music, and animation for iPhone, iPad, iPod, and Mac)

Interactive Voronoi Diagram Generator with WebGL (Alex Beutel)
 
The above video was found in Alex Beutel's blog post, "Interactive Voronoi Diagrams with WebGL"

Posts about DanceDraw and related work at UNC-Charlotte:
News from the HCI lab at UNC-Charlotte- Creative Interactions (Videos)
Exploring the Design Space in Technology-Augmented Dance at CHI 2010:  Celine Latulipe's team from UNC-Charlotte
Interactive Surveillance:  Live digital art installation by Annabel Manning and Celine Latulipe

SIGCHI  Digital Arts and Interaction Community:  Building Bridges
The Interdisciplinary World of Dance and Interactive Technology

HTML5Voronoi  (HTML5Code website)

HTML5 Voronoi, Live Version
Update to code to compute Voronoi diagrams (Raymond Hill, 5/22/11)
William Forsythe's "Synchronous Objects-One Flat Thing, Reproduced" - Multidisciplinary online interactive project: Translating choreography into new forms.
BCS HCI 2011 Workshop: When Words Fail:  What can Music Interaction tell us about HCI?
Woven Sound (Alex McLean)
Real DJs Code Live (Robert Andrews, Wired, 7/3/06)
Visualization of Live Code (Alex McLean)
Voronoi diagrams of music (pdf)  (Alex McLean, 2006)
WebGL
Sylvester: Vector and Matrix Math for JavaScript
Generative Art Links (Mikael Hvidtfeldt Christensen)
Schacher, J.C. Motion to Gesture to Sound:  Mapping For Interactive Dance (pdf) Proceedings of the 2010 Conference on New Interfaces for Musical Expression (NIME 2010), Sydney, Australia
Code & Form:  Computational Aesthetics (Marius Watz)
Werghi, N. Extracting ordered patterns from a triangular mesh surface.  IEEE Potentials, Nov/Dec 2011
Last night I dreamt about haptic touch-screen overlays
Hyun-Seok Kim's 'Voronoi' dragonfly wing inspired superyacht 2

Hyun-Seok Kim's Voronoi Yacht