Showing posts with label video. Show all posts
Showing posts with label video. Show all posts

Mar 9, 2011

Could your child have a Geminoid for a teacher someday? Imagine the possibilities, good or bad!

In the US, teachers are being laid off or eliminated from their jobs in many communities. The improvements seen in Wall Street haven't been felt on Main Street. Many parents face the fact that their children, teens, and college-age kids will be entering classrooms and lecture halls filled many more students than in the past, come August or early September. 


What are the solutions?         The popular buzz says that interactive whiteboards, netbooks, SmartPhones, and tablets such as the iPad2 will tranform K-16 education. States and local school districts have created a range of virtual schools, and these are increasing in number.  My own state offers a "virtual advantage" in the form of the North Carolina Virtual Public School.  There are opportunities for students with special needs to receive instruction on-line, too.

"Mr. Geminoid"



What might be see in the future-  Robots for teachers?  While the topic might seem a bit amusing, it might not be a joke in the future.

Five years from now, will my grandson go to kindergarten toting a next-gen iPad, (augmented reality-enabled, 3D holographic, telepresent, virtual-school connected) greeted at the classroom door by the next-Gen Mr. Geminoid, his life-like, fully programmable teacher?  


At this point, the Geminoid is controlled by a real person.  In the school setting, one teacher could control his or her "Geminoid" in multiple classrooms, I suppose....


"A Geminoid or twin-robot is designed to resemble a living person. It is controlled by an operator who through the use of advanced software can transfer facial movements and speech to the robot next door (or where ever...). For many years, robot technology has primarily been associated with factories and warehouses, but the days of thinking about robots as just 'tools' are over, Henrik Scharfe says. The robots have become a media, perhaps one of the more significant media of the future. In that way, robot technology follows the trajectory of computers: from tools to media." -Geminoid DK


I can see that many people might fancy the idea of robots taking over the role of teachers.  


Unlike the Geminoid, these robots wouldn't need to be controlled by living, breathing teacher.  Anti-teacher union folks would like the fact that there would be fewer wages to pay,  no rising health insurance premiums to worry about,  no concerns about retirement.  No money would need to be set aside for death benefits.


I imagine there would be other costs, and this might be more than what we now pay for live teachers.  We'd need to pay for  programmers who can handle all aspects of instructional technology.  We'd need to pay for robotic maintenance technicians.  We'd have to pay for the network that runs the system of robot teachers.    


We'd have to figure out how the robots could realistically handle parent-teacher conferences, classroom conflicts, bus duty, coaching, choir practice,  club sponsorship, volunteer coordination,  science fairs, and so forth.  


There is always the chance that robots outfitted with future generation AI capabilities might become sentient, at which point, they might decide to unionize, if they don't happen to "live" in states that don't allow  teacher unions, such as N.C.   ; ) 


Just a thought.   

RELATED AND SOMEWHAT RELATED
Geminoid DK Website (lots of info!)
Geminoid DK: An ultra-realistic android announced (w/Video)
Katie Gatto, Physorg, 3/7/11
Latest Geminoid Is Incredibly Realistic
Evan Ackerman, IEEE Spectrum Blog, 3/5/2011
I Want to See What You See:  Babies Treat 'Social Robots' as Sentient Beings
Science Daily, 10/14/11
"A team of University of Washington researchers is studying how infants tell which entities are "psychological agents" that can think and feel."

geminoid dk
Photo Credit:  IEEE Spectrum/Geminoid|DK




Ms. Geminoid:
geminoid f
Photos: Osaka University (left); Osaka University and Kokoro Company (right); composite (middle)



Get the Virtual Advantage (pdf)
NC Virtual Public School

Mar 7, 2011

Post from TechPsych blog: How about some iPad 2 apps for school psychology and related fields?


The video provides a recap of the iPad and highlights how it can be used in education, for students with special needs, in aviation, in doctor's offices, in the board room, in an art studio, in the kitchen - just about anywhere.


I wonder how an iPad could transform the work of a school psychologist!   I have a plenty of ideas, now that the iPad2 is out.  How can it help with RTI?  Progress monitoring?  Digital social stories?   Social skills activities?  Counseling? Behavior intervention?   Transition planning?  Assessment?  Communication? Productivity?  IEP collaboration and development?   


Wouldn't it be nice if there were a few iPad apps that could effectively support the work of school psychologists and those who work in similar fields? 



So much of what we use in our productivity work as school psychologists was built upon applications first developed in the mid-to-late 1990's,  at best.   There must be more efficient and effective solutions out there.     What can be done to move us forward?


Psychological and educational test publishers, can you hear me now?   


I have some ideas.








Feb 21, 2011

Make It So: Interaction Design Lessons from Science Fiction - Video of MacWorld presentation by authors of upcoming book.

I'll admit it. I've been a SciFi fan and Trekkie since childhood. 


When I discovered that Nathan Shedroff and Cris Noessel were writing Make It So: Interaction Design Lessons from Science Fiction, I just had to take a break from writing reports to share the news.  


The video below is from the author's MacWorld presentation that covers some of the content of the book's first chapter:


Below is information about the book, from the publisher's website:

"Interaction and interface designers can learn practical lessons from the interfaces in Science Fiction films and television. Though lacking rigorous engagement with users, production designers are nonetheless allowed to develop influential "blue-sky" examples that are inspiring, humorous, prophetic, useful, and can be incorporated into "real" work to make online, mobile, and ubiquitous interfaces more interesting and more successful. This book will share lessons and examples culled from imaginative interfaces free from traditional constraints. In addition, the authors will outline their process of investigation and describe a toolkit for others to make similar explorations into other domains." 
book in progress by Nathan Shedroff & Chris Noessel. Publisher: Rosenfeld Media. Anticipated publication date: 2012
Nathan Shedroff is the chair of the new MBA in Design Strategy program at California College of the Arts.  Chris Noessel is an interaction design director at Cooper.


via Putting People First 

Transit: Digital Video Dance Storytelling on Large Displays at the LAX Airport, by Scott Snibbe and Francesca Penzani

Scott Sona Snibbe, an interactive artist, created a large-scale video installation on twenty-nine connected HD displays that curved around at the Tom Bradley International Terminal at the LAX airport in 2010.   The video tells a story of groups of people traveling through an airport who eventually begin dancing in interesting ways.  

Francesca Penzani was the choreographer for this project. Noah Cunningham was responsible for the cinematography, editing, and post-production.




RELATED
SNIBBE INTERACTIVE
The Snibbe Interactive website has lots of great photos and videos of the company's work.



Thanks to Daniel Chen for sharing this video!

Feb 20, 2011

Human-Computer Interaction (HCI) Is Changing the World: BLUR Conference, February 22-23, Omni Orlando Resort (Includes video)

Tuesday, February 22, 2011 at 8:00 AM - Wednesday, February 23, 2011 at 5:00 PM (ET) Omni Orlando Resort, 1500 Masters Blvd. 
 ChampionsGate, Florida 33896 Phone: (407) 390-6664
Blur Conference

ABOUT BLUR 
(from the conference website)

"It’s easy to forget that the computer mouse is over 45 years old."

"What’s not as easy to forget is that we’re now collectively getting used to interacting with computers via means and interfaces that have moved way beyond the keyboard and the mouse — the iPhone and Wii being the most prominent examples."

"The truth is that we stand on the verge of a major revolution in the models of Human Computer Interaction (HCI). A revolution that will fly right past academic and into a world of retail, medical, gaming, military, public event, sporting, personal and marketing applications."

"From multi-touch to motion capture to spatial operating environments, over the next 10 years, everything we know about HCI will change."

"Blur is the only conference that is exploring the line of interaction between computers and humans in a substantive, real-world and hands-on way."

"At Blur, vendors, strategists, buyers and visionaries assemble to not only discuss the larger issues of HCI, but also to lay their hands on the latest in HCI technology. Blur is the only forum for a focused, hands-on exploration of the varied technologies evolving in the HCI."

"Come play, investigate, learn and apply at Blur — where we’re changing how you interact with computers forever." -Blur




BLUR Conference Agenda
(Note:   I added the links to conference participants and/or their organizations. Feel free to leave a comment if you know of any corrections or better links!)
Keynotes:

Neuroergonomics: How an Understanding of the Brain is Changing the Practice of Human Factors Engineering - Dr. Kay Stanney, Design Interactive
When Computers Feel: Understanding Human Emotional Measurement  - Hans Lee, EmSense
A Quick Hit on Mobility and HCI - Juan Pons, Swype
Panel Discussion: Haptics- The Beginnings and Future of Touch  - Nimish Mehta
Why HCI will lead the biggest tech revolution yet - Andrew Tschesnok, Organic Motion
Location as a Primary Interface Input - Matt Galligan, SimpleGeo; Nick Brachet, Skyhook Wireless
Robotics, Gaming and The Future of Entertainment- Paul Berberian, Orbotix
Virtual Coaches in Healthcare: A Vision of the Future - Dan Siewiorek, Carnegie Mellon University
10 reasons to be happy about giving computers emotion sensing - Dr. Rosalind Picard, MIT
Commercializing HCI Technology - Dr. Paul Kedrosky, Ewing Marion Kauffman Foundation and Dr. Gerry Barnett


Breakout Sessions:
Human Instrumentation - James Park, FitBit; Ben Rubin, Zeo; Jason Jacobs, RunKeeper;
Steve Larsen, moderator

New Museum Experiences: Learning from Multitouch and Multiuser Installations - Jim Spadacinni, Ideum
Kinect Hacks - Jonathan C. Hall; Lonergan Harrington; Jim Spadacinni, Sean Kean, moderator
Interactive Ads and Consumer Experiences - Alessio Signorini, Immersive Labs; Jon Fox, Helios Interactive
Augmented Reality - Ready for Primetime? - Vikas Reddy, Occipital; Carlin Getliffe, OmniarEdwin Rivera, Credelis; Dan Rua, moderator
Building an Interface for Endangered Language Learners - Finn Thye and Kelson Adams, Univ of Colorado - Boulder
Alternative Interface Inputs - Gary Clayton, Nuance; Nick Langdale-Smith, Seeing Machines; RJ Auburn, Voxeo; Steve Larsen, moderator
Ewing Marion Kauffman Foundation "Idea Hack" - Commercializing HCI Technology: A Discussion - led by Paul Kedrosky
Building Natural User Interfaces - Thomas Peterson, SoftKinetic; Ohad Shvueli, Prime Sense; David Minnen, Oblong
3D Interactive Design for the Human Body - Albert Hwang
Panel: Will the Kinect Change the HCI Industry Forever? A Group Discussion
HCI in the 21st Century:  Technologies for Extending and Amplifying the Human Experience (pdf) - Dr. Charlie Hughes, UCF; Dr. David Pratt; Dr. Joseph LaViola;  moderated by Steve Fiore, UCF


Some Videos of HCI/Tech featured at Blur 2011 
360 Panorama occipitalhq


"Illuminous" Eric Gradman



"Standard Gravity" Eric Gradman, OpenKinect (libfreenect/python)


Organic Motion Markerless Motion Capture


Advisory:
Steve Fiore, University of Central Florida
Bob Allen, Disney R&D
Kay Stanney, Design Interactive
Capt. Dylan Schmorrow, USN





Feb 13, 2011

Wii Just Dance2 and Kinect Dance Central: UI and Usability Approaches; Challenges for Developing Accessible Games (revised)

I love to dance- I studied dance through college, and off and on as an adult.   I have a DDR (Dance Dance Revolution) game-floor pad somewhere in my attic gathering dust.  I'm ready for new challenges.


I'm planning on buying a couple new dance games for the Wii and the Kinect. There is more to this story, given my interest off-the-desktop, post-WIMP HCI (human-computer interaction), interactive multimedia and games, and a career as a school psychologist dedicated to young people with disabilities, I'm excited to see where new technologies, interfaces, and interactions will take us.


So what do the wise men of usability have to say about new ways of interacting with games and other applications?


"Kinect has many great design elements that clearly show that the team (a) knows usability, (b) did user testing, and (c) had management support to prioritize usability improvements, even when they required extra development work." -Jakob Nielsen


Jakob Nielsen, one of the godfathers of usability,  shared a few words of wisdom about the Kinect in his 12/27/10 Alertbox post: Kinect Gestural UI: First Impressions.  Although he did not review Dance Central, he concludes that the game he reviewed, Kinect Adventures, was fun to play, despite usability problems.


If this is a topic that interests you, I recommend you read Neilsen's post, and also take a look at which are outlined in the post.  Also take a look at recent essay Neilsen co-authored with Don Norman, another godfather of usability: Gestural Interfaces: A Step Backwards In Usability


Why is this topic important to me?
I have been involved in the Games for Health and Game Accessibility movement for many years.  Lately I've been exploring the OpenKinect project with an aim to create ways of making movement-oriented games accessible for young people with more complex disabilities.  For example, there is a need to have dance and movement games modified for students (and adults!) who need wheelchairs or walkers.  There are students who have milder mobility challenges who love to dance, and the current games don't address their needs.  Some of my students have vision or hearing impairments, too.  They deserve a chance to play things designed for the Kinect.

"OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac."

Note:  I currently work as a school psychologist with students up to age 22. My main office is adjacent to a large OT and PT room at Wolfe, a program for students who have special needs.   We just had a large interactive whiteboard installed in the room that is begging for us to connect it with the school's Wii,  and soon (we hope), a Kinect.   If we are going to use dance games to help promote healthy activities among our special students, the games need to be accessible for students with cognitive, motor, and other limitations.

FIRST STEPS
Although I can dance, I understand what the world is like through the eyes of many of the young people I work with who have motor coordination and sensory integration problems that interfere with their ability to  move and dance, let alone access fast-paced dance games on the Wii or Kinect. 


My initial plan is to look at what the new dance games might be like from the view of someone who doesn't know how to dance, and admits that they have "two left feet" - an perhaps, no sense of rhythm.  Where would I start?


Wii's Just Dance2 seems to offer some support for learning how to dance through the use of simple movement icons, in the form of outlined figures, that provide information about how to move with the dancer on the screen. As you can see from the video below, the gamer is provided with information about upcoming moves throughout the game.


I decided to take a look at Just Dance2's  MIKA "Big Girl" (You Are Beautiful) because some of the adolescent females I work with have weight concerns that interfere with their health. During the teen years, this can become a vicious cycle, resulting  in less movement, and less participation with peers in physical activities, such as playing dance games.  If a teen has depression as part of this mix, we know that exercise can help,  and a fun dance game might be a life-saver, in more ways than one.




The screen shots below show how the movement icons are used in the game:








I thought it would be useful to learn more about the story behind the making of JustDance2.
At 2:22, Alexia, the project's usability expert, makes her presence known. From what I can tell, she focused on aspects of the game that would make it more usable for non-dancers, including those with "two left feet", to play the game.  (I don't know if there was anyone consulted about accessibility concerns for the game.)


Kinect Dance Central
Dance Central uses a different approach when it comes to "teaching" people how to dance along through the game. It would be interesting to test out Dance Central and JustDance 2 with the same set of people to get a better feel for what works and what doesn't.  Below is a video that previews, in split-screen, the interaction that takes place in Dance Central:

Dance Central Full Motion Preview


In Dance Central, gamers are provided with information about the moves through icons that cycle up the right hand side of the screen.  The level of dance-coordination to keep up with the moves is challenging at times, even for people who are OK at dancing.  Players can select dances according to level of difficulty. 


Kinect Usability with Regular People

Steve Cable (CX Partmers) shared his team's look at usability issues related to the Kinect by testing several games, including Dance Central, with groups of people in his article, "Designing for XBox Kinect - a usability study".  The quote below is from the Steve's article:

"We’ve loved playing with the Kinect. There’s no doubt that the game play is lots of fun. In-game menus are a barrier to that fun. Kinect should allow players to move through menus quickly and compensate for inaccuracy.

We felt the Kinect would benefit from some standardised global controls – much like a controller uses the A button to select and the B button to move backwards. We also think it needs a more responsive pause gesture – one that doesn’t interfere with the user’s game play.

Most of our participants found the Dance Central menu to be more effective, more efficient and more satisfying to use. Here are our recommendations for designing a Kinect menu interface:
  1. Allow users to make selections through positive gestures, rather than timed positions
  2. Place options on a single axis to make them easier and quicker to select
  3. Allow users to control menus with the game pad if they prefer
  4. Use large easy to read text
  5. Don’t make users scroll through options unnecessarily – it takes too long
  6. Users will be distracted if used in a social setting – test your menus in a social context to see if they are prone to errors
  7. Avoid the cursor metaphor, it’s not what gamers are used to seeing in game menus, and makes it harder to implement alternative joypad controls"

Below are screen shots that provide examples of how the movement icons are displayed in Dance Central:







RELATED
Just Dance 2 Review: Get your body moving. No, really. Give it a shot.
Kexa MacDonald, 10/19/10







Feb 8, 2011

Toyota Vision Multi-Touch Wall at the 2011 Detroit Autoshow: Good example of a collaboration between PUSH, Juxt Interactive, GPJ, and Hush (updated)

GOOD EXAMPLE!

I didn't have the chance to get up to the Detroit Auto Show and interact with the iWall, so I watched the videos.


Toyota iWall from Justin Micklish on Vimeo.


From Justin Micklish's Vimeo site:
"Named "Belle of the Ball" at the 2011 Detroit Auto Show, the Toyota Vision Wall is a large-format interactive installation leveraging a great deal of UX intelligence and bleeding edge technology."


This project was a collaboration between a team at PUSH Offices and Juxt Interactive (lead agency), GPJ (experience marketers), and Hush (motion graphics).


Toyota Vision Multi-Touch Wall Case Study:  A good explanation of the "how and why" of this project:

Toyota Vision Multi-Touch Wall Case Study from JUXT Interactive on Vimeo.

From the JUXT Interactive Vimeo site:
"In January, Toyota debuted the largest digital multi-touch installation ever seen at the 2011 North American International Auto Show in Detroit. GPJ and JUXT Interactive created the digital touch wall experience to deliver a feast for the senses, allowing consumers to explore videos, photos and custom content through a larger-than-life, interactive surface."


RELATED
PUSH
"PUSH is a marketing & design research firm based in Venice, CA. We shirk labels and defy classifications. We have been called strategists, writers, technologists, marketers, typographers, entrepreneurs, and designers. Above all, we are curious. We make a living creating compelling experiences with our clients. Recent collaborations include work with Yahoo, West, and Sony."
Juxt Interactive 
"JUXT is a company that creates from the ground up, top to bottom, or even sideways. It kinda makes sense since we were started by a couple of architects....Lots of people ask us if we only do digital work. Well we like to think and play where people are, and these days that just happens to be the Internet. That suits us just fine because the online space has far less restrictions and barriers than other media. JUXT does not take kindly to limits, or being put in a box."
GPJ: George P. Johnson
"GPJ is the world's premier experience marketing agency, working with leading companies who are looking for innovative marketing solutions that motivate their audiences and activate their brands through live experiences."
Hush
"Hush is a design and direction studio crafting stories for digital, broadcast, mobile and physical installations. Whether using the latest technology or the most classic of techniques, we unifyu stories for agencies and brands alike."


Comment:  This post was revised on 2/11/11 to reflect a clarification from Justin MicKlish.  He's a partner and producer with Push and worked on this project with others in the role of technology consulting and software development. 

Jan 15, 2011

Video Interview of 114 Year-Old Walter Bruening: Words of wisdom from the timeline of his life.

"Everyday is a good day...and make it that way!" - Walter Bruening



Some quotes from the video:

If you could give one piece of advice to today's youth, what would it be?

"I'd tell them to stay in school, get educated, as much as you can, because if you're not educated today, fully, all about computers and everything like that, you are not going to get a job, because you got to be fully qualified to go to work on any kind of job right now."


Walter Bruening's words to live by:
"Be good to everybody, be kind to them."
"People should be helping other people."
"The more you do for other people, the better your going to help your self."

Interview of Walter Breuning at age 11, includes a multimedia timeline:
(Video starts at the end of the CSI ad)


Idea:
Although younger generations will leave the world digital legacies in some form or another,  the ideas, and views of our elders are not often found online. Wouldn't it be great if we could capture the wisdom of our elders in a broad way and get it on the web?

It would be great if students in schools around the world could work on a shared on-line multimedia timeline project like this, highlighting their elderly relatives and friends of the family!

If something like this exists, please let me know.

RELATED
Centuries of Wisdom From the World's Oldest Man
Aaron Saenz, Singularity Hub, 1/8/11

Cross-posted on the TechPsych blog.

Gem of a video clip: Joshua Tree Under the Milky Way

I'm always on the lookout for videos that work nicely on interactive whiteboards for students with multiple special needs, including autism.  The video below is a time-lapse of the Perseid Meteor Shower and the galactic core of the Milky Way, from Joshua Tree National Park.  It was produced by Henry Jun Wah Lee, of Evosia Studios.

Joshua Tree Under the Milky Way from Henry Jun Wah Lee on Vimeo.


Teachers of studets with special needs might want to incorporate this short video clip into lessons, (in this case, science). It provides a soothing touch through the use of music, and looks fantastic on a large interactive whiteboard. The video isn't too boring if it is looped, making it great for students who require repetition of content.

RELATED
Under the Milky Way in Joshua Tree National Park

Jan 12, 2011

Multi-modal Interactive Maps for People with Visual Impairments: Featuring a Stantum multitouch screen with a tactile layer.

To learn more about this project, take a look at the video and related publications below. This is a great example of a team that is harnessing emerging technologies to improve the lives of people with disabilities.


Video: "Multimodal Maps for Blind People"


Website


Publications
Anke Brock, Philippe Truillet, Bernard Oriola, Christophe Jouffrai (IRIT CNRS and Universite de Toulouse) Usage of Multimodal Maps for Blind People: Why and How
ITS’10, November 7–10, 2010, Saarbrücken, Germany
Paper: http://www.irit.fr/~Philippe.Truillet/projects/doc/MultimodalMapsForTheBlind-ITS10.pdf
Poster: http://www.irit.fr/~Philippe.Truillet/projects/doc/Poster-ITS10.pdf

Stantum (Multi-touch screen used for the application.)
Ivy Middleware (Used in this application.)

Cross-posted on the TechPsych blog.

Jan 10, 2011

Technology Out and About: DOOH at the Porche Design Shop in St. Martin (Video)

Video Window, Porche Design, St. Martin from Lynn Marentette on Vimeo.


I came across this video display at a Porche Design shop in St. Martin. The video was well-done, but the display was difficult to see from a distance. The saleswoman wasn't sure who produced the content.

I thought that it would be more interesting if the content was interactive- something for do while my husband shopped!

ImageFlow for streaming image search; Content-Aware Dynamic Timeline for Video Browsing, and more from Microsoft Research

I thought I'd share a couple of interesting videos about the work of some researchers at Microsoft Research and their colleagues:



RELATED
Japani, V., Ramos, G., and Drucker, S.  ImageFlow: Streaming Image Search (pdf) (Microsoft Research Publication, 11/18/2010)
Link to abstract

Fisher, D., Drucker, S., Fernandez,R. and Ruble,S. Visualizations Everywhere: A Multiplatform Infrastructure for Linked Visualizations, in Transactions on Visualization and Computer Graphics, IEEE, Salt Lake City, UT, November 2010


The following video and article focuses on some research that I think will be very useful for creating interactive multimedia timelines in the future.  I'd like to learn more about ways this interaction can be implemented in gesture-based systems!





Content-Aware Dynamic Timeline for Video Browsing
From Gonzalo's YouTube channel: "When videos have more frames than pixels in the player's timeline slider, frames become inaccessible and scrolling actions cause sudden jumps in a video's continuity. We propose a content-aware dynamic timeline control that decouples video speed and playback speed and allow salient shots to be presented at an intelligible speed."


Article
Pongnumkul, S., Wang, J., Ramos, G., Cohen, M.  Content-Aware Timeline for Video Browsing (pdf). UIST '10, ACM


Note: The researchers are investigating the use of audio/sound in a similar manner, making this form of video browsing/search truly multimedia. An example of this is included near the end of the above video clip.


The following video is a demonstration of something I could use in my work as a school psychologist, since I use a lot of video for assessing students with disabilities, including those with autism.  I also use video to create digital social stories for many of the students.

Dec 22, 2010

Teach Parents Tech website by Google employees - gotta love it - it includes tech "how-to" care package videos!

Google employees know what it is like to play the role of the extended family tech support person.  For the holidays- and beyond - they've created a series how-to videos that might prove to be useful to parents  and other extended family members who are interested in joining World 2.0 but need some sort of useful roadmap.


Teach Parents Tech is a great website to visit to learn the basics and a bit more. Here is the introductory video:



There is a "how-to"video for nearly everything.  Below is a screen shot of the home page, that lets you create a customized tech support "care package" that you can email to a parent:


Video School Online: Free from Vimeo

Prosumers, DYI, hobbyists, multimedia wannabes, and even a few film or video pros might want to take a look at Vimeo's Video School Online. 


I'd like to use a dolly for a couple projects, and found the following video on the Vimeo Video School website that gives a great step-by-step demonstration of how to make your very own dolly for about $45.00:

My DIY Dolly from Knut Uppstad on Vimeo.





Interesting animation made with Google Docs presentation app. (Google Demo Slam), via Flowing Data

The video below was an entry in the Google Demo Slam,  an effort started by Google to share the word with the world about their innovative technologies.  By the time I learned of Epic Docs Animation, the video had over 800,000 views.  I plan to view a few more Google Demo Slam videos over the holiday break!


-Tu+, Namroc, and Metcalf 

For more information and Demo Slam videos, visit Google's Demo Slam website: "Welcome to Demo Slam, Where Amazing Tech Demos Battle for Your Enjoyment"

RELATED
Epic animation in Google Docs
Nathan Yau, Flowing Data, 12/22/10
Google's rationale for creating Demo Slam:
"We spend our time making a whole bunch of technologies that are free for the world, but a lot of people dont even know about them. And that kind of sucks. So, we thought organizing the world's most creative tech demo battle would be a great way to help spread the word and teach people about tech. Not to mention, it is a lot of fun."
About Demo Slam
Hall of Demo Champs

Dec 14, 2010

Short documentary of the story behind the Reactable, a tangible user interface for creating music. (Includes an interview of Joel Bonasera, of Charlotte's Discovery Place museum.)

The following video provides a look into the history of the Reactable,  from the initial paper prototypes to the present, including the Reactable Mobile application designed for the iPad, iPhone, and iPod touch.   The video includes interviews of Sergi Jorda and Gunter Geiger, members of the original team at Pompeu Fabra University (Barcelona) that created the Reactable.  The other team members are Martin Kaltenbrunner and Marcos Alonso.




FYI:  At about 2:34 in the video, Joel Bonasera briefly discusses the Reactable installation at Charlotte's Discovery Place museum. Joel is a project manager at Discovery Place.


RELATED
How the Reactable Works
John Fuller, howstuffworks
Music Technology Group, Pompeu Fabra University
Reactable Website
Reactable Concepts
Reactable History
Discovery Place
Interactive Technology in the Carolinas: Discovery Place Science Center
(Includes a short video clip I took of the Reactable at Discovery Place)

Dec 12, 2010

Interactive Surveillance: Live digital art installation by Annabel Manning and Celine Latulipe

Interactive Surveillance, a live installation by artist Annabel Manning and technologist Celine Latulipe, was held at the Dialect Gallery in the NoDa arts district of Charlotte, N.C. on Friday, December 10th, 2010. I attended this event with the intention of capturing some of the interaction between the participants and the artistic content during the experience, but I came away with so much more. The themes embedded in the installation struck a chord with me on several different levels.


Friday's version of Interactive Surveillance provided participants the opportunity to use wireless gyroscopic mice to manipulate simulated lenses on a large video display. The video displayed on the screen was a live feed from a camera located in the stairway leading to the second-floor gallery.  When both lenses converged on the screen, a picture was taken of the stairway scene, and then automatically sent to Flickr. Although it was possible for one person to take a picture of the scene holding a mouse in each hand, the experience was enhanced by collaborating with a partner.

In another area of the gallery, guests had the opportunity to use wireless mice to interact with previously recorded surveillance video on another large display.  The video depicted people crossing desert terrain at night from Mexico to the U.S. In this case, the digital lenses on the screen functioned as search lights, illuminating - and targeting- people who would prefer not to be seen or noticed in any way.  On a nearby wall was another smaller screen with the same video content displayed on the larger screen.  This interaction is demonstrated in the video below:



A smaller screen was set out on the refreshment table so participants could view the Flickr photostream of the "surveillance" pictures taken of the stairway.   On a nearby wall was a smaller digital picture frame that provided a looping video montage of Manning's photo/art of people crossing the border.

The themes explored in the original Interactive Surveillance include border surveillance, shadow, and identity, delivered in a way that creates an impact beyond the usual chatter of  pundits, politicians, and opinionators. The live installation provided another layer to the event by providing participants to be the target of the "stairway surveillance", as well as play the role of someone who conducts surveillance.    

Reflections:
In a way, the live component of the present installation speaks to the concerns of our present era, where the balance between freedom and security is shaky at best. It is understandable that video surveillance is used in our nation's efforts to protect our borders. But in our digital age, surveillance is pervasive. In most public spaces it is no longer possible to avoid the security camera's eye.  Our images are captured and stored without our explicit knowledge. We do not know the identities or the intentions of those who view us, or our information, remotely. 

We are numb to the ambient surveillance that surrounds us. We go about our daily activities without notice.  We are silently tracked as we move across websites,  dart in and out of supermarkets and shopping malls, and pay for our purchases with plastic.  Our SMART phones know where we are located and will give out our personal information if we are not vigilant, as our default settings are often "public".

It is easy to forget that the silent type of surveillance exists.  It is not so easy to ignore more invasive types of "surveillance".  We must agree to submit to a high degree of inspection in the form of metal detectors, baggage searches, and in recent weeks, uncomfortable physical pat-downs, for the privilege of traveling across state borders by plane, within our own country.  In some airports, we are subject to whole-body scans that provide strangers with views of our most private spaces. We go along with this effort and prove our innocence on-the-spot, for the greater good.   Conversely, we have multiple means of conducting our own forms of surveillance, through Internet searches, viewing pictures and videos posted to the web, and playing around with Google Streetview. 

As I wandered around the Dialect Gallery with my video camera, I realized that I was conducting my own form of surveillance, adding another layer to the mix.  Unfortunately, some of the time I had my camera pressed to "pause" when I thought I was filming, and vice versa, and as a consequence, I did not capture people using the wireless mice to interact with the content on the displays. I went ahead with my mission and created a short video reflection of my impressions of Interactive Surveillance.  If you look closely at the video between :40 and :47, you'll see some people from across the street from the gallery that I unintentionally captured, and now they are part of my surveillance.

Although the video below was hastily edited, it includes music and sounds from the iMovie library that approximated the "soundtrack" that formed in my mind as I experienced the exhibit.

To get a better understanding of Interactive Surveillance,  I recommend the following links:


Barbara Schrieber, Charlotte Viewpoint



Video Reflection of Interactive Surveillance (Lynn Marentette, 12/10/10)

Live Installation: Interactive Surveillance, by Annabel Manning and Celine Latulipe from Lynn Marentette on Vimeo.



Interactive Surveillance Website



Interactive Surveillance Flickr Photostream