I thought I'd share a couple of interesting videos about the work of some researchers at Microsoft Research and their colleagues:
Japani, V., Ramos, G., and Drucker, S. ImageFlow: Streaming Image Search (pdf) (Microsoft Research Publication, 11/18/2010)
Link to abstract
Fisher, D., Drucker, S., Fernandez,R. and Ruble,S. Visualizations Everywhere: A Multiplatform Infrastructure for Linked Visualizations, in Transactions on Visualization and Computer Graphics, IEEE, Salt Lake City, UT, November 2010
The following video and article focuses on some research that I think will be very useful for creating interactive multimedia timelines in the future. I'd like to learn more about ways this interaction can be implemented in gesture-based systems!
Content-Aware Dynamic Timeline for Video Browsing
From Gonzalo's YouTube channel: "When videos have more frames than pixels in the player's timeline slider, frames become inaccessible and scrolling actions cause sudden jumps in a video's continuity. We propose a content-aware dynamic timeline control that decouples video speed and playback speed and allow salient shots to be presented at an intelligible speed."
Pongnumkul, S., Wang, J., Ramos, G., Cohen, M. Content-Aware Timeline for Video Browsing (pdf). UIST '10, ACM
Note: The researchers are investigating the use of audio/sound in a similar manner, making this form of video browsing/search truly multimedia. An example of this is included near the end of the above video clip.
The following video is a demonstration of something I could use in my work as a school psychologist, since I use a lot of video for assessing students with disabilities, including those with autism. I also use video to create digital social stories for many of the students.