I have the Leap Motion dev kit and can't wait until I can use it with Google Earth. Hopefully I'll find time tonight after I get home from work! For now, here is the promotional video:
The Affinity+ concept has the potential to be useful in educational settings such as schools, museums, and libraries. Although it was designed to support collaborative activities among software designers/developers, it could support a wide range of collaborative project-based learning activities. The clearly narrated video below was produced by a team from the Pacific Northwest National Laboratory.
"Affinity diagraming is a powerful method for encouraging and capturing lateral thinking in a group environment. The Affinity+ Concept was designed to improve the collaborative brainstorm process through the use of large display surfaces in conjunction with mobile devices like smart phones and tablets. The system works by capturing the ideas digitally and allowing users to sort and group them on a large touch screen manually. Additionally, Affinity+ incorporates theme detection, topic clustering, and other processing algorithms that help bring structured analytic techniques to the process without requiring explicit leadership roles and other overhead typically involved in these activities." -PNNL RELATED Affinity+ Semi-Structured Brainstorming on Large Displays Russ Burtner, Richard May, Randy Scarberry, Ryan LaMothe, Alex Endert Pacific Northwest National Laboratory
Photo via engadget About four years ago I almost rolled out on the floor laughing at Fred Armisen's parody of CNN's Magic Map Wall during the Weekend Update segment of Saturday Night Live. When I came across the video clip, I couldn't resist sharing it on my blog, and thought I'd share the video clip once again in honor of Election Day:
My 10/25/08 post, Multi-touch Parody of CNN's Magic Map Wall by Fred Armisen on SNL's MegaPixel Giant Touch-map, was written when large touch-screen displays were beginning to gain steam in a variety of markets. At the time, not all of the kinks had been worked out, and there were few people around who knew much about developing programs for use on these displays. Since then, there have been many improvements in both hardware and software. Jeff Han, of Perceptive Pixel, provided the multi-touch screen system for CNN and the SNL parody. He recently sold Perceptive Pixel to Microsoft and now works as a general manager for Microsoft Office. He's spreading the word about Microsoft's venture into large multi-touch displays- (55 and 82 inches), powered by Windows 8. For more information about Jeff Han and Microsoft Office, take a look at my blog post featuring a recent video of Jeff Han's presentation about Windows 8 for large displays, Microsoft's new multi-touch, pen, and ink technology. The video also includes a presentation about experience design considerations for large displays, by Nathan Fish.
I was at the auto dealership to get my car's oil changed a few weeks ago and noticed a large interactive display that featured an in-depth explanation of Hyundai's BlueLink technology. The display also provided touch-screen interaction to explore information and view videos about the features of new cars.
My toddler grandson loves cars, so when he came to visit, I brought him along. Since it is difficult to interact with a touch screen display and at the same time discreetly video the experience with a toddler in tow, I wasn't able spend much time exploring the display's features. The following video is what I managed to capture.
I am still hunting down information about the story behind the display. RELATED HYUNDAI's Interactive BlueLink website Overview of BlueLink The Tube (2008) This project was installed in 180 Hyundai dealerships in the U.S. It was created with a Papervision3D menu to display media types such as video, audio, animation, and zoomable bitmaps. The application was designed to run within a local network. I'm not sure if the display I saw at my local dealership was an updated version of this project or something new.
FlatFrog Multitouch is a company based in Sweden. It was founded by Ola Wassvic and Christer FÃ¥hraeus. The technologies support 20+ simultaneous touches, and recognize object size, a useful feature. FlatFrog screens can be optimized for a wide range of light conditions FlatFrog's multi-touch and gesture interaction is featured in the short video clips below.
FlatFrog is gearing up for commercial release. According to the FAQ's on the website, "all sizes are possible, from 5" to 100" and upward. Promethean is one of the company's investors. There is a volume manufacturing agreement with Kortek Corporation, known for industrial and gaming displays.
Thanks Touch User Interface for sharing this information!
(Touch User Interface is the blog for Sensible UI, known for the ArduMT, aka the Arduino Multi-touch Development Kit)
"T(ether) is a novel spatial aware display that supports intuitive interaction with volumetric data. The display acts as a window affording users a perspective view of three- dimensional data through tracking of head position and orientation. T(ether) creates a 1:1 mapping between real and virtual coordinate space allowing immersive exploration of the joint domain. Our system creates a shared workspace in which co-located or remote users can collaborate in both the real and virtual worlds. The system allows input through capacitive touch on the display and a motion-tracked glove. When placed behind the display, the user’s hand extends into the virtual world, enabling the user to interact with objects directly." -Vimeo
For more pictures and information, see the following post on the Creative Applications Network website: T(ether) [Cinder] Filip Visnjic
Multitouch Display for Business Science Park Aurorum, by NUITEQ
"NUITEQ developed a customized multi-touch software solution for Corporate Reception / Lounge Areas for Business Science Park Aurorum in Luleå, based on the award-winning Snowflake Suite framework. In addition to the software, NUITEQ delivered and installed a 32 touch points multitouch dreaMTouch LCD from Germany based Elektrosil."-NUITEQ
"Virtual anatomy surface computer in the shape of autopsy table that
show and enable the user to use hand gestures to do the anatomy process
virtually."-Innovation Now
The people at Stantum have been working hard to improve multi-touch technology, focusing on smaller tablet-sized systems. Stantum is a company I've been following for several years, from the time it was known as Jazz Mutant. I have been impressed by Stantum's focus on the needs of people as well as the company's careful attention to important details.
I'm pleased to see that the company has an idea of how its multi-modal technology can support multi-touch in education: "Ambidexterity and multi-modality are the two pillars of Stantum's core project – making the use of touch-enabled devices more creative and productive. Amongst others, there is one field of application where we truly see a soaring need for ambidexterity and multi-modality – augmented textbooks." -Guillaume Largillier
At the Society for Information Display's Display Week exhibition this past May, Stantum introduced a new palm rejection feature for its Interpolated Voltage Sensitivity technology. This technology provides users with a more natural way to interact with the interface and application content on tablets. The technology supports Android's multi-touch framework and is also Windows 7 certified. The palm rejection feature will be a welcome improvement for future multi-touch applications designed for education settings, where it is likely that more than one hand - or person, might be interacting with content on the screen at the same time.
Below are two videos that provide a glimpse of Stantum's innovations:
Stantum's technology can enable ten simultaneous touches, is highly responsive, and supports high-resolution content. According to a May press release, "Palm rejection is available as an API (application programming interface) to Windows and Android operating systems on x86 and ARM platforms. IVSM touch modules are offered to OEMs through the company’s Qualified Manufacturers Partners, comprising tier-one touch-screen manufacturers with high-volume production capabilities. More information is available at info@stantum.com"
I thought I'd share some examples of interesting interactive multimedia sites on the web. It seems that artists, musicians, and ad agency folks have been experimenting with tools such as HTML5, SVG, Canvas, and Web GL. Some of this work is featured on Google's Chrome Experiments website, and other examples can be found on websites promoting Wrangler Jeans or Ikea furniture. This sort of content is great on a larger display.
Take some time to watch the videos and explore the links below. Enjoy!
"Choreographed windows, interactive flocking, custom rendered maps, real-time compositing, procedural drawing, 3D canvas rendering... this Chrome Experiment has them all. "The Wilderness Downtown" is an interactive interpretation of Arcade Fire's song "We Used To Wait" and was built entirely with the latest open web technologies, including HTML5 video, audio, and canvas."
ROME: "3 Dreams of Black", an Interactive Film by Chris Black (The link leads to the interactive site.)
"IKEA is now launchig the Kokokaka produced A Better Sleep for Everyone campaign site, which features IKEA's bedding catalog. 6 different mattresses are shown by 6 Swedish artists, each interpreting a classic lullaby performed in a dreamy and surreal music video. By scrolling up and down the user can change between the artist's music videos and the different mattresses. Experience, for instance, a soulful Tingsek having problems falling asleep. ust like the princess from the famous fairly tale he gets annoyed by something hard under the pile of mattresses. But guess what? It's not a pea, it's Tingsek's band! Let yourself fall asleep to beautifully performed lullabies!"
Agency: Forsman and Bodenfors; Film Production: Social Club; Director: RBG6;
Music: Music Super Circus; Web Production: Kokokaka; Photographer: Carl Nilsson
Below are videos of two of the lullabies featured in the interactive ad:
"This is a campaign to promote IKEA's wardrobe solutions. IKEA wanted to show their huge range of styles and all the smart features on the inside. All the movements on the web site are controlled by sound and music. So change songs, upload your own music, play on your keyboard or sing into the microphone."
RELATED Cacophony: An interactive video player in HTML5 and Javascript "The basic elements of a Cacophony video are: An HTML5 Video on the base layer, a series of HTML5 Canvas layers above that, a timeline of effects to be triggered to the beat of the song, images and other elements to be used by the effects"
Below is a video of an interactive media wall at the Wisconsin Institutes for Discovery, the winner of the 2011 Award of Excellence from the Digtal Screenmedia Association in the category of Best Government/Education/Non-Profit Agency Deployment.
Detailed information about the Discovery Wall, including an overview of the technology, objectives of the deployment, and the positive outcomes of his project can be found on the Digital Screenmedia website.
"Float4 Interactive is a creative technology company that develops interactive systems for entertainment, advertising and design applications."
The video wall below streams 4,500 videos through Fusion-io's NAND flash card. It can take care of 1 million transactions per second, the equivalent of 6 gigabytes of throughput per second, according to a recent Computerworld post by Lucas Mearian.
SOMEWHAT RELATED Driving Data Warehousing with ioMemory Fusion-io Whitepaper 1/11/11 Transcription: Fusion-io CEO David Flynn on Enabling a New Class of Cloud Computing Apps Bert Latamore, Wikibon, 4/8/11 "We're talking about a fundamental new building-block. So it impacts and will impact everything in the entire data center. In the database world it typically means that a database server can do about 10X the throughput, for the same server. And those queries are answered 30%-40% faster. So it means faster page loads, more throughput per server. So Answers.com retrofitted their MySQL scale-out database tier and saw 9X the throughput per server. What they chose to do was to shrink the database farm four-to-one. So they got a 75% consolidation, and with that remaining one-out-of-four servers they were still getting more than twice the throughput they had before." - David Flynn
I came across this video display at a Porche Design shop in St. Martin. The video was well-done, but the display was difficult to see from a distance. The saleswoman wasn't sure who produced the content.
I thought that it would be more interesting if the content was interactive- something for do while my husband shopped!
Catching up on reading the MIT Technology Review, I came across an article written by Nidhi Subbaraman about the use of graphene to make flexible displays:
The most recent version of graphene was created by researchers in Korea at the Sungkyunkwan University, in collaboration with Samsung. According to the article, graphene was discovered over thirty years ago, but only recently have researchers been able to produce it in large mono-layers. This flexibility looks like it will have possibilities for future display applications, as noted in the video clip below the photos. Photo Credit:Byung Hee Hong, SKKU. Photo Credit: Impact Lab "Future Applications of Graphene"
It is about time for an update about touch/gesture- interactive technologies.
I've been researching the latest in "touch" screens and new developments in interactive multi-media content. In just one year, a multitude of websites have been transformed from static to interactive.
Although the initial objective for some of these websites was to optimize the interface and navigation for people accessing websites via touch-screen cell phones, some are ideal for use on touch-enabled slates, the iPad, and even larger touch screen displays and surfaces.
Convergence seems to be the buzz word of the day. Interactive TV. Game sets with Internet access. Movies on your cell phone. Touch screen Coke machines displaying movie trailers. What's happening now, and what is next?
I welcome input from my readers in the form of links to websites, university labs with grad students and professors who are obsessed with emerging interactive technologies, proof-of-concept video clips, video clips of related technologies that are new-to-market, etc.
FYI: I'm also in the middle of writing a series of posts about 3D television technologies for the Innovative Interactivity blog, and welcome input from my readers about this topic.
When I return to graduate school (hopefully I'll have the means to attend full-time), I want to flesh out my ideas for a "interactive multi-dimensional multi-media multi-user timeline" for use on interactive multi-touch/gesture tables and displays. Although I've limited my work to a prototype of a template, I know that this concept won't work unless the application can incorporate an efficient means of handling large volumes of data, as well as data in various formats.
I want this template to be useful to people in a variety of contexts, such as students studying world history and humanities, education administrators looking at educational data over time, producers and viewers of interactive documentary programs (think interactive TV), the health industry, urban planners, the military, serious games, etc.
One of my stumbling blocks is how all of the data would be stored and analysed. What I learned a few years ago in my computer classes simply won't work.
So now what?! I think that Roger Magoulas, the director of research at O'Reilly, has some good things to say about the critical problem of handling what he calls "Big Data". Here are a few videos that I think are worth watching.
The Future of Work
Part One
Next Device (SmartPhones, netbooks, creation & consumption factors - supporting usability in multiple contexts)
You Tube Series: O'Reilly Media
Big Data: Technologies & Techniques for Large-Scale Data (Emphasis on experimental approach) Part I
Part II (Discusses new forms of databases and the user of parallel processors to handle Big Data)
Part III Key Technology Dimensions
Part IV, Focus on hardware- Solid state disks, new data structure called "triadic continuum" which handles real-time data and ongoing probability estimates of data.
I would be happy to hear from anyone who is working on a project similar to the one I'm working on as a "hobby".
RELATED
Triadic Continuum "Phaneron, KStore, Knowledge store, or simply K, is a dynamic data model that is based on the cognitive theory of C. S. Peirce. Phaneron efficiently organizes data into a unique, compact, interconnected, and fully-related data model. Phaneron is constructed using the Triadic Continuum."
For those of you who like visual representations of geeky-techy concepts, here a few visuals and related descriptions of KStore fundamentals from the Triadic Continuum website:
"The KStore data model is constructed using the basic triad. For example, the event sequence 'cat' would be recorded as shown in 'a sequence' below. A new level of nodes is created above a lower level of nodes as a result of the triadic process. In this case the lower level of nodes contains a node for each character of the alpha-numeric character set and the new nodes reference the lower level nodes to record the sequence 'cat'. Each sequence is initialize with a reference to a BOT (beginning of thought) and terminated with an EOT (end of thought) reference."
"The data set above was used to create the K structure below with the lowest level that contains the alpha-numeric character set, the second level is created to record sequences that represent the field variables. Then a third level is created using the field variables of the second level to record the record sequences. Records recorded in this K structure reuse the field variable nodes so that these field variable sequences never have to be recorded more than once. This is just one of the attributes of a K structure that makes it very efficient." -Triadic-conintuum.com
Personal Note: Due to the economic downturn and its impact on my family (two kids in college), I returned to work full time in mid 2008. I have a very busy day job as a school psychologist, working at two high schools as well as a program for students with multiple, severe disabilities, including autism. This has limited my ability to work on my project.
Multi-touch, multi-media, multi-modal... the Fujitsu LIFEBOOK T4310 looks like it provides multiple possibilities for people from all walks of life: -GestureWorks
The LIFEBOOK T4310 comes with an integrated web cam and fingerprint reader, and a variety of I/O options, including HDMI, USB, Firewire, BlueTooth, LAN, analog video output, SD card reader, line in/out, a wireless switch for the integrated 3G and UMTS, and an express card reader. There is an optional modular bay that can accommodate an additional drive or battery.
The fun part is that the Fujitsu LIFEBOOK T4310 hands-on comes with Microsoft's Touch Pack applications, which are demonstrated in the video below:
Microsoft Surface globe
Surface Collage
Microsoft Rebound touch game
Surface Lagoon screensaver, which provides a water-ripple effect and little fish that respond to touch interaction.
As demonstrated in the video, the LIFEBOOK supports gesture interaction, multi-touch interaction, stylus interaction, and traditional keyboard interaction. The capacitive display has a bi-directional hinge, allowing it to be turned 360 degrees, and also positioned so that the display can be set facing up horizontally over the keyboard. (This feature would be welcomed in educational settings, if the notebook was used in education settings, as it could support paired and group collaborative learning activities.)
What I like about the LIFEBOOK is that it has an integrated ambient light sensor, which automatically adjusts the brightness of the display according to the level of light in the environment. This feature is important for people like me who are on the go and must use their computing devices under a range of lighting conditions.
I would love to get my hands on the Fujitsu LIFEBOOK T4310 for a month and test the system in-depth in my day-job as a school psychologist, and in my leisure pursuits as a UX/designer/developer/musician/gamer/etc, "hobbyist".
According to Lichtman, "telepresence is the science and art of creating visual collaboration environments, networks, and strategies that duplicate in-person meeting experiences as completely as possible in both internal and external business communications. Effectively leveraging telepresence as an organizational and collaborative strategy can improve productivity and effectiveness by enhancing business communication, collaboration, and reducing physical travel." (I'm a bit short on time today, so for more information regarding telepresence, take a look at the various links I've posted on this page.)
Here are a few items from Lichtman's post:
Musion Announces First Live Transatlantic Interactive 3D HologramPublic Broadcast