I'd like to see how these concepts play out on a multi-touch, multi-user table system!
In this video, Jeffrey Heer, from the Computer Science Department at Stanford University, presents his work in the field of interactive information visualization, and focuses on visualization techniques,visualization tools, and collaborative visualization.
The video is nearly one hour long, but worth it if you are a techie who is interested in information visualization.
Another option - scan through the following presentation:
Voyagers and Voyeurs: Supporting Social Data Analysis
Infoblog
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts with label +. Show all posts
Showing posts with label +. Show all posts
Sep 11, 2009
Ru Zarin's Interactive Multi-touch Applications for the Cognitively Disabled: Social stories, talking symbols, a piano, and more!
Trollskogen Final Demo from Ru Zarin on Vimeo.
The above video is a demonstration of Ru Zarin's masters degree project at Umea Universitet.
This project examined ways to use multi-touch technologies to support communication and learning among children with Down syndrome, autism, and related diagnoses within the age group of 5-8. I especially liked the animated social story application.
"Trollskogen is a communicative framework designed to enhance communication among people with cognitive disabilities. The forest is split up into interactive modules that provide a fun and engaging learning environment while helping improve on certain aspects of speech, reading/writing and symbol based languages. This framework has been deployed on a custom multi-touch table prototype built at the Interactive institute UmeƄ, enabling the children to interact with their fingers in a more natural, intuitive way rather than a traditional keyboard/mouse setup."
RELATED READING
RELATED VIDEOS OF THE PROJECT DEVELOPMENT
Ru collaborated with the illustrator Linda Arvidsson for the characters and objects found in the troll forest theme.
I especially liked the Troll Forest Cabin, which is shown in the social stories section of the video.
Cross-posted on the TechPsych blog.
Posted by
Lynn Marentette
Sep 10, 2009
CONCAVESCREAM: Soundtrack for a book interactive music and visuals,nice and relaxing on a touch screen!
I found a link to an interesting audio-based interactive website today. The Concave Scream "Soundtrack for a Book" is a promotional website for the band's new album of the same name. I never heard of this band before, but I'm impressed with the touch screen interactivity that the website provides!
It was awarded the FWA (Favorite Website Award) earlier this year.
It was awarded the FWA (Favorite Website Award) earlier this year.
Posted by
Lynn Marentette
Sep 7, 2009
Stantum's Mobile Phone Multi-touch Interface: Demonstration of Precise Interactions on a Resistive Touch Screen
For more information, read Paul Miller's article (Engadget), Stantum's mind-blowing multitouch interface on video!
Paul Miller says:
"The PMatrix multitouch firmware allows for unlimited inputs, detection of any contacting object (a finger, a stylus or even a paintbrush) and pressure sensitivity. We played with the demo unit for a bit and were frankly blown away, it's far and away the best touch experience we've ever seen or felt, and the multitouch functionality is just gravy on top. Stantum is targeting resistive touchscreens because they're still considerably cheaper to build than capacitive ones, and from our perspective there seems to be zero tradeoff -- for sensitivity and accuracy this destroys everything else we've seen on the market, capacitive or not."
How it works: Tech Specs from Stantum:
PMatrix: The unparalleled Multi-Touch IP core and detection firmware (pdf)
"PMatrix™ is the most powerful software engine for Multi-Touch technology. It runs as a chip firmware or software on controllers orany embedded processing devices. PMatrix scans and delivers an exact representation of what’s happening on the touch-panel. The Multi-Touch driver dynamically updates the cursor list, enabling any application to control the Graphical User Interface objects."
Stantum's rational behind the company's choice of using resistive rather than capacitive touch technology:
Why resistive? :
In comparison to capacitive sensing, resistive technology brings :
• Versatility : it detects fingers, gloves, stylus and any contact object
• A lower solution cost
• Lower power consumption
• Faster scanning rate
• Lower overall latency
• Precision is linear (same in the center and at the borders)
• Performance is not affected by the size of the touch-panel or the number of touches
• Transparency and soft touch enabling user experience equivalent to the best capacitive touch-panels
I'll update this post when I have more information!
RELATED
For in-depth, detailed information about resisitive multi-touch technology, read the following posts via Touch User Interface:
How the Stantum's Resistive Multi-Touch Screen Works
Pressure Sensitive Multi-touch Surfaces
Posted by
Lynn Marentette
Labels:
computing,
engadet,
firmware,
interaction,
interface,
mobile,
multi-touch,
pmatrix,
resistive,
stantum
No comments:
Interactive Light and Dance Installations using Flexible OLED's: You Fade to Light
Dance and Interactive Light Installation: You Fade to Light/ edit 1 (rAndom International, Kristin Knappstein, Royal Philips Electronic' Luminblade team)
You Fade To Light / edit 1 from rAndom International on Vimeo.
"Lumiblade OLED's are large area diffuse light sources made from extremely thin glass and feature a perfect mirror finish. OLED's are very close to the quality of natural light."
Installation and Art Direction by rAndom International
Choreography: LaĆÆla Diallo in collaboration with Khamlane Halsackda
Performance: LaĆÆla Diallo and Khamlane Halsackda
Film Production and Director of Photography: Mark Purnell at Hogarth Worldwide
Related versions can be found on the rAndom International Vimeo site.
Photos of the making of You Fade to Light
Chris O'Shea created the software for You Fade to Light using C++, openFrameworks, and openCV. There was a hidden camera embedded in the installation that generated video for the software to process and also generate video files as output. The hardware drivers were created by rAndom International and through the software, controlled the brightness of each OLED.
Chris is the author of the Pixelsumo blog, and also is behind This Happened, a "series of events focusing on the stories behind interaction design", with the purpose of encouraging interaction designers to become more open in their methods and ideas.
For more information about interactive OLED, read the NY Time's article:
Panels of Light Fascinate Designers
Eric A. Taub, New York Times, 9/6/09
Thanks to Celine Latulipe for the link to the NY Times article!
Posted by
Lynn Marentette
Multi-touch, multimedia, multi-modal: Fujitsu LIFEBOOK T4310 has a multitude of possibilities!
Multi-touch, multi-media, multi-modal... the Fujitsu LIFEBOOK T4310 looks like it provides multiple possibilities for people from all walks of life:

-GestureWorks
The LIFEBOOK T4310 comes with an integrated web cam and fingerprint reader, and a variety of I/O options, including HDMI, USB, Firewire, BlueTooth, LAN, analog video output, SD card reader, line in/out, a wireless switch for the integrated 3G and UMTS, and an express card reader. There is an optional modular bay that can accommodate an additional drive or battery.
The fun part is that the Fujitsu LIFEBOOK T4310 hands-on comes with Microsoft's Touch Pack applications, which are demonstrated in the video below:
As demonstrated in the video, the LIFEBOOK supports gesture interaction, multi-touch interaction, stylus interaction, and traditional keyboard interaction. The capacitive display has a bi-directional hinge, allowing it to be turned 360 degrees, and also positioned so that the display can be set facing up horizontally over the keyboard. (This feature would be welcomed in educational settings, if the notebook was used in education settings, as it could support paired and group collaborative learning activities.)
What I like about the LIFEBOOK is that it has an integrated ambient light sensor, which automatically adjusts the brightness of the display according to the level of light in the environment. This feature is important for people like me who are on the go and must use their computing devices under a range of lighting conditions.
I would love to get my hands on the Fujitsu LIFEBOOK T4310 for a month and test the system in-depth in my day-job as a school psychologist, and in my leisure pursuits as a UX/designer/developer/musician/gamer/etc, "hobbyist".
Can anyone make this dream come true?
RELATED:
Fujitsu's multitouch LIFEBOOK T4310 tablet makes quick work of Microsoft's Touch Pack
(Paul Miller, Engadget, 9/6/009)
Microsoft Touch Pack Gallery (Engadget)
Engadget's Reader Comments
I visited Fujitsu's website to gather additional information, and found a couple of press releases in German:
IFA 2009: Fujitsu setzt auf Fingerspitzengefuhl und prasentiert Notebook mit Multiple-Touch-Function (pdf)
Here is the press contact:
Fujitsu Technology Solutions
Melanie Wolf, Senior PR Manager
Tel.: + 49 (0) 89 62060 4458, Mobile: + 49 (0) 171 33 42 882,
Melanie.wolf@ts.fujitsu.com
-GestureWorks
The LIFEBOOK T4310 comes with an integrated web cam and fingerprint reader, and a variety of I/O options, including HDMI, USB, Firewire, BlueTooth, LAN, analog video output, SD card reader, line in/out, a wireless switch for the integrated 3G and UMTS, and an express card reader. There is an optional modular bay that can accommodate an additional drive or battery.
The fun part is that the Fujitsu LIFEBOOK T4310 hands-on comes with Microsoft's Touch Pack applications, which are demonstrated in the video below:
- Microsoft Surface globe
- Surface Collage
- Microsoft Rebound touch game
- Surface Lagoon screensaver, which provides a water-ripple effect and little fish that respond to touch interaction.
As demonstrated in the video, the LIFEBOOK supports gesture interaction, multi-touch interaction, stylus interaction, and traditional keyboard interaction. The capacitive display has a bi-directional hinge, allowing it to be turned 360 degrees, and also positioned so that the display can be set facing up horizontally over the keyboard. (This feature would be welcomed in educational settings, if the notebook was used in education settings, as it could support paired and group collaborative learning activities.)
What I like about the LIFEBOOK is that it has an integrated ambient light sensor, which automatically adjusts the brightness of the display according to the level of light in the environment. This feature is important for people like me who are on the go and must use their computing devices under a range of lighting conditions.
I would love to get my hands on the Fujitsu LIFEBOOK T4310 for a month and test the system in-depth in my day-job as a school psychologist, and in my leisure pursuits as a UX/designer/developer/musician/gamer/etc, "hobbyist".
Can anyone make this dream come true?
RELATED:
Fujitsu's multitouch LIFEBOOK T4310 tablet makes quick work of Microsoft's Touch Pack
(Paul Miller, Engadget, 9/6/009)
Microsoft Touch Pack Gallery (Engadget)
Engadget's Reader Comments
I visited Fujitsu's website to gather additional information, and found a couple of press releases in German:
IFA 2009: Fujitsu setzt auf Fingerspitzengefuhl und prasentiert Notebook mit Multiple-Touch-Function (pdf)
Here is the press contact:
Fujitsu Technology Solutions
Melanie Wolf, Senior PR Manager
Tel.: + 49 (0) 89 62060 4458, Mobile: + 49 (0) 171 33 42 882,
Melanie.wolf@ts.fujitsu.com
Posted by
Lynn Marentette
Sep 6, 2009
Oblong's g-speak Spatial Operating Environment: Gesture interaction, massive datasets, film production, and more.
g-speak overview 1828121108 from john underkoffler on Vimeo.
What is g-speak?
From the Oblong website: "Spatial semantics at the platform level"
"Every graphical and input object in a g-speak environment has real-world spatial identity and position. Anything on-screen can be manipulated directly. For a g-speak user, "pointing" is literal."
"The g-speak implementation of spatial semantics provides application programmers with a single, ready-made solution to the interlocking problems of supporting multiple screens and multiple users. It also makes control of real-world objects (vehicles, robotic devices) trivial and allows tangible interfaces and customized physical tools to be used for input."
"The g-speak platform is display agnostic. Wall-sized projection screens co-exist with desktop monitors, table-top screens and hand-held devices. Every display can be used simultaneously and data moves selectively to the displays that are most appropriate. Three-dimensional displays can be used, too, without modification to application code."
Origins of Oblong
g-speak was born at the MIT Media Lab, and Oblong was started in 2006. The work behind g-speak's gestural I/O began over 15 years ago. For more information, read g-speak in slices.
Oblong developed Tamper over the g-speak system as a prototype for film production. Below is the demo. At 0:08, sketches of the gestures used in g-speak are displayed in the video.
I hate wearing gloves, but I'd gladly put them on to play with the system for a few days!
Posted by
Lynn Marentette
Interactive Memorabilia at the Hard Rock Cafe: Microsoft's Multi-touch Rock Wall, Companion Surface Installations, and Awesome Touch-Responsive Interactive Memorabilia Website.
UPDATE: Video of Hard Rock Cafe's Memorabilia Multi-touch Wall
I came across the updated Hard Rock Cafe website and found that it provides an awesome interactive experience! Visitors can explore the extensive Memorabilia collection in detail. This site is almost ready for inclusion in my UX/Interactive Hall of Fame!
According an article posted on the Hospitality Technology website, ""Tech, Love, and Rock 'n' Roll, the website was built using Silverlight, and is part of an initiative to expand the reach of the Hard Rock Cafe's extensive memorabilia collection.
"It all started last year with what we call 'Memo 2.0,'" [for memorabilia 2.0] and the rollout of Microsoft's new Silverlight (www.silverlight.net) technology, explains Joe Tenczar, senior director of technology and CIO for Hard Rock International. Silverlight is a web application framework, similar in scope to Adobe flash, that lets companies build custom apps. Hard Rock partnered with its brand agency Duncan/Channon (www.duncanchannon.com) and software developer Vertigo (www.vertigo.com) to create a custom Silverlight application for www.hardrock.com."
Below is a video of the website when it was still under development, unveiled during a MIX08 keynote:
The website is a companion project that is part of a bigger vision that includes an 18-foot interactive multi-touch wall that mirrors the Memorabilia website. The article goes on to mention that the "Rock Wall can be used by one person to blow up a piece of memorabilia as large as the screen, or optimized for six different user experiences at one time, though the technology has been tested for responsiveness to hundreds of simultaneous touches. "Imagine seeing Bob Dylan's Harley at more than life size; big impact," says Tenczar....Rock Wall uses a single piece of Stewart projection glass, custom-made at the factory for Hard Rock, along with three Christie projectors, numerous lasers, and multiple IR cameras to create a unified projection and touch experience. "The graphics are driven by a Nvidia Quadro Plex and there are two other dedicated servers for localized content and physics. I have seen a lot of cool technology, but this still makes my jaw drop whenever I see it in person." And the technology is smart. "If I walk up to the screen, it will blow up where I am."
According to information from the Hard Rock Cafe's press release, the interactive wall was created by Obscura Digital, a company located in San Francisco, along with Technomedia Systems was also involved in this project. Here's a quote from the press release:
"We were excited to work with Hard Rock on this revolutionary project," said Obscura Digital CEO Patrick Connolly. "This is the longest, largest and most technologically sophisticated multi-touch wall we’ve built. It utilizes our newest Fireframe technology, and the result is the highest resolution, seamless multi-touch display we have ever seen. We are delivering a 6000 x 1000 pixel display, so that multiple users at a time can enjoy the vast Hard Rock Memorabilia collection in stunningly high resolution. With this display, customers can literally zoom so deep into the images, that things like a scratch on Buddy Holly's glasses can be seen 3 feet wide. The graphic quality of this system is something the public has never seen, and the impact on the customers is something they will not soon forget."
Hard Rock Cafe Multi-Touch
In addition to the website and Rock Wall, booths at the Hard Rock Cafe provide customers with access to music videos from a video distribution server, or VDS, created by Coleman Technologies. Each booth's touch screen display terminal has its own computer, and this allows guest control over the music videos that are played at the cafe. Content from the memorabilia system can be explored by customers right from the booth, from what I understand.

Microsoft's Surface tables were installed in the cafe, running a custom application created by Vertigo. that allows customers to interact with a spinning globe to locate and explore other Hard Rock Cafe's around the world.
Memorabilia Website UX:
Unfortunately, I am not in Las Vegas, so I haven't had a chance to interact with the Hard Rock Cafe's Rock Wall or any of the Microsoft Surface installations, which are the centerpiece of this project.
Not to worry. The companion Hard Rock Cafe Memorabilia website, experienced from the comfort of my home on an HP TouchSmart PC, provides a great user experience, even thought it is not fully optimized for touch navigation.
One of the things that I like about the site is that it uses Microsoft's Deep Zoom feature. When you zoom in, you can explore each photo without sacrificing resolution, and explore everything in minute detail.
If you want to try your hand at touch navigation, you will be pleasantly surprised, even though you will have to rely on your mouse at times. To zoom in, tap the initial presentation screen, and use your fingertips to pan around. What you touch and release won't go flying about the screen, which is a good thing.
If you lightly tap an item of interest, you will get a translucent information box on the right side of the screen. The information box might include video clip of an interview with a musician, a story, and a means to share the experience through embedding a widget on your website or blog, posting a link, or emailing a link.
After poking around the site using Firefox, I switched to Internet Explorer, turned the zoom setting to 125%, and had no difficulty using my right index finger to navigate through the information box. (It was more difficult to do this using Firefox.)
The directions for traditional navigation are simple:
"Turn on, zoom in, zoom out. Use your mouse wheel to zoom in and out. You can also zoom in by clicking and zoom out by shift-clicking. To pan, click and drag."
Featured artists on the Memorabilia site include The Who, The Beatles, James Brown, John Lennon, Bob Dylan, Guns N'Roses, Buddy Holly, Madonna, Motley Crue, Elvis Presley, Queen, and The Rolling Stones. You can sort the main presentation page by artist or year, and explore the collection by artist, type of item, genre of music, decade, and location.
This website provides an interactive, user-friendly means of exploring the history of rock music. (It would be cool if the site offered some form of interactive sound track.)
I'm impressed.
I embedded the widgets below from the Memorabilia site. I wasn't sure exactly what might turn up. The interactive widgets, created with Silverlight, went beyond my expectations. You can explore the content right from this post, if you have Silverlight installed:
I shot some HD video of my touch interaction of the website and will post the videos as soon as I can. For now, take a look at some of the screen shots:
Main portal:

Eric Clapton's Guitar and Backstory:
Screen shot of memorabilia sorted by type - instruments:
Zooming in:

Screenshot of Gibby Hayne's art:

Detail of one of Madonna's decorative costumes:

I came across the updated Hard Rock Cafe website and found that it provides an awesome interactive experience! Visitors can explore the extensive Memorabilia collection in detail. This site is almost ready for inclusion in my UX/Interactive Hall of Fame!
According an article posted on the Hospitality Technology website, ""Tech, Love, and Rock 'n' Roll, the website was built using Silverlight, and is part of an initiative to expand the reach of the Hard Rock Cafe's extensive memorabilia collection.
"It all started last year with what we call 'Memo 2.0,'" [for memorabilia 2.0] and the rollout of Microsoft's new Silverlight (www.silverlight.net) technology, explains Joe Tenczar, senior director of technology and CIO for Hard Rock International. Silverlight is a web application framework, similar in scope to Adobe flash, that lets companies build custom apps. Hard Rock partnered with its brand agency Duncan/Channon (www.duncanchannon.com) and software developer Vertigo (www.vertigo.com) to create a custom Silverlight application for www.hardrock.com."
Below is a video of the website when it was still under development, unveiled during a MIX08 keynote:
The website is a companion project that is part of a bigger vision that includes an 18-foot interactive multi-touch wall that mirrors the Memorabilia website. The article goes on to mention that the "Rock Wall can be used by one person to blow up a piece of memorabilia as large as the screen, or optimized for six different user experiences at one time, though the technology has been tested for responsiveness to hundreds of simultaneous touches. "Imagine seeing Bob Dylan's Harley at more than life size; big impact," says Tenczar....Rock Wall uses a single piece of Stewart projection glass, custom-made at the factory for Hard Rock, along with three Christie projectors, numerous lasers, and multiple IR cameras to create a unified projection and touch experience. "The graphics are driven by a Nvidia Quadro Plex and there are two other dedicated servers for localized content and physics. I have seen a lot of cool technology, but this still makes my jaw drop whenever I see it in person." And the technology is smart. "If I walk up to the screen, it will blow up where I am."
According to information from the Hard Rock Cafe's press release, the interactive wall was created by Obscura Digital, a company located in San Francisco, along with Technomedia Systems was also involved in this project. Here's a quote from the press release:
"We were excited to work with Hard Rock on this revolutionary project," said Obscura Digital CEO Patrick Connolly. "This is the longest, largest and most technologically sophisticated multi-touch wall we’ve built. It utilizes our newest Fireframe technology, and the result is the highest resolution, seamless multi-touch display we have ever seen. We are delivering a 6000 x 1000 pixel display, so that multiple users at a time can enjoy the vast Hard Rock Memorabilia collection in stunningly high resolution. With this display, customers can literally zoom so deep into the images, that things like a scratch on Buddy Holly's glasses can be seen 3 feet wide. The graphic quality of this system is something the public has never seen, and the impact on the customers is something they will not soon forget."
Hard Rock Cafe Multi-Touch
In addition to the website and Rock Wall, booths at the Hard Rock Cafe provide customers with access to music videos from a video distribution server, or VDS, created by Coleman Technologies. Each booth's touch screen display terminal has its own computer, and this allows guest control over the music videos that are played at the cafe. Content from the memorabilia system can be explored by customers right from the booth, from what I understand.
Microsoft's Surface tables were installed in the cafe, running a custom application created by Vertigo. that allows customers to interact with a spinning globe to locate and explore other Hard Rock Cafe's around the world.
Memorabilia Website UX:
Unfortunately, I am not in Las Vegas, so I haven't had a chance to interact with the Hard Rock Cafe's Rock Wall or any of the Microsoft Surface installations, which are the centerpiece of this project.
Not to worry. The companion Hard Rock Cafe Memorabilia website, experienced from the comfort of my home on an HP TouchSmart PC, provides a great user experience, even thought it is not fully optimized for touch navigation.
One of the things that I like about the site is that it uses Microsoft's Deep Zoom feature. When you zoom in, you can explore each photo without sacrificing resolution, and explore everything in minute detail.
If you want to try your hand at touch navigation, you will be pleasantly surprised, even though you will have to rely on your mouse at times. To zoom in, tap the initial presentation screen, and use your fingertips to pan around. What you touch and release won't go flying about the screen, which is a good thing.
If you lightly tap an item of interest, you will get a translucent information box on the right side of the screen. The information box might include video clip of an interview with a musician, a story, and a means to share the experience through embedding a widget on your website or blog, posting a link, or emailing a link.
After poking around the site using Firefox, I switched to Internet Explorer, turned the zoom setting to 125%, and had no difficulty using my right index finger to navigate through the information box. (It was more difficult to do this using Firefox.)
The directions for traditional navigation are simple:
"Turn on, zoom in, zoom out. Use your mouse wheel to zoom in and out. You can also zoom in by clicking and zoom out by shift-clicking. To pan, click and drag."
Featured artists on the Memorabilia site include The Who, The Beatles, James Brown, John Lennon, Bob Dylan, Guns N'Roses, Buddy Holly, Madonna, Motley Crue, Elvis Presley, Queen, and The Rolling Stones. You can sort the main presentation page by artist or year, and explore the collection by artist, type of item, genre of music, decade, and location.
This website provides an interactive, user-friendly means of exploring the history of rock music. (It would be cool if the site offered some form of interactive sound track.)
I'm impressed.
I embedded the widgets below from the Memorabilia site. I wasn't sure exactly what might turn up. The interactive widgets, created with Silverlight, went beyond my expectations. You can explore the content right from this post, if you have Silverlight installed:
I shot some HD video of my touch interaction of the website and will post the videos as soon as I can. For now, take a look at some of the screen shots:
Main portal:
Eric Clapton's Guitar and Backstory:
Screen shot of memorabilia sorted by type - instruments:
Screenshot of Gibby Hayne's art:
Detail of one of Madonna's decorative costumes:
Posted by
Lynn Marentette
Sep 5, 2009
Manual Lima's Information Visualization Manifesto & Discussion on the Visual Complexity Blog: Interaction Supporting Analysis is Key
Manual Lima, author of the Visual Complexity blog, set out to write about a set of considerations or requirements for people working in the field of Information Visualization, and outlined these goals in his Information Visualization Manifesto (8/30/0). Lima's post spurred quite a discussion among his readers, and resulted in a subsequent post, Observations on the Manifesto.
My favorite section of the manifesto:
Manual Lima also suggests that narrative is an important component, and quotes Patricia McDonald: "...the question of narrative seems to lie at the heart of this Manifesto; the need to pose a specific question of the data and to weave coherent themes and stories from it."
Also read:
Observations on the Manifesto 9/3/09
In this post, Manual Lima reflects on the discussion surrounding his 8/30 post. Regarding his statement, "Interactivity is Key":
"This principle merits the reflection of us all. Jerome Cukier and David McCandless challenged the need for interactivity in Information Visualization. In a broader definition of Visualization I would certainly agree with this notion: Information can be successfully conveyed in either static or interactive mediums. However, we have to question what really sets us apart from other parallel fields such as Information Design or Information Graphics. I do believe one of the crucial benefits of Information Visualization is interactivity – which also explains why this area emerged from Computer Science and HCI. It’s this “computer-supported, interactive” visual representation of data that truly makes us different. And this unique offering “becomes imperative as the degree of complexity of the portrayed system increases”. The representation of complex networks is just an instance where interactivity should be mandatory."
Lima goes on to discuss the diversity of the field, as it is comprised of people from a variety of disciplines. However, he sees a difference between Information Visualization and Information Art.
"The fallacy of Information Visualization being a conveyor of “pretty pictures” is drastically threatening the field, by undermining its goals and expectations. “We have to fight that or risk the trivialization and marginalization of visualization as an analytic tool”, asserts Robert Kosara on a recent review of the manifesto."
RELATED:
Robert Kosara's Post: Shaking the "Pretty Picture" Stigma 8/31/09
Kosara points out that visual analysis is a better term to use than visualization, among other things.
"The purpose of visualization is insight, not pictures" Ben Shneiderman (1999)
My favorite section of the manifesto:
"Interactivity is Key"
"As defined by Ben Shneiderman, Stuart K. Card and Jock D. Mackinlay, “Information Visualization is the use of computer-supported, interactive, visual representations of abstract data to amplify cognition”. This well-known statement highlights how interactivity is an integral part of the field’s DNA. Any Information Visualization project should not only facilitate understanding but also the analysis of the data, according to specific use cases and defined goals. By employing interactive techniques, users are able to properly investigate and reshape the layout in order to find appropriate answers to their questions. This capability becomes imperative as the degree of complexity of the portrayed system increases. Visualization should be recognized as a discovery tool."Manual Lima also suggests that narrative is an important component, and quotes Patricia McDonald: "...the question of narrative seems to lie at the heart of this Manifesto; the need to pose a specific question of the data and to weave coherent themes and stories from it."
Also read:
Observations on the Manifesto 9/3/09
In this post, Manual Lima reflects on the discussion surrounding his 8/30 post. Regarding his statement, "Interactivity is Key":
"This principle merits the reflection of us all. Jerome Cukier and David McCandless challenged the need for interactivity in Information Visualization. In a broader definition of Visualization I would certainly agree with this notion: Information can be successfully conveyed in either static or interactive mediums. However, we have to question what really sets us apart from other parallel fields such as Information Design or Information Graphics. I do believe one of the crucial benefits of Information Visualization is interactivity – which also explains why this area emerged from Computer Science and HCI. It’s this “computer-supported, interactive” visual representation of data that truly makes us different. And this unique offering “becomes imperative as the degree of complexity of the portrayed system increases”. The representation of complex networks is just an instance where interactivity should be mandatory."
Lima goes on to discuss the diversity of the field, as it is comprised of people from a variety of disciplines. However, he sees a difference between Information Visualization and Information Art.
"The fallacy of Information Visualization being a conveyor of “pretty pictures” is drastically threatening the field, by undermining its goals and expectations. “We have to fight that or risk the trivialization and marginalization of visualization as an analytic tool”, asserts Robert Kosara on a recent review of the manifesto."
RELATED:
Robert Kosara's Post: Shaking the "Pretty Picture" Stigma 8/31/09
Kosara points out that visual analysis is a better term to use than visualization, among other things.
"The purpose of visualization is insight, not pictures" Ben Shneiderman (1999)
More Multi-touch: Packard Bell's Viseo 200T Touch Edition; oneTwo M and L All-In-One; MSI's HD AE2400
More multi-touch!
Packard Bell Press Release: oneTwo M and oneTwo L press release (pdf)
"The use of multi-touch technology is expected to rapidly expand because it provides a fast and
intuitive interface for the user. In fact, it is not meant for computer literate or techno-maniacs, it’s addressing users who want easy-to-use and complete digital solutions, who love unique and expressive style.The Packard Bell oneTwo Series gathers all these needs into an all-in-one PC with easy-to-use touch-screen technology, eye-catching style, and fantastic entertainment and media sharing."
Packard Bell Press Release: Viseo 200 Touch and enjoy! (pdf)
The MSI promotional video shows some interesting touch-screen interaction with the media. I'm not sure how much of this video is special effects!
MSI had a contest for people to create and share their videos to promote the benefits of touch screens over traditional PC's. Here's the promo for the contest:
The videos can be seen on the MSI contest website or on YouTube.
It is rumored that MSI will be releasing a multi-touch all-in-one sometime in the future, the MSI AE2400.
I haven't had a chance to test out any of these computers. If you have had the opportunity, please leave a comment.
I'm hoping that there will be an influx of great touch and multi-touch applications in the future. Multi-touch web browsers exist, but are only in the experimental phase, as far as I know, and most web-sites are not optimized for even single-touch interaction.
Packard Bell Press Release: oneTwo M and oneTwo L press release (pdf)
"The use of multi-touch technology is expected to rapidly expand because it provides a fast and
intuitive interface for the user. In fact, it is not meant for computer literate or techno-maniacs, it’s addressing users who want easy-to-use and complete digital solutions, who love unique and expressive style.The Packard Bell oneTwo Series gathers all these needs into an all-in-one PC with easy-to-use touch-screen technology, eye-catching style, and fantastic entertainment and media sharing."
Packard Bell Press Release: Viseo 200 Touch and enjoy! (pdf)
The MSI promotional video shows some interesting touch-screen interaction with the media. I'm not sure how much of this video is special effects!
MSI had a contest for people to create and share their videos to promote the benefits of touch screens over traditional PC's. Here's the promo for the contest:
The videos can be seen on the MSI contest website or on YouTube.
It is rumored that MSI will be releasing a multi-touch all-in-one sometime in the future, the MSI AE2400.
I haven't had a chance to test out any of these computers. If you have had the opportunity, please leave a comment.
I'm hoping that there will be an influx of great touch and multi-touch applications in the future. Multi-touch web browsers exist, but are only in the experimental phase, as far as I know, and most web-sites are not optimized for even single-touch interaction.
Posted by
Lynn Marentette
Labels:
MSI,
MSI AE2400,
multi-touch,
NUI,
oneTwo,
packard bell,
touch screen,
video contest,
viseo200T
No comments:
William Forsythe's "Synchronous Objects - One Flat Thing, Reproduced." Multidisciplinary online interactive project: Translating choreography into new forms.
If you are interested in keeping up with ways various disciplines are converging, take the time to learn about the process behind the Synchronous Objects, One Flat Thing, Reproduced project.
This project was a product of a collaboration between choreographer William Forsythe and a multi-disciplinary group of researchers from Ohio State University's Advanced Computing Center for the Arts and Design (ACCAD).
The online installation of this project is an interactive website. The site demonstrates how the project's team members explored "discipline-specific and cross-disciplinary ways of seeing." and focused on "visualizing choreographic structure from dance to data to objects".
Synchronous Objects Interactive Site
"...the project presents an original collection of screen-based visualizations (video, digital artwork, animation, and interactive graphics) that reveal interlocking systems of organization in the choreography. The project aims to appeal to a broad public from diverse fields including but not limited to dance. Forsythe explains, “The project starts from the recognition that choreography is an organizational practice that employs fundamental creative strategies relevant to many other domains.”(The link above leads to the website- be sure to watch the video introduction before entering the site.)
Overview/Teamwork
Each discipline involved in the project discussed what they saw in the dance.. Their contributions to the project were amazing. For example, a geographer took data from the dance and translated it into a geographical plane, to describe the use of space:
A video dance scoring system was utilized: Full Video Score
Cue Visualizer tool
"The Cue Visualizer is an interactive tool in which users can view the cue system, in One Flat Thing, reproduced, over time."
Synchronous Objects Project Gallery
William Forsythe's Essay: Choreographic Objects
"When the blind mathematician Bernard Morin was asked where the imaging of the process of everting a sphere existed in his imagination, he famously replied: "Nowhere and everywhere at the same time." And so it is with the choreographic object: it is a model of potential transition from one state to another in any space imaginable."
Project Team Member's Quotes
Video Introduction
All Credits: Synchronous Objects Project, The Ohio State University and The Forsythe Company
RELATED
"The main focus of Synchronous Objects is to develop a set of data visualization tools for capturing, analyzing and presenting the underlying choreographic structures and components of Forsythe's "One Flat Thing, reproduced" (OFTr), which premiered in 2000. These visualizations in the form of information graphics, 2D and 3D animations and visual dance scores will provide audiences, students and researchers with new approaches to thinking about and studying Forsythe's intricate, counter-point work." - Doug Fox, Great Dance
Overview of Pacific Northwest Ballet's performance of One Flat Thing, Reproduced
From Daily Motion: One Flat Thing Reproduced, ARTE France production:
Transcript of the Teamwork video:
Chuck Helm: Bill received the Wexner Prize in 2002. One of the intents of the Wexner Prize was that this not just be a one-time occasion, that these might be people who would come back to OSU and have further involvement with both the Wexner Center and with campus, and in this case, it's really borne fruit.
We're doing this exhibition of Bill's installation and his video-related work that show his idea of how he's extending choreographic thinking into new forms such as video as well as a sculptural installation and also a performance installation as well as a whole section devoted to the new web project he developed at OSU as well as material that relates to the evolution of that web project.
Maria Palazzi: We have had this opportunity with the Synchronous Objects project to bring together a group of students who come from different disciplines like computer science, design, dance, art, working together to understand what their disciplines bring to a project like this but also to come to an understanding of what other disciplines might bring to their work.
Norah Zuniga Shaw: For dancers, we'd like to communicate better about what we do, so we feel it's important to make ourselves more readable. But I also think knowing about how motion works, knowing about complex event perception, analyzing the complexity in this dance is relevant for thinking critically about visual literacy in this new century that we're in.
Palazzi: When we started to talk to Bill about the project we said, "Who is audience?" and the answer was "Everybody." So we thought about what it was we needed to do to communicate with everybody and I think this idea of solving a lot of this communication through visuals, using algorithmic techniques, using computer graphics, using animation. These are very contemporary ways in which young people are used to seeing information and they're very effective ways to take lots of information and translate it so they're understandable.
William Forsythe: This is research into a kind of a proto literature for dance, where other people could look at a dance as it plays out on a video and with these annotations overlaid upon them, could we understand very quickly what this dance was or how it was organized and what they're thinking about it, so that dance wasn't so frightening. Which is where you hear people going, "Oh, I don't want to go to see the dance, I don't understand dance." And we're like, "You just don't know how to look at it. Of course you understand it." And we've had the wonderful experience now showing people these annotations and people say literally within second, "Oh, I get it," and you're going, like, "Voila."
Helm: This has been the first time that there's been a significant showing of Bill's installation works in this country, and particularly a body of his work that can be shown at a place like OSU, which is also sponsoring the development of this new web project. So it kind of shows the important resources that a major university can bring for an artist like Bill. It's not everyplace you go to that can say, "Yes, we can do this full range of activity."
Palazzi: I think he was attracted to ACCAD and the Department of Dance because of this unique collaboration that we have between our research center and dance. I think that is one of the advantages of a big university like Ohio State: We have lots of expertise on our campus. One of the things that ACCAD serves as, is an umbrella or a gathering place for multidisciplinary approaches to problems.
Forsythe: OSU is a no brainer because the Dance Department and ACCAD are connected. I don't know where else that exists. It turned out OSU had everything in place that I needed. So, what's very interesting is that it's this issue of research is that no one was trying to define the way things should be, everyone's saying, "Well, what else could this look like?" So there was this wonderful feeling of a palate of opportunities that's being offered. That was very, very relaxing.
Thanks to Celine Latulipe for the link!
This project was a product of a collaboration between choreographer William Forsythe and a multi-disciplinary group of researchers from Ohio State University's Advanced Computing Center for the Arts and Design (ACCAD).
The online installation of this project is an interactive website. The site demonstrates how the project's team members explored "discipline-specific and cross-disciplinary ways of seeing." and focused on "visualizing choreographic structure from dance to data to objects".
Synchronous Objects Interactive Site
"...the project presents an original collection of screen-based visualizations (video, digital artwork, animation, and interactive graphics) that reveal interlocking systems of organization in the choreography. The project aims to appeal to a broad public from diverse fields including but not limited to dance. Forsythe explains, “The project starts from the recognition that choreography is an organizational practice that employs fundamental creative strategies relevant to many other domains.”(The link above leads to the website- be sure to watch the video introduction before entering the site.)
Overview/Teamwork
To view flash video, this browser needs the Flash 8 (or higher) plug-in
Each discipline involved in the project discussed what they saw in the dance.. Their contributions to the project were amazing. For example, a geographer took data from the dance and translated it into a geographical plane, to describe the use of space:
A video dance scoring system was utilized: Full Video Score
Cue Visualizer tool
"The Cue Visualizer is an interactive tool in which users can view the cue system, in One Flat Thing, reproduced, over time."
Synchronous Objects Project Gallery
William Forsythe's Essay: Choreographic Objects
"When the blind mathematician Bernard Morin was asked where the imaging of the process of everting a sphere existed in his imagination, he famously replied: "Nowhere and everywhere at the same time." And so it is with the choreographic object: it is a model of potential transition from one state to another in any space imaginable."
Project Team Member's Quotes
Video Introduction
All Credits: Synchronous Objects Project, The Ohio State University and The Forsythe Company
RELATED
"The main focus of Synchronous Objects is to develop a set of data visualization tools for capturing, analyzing and presenting the underlying choreographic structures and components of Forsythe's "One Flat Thing, reproduced" (OFTr), which premiered in 2000. These visualizations in the form of information graphics, 2D and 3D animations and visual dance scores will provide audiences, students and researchers with new approaches to thinking about and studying Forsythe's intricate, counter-point work." - Doug Fox, Great Dance
Overview of Pacific Northwest Ballet's performance of One Flat Thing, Reproduced
From Daily Motion: One Flat Thing Reproduced, ARTE France production:
Transcript of the Teamwork video:
Chuck Helm: Bill received the Wexner Prize in 2002. One of the intents of the Wexner Prize was that this not just be a one-time occasion, that these might be people who would come back to OSU and have further involvement with both the Wexner Center and with campus, and in this case, it's really borne fruit.
We're doing this exhibition of Bill's installation and his video-related work that show his idea of how he's extending choreographic thinking into new forms such as video as well as a sculptural installation and also a performance installation as well as a whole section devoted to the new web project he developed at OSU as well as material that relates to the evolution of that web project.
Maria Palazzi: We have had this opportunity with the Synchronous Objects project to bring together a group of students who come from different disciplines like computer science, design, dance, art, working together to understand what their disciplines bring to a project like this but also to come to an understanding of what other disciplines might bring to their work.
Norah Zuniga Shaw: For dancers, we'd like to communicate better about what we do, so we feel it's important to make ourselves more readable. But I also think knowing about how motion works, knowing about complex event perception, analyzing the complexity in this dance is relevant for thinking critically about visual literacy in this new century that we're in.
Palazzi: When we started to talk to Bill about the project we said, "Who is audience?" and the answer was "Everybody." So we thought about what it was we needed to do to communicate with everybody and I think this idea of solving a lot of this communication through visuals, using algorithmic techniques, using computer graphics, using animation. These are very contemporary ways in which young people are used to seeing information and they're very effective ways to take lots of information and translate it so they're understandable.
William Forsythe: This is research into a kind of a proto literature for dance, where other people could look at a dance as it plays out on a video and with these annotations overlaid upon them, could we understand very quickly what this dance was or how it was organized and what they're thinking about it, so that dance wasn't so frightening. Which is where you hear people going, "Oh, I don't want to go to see the dance, I don't understand dance." And we're like, "You just don't know how to look at it. Of course you understand it." And we've had the wonderful experience now showing people these annotations and people say literally within second, "Oh, I get it," and you're going, like, "Voila."
Helm: This has been the first time that there's been a significant showing of Bill's installation works in this country, and particularly a body of his work that can be shown at a place like OSU, which is also sponsoring the development of this new web project. So it kind of shows the important resources that a major university can bring for an artist like Bill. It's not everyplace you go to that can say, "Yes, we can do this full range of activity."
Palazzi: I think he was attracted to ACCAD and the Department of Dance because of this unique collaboration that we have between our research center and dance. I think that is one of the advantages of a big university like Ohio State: We have lots of expertise on our campus. One of the things that ACCAD serves as, is an umbrella or a gathering place for multidisciplinary approaches to problems.
Forsythe: OSU is a no brainer because the Dance Department and ACCAD are connected. I don't know where else that exists. It turned out OSU had everything in place that I needed. So, what's very interesting is that it's this issue of research is that no one was trying to define the way things should be, everyone's saying, "Well, what else could this look like?" So there was this wonderful feeling of a palate of opportunities that's being offered. That was very, very relaxing.
Thanks to Celine Latulipe for the link!
Kindlelab Project: Opensource & 3D Immersive Educational Tools
For more information and links about KindleLab, see the "KindleLab Project" post on the TechPsych blog.
Posted by
Lynn Marentette
Off topic: Interesting motivational video clip: "If you never failed, you never lived". (Do you know who is sitting in your classrom?)
Good motivational video clip: "If you never failed, you never lived" (Do you know who is sitting in your classroom?)
(cross-posted on the TechPsych blog)
(cross-posted on the TechPsych blog)
Sep 3, 2009
The Future of Interactive Technology: Links to VentureBeat's series of articles
Put your finger on it: The future of interactive technology
Matt Marshall, 8/12/09
"Touch" technology for desktop computing finally taking off
Al Monro (CEO, NextWindow) 8/28/09

Speech, touchscreen--been there, done that. What's the user interface of tomorrow?
9/2/09 Mary Czerwinski (Microsoft)

How phones emerged as main computing devices, and why user interface will improve
Andrew Hsu (Synaptics) 8/24/09
SOMEWHAT RELATED
Playing Tetris could be good for your brain, study says

Mind Research Network
BRAIN IMAGING SHOWS PLAYING TETRIS LEADS TO BOTH BRAIN EFFICIENCY AND THICKER CORTEX
"Over a three-month period, adolescent girls practiced Tetris, a computer game requiring a combination of cognitive skills. The girls who practiced showed greater brain efficiency, consistent with earlier studies. Compared to controls, the girls that practiced also had a thicker cortex, but not in the same brain areas where efficiency occurred."
"We showed that practice on a challenging visuospatial task has an impact on the structure of the cortex, which is in keeping with a growing body of scientific evidence showing that the brain can change with stimulation and is in striking contrast with the pervasive and only-recently outmoded belief that our brain’s structure is fixed,” said Dr. Sherif Karama, a co-investigator at the Montreal Neurological Institute. This study, published in the open-access journal BMC Research Notes, is one of the first to investigate the effects of practice in the brain using two imaging techniques. The girls completed both structural and functional MRI scans before and after the three-month practice period, as did girls in the control group who did not play Tetris. A structural MRI was used to assess cortical thickness, and a functional MRI was used to assess efficient activity."
Matt Marshall, 8/12/09
"Touch" technology for desktop computing finally taking off
Al Monro (CEO, NextWindow) 8/28/09
Speech, touchscreen--been there, done that. What's the user interface of tomorrow?
9/2/09 Mary Czerwinski (Microsoft)
How phones emerged as main computing devices, and why user interface will improve
Andrew Hsu (Synaptics) 8/24/09
SOMEWHAT RELATED
Playing Tetris could be good for your brain, study says
Mind Research Network
BRAIN IMAGING SHOWS PLAYING TETRIS LEADS TO BOTH BRAIN EFFICIENCY AND THICKER CORTEX
"Over a three-month period, adolescent girls practiced Tetris, a computer game requiring a combination of cognitive skills. The girls who practiced showed greater brain efficiency, consistent with earlier studies. Compared to controls, the girls that practiced also had a thicker cortex, but not in the same brain areas where efficiency occurred."
"We showed that practice on a challenging visuospatial task has an impact on the structure of the cortex, which is in keeping with a growing body of scientific evidence showing that the brain can change with stimulation and is in striking contrast with the pervasive and only-recently outmoded belief that our brain’s structure is fixed,” said Dr. Sherif Karama, a co-investigator at the Montreal Neurological Institute. This study, published in the open-access journal BMC Research Notes, is one of the first to investigate the effects of practice in the brain using two imaging techniques. The girls completed both structural and functional MRI scans before and after the three-month practice period, as did girls in the control group who did not play Tetris. A structural MRI was used to assess cortical thickness, and a functional MRI was used to assess efficient activity."
Posted by
Lynn Marentette
The World is My Web Browser: Interactive Technology in Public Spaces
Off-the desktop Interactive technologies are spreading quickly, across many platforms. It seems that within the next 3-5 years we will be able to have instant access to a social and dynamic web, with information and entertainment at our fingertips, anywhere we go.
The "world-as-web-browser" concept is one I've touched upon in the past. I'd like to focus on this topic in future posts.
I came across some of the information in this post on the Future of Media Blog,maintained by the folks at the Interpublic Emerging Media Lab, based in California. This blog is a must-read!
The Saatchi & Saatchi ad agency in the UK created interactive games that provide people waiting in bus shelters the opportunity to pass the time by bashing digital falling eggs.
A more utilitarian approach to interactive touch screens is the EyeStop bus stop, a concept created by researchers MIT's SENSEable City Lab, along with the city of Florence and the city's transit service:

MIT

MIT researchers unveil the EyeStop: "Bus stop of the future" can sense its surroundings (pdf)

SOMEWHAT RELATED
IPG Future of Media Blog
The "world-as-web-browser" concept is one I've touched upon in the past. I'd like to focus on this topic in future posts.
I came across some of the information in this post on the Future of Media Blog,maintained by the folks at the Interpublic Emerging Media Lab, based in California. This blog is a must-read!
The Saatchi & Saatchi ad agency in the UK created interactive games that provide people waiting in bus shelters the opportunity to pass the time by bashing digital falling eggs.
A more utilitarian approach to interactive touch screens is the EyeStop bus stop, a concept created by researchers MIT's SENSEable City Lab, along with the city of Florence and the city's transit service:
MIT

MIT researchers unveil the EyeStop: "Bus stop of the future" can sense its surroundings (pdf)

SOMEWHAT RELATED
IPG Future of Media Blog
Posted by
Lynn Marentette
Sep 1, 2009
VideoJet: Touch PDC Multimedia Mobile Games for Children; Video Jet KIU, a French "Wii"
I came across this portable touch-screen multimedia device today on the web, and had difficulty finding more information about it. It is from Videojet, a company that provides mobile games and devices for children, as well as game consoles for use in the home.

PDC Touch Multimedia
The devices is PDC Touch Multimedia, a handheld multi-purpose touch-screen/video camera, camera/ game console, M-3 player that has 16 GB of on-board memory, offers 105 games and application. 40 titles provide tactile and sensory feedback. This device also provides a mini SD card reader and video outputs.
This device looks like it has assistive technology potential.
When I visited the website, I came across KIU, Videojet's version of the "Wii":


Videojet kiu (PUB)
Uploaded by prodet. - Watch game reviews, trailers, and walkthrough videos.
I don't know much about this company, and I have never played any of their games. I'd like to know if this company has any educational titles!
PDC Touch Multimedia
The devices is PDC Touch Multimedia, a handheld multi-purpose touch-screen/video camera, camera/ game console, M-3 player that has 16 GB of on-board memory, offers 105 games and application. 40 titles provide tactile and sensory feedback. This device also provides a mini SD card reader and video outputs.
This device looks like it has assistive technology potential.
When I visited the website, I came across KIU, Videojet's version of the "Wii":
Videojet kiu (PUB)
Uploaded by prodet. - Watch game reviews, trailers, and walkthrough videos.
I don't know much about this company, and I have never played any of their games. I'd like to know if this company has any educational titles!
Posted by
Lynn Marentette
Aug 30, 2009
Yelp has an Augmented Reality iPhone App for the iPhone 3Gs, "The Monocle"!
(Cross posted from tshwi)
Thanks to Dan Saffer for the link from the Read Write Web!
Here is my partial "reblog":
"Social review service Yelp has snuck the first Augmented Reality (AR) iPhone app specifically for the US into the iTunes App Store. The undisclosed new feature allows iPhone 3Gs owners to shake their phones three times to turn on a view called "the Monocle." This view uses the phone's GPS and compass to display markers for restaurants, bars and other nearby businesses on top of the camera's view...Blogger Robert Scoble discovered the hidden feature and posted about it on FriendFeed today. "
Video (in French, but easy to understand by the demonstration)
"Both GPS and a compass are used to determine location and direction being pointed at."
Screen Shots:


I want an iPhone. Verizon, can you hear me now?!
Thanks to Dan Saffer for the link from the Read Write Web!
Here is my partial "reblog":
"Social review service Yelp has snuck the first Augmented Reality (AR) iPhone app specifically for the US into the iTunes App Store. The undisclosed new feature allows iPhone 3Gs owners to shake their phones three times to turn on a view called "the Monocle." This view uses the phone's GPS and compass to display markers for restaurants, bars and other nearby businesses on top of the camera's view...Blogger Robert Scoble discovered the hidden feature and posted about it on FriendFeed today. "
Video (in French, but easy to understand by the demonstration)
"Both GPS and a compass are used to determine location and direction being pointed at."
Screen Shots:
I want an iPhone. Verizon, can you hear me now?!
Posted by
Lynn Marentette
Labels:
3GS,
Apple,
apps,
armband,
augmented reality,
cell phone,
dan saffer,
iPhone,
mobile technology,
monocle,
paris,
pervasive,
readwrite web,
technology,
ubiquitous,
Yelp
1 comment:
Links to Accessibility Posts: 1, Hello Haptic Flash Cards; 2. Samsung Braille TouchPhone Prototype is Cool, uses Electric Active Plastic
Here are the links to the posts:
Samsung Braille TouchPhone Prototype is Cool: Uses Electric Active Plastic
Hello Haptic Flash Cards
More about the camera for the visually impaired/blind later!
For techies and the tech-curious - Panel podcast: Is Software Development Too Complex?
If you are interested in programming, software development, UI design, or if you are a student/technology "hobbyist", a recent panel discussion .NET Rocks! podcast is a must-listen. Even for even if you are NOT a Microsofty.
Is Software Development Too Complex? 8/27/09
"Recorded live at devLink in Nashville, Tennessee. Billy Hollis, Kathleen Dollard, Jim Holmes, and Josh Holmes (no relation) discuss the issue of the complexity of software development. Several .NET celebrities in the audience also chimed in."
If you listen carefully, and take notes, you will find several gems of wisdom. It is well worth the time. If you don't have 78 minutes to listen to the podcast, listen to it while you do your laundry or something!
(Lots of people in the podcast's audience still write programs using in .NET 2.0)
The real question:
How do we develop usable, elegant software to support a complex society, and at the same time, meet the demands of a world that holds high expectations about what technology can do the future?
A couple of quotes from the panel discussion:
"They've all seen CSI Miami!"
"Ohmygod, what do I have to learn NOW?"
"Microsoft, stop the insanity!"
RELATED LINKS
Jesse Liberty's Post & Reader's Responses to Dot Next Rocks
A user's point of view about the complexity of technology:
Computers conspiring to overthrow us- Gail Collins, National Columnist 8/30/09
Is Software Development Too Complex? 8/27/09
"Recorded live at devLink in Nashville, Tennessee. Billy Hollis, Kathleen Dollard, Jim Holmes, and Josh Holmes (no relation) discuss the issue of the complexity of software development. Several .NET celebrities in the audience also chimed in."
If you listen carefully, and take notes, you will find several gems of wisdom. It is well worth the time. If you don't have 78 minutes to listen to the podcast, listen to it while you do your laundry or something!
(Lots of people in the podcast's audience still write programs using in .NET 2.0)
The real question:
How do we develop usable, elegant software to support a complex society, and at the same time, meet the demands of a world that holds high expectations about what technology can do the future?
A couple of quotes from the panel discussion:
"They've all seen CSI Miami!"
"Ohmygod, what do I have to learn NOW?"
"Microsoft, stop the insanity!"
RELATED LINKS
Jesse Liberty's Post & Reader's Responses to Dot Next Rocks
A user's point of view about the complexity of technology:
Computers conspiring to overthrow us- Gail Collins, National Columnist 8/30/09
Posted by
Lynn Marentette
Aug 29, 2009
For the Gamers: Episode three of Half-Life 2 will incorporate sign language!
Below is a video of a discussion between Gabe Newell, a game developer at Valve, and a focus group including deaf and hearing impaired gamers. Valve is conducting research to gather information needed in order to incorporate signing into Episode 3 of Half-Life 2.
In the video, the importance of the use of realistic facial expressions was discussed, as well as other issues, such as deaf culture.
(All three video clips related to Gabe's focus groups can be found on the Escapist website.)


-Valve
(I'm a little behind the game accessibility news...)
In the video, the importance of the use of realistic facial expressions was discussed, as well as other issues, such as deaf culture.
(All three video clips related to Gabe's focus groups can be found on the Escapist website.)
-Valve
(I'm a little behind the game accessibility news...)
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)
