Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Nov 28, 2009
Quick Post: Video of Stantum's Multi-touch "Slate PC" Digital Resistive Touch Screen Netbook
The video is is of the Stantum Slate PC, via Netbooked's YouTube channel...The system in the video is running on a modded Dell Mini 10, and doesn't require calibration. Notice how the system easily handles a variety of interaction- fingers, thumbs, pinch, rotation, multiple finger swipes, brush strokes, fingernail action, stylus, and more.
At this time, the Stantum Slate is available for developers only.
RELATED POSTS
Stantum's Multi-touch Slate PC, Windows 7 Certified (11/17/09)
Interactive multi-touch for sound design, dj-ing, and music creation (10/25/09)
Stantum's Mobile Phone Multi-touch Interface: Demonstration of precise interactions on a resistive touch screen (9/7/09)
Updates about NextWindow and Stantum; Upcoming Emerging Displays Technologies Conference (6/2/09)
FYI: Netbooked's Netbook Blog
Posted by
Lynn Marentette
Labels:
demonstration,
HCI,
interaction design,
multi-touch,
notebook,
NUI,
stantum,
stantum slate,
video,
Windows 7
No comments:
Nov 27, 2009
Johannes Schoening & Friends Research: Videos of Multi-User Interaction on Multi-touch Walls and Tables
In this post, I'm featuring videos of the interactive work of Johannes Schoening, a member of the NUI-Group, and his collaborators. Johhannes works at the Innovative Retail Laboratory of the German Research Centre for Artificial Intelligence ( DFKI ) in Saarbrücken. Prof. Dr. Antonio Krüger is director of this lab. He also works with Michael Rohs at the Deutsche Telekom Laboratories in Berlin.
Johannes received a Diploma in Geoinformatik at the University of Münster at the Institute for Geoinformatics in 2007. His research interests are new methods and interfaces to intuitive navigate through spatial information or in general new intelligent interfaces that help people to solve daily tasks more effectively. His interests include mobile augmented reality applications, the use of Wikipedia as a knowledge database, and home grown multi-touch surfaces . (More information can be found on Johannes' website.)
Note: The 2010 Interactive Tabletops and Surfaces conference will be held in Germany, and Johannes and others will be involved with running it. You can follow the news on Twitter: http://twitter.com/its_Germany2010. The link to the conference website will be up soon, at http://www.its2010.org/ .
The descriptions below each video are from Johannes' YouTube Research Channel.
Multi-touch Risk
"A multi-touch and multi-user version of the classical Risk game. As a platform Nasa World Wind (WWJ) and the Java implementation of Risk "Domination" by yura.net were used. A authentication method (that was also integrated in the game) can be found in the last (next) video. Thanks to Klaus Drerup & Wadim Hamm." TEAM: Klaus Drerup, Wadim Hamm, Florian Daiber, & Johannes Schoening. Music by cycom: mathematics.
User Authentication on Large Multi-touch Wall with Mobile Device
"The exploitation of finger and hand tracking technology based on infrared light, such as FTIR, Diffused Illumina- tion (DI) or Diffused Surface Illumination (DSI) has enabled the construction of large-scale, low-cost, interactive multi-touch surfaces. In this context, access and security problems arise if larger teams operate these surfaces with different access rights. The team members might have several levels of authority or specific roles, which determine what functions and objects they are allowed to access via the multi-touch surface. In this video we present first concepts and strategies to authenticate with a large-scale multi-touch wall using a mobile device."
GeoLens: Allowing Multi-User Interaction with Geographic Information Systems on Interactive Surfaces
"This video shows the GlobalData application in use on an Archimedes SessionDesk http://www.archimedes-products.com/se The application was used to illustrate our GeoLens concept. GeoLenses are GUI widgets that can be used like scalable as well as zoomable magnifying lenses to allow synchronous multi-user interaction in GIS systems."
Google Is My Friend: The Google Chrome OS video, if you haven't yet seen it...
I've been meaning to watch the Google Chrome OS video. I'm the 609,825th viewer of the YouTube version below:
The Chromium Projects
The Chromium Projects
Posted by
Lynn Marentette
Varied Collection of Interface Interactions: Art and Sculpture Videos For Your Viewing Pleasure
Cross posted on The World Is My Interface
I've been exploring the contributions of artists to the world of interactive digital media. Here are videos of some of the interesting works I've come across recently. Some of the videos are of older works, but were new to me.
INTERACTIVE KINETIC SCULPTURE
Kinetic Pond
(I'm still searching for more information regarding the Kinetic Pond.)
Rose Finn-Kelcey: It Pays to Pray. Interactive Sculpture at the Cass Sculpture Foundation. Filmed by Robin Fitton.
"Insert 20p and select one of a range of prayers. An interactive sculpture which gives you back the money after providing an interesting message. Warning not to be used by the holy or holey. The prayers were about relationships with various chocolate bar brands." It Pays to Pray Description
Fiber Cloud, MIT Mobile Experience Lab
The Cloud - from MIT Mobile Experience Lab on Vimeo.
For more information, see the Fiber Cloud web page.
Marque Cornblatt: Interactive Kinetic Steampunk Sculptures (1993-1996)
Marque Cornblatt blogs at The MediaSapien: The Art and Culture of Hypermediated Identity
Marque Cornblatt's MFA Thesis: The Emergence of the MediaSapien
Daniel Rozin's Wooden Mirrors (Uses video system)
More Information: Daniel Rozin Interactive Art
GIANT- Interactive Sculpture at the Children's Museum of Pittsburgh (2008) David Butts
Imagine what this could do if it was controlled by gestures and a system of sensors!
Nothing (without you)(Adam Chapman)Warning: What is inside the box is sort of yucky!
Hall of Faces that Follow
(Installation at Puzzling World in New Zealand-I don't think this installation is computerized.)

Interactive Sculpture: MirrorMap, by Ryan Schenk
Self Organizing Still Life- David Fried's Kinetic Sculpture at the Atlanta Botanical Garden (responds to sounds)
Another video of Self Organizing Still Life
Act/React: Interactive Art Installation Video Milwaulkee Art Museum
Brian Knep discusses computer technology and his art:
Scott Snibbe's Deep Walls Milwaukee Art Museum
Scott Snibbe's Artist's Statement (Focuses on interaction)
I've been exploring the contributions of artists to the world of interactive digital media. Here are videos of some of the interesting works I've come across recently. Some of the videos are of older works, but were new to me.
INTERACTIVE KINETIC SCULPTURE
Kinetic Pond
(I'm still searching for more information regarding the Kinetic Pond.)
Rose Finn-Kelcey: It Pays to Pray. Interactive Sculpture at the Cass Sculpture Foundation. Filmed by Robin Fitton.
"Insert 20p and select one of a range of prayers. An interactive sculpture which gives you back the money after providing an interesting message. Warning not to be used by the holy or holey. The prayers were about relationships with various chocolate bar brands." It Pays to Pray Description
Fiber Cloud, MIT Mobile Experience Lab
The Cloud - from MIT Mobile Experience Lab on Vimeo.
For more information, see the Fiber Cloud web page.
Marque Cornblatt: Interactive Kinetic Steampunk Sculptures (1993-1996)
Marque Cornblatt blogs at The MediaSapien: The Art and Culture of Hypermediated Identity
Marque Cornblatt's MFA Thesis: The Emergence of the MediaSapien
Daniel Rozin's Wooden Mirrors (Uses video system)
More Information: Daniel Rozin Interactive Art
GIANT- Interactive Sculpture at the Children's Museum of Pittsburgh (2008) David Butts
Imagine what this could do if it was controlled by gestures and a system of sensors!
Nothing (without you)(Adam Chapman)Warning: What is inside the box is sort of yucky!
Hall of Faces that Follow
(Installation at Puzzling World in New Zealand-I don't think this installation is computerized.)
Interactive Sculpture: MirrorMap, by Ryan Schenk
Self Organizing Still Life- David Fried's Kinetic Sculpture at the Atlanta Botanical Garden (responds to sounds)
Another video of Self Organizing Still Life
Act/React: Interactive Art Installation Video Milwaulkee Art Museum
Brian Knep discusses computer technology and his art:
Scott Snibbe's Deep Walls Milwaukee Art Museum
Scott Snibbe's Artist's Statement (Focuses on interaction)
Posted by
Lynn Marentette
Nov 26, 2009
Cultural Analytics of Mark Rothko Paintings on the 287-Megapixel HIPerSpace Wall at Calit2
This is what I'd might like to use for my multi-dimensional interactive timeline project!
The interactive Cultural Analytics software system was developed by UC San Diego's Software Studies Initiative (featured in a previous post), and the Graphics, Visualization and Virtual Reality Laboratory.
Jeremy Douglass Presents Cultural Analytics on the Interactive HIperSpace Wall at Calit2
Description of Cultural Analytics, from the Software Studies Initiative Website
"The explosive growth of cultural content on the web including social media since 2004 and the digitization efforts by museums, libraries, and companies since the 1990s make possible fundamentally new paradigm for the study of both contemporary and historical cultures. We can use computer-based techniques for quantitative analysis and interactive visualization already commonly employed in sciences to begin analyzing patterns in massive cultural data sets. To make an analogy with "visual analytics," "business analytics," and "web analytics," we call this new paradigm cultural analytics."
"We believe that a systematic use of large-scale computational analysis and interactive visualization of cultural data sets and data streams will become a major trend in cultural criticism and culture industries in the coming decades. What will happen when humanists start using interactive visualizations as a standard tool in their work, the way many scientists do already? If slides made possible art history, and if a movie projector and video recorder enabled film studies, what new cultural disciplines may emerge out of the use of interactive visualization and data analysis of large cultural data sets?"
"The idea of Cultural Analytics was first presented by Lev Manovich in 2005. Software Studies Initiative founded at Calit2 in 2007 made possible to turn this vision into a research program. By drawing on the cutting-edge cyberinfrastructure and visualization research at Calit2 as well as world reputation of UCSD in digital arts and theory, we are able to develop a unique research agenda which complements other projects in digital humanities and "cyberscholarship":
The interactive Cultural Analytics software system was developed by UC San Diego's Software Studies Initiative (featured in a previous post), and the Graphics, Visualization and Virtual Reality Laboratory.
Jeremy Douglass Presents Cultural Analytics on the Interactive HIperSpace Wall at Calit2
Description of Cultural Analytics, from the Software Studies Initiative Website
"The explosive growth of cultural content on the web including social media since 2004 and the digitization efforts by museums, libraries, and companies since the 1990s make possible fundamentally new paradigm for the study of both contemporary and historical cultures. We can use computer-based techniques for quantitative analysis and interactive visualization already commonly employed in sciences to begin analyzing patterns in massive cultural data sets. To make an analogy with "visual analytics," "business analytics," and "web analytics," we call this new paradigm cultural analytics."
"We believe that a systematic use of large-scale computational analysis and interactive visualization of cultural data sets and data streams will become a major trend in cultural criticism and culture industries in the coming decades. What will happen when humanists start using interactive visualizations as a standard tool in their work, the way many scientists do already? If slides made possible art history, and if a movie projector and video recorder enabled film studies, what new cultural disciplines may emerge out of the use of interactive visualization and data analysis of large cultural data sets?"
"The idea of Cultural Analytics was first presented by Lev Manovich in 2005. Software Studies Initiative founded at Calit2 in 2007 made possible to turn this vision into a research program. By drawing on the cutting-edge cyberinfrastructure and visualization research at Calit2 as well as world reputation of UCSD in digital arts and theory, we are able to develop a unique research agenda which complements other projects in digital humanities and "cyberscholarship":
- while most projects in digital humanities deal with text, we focus on automatic analysis of visual and media cultures and artifacts: video games, visual art, media design, cinema, animation, AMV, machinema, photography, etc.;
- in developing techniques particularly suited for cultural visualization, we draw both from visualization fields (information visualization, scientific visualization, visual analytics) and from media and digital art;
- we are also developing techniques for analysis and visualization of born digital content such as video games, web sites and social media."
Links to white papers, scholarly papers, presentations, and photos related to this cultural visualization and related techniques/projects can be found on the UCSD Cultural Analytics web page.
The Emerging Field of Software Studies: Anne Helmond's Presentation: "Blogging and the blogosphere through the eyes of software and search engines"; UCSD's Software Studies Initiative
The slideshare presentation is by Anne Helmond, a New Media PhD candidate with the Digital Methods Initiative at the Mediastudies department at the University of Amsterdam where she studied New Media from 2004-2008. She is focusing her work "on the emerging field of Software Studies, which addresses the role that software plays in our society."
The presentation caught my eye because I've been using my blogs as on-line file cabinets, and discovered that my my careful tagging, designed to help me search my own posts, has been something highly favored by search engines. Anne has given this topic some deep thoughts, as you can see from the presentation.
Blogging and the blogosphere through the eyes of software and search engines
The Software Studies Initiative at UCSD The description below was taken from the UCSD Software Studies Initiative website:Blogging and the blogosphere through the eyes of software and search engines
"Google searches and Amazon recommendations, airline flight paths and traffic lights, email and your phone: our culture runs on software. How does software shape the world?
"Software Studies is a new research field for intellectual inquiry that is now just beginning to emerge. The very first book that has this term in its title was published by The MIT Press in June 2008 (Matthew Fuller, ed., Software Studies: A Lexicon). In August 2008 The MIT Press approved Software Studies book series, with Matthew Fuller, Noah Wardrip-Fruin
"The Software Studies Initiative intends to play the key role in establishing this new field. The competed projects will become the models of how to effectively study “software society.” Through workshops, publications, and lectures conducted at UCSD and disseminated via the web and in hard copy publications, we will disseminate the broad vision of software studies. That is, we think of software as a layer that permeates all areas of contemporary societies. Therefore, if we want to understand contemporary techniques of control, communication, representation, simulation, analysis, decision-making, memory, vision, writing, and interaction, our analysis can't be complete until we consider this software layer. By being the very first center of its kind, The UCSD Software Studies Initiative has the unique opportunity to shape how this software layer will be understood and studied by other universities, programs, and centers in years to come."
"Social scientists, philosophers, cultural critics, and media and new media theorists now seem to cover all aspects of the IT revolution, creating a number of new disciplines such as cyber culture, Internet studies, new media theory, and digital culture. Yet the underlying engine that drives most of these subjects – software – has received little or no direct attention. Software is still invisible to most academics, artists, and cultural professionals interested in IT and its cultural and social effects. But if we continue to limit critical discussions to the notions of “cyber,” “digital,” “new media,” or “Internet,” we are in danger of always dealing only with effects rather than causes; the output that appears on a computer screen rather than the programs and social cultures that produce these outputs. This is why we are convinced that “software studies” is necessary and we welcome you to join us in our projects and activities....“software studies” translates into two complementary research paradigms. On the one hand, we want to study software and cyberinfrastructure using approaches from humanities, cultural criticism, and social sciences. On the other hand, we want to bring software-based research methods and cutting-edge cyberinfrastructrure tools and resources or the study of the new domain where they have not being applied so far – large sets of cultural data."
Pictures from the Software Studies Initiative website & culturevis' Flickr photostream:
Cultural Analytics Research Environment + HiPerWall
Interactive exploration of an image collection on a HIPerSpace tiled display
Legend of Zelda Map Visualization
Data Exploration on the HiPerWall
Posted by
Lynn Marentette
Anne Helmond's Presentation
This post was updated and moved:
http://interactivemultimediatechnology.blogspot.com/2009/11/emerging-field-of-software-studies-anne.html
http://interactivemultimediatechnology.blogspot.com/2009/11/emerging-field-of-software-studies-anne.html
Posted by
Lynn Marentette
Nov 25, 2009
"Throw Your Data into Different Environments" - UC San Diego's NexCAVE: High Definition Virtual Reality for Interactive Visualization-
What is NexCave? It is an array of LCD panels that provides a projector-free visualization display that enables the visualization of massive datasets in great detail, at high speeds. It was created at Calit2's Virtulab, under the direction of Research Scientist Tom DeFanti. The bonus of this system is that it is much less costly than traditional VR Cave projection systems.
"NexCAVE exploration of Jordan archaeological excavation site. Speaker: Tom Levy, Professor, UCSD and Associate Director, CISA3."-YouTube description
NexCave Demo 3 of Wind Patterns, with 3D sound and HD monitors.
NexCave Display of 3D Model of Calit2 at UC San Diego
RELATED
JVC Introduces the NexCAVE System
"JVC’s Professional Products division is proud to announce today that the California Institute for Telecommunications and Information Technology has developed a new immersive visualization system they call the NexCAVE. This device uses nine GD-463D10U 3D HD monitors to give the user the feeling that they are in the environment. All of these displays feature a 46” diagonal screen, full HD (1920 x 1080) resolution, and 2000:1 contrast ratio. The NexCAVE is created by the same developers who created the CAVE system, which uses 3D projectors to turn a room into a 3D environment. The use of monitors instead of projectors allows for a more compact system that can also be portable for traveling purposes. Unfortunately their isn’t any word when the NexCAVE will be released at this time." -HDTV Review 11/24/09 (Via ITVT)
University of California's Calit2 Develops Immersive 3D Visualization System Using JVC Monitors -Tracy Swedlow, InteractiveTV Today 11/24/09
"Calit2 research scientist, Tom DeFanti, and his partner, Dan Sandin, began designing visualization systems over 35 years ago when they co-founded the Electronic Visualization Laboratory at the University of Illinois at Chicago. According to DeFanti, back in 1991 the pair conceived of the original CAVE system using projectors to reconstruct a 3D surround environment. According to JVC, early projector-based virtual reality (VR) systems were generally limited by two major problems: resolution was "fair at best," due to limitations in computer processing power and projector technology; and the systems required a very large dedicated space--since users could block images projected by front projectors, rear projectors, which required sufficient throw distance, were necessary."

"NexCAVE exploration of Jordan archaeological excavation site. Speaker: Tom Levy, Professor, UCSD and Associate Director, CISA3."-YouTube description
NexCave Demo 3 of Wind Patterns, with 3D sound and HD monitors.
NexCave Display of 3D Model of Calit2 at UC San Diego
RELATED
JVC Introduces the NexCAVE System
"JVC’s Professional Products division is proud to announce today that the California Institute for Telecommunications and Information Technology has developed a new immersive visualization system they call the NexCAVE. This device uses nine GD-463D10U 3D HD monitors to give the user the feeling that they are in the environment. All of these displays feature a 46” diagonal screen, full HD (1920 x 1080) resolution, and 2000:1 contrast ratio. The NexCAVE is created by the same developers who created the CAVE system, which uses 3D projectors to turn a room into a 3D environment. The use of monitors instead of projectors allows for a more compact system that can also be portable for traveling purposes. Unfortunately their isn’t any word when the NexCAVE will be released at this time." -HDTV Review 11/24/09 (Via ITVT)
University of California's Calit2 Develops Immersive 3D Visualization System Using JVC Monitors -Tracy Swedlow, InteractiveTV Today 11/24/09
"Calit2 research scientist, Tom DeFanti, and his partner, Dan Sandin, began designing visualization systems over 35 years ago when they co-founded the Electronic Visualization Laboratory at the University of Illinois at Chicago. According to DeFanti, back in 1991 the pair conceived of the original CAVE system using projectors to reconstruct a 3D surround environment. According to JVC, early projector-based virtual reality (VR) systems were generally limited by two major problems: resolution was "fair at best," due to limitations in computer processing power and projector technology; and the systems required a very large dedicated space--since users could block images projected by front projectors, rear projectors, which required sufficient throw distance, were necessary."
-Tiffany Fox, US SanDiego News Center 8/17/09
-Doug Ramsey, UC San Diego News Center 9/25/09
I like the pictures. In addition to use for visualization, this system might be a great platform for interactive multimedia art installations!
I like the pictures. In addition to use for visualization, this system might be a great platform for interactive multimedia art installations!
Posted by
Lynn Marentette
352 Media Group: Creating a Microsoft Deep Zoom Silverlight Wall: Great idea, could use some optimization for touch or IWB interaction
Ever since I explored the Hard Rock Cafe Memorabilia website on my HP TouchSmart PC, I've been on the look-out for other great touch-friendly applications created with Microsoft's Deep Zoom and Sliverlight. Today, I came across an example that holds some promise, although it needs some tweaking before it is truly touch-ready.
352 Media Group is a web development firm that has been experimenting with Microsoft's Deep Zoom in Silverlight. The results can be seen on the 352 Media Group Deep Zoom Page. On this page, you can interact with the deep zoom wall. You might need to install a Silverlight plug-in on your browser. Scroll down and read the "How Did We Do It?" section for specifics.
Note: I tried this in three browsers on my HP TouchSmart PC, Google Chrome, Internet Explorer, and Firefox. At the top of the viewing box, it says, "Click inside to zoom in". Clicking the picture or touching my touch screen did not activate the zoom. However, it did enable me to zoom in the wall through scrolling with my mouse.
If you touch the picture with your finger, you can move it around, and you can do this with your mouse as well. At the upper left-hand corner of the frame, there are tiny icons that will allow you to zoom in or out. If the icons were just a little bit larger, with just a little bit more space between them, it would be easy to activate the zoom feature with my finger.
RELATED
Windows User Experience Interaction Guidelines: Touch
Windows 7 Touch
Microsoft Silverlight Deep Zoom
Information from Microsoft Live Labs about Sliverlight Deep Zoom
Autostitch
This was the program used to help stitch together the pictures on the wall into a format that could be used with Deep Zoom.
Cross posted on The World Is My Interface blog.
352 Media Group is a web development firm that has been experimenting with Microsoft's Deep Zoom in Silverlight. The results can be seen on the 352 Media Group Deep Zoom Page. On this page, you can interact with the deep zoom wall. You might need to install a Silverlight plug-in on your browser. Scroll down and read the "How Did We Do It?" section for specifics.
Note: I tried this in three browsers on my HP TouchSmart PC, Google Chrome, Internet Explorer, and Firefox. At the top of the viewing box, it says, "Click inside to zoom in". Clicking the picture or touching my touch screen did not activate the zoom. However, it did enable me to zoom in the wall through scrolling with my mouse.
If you touch the picture with your finger, you can move it around, and you can do this with your mouse as well. At the upper left-hand corner of the frame, there are tiny icons that will allow you to zoom in or out. If the icons were just a little bit larger, with just a little bit more space between them, it would be easy to activate the zoom feature with my finger.
RELATED
Windows User Experience Interaction Guidelines: Touch
Windows 7 Touch
Microsoft Silverlight Deep Zoom
Information from Microsoft Live Labs about Sliverlight Deep Zoom
Autostitch
This was the program used to help stitch together the pictures on the wall into a format that could be used with Deep Zoom.
Cross posted on The World Is My Interface blog.
Posted by
Lynn Marentette
Nov 24, 2009
Morning Tech News: Razorfone: Windows 7 & WPF Retail Demo on a Multi-touch Screen, via I Started Something
Razorfone Interactive Retail Experience from Razorfish - Emerging Experiences on Vimeo.
This demo was created by the Emerging Experiences team at Razorfish. Here's the video description from Vimeo:
"Customers are being faced with increasingly complex buying decisions, especially when it comes to technology and services. As a result, increased pressure is being placed on store associates to provide knowledgeable service to customers. Our Emerging Experiences team used this opportunity to develop a solution to demonstrate how an immersive interactive experience can assist customers and store associates with complex buying decisions in a retail setting."
Comment: We've graduated from 2D multi-touch manipulation of photos via pan-zoom-rotate-resize-drag to 3D multi-touch manipulation of "objects".
So? I'm expecting much more.
There is much room for creative growth in this area!
RELATED
Emerging Experiences Blog
I Started Something Blog
Posted by
Lynn Marentette
Nov 23, 2009
GestureTek & Sprint's Interactive Wall: 3D depth-sensing allows wall interaction with a cell phone.
I missed this one! The video and photos below are of the Sprint Center Interactive Wall, powered by GestureTek's 3D depth-sensing system. The media art was created by Takashi Kawashima,a designer/media artist who lives in San Francisco. He has an MFA in Design| Media Arts from UCLA.
The interactive display can be controlled by a cell phone.
YouTube description/plug:
"GestureTeks 3D depth sensing technology powers an attention-grabbing interactive digital signage system for telecom leader Sprint. The 3D depth sensing interactive display screen, with mobile phone connectivity, tracks peoples body movements, and responds by sending a Sprint promotional message that follows them the entire length of the interactive billboard. The interactive motion-detecting advertising message invites users to create their own personalized interactive wall art on Sprints gesture control screen, by calling Sprint on their mobile phone. GestureTeks 3D tracker is the heart of the system. Installation lead: Mission Electronics. Creative: Goodby Silverstein."
The Instant DJ application looks fun! It allows you to mix the music tracks on the large display with your phone.

Phone Painter: Sprint Center Interactive Wall

Instant DJ

Now Widget

RELATED
Sprint Uses GestureTek 3D Tracking & Control System for New Interactive Digital Signage Campaign
GestureTek Announces 3D Gesture Tracking Initiatives for Sprint and Hitachi; Shares New 3D Patent Information
SOMEWHAT RELATED
GestureFX: Next Generation Pediatrics Business Case (interactive floor for a pediatric clinic's waiting room)
AirPoint Hand-Tracking Unit for Mouse Replacement and "Point to Control" Interactivity
Cross posted on The World Is My Interface blog
The interactive display can be controlled by a cell phone.
YouTube description/plug:
"GestureTeks 3D depth sensing technology powers an attention-grabbing interactive digital signage system for telecom leader Sprint. The 3D depth sensing interactive display screen, with mobile phone connectivity, tracks peoples body movements, and responds by sending a Sprint promotional message that follows them the entire length of the interactive billboard. The interactive motion-detecting advertising message invites users to create their own personalized interactive wall art on Sprints gesture control screen, by calling Sprint on their mobile phone. GestureTeks 3D tracker is the heart of the system. Installation lead: Mission Electronics. Creative: Goodby Silverstein."
The Instant DJ application looks fun! It allows you to mix the music tracks on the large display with your phone.
Phone Painter: Sprint Center Interactive Wall
Instant DJ
Now Widget
RELATED
Sprint Uses GestureTek 3D Tracking & Control System for New Interactive Digital Signage Campaign
GestureTek Announces 3D Gesture Tracking Initiatives for Sprint and Hitachi; Shares New 3D Patent Information
SOMEWHAT RELATED
GestureFX: Next Generation Pediatrics Business Case (interactive floor for a pediatric clinic's waiting room)
AirPoint Hand-Tracking Unit for Mouse Replacement and "Point to Control" Interactivity
Cross posted on The World Is My Interface blog
Posted by
Lynn Marentette
Morning Tech News: LED "Tatoos"; Sixth Sense Wearable Displays
Since I am usually crunched for time, I thought I'd try posting "morning tech news" on this blog in a brief format, and return to the topic later - hopefully later in the day or at the most, within the week.
If you are familiar with this blog, what I consider "news" is sometimes new to me. It might be something that crossed my path a while ago and never posted. It might be something that I missed. It doesn't even have to be "news", if it is something that is unique, catches my fancy, or is something that I think is an important innovation that should be followed and shared.
Today's news I caught from Wired, which linked to an article in MIT's Technology Review, "Implantable Silicon-Silk Electronics: Biodegradable circuits could enable better neural interfaces and LED tatoos", written by Katherine Bourzac.
"By building thin, flexible silicon electronics on silk substrates, researchers have made electronics that almost completely dissolve inside the body. So far the research group has demonstrated arrays of transistors made on thin films of silk. While electronics must usually be encased to protect them from the body, these electronics don't need protection, and the silk means the electronics conform to biological tissue. The silk melts away over time and the thin silicon circuits left behind don't cause irritation because they are just nanometers thick."
RELATED
WIRED's Gadget Lab: The Illustrated Man: How LED Tattos Could Make Your Skin a Screen Charlie Sorrel 11/20/09
"The silk substrate onto which the chips are mounted eventually dissolves away inside the body, leaving just the electronics behind. The silicon chips are around the length of a small grain of rice — about 1 millimeter, and just 250 nanometers thick. The sheet of silk will keep them in place, molding to the shape of the skin when saline solution is added.
-Surfdaddy Orca, hplusmagaizine 11/17/09
"Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania, is working with researchers from Beckman Institute at the University of Illinois and Tufts University to develop medical applications for the new transistors. Their silk-silicon LEDs can act as photonic tattoos that can show blood-sugar readings, as well as arrays of conformable electrodes that might interface with the nervous system."
Litt Lab : Translational NeuroEngineering
(Brian Litt's lab.)
SOMEWHAT RELATED
I've been thinking about flexible touch-screen applications, and it never occurred to me that the concept might be something that would transfer to human skin! Here are a few of my posts related to this topic:
Last night I dreamt about haptic touch-screen overlays...
Rhizome 2009: A Lovely Interactive Multi-touch App on a Flexible Lycra Screen
Impress: A cool flexible interface project by Silke Hilsing
More about this "somewhat related topic" to come:
Latest SixthSense demo features paper "laptop" camera gestures
Nilay Patel, Engadget 11/18/09
Adding a "SixthSense" to your Cellphone
Vikas Bajaj, Bits, New York Times 11/6/09
Pattie Maes TED Talk: Sixth Sense- Mobile Wearable Interface and Gesture Interaction (for the price of a cell phone!) - my post from 3/2009
If you are familiar with this blog, what I consider "news" is sometimes new to me. It might be something that crossed my path a while ago and never posted. It might be something that I missed. It doesn't even have to be "news", if it is something that is unique, catches my fancy, or is something that I think is an important innovation that should be followed and shared.
Today's news I caught from Wired, which linked to an article in MIT's Technology Review, "Implantable Silicon-Silk Electronics: Biodegradable circuits could enable better neural interfaces and LED tatoos", written by Katherine Bourzac.
"By building thin, flexible silicon electronics on silk substrates, researchers have made electronics that almost completely dissolve inside the body. So far the research group has demonstrated arrays of transistors made on thin films of silk. While electronics must usually be encased to protect them from the body, these electronics don't need protection, and the silk means the electronics conform to biological tissue. The silk melts away over time and the thin silicon circuits left behind don't cause irritation because they are just nanometers thick."
RELATED
WIRED's Gadget Lab: The Illustrated Man: How LED Tattos Could Make Your Skin a Screen Charlie Sorrel 11/20/09
"The silk substrate onto which the chips are mounted eventually dissolves away inside the body, leaving just the electronics behind. The silicon chips are around the length of a small grain of rice — about 1 millimeter, and just 250 nanometers thick. The sheet of silk will keep them in place, molding to the shape of the skin when saline solution is added.
These displays could be hooked up to any kind of electronic device, also inside the body. Medical uses are being explored, from blood-sugar sensors that show their readouts on the skin itself to neurodevices that tie into the body’s nervous system — hooking chips to particular nerves to control a prosthetic hand, for example."
Tatoo You: Silicon LED's can act as photonic tattoos that can show blood sugar readings-Surfdaddy Orca, hplusmagaizine 11/17/09
"Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania, is working with researchers from Beckman Institute at the University of Illinois and Tufts University to develop medical applications for the new transistors. Their silk-silicon LEDs can act as photonic tattoos that can show blood-sugar readings, as well as arrays of conformable electrodes that might interface with the nervous system."
Litt Lab : Translational NeuroEngineering
(Brian Litt's lab.)
SOMEWHAT RELATED
I've been thinking about flexible touch-screen applications, and it never occurred to me that the concept might be something that would transfer to human skin! Here are a few of my posts related to this topic:
Last night I dreamt about haptic touch-screen overlays...
Rhizome 2009: A Lovely Interactive Multi-touch App on a Flexible Lycra Screen
Impress: A cool flexible interface project by Silke Hilsing
More about this "somewhat related topic" to come:
Latest SixthSense demo features paper "laptop" camera gestures
Nilay Patel, Engadget 11/18/09
Adding a "SixthSense" to your Cellphone
Vikas Bajaj, Bits, New York Times 11/6/09
Pattie Maes TED Talk: Sixth Sense- Mobile Wearable Interface and Gesture Interaction (for the price of a cell phone!) - my post from 3/2009
Posted by
Lynn Marentette
Nov 21, 2009
Want to make some multi-touch? Try PyMT- Python Multitouch. Featured in Make. (via Sharath Patali)
Sharath Patali, a member of the NUI-Group, has been working with Python Multitouch, otherwise known as PyMT, to create multi-touch applications. He shared a link to a recent post in Make, featuring PyMT. Sharath is the author of the UI Addict blog, and is currently doing his internship at NUITEQ (Natural User Interface Technologies).
I've been told that the beauty of PyMT is that it makes it "easy" to create multi-touch prototype applications using very few lines of code, which is great for trying out different ideas in a short period of time. It helps if you already know Python!
PyMT - A post-WIMP Multi-Touch UI Toolkit from Thomas Hansen on Vimeo.
"PyMT is a python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. PyMT is written in Python, based on pyglet toolkit."
PyMT Programming Guide
PyMT Website
Note:
Christopher, author of The Space Station blog, is a member of the NUI-Group, and is building his own multi-touch table running his PyMT-based applications. Christopher is a student in Koblenz, Germany, studying computational visualistics, known as information visualization in the US.
I've been told that the beauty of PyMT is that it makes it "easy" to create multi-touch prototype applications using very few lines of code, which is great for trying out different ideas in a short period of time. It helps if you already know Python!
PyMT - A post-WIMP Multi-Touch UI Toolkit from Thomas Hansen on Vimeo.
"PyMT is a python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. PyMT is written in Python, based on pyglet toolkit."
PyMT Programming Guide
PyMT Website
Note:
Christopher, author of The Space Station blog, is a member of the NUI-Group, and is building his own multi-touch table running his PyMT-based applications. Christopher is a student in Koblenz, Germany, studying computational visualistics, known as information visualization in the US.
"Image Reveal" application for the SMART Table, by Vectorform.
The SMART Table from Smart Technologies now features the Image Reveal application, created by Vectorform, that supports multi-touch, multi-user collaborative learning activities for children. The Image Reveal is the first third-party application published for the SMART Table, and is available for free from the SMART website.
"Vectorform was eager to collaborate with SMART to create an early learning application for the SMART Table, which it feels is a groundbreaking technology product. Image Reveal enables young users to collaborate and answer a series of multiple choice questions in a chosen subject area. Each correct answer uncovers part of a hidden image until it is fully visible. Alternatively, students can guess what the hidden image is at any time to win the game. Using the SMART Table Toolkit, teachers can customize content, including subject area, hidden image, questions and answers, and use images to tailor questions and answers for pre-literate learners." -SMART Tech Press Release
SMART Table Introductory Video:
It is good news to see that SMART Technologies is providing new applications for the SMART Table. There is much room for growth in this field. However, the applications still have the look and feel of electronic workbooks, with a few interactive media bells and whistles tossed in to ensure that the system appeals to young learners. I wonder if the application supports teaching the skills needed for children to successfully work together, such turn-taking, negotiating with other children in a group situation, or settling differences of opinion.
Classrooms in elementary schools now contain a growing number of students who have autism spectrum disorders, as well as other disabilities that interfere with social interaction. For this reason, it would be important to learn if SMART Table applications follow the guidelines for Universal Design for Learning(UDL).
"Vectorform was eager to collaborate with SMART to create an early learning application for the SMART Table, which it feels is a groundbreaking technology product. Image Reveal enables young users to collaborate and answer a series of multiple choice questions in a chosen subject area. Each correct answer uncovers part of a hidden image until it is fully visible. Alternatively, students can guess what the hidden image is at any time to win the game. Using the SMART Table Toolkit, teachers can customize content, including subject area, hidden image, questions and answers, and use images to tailor questions and answers for pre-literate learners." -SMART Tech Press Release
SMART Table Introductory Video:
It is good news to see that SMART Technologies is providing new applications for the SMART Table. There is much room for growth in this field. However, the applications still have the look and feel of electronic workbooks, with a few interactive media bells and whistles tossed in to ensure that the system appeals to young learners. I wonder if the application supports teaching the skills needed for children to successfully work together, such turn-taking, negotiating with other children in a group situation, or settling differences of opinion.
Classrooms in elementary schools now contain a growing number of students who have autism spectrum disorders, as well as other disabilities that interfere with social interaction. For this reason, it would be important to learn if SMART Table applications follow the guidelines for Universal Design for Learning(UDL).
RELATED
Cross-posted in Tech Psych
Video: DROID & Interactive Display in Times Square; Droid Voice-activated Search
The video below shows people in NYC's Times Square using their Verizon Droid phones to interact with the Verizon Wireless digital signage billboard:
The Droid offers a voice-activated search feature. Users can ask a question, and the search engine, powered by Google, will provide the search results from the web or from items stored on the phone. One feature I like is that it provides turn-by-turn directions from Google Maps, as well as other helpful geographic information. This would be a great tool for city dwellers and visitors alike.
The video below is a demonstration of how the Google Maps Navigation feature works on Android-based phones:
RELATED
Verizon, Motorola Unveil the Droid
Marguerite Reardon 10/29/09 CNN Tech
Verizon Droid Gets New Google Innovation: Real-Time Internet-Linked Navigation
Michael Hickins, BNET
Announcing Google Maps Navigation for Android 2.0
Google Mobile Blog
DRIOD by Motorola Fact Sheet
Cross posted-The World Is My Interface
The Droid offers a voice-activated search feature. Users can ask a question, and the search engine, powered by Google, will provide the search results from the web or from items stored on the phone. One feature I like is that it provides turn-by-turn directions from Google Maps, as well as other helpful geographic information. This would be a great tool for city dwellers and visitors alike.
The video below is a demonstration of how the Google Maps Navigation feature works on Android-based phones:
RELATED
Verizon, Motorola Unveil the Droid
Marguerite Reardon 10/29/09 CNN Tech
Verizon Droid Gets New Google Innovation: Real-Time Internet-Linked Navigation
Michael Hickins, BNET
Announcing Google Maps Navigation for Android 2.0
Google Mobile Blog
DRIOD by Motorola Fact Sheet
Cross posted-The World Is My Interface
Nov 19, 2009
Become a Facebook fashionista with interactive augmented reality in Tobi's virtual dressing room.
Tobi, an on-line shopping website, has virtual dressing room with hundreds of dresses waiting to be tried on. Take a snapshot, share it on Facebook, and the process is elevated to a form of social fashionista networking.
The video below explains it all:
I'm not sure if the Tobi website will be offering a virtual dressing room for men.
Link from David Tan, ImmersiveTech
The video below explains it all:
I'm not sure if the Tobi website will be offering a virtual dressing room for men.
Link from David Tan, ImmersiveTech
Posted by
Lynn Marentette
Multi-touch & Gesture Interaction News: NUITEQ's Snowflake Suite 1.7 compatible with Windows 7 and 3M Touch Systems, N-trig and Lumio
"This video demonstrates the N-trig DuoSense true multi-touch solution utilizing up to four fingers. The video features various multi-touch enabled applications, including how to pan and rotate using up to four fingers on Google Earth, a demonstration of how to play various onscreen musical instruments using the Snowflake Suite Music application, and a new hands-on way to play Sudoku. The Corel Paint it!™ application shows how existing images can be transformed using multi-touch, and a 3D desktop organizer application from BumpTop demonstrates new and innovative ways in which to organize your desktop using up to four fingers" -avitaintrig's YouTube description
Snowflake Suite and NextWindow Plugin Information
NUITEQ in the media
3M Touch Systems
nTrig
Lumio
Bumptop
NextWindow
(SnowFlake Suite 1.7 works on NextWindow screens.)
I'll post more news and information about the natural interface/interaction biz very soon!
Posted by
Lynn Marentette
Nov 18, 2009
The Social and Technological Innovations in Social Media: Video of recent panel presentations moderated by Henry Jenkins at the USC Annenberg School of Communication and Journalism
Times are changing faster than we can change the buzzwords that convey this change. Social Media. Spreadibility. Immersive Journalism.
Henry Jenkins, the Provost's Professor of Communication, Journalism, and Cinematic Arts at the University of Southern California, recently moderated a panel on the topic of social and technological innovations in social media. If you are in the mood for reflection, the videos below of the panel presentations are worth a look. Topics covered include on-line social networks, 3D virtual worlds, immersive journalism, social computing research, "stickiness moving to spreadable", and more.
If you are in a rush, the following article provides an overview of the panel discussions, along with key quotes from the various participants:
Annenberg panels explore "Social Media: Platform or Provocation for Innovation?
Lara Levin, Student Writer, USC Annenberg News 11/16/09
Session 1- Video Social Media: Platform or Provocation for Innovation?
Session 2 - Implicatons of Social Media for Business, Learning and Institutional Development
Description from the USC Annenberg YouTube Channel:
Nov. 5, 2009: "Implications of Social Media for Business, Learning and Institutional Development"
"As part of the week-long visit by and dialogue with Annenberg Innovator in Residence Dr. Irving Wladawsky-Berger, Dean Ernest J. Wilson III hosts a half-day conference titled "Social Media: Platform or Provocation for Innovation?" In this panel, "Implications of Social Media for Business, Learning and Institutional Development" experts from USC Annenberg and IBM will explore recent innovations and future trends in the social media space as well as industry responses to these developments. The rate of innovation in social media has been staggering in recent years. The result is a substantially different media landscape than one confronted by media organizations even five years ago. The conversation will focus on both the demands of the new media marketplace and the barriers that organizations are likely to face in attempting to meet these demands. In addition to Wladawsky-Berger, panelists include USC Annenberg faculty members Henry Jenkins, Jonathan Taplin, Dmitri Williams, Marc Cooper, executive in residence David Westphal and research fellow Nonny de la Peña. They will be joined by IBMs Steve Canepa, general manager for media and entertainment and Julia Grace, software engineer and Melissa Cefkin, ethnographer and research scientist."
Henry Jenkins, the Provost's Professor of Communication, Journalism, and Cinematic Arts at the University of Southern California, recently moderated a panel on the topic of social and technological innovations in social media. If you are in the mood for reflection, the videos below of the panel presentations are worth a look. Topics covered include on-line social networks, 3D virtual worlds, immersive journalism, social computing research, "stickiness moving to spreadable", and more.
If you are in a rush, the following article provides an overview of the panel discussions, along with key quotes from the various participants:
Annenberg panels explore "Social Media: Platform or Provocation for Innovation?
Lara Levin, Student Writer, USC Annenberg News 11/16/09
Session 1- Video Social Media: Platform or Provocation for Innovation?
Session 2 - Implicatons of Social Media for Business, Learning and Institutional Development
Description from the USC Annenberg YouTube Channel:
Nov. 5, 2009: "Implications of Social Media for Business, Learning and Institutional Development"
"As part of the week-long visit by and dialogue with Annenberg Innovator in Residence Dr. Irving Wladawsky-Berger, Dean Ernest J. Wilson III hosts a half-day conference titled "Social Media: Platform or Provocation for Innovation?" In this panel, "Implications of Social Media for Business, Learning and Institutional Development" experts from USC Annenberg and IBM will explore recent innovations and future trends in the social media space as well as industry responses to these developments. The rate of innovation in social media has been staggering in recent years. The result is a substantially different media landscape than one confronted by media organizations even five years ago. The conversation will focus on both the demands of the new media marketplace and the barriers that organizations are likely to face in attempting to meet these demands. In addition to Wladawsky-Berger, panelists include USC Annenberg faculty members Henry Jenkins, Jonathan Taplin, Dmitri Williams, Marc Cooper, executive in residence David Westphal and research fellow Nonny de la Peña. They will be joined by IBMs Steve Canepa, general manager for media and entertainment and Julia Grace, software engineer and Melissa Cefkin, ethnographer and research scientist."
Posted by
Lynn Marentette
iStethoscope: Use Your iPhone, Listen to Your Heart - via Peter J. Bentley, the Undercover Scientist : Health Care Reformers, Listen to This!
I moved the contents of this post to another page:
iStethoscope: Use Your iPhone, Listen to Your Heart
iStethoscope: Use Your iPhone, Listen to Your Heart
Posted by
Lynn Marentette
Nov 17, 2009
TECH NEWS: HP TouchSmart DevZone; HP Interactive Solutions ISV Partner Program; Info from NextWindow
The following information is from the following HP press release:
HP Announces HP TouchSmart Software Development
"HP today announced the HP TouchSmart software development programs that allow software developers to create consumer and commercial applications for HP TouchSmart PCs and touch-enabled digital signage displays. The programs are designed to significantly increase the utility of TouchSmart products for consumers and businesses."
“HP TouchSmart development programs allow developers to uncover new market opportunities while users can discover a whole new world of possibilities on their touch products,” said James Taylor, director, Experience Marketing, Personal Systems Group, HP. “HP’s unique multitouch user interface, combined with native applications, provide the most advanced software platform for touch-enabled PCs and digital signage.”
HP connects with developers at TouchSmart DevZone
"Developers looking to create applications for consumer TouchSmart PCs can visit the TouchSmart DevZone at www.touchsmartdevzone.com to download the HP TouchSmart software development kit free of charge. The kit includes a complete set of code samples, documentation and application programming interfaces that make creating applications for HP TouchSmart products easy and fast"
HP Interactive Solutions ISV Partner Program Overview
"The HP Interactive Solutions ISV (Independent Software Vendor) Partner Program allows ISVs to register with HP to access technical resources and support for building tailor-made business solutions for business TouchSmart PCs and touch-enabled and non-touch digital signage displays. In partnering with ISVs, HP provides its business customers more choices and provides a more complete solution for their needs."
Note:
I have an HP TouchSmart - one of the reasons I bought it is that the touch-screen component was made by NextWindow. Nextwindow was the company responsible for the large touch-screen display I used for a couple of projects when I was taking HCI and Ubicomp classes during the first part of 2007. It had the best resolution and touch response of all of the displays I could get my hand on at the time.
NextWindow screens can be found in new computers and all-in-ones. These include the Dell Studio One 19, Dell SX2210T the HP TouchSmart 300, 600, & 9100, the Sony L Series, the Medion X9613, the NEC's ValueStar W All In One, Leveno A70z All-in-On PC More information regarding Nextwindow can be found on the company's press release page.


-Photos from NextWindow
HP Announces HP TouchSmart Software Development
"HP today announced the HP TouchSmart software development programs that allow software developers to create consumer and commercial applications for HP TouchSmart PCs and touch-enabled digital signage displays. The programs are designed to significantly increase the utility of TouchSmart products for consumers and businesses."
“HP TouchSmart development programs allow developers to uncover new market opportunities while users can discover a whole new world of possibilities on their touch products,” said James Taylor, director, Experience Marketing, Personal Systems Group, HP. “HP’s unique multitouch user interface, combined with native applications, provide the most advanced software platform for touch-enabled PCs and digital signage.”
HP connects with developers at TouchSmart DevZone
"Developers looking to create applications for consumer TouchSmart PCs can visit the TouchSmart DevZone at www.touchsmartdevzone.com to download the HP TouchSmart software development kit free of charge. The kit includes a complete set of code samples, documentation and application programming interfaces that make creating applications for HP TouchSmart products easy and fast"
HP Interactive Solutions ISV Partner Program Overview
"The HP Interactive Solutions ISV (Independent Software Vendor) Partner Program allows ISVs to register with HP to access technical resources and support for building tailor-made business solutions for business TouchSmart PCs and touch-enabled and non-touch digital signage displays. In partnering with ISVs, HP provides its business customers more choices and provides a more complete solution for their needs."
Note:
I have an HP TouchSmart - one of the reasons I bought it is that the touch-screen component was made by NextWindow. Nextwindow was the company responsible for the large touch-screen display I used for a couple of projects when I was taking HCI and Ubicomp classes during the first part of 2007. It had the best resolution and touch response of all of the displays I could get my hand on at the time.
NextWindow screens can be found in new computers and all-in-ones. These include the Dell Studio One 19, Dell SX2210T the HP TouchSmart 300, 600, & 9100, the Sony L Series, the Medion X9613, the NEC's ValueStar W All In One, Leveno A70z All-in-On PC More information regarding Nextwindow can be found on the company's press release page.
-Photos from NextWindow
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)
