Showing posts with label interaction. Show all posts
Showing posts with label interaction. Show all posts

Sep 26, 2009

More Multi-touch and Gesture-based Natural User Interfaces: Bamboo Wacom Tablet; Multi-touch PresTop Kiosk and Snowflake Suite software

Wacom Tablets Get Multi-Touch, Gestures
(Charlie Sorrel, Wired, 9/24/09)
"For the tech-curious, the new tablets have 512 pressure levels in the pen tip and the active area of the tablet is 5.8 x 3.6 inches, and all lose the in-pack mouse (for obvious reasons). The Touch and the Pen models are both $70, and the Pen & Touch is $100. Also, if you were thinking of buying Photoshop Elements 7 for the same price, get a tablet instead — Elements comes in the box."




http://www.wired.com/images_blogs/gadgetlab/2009/09/cth460k_3-660x371.jpg

Official Wacom Video

"Bamboo Touch is new type of computer input device by Wacom that lets you navigate and perform commands like zoom, scroll, rotate and more with a series of simple finger taps and hand gestures. Bamboo Touch brings Multi-Touch capability to your Mac or PC"

Video from a Wacom user:

A nice alternative to a mouse.  I'm going to get one for my laptop!


Multi-touch Kiosks!
Press release:  Dutch touchscreen supplier PresTop partners with Natural User Interface (NUITEQ)
 
http://prestop.nl/images/gallery/products/st_UU_zuil_wit.png
http://prestop.nl/images/gallery/products/st_DSC02106.png

RELATED

I couldn't find any video clips of PresTop's multi-touch interaction. From what I can tell, PresTop multi-touch screens will be using SnowFlake Suite from Natural User Interface Technogies AB.

How-to:SnowFlake Suite Flash multi-touch Interactable component (NUIversity)

Without a single line of code, you can do quite a bit with Snowflake Suite

"This video covers how to make a rotatable and scalable image. The beauty about this is, that we have developed a Flash mouse input simulator, so that there is no need for multi-touch hardware in order to develop your applications. Simply simulate multiple mouse inputs for multi-touch.This project is still in alpha phase and a download will become available with the next release of Snowflake Suite 1.7 for the NextWindow platform and camera based multi-touch solutions."


Below is a video of single-touch interaction for PresTop, from Omnivision:


PresTop  PresTop offers interactive hardware and software solutions that can be used indoors as well as in outdoor environments.

Sep 22, 2009

Impress: A cool flexible interface project by Silke Hilsing

Impress, a cool flexible interface project by  Silke Hilsing:


impress - flexible display from Silke Hilsing on Vimeo.

From Sike Hilsing's website:

"Impress is the deliverance of the touch screen from its technical stiffness, coldness and rigidity. It breaks the distance in the relationship of human and technology, because it is not any longer the user which is subjected to technology, but in this case the display itself has to cave in to the human. Impress is a chance of approach of user and technology, above all, from technology.


It is a matter of a flexible display consisting of foam and force sensors which is deformable and feels pleasantly soft. Impress works with the parameters position and time like other touch screens as well, but in addition to that, it reacts, above all, on the intensity of pressure."

The application was created with Aruino and Processing.


http://www.silkehilsing.de/impress/blog/wp-content/uploads/2009/03/final_4.jpg



Thanks to Richard Van Tol for the link!

Sep 19, 2009

The World Is My Interface

THE WORLD IS MY INTERFACE  is the new  title of my Technology-Supported Human-World Interaction blog.  The name was just too long!

The first post written under the new title has a variety of links about ubiquitous computing, off-the-desktop interaction design, use of technology in public spaces, and so forth.  There are a few links to scholarly research and related projects, too.

Sep 16, 2009

The Touch Research Project, re/Touch, and Near Field Communication Touch Interaction




http://farm3.static.flickr.com/2360/1616057288_fa6c86a991.jpg
The Touch Project is based in the Interaction Design department of the Oslow School of Architecture and Design in Norway. "Touch is a research project that investigates Near Field Communication (NFC), a technology that enables connections between mobile phones and physical things. We are developing applications and services that enable people to interact with everyday objects and situations through their mobile devices. Touch consists of an inter-disciplinary team involved in social and cultural enquiry, interaction/industrial design, rapid prototyping, software, testing and exhibitions." -

Nearness Explores Interaction Without Touching

Nearness from timo on Vimeo.

iPhone RFID: Object-based media

iPhone RFID: object-based media from timo on Vimeo.


















reTouch Info Sheet (pdf)

reTouch is part of the Touch project, and it " brings together hundreds of cross-cultural examples of social norms and values involving touch—all categorised according to actions related to touching. Using quotes from ethnographic accounts written between the late 1800s and the present, re/touch encourages designers and researchers to explore how touch is used by people to relate to one another and the worlds in which we live. Browse re/touch to create design briefs, refine interaction scenarios, devise game play, or otherwise think, make and do things touchrelated." -reTouch web info, Anne Galloway.


One of the members of the research team is Anne Galloway, a social researcher and the author of the  purselipsquarejaw blog, which she recently resurrected after taking a year off from blogging.

Anne also contributes to the space and culture journal.  I've followed Anne's writing for a while.  Over the course of her Ph.D. studies, she has thought deeply about the intersections of technology, space, and culture,  including cross-cultural meanings of touch.

RELATED
Inspiring Touch Related Interaction Design

Original Design Thinking Approach for Researching RFID
-Nikolas Nova, Pasta and Vinegar


Note:  Nikolas Nova's Pasta and Vinegar blog is worth taking a look at if you are interested in design, UX, emerging technologies, pervasive/ubiquitous computing.


About Nikolas and his blog:
"User Experience researcher at LIFTlab. My work is about studying how people use various technologies and turn them into insights, ideas, prototypes or recommendations to inform design and foresight.This blog is a selection of the material that I collect, especially in fields such as mobility, urban environments, digital entertainment and new interfaces. I am also part of the near future laboratory."

Sample of Pasta and Vinegar Posts:
(Touch) Interaction Vocabulary
A Graphic Language for Touch-based Interactions

Sep 12, 2009

Exploring high quality, low-cost, and opensource eyegaze, and face tracking technologies:

Update:  More information about this topic is posted on the TechPsych blog.

I'm working with a few students who have multiple special needs who are in need of assistive communication technologies that are not expensive. Although my first idea was to look at eye gaze technology, the possibility of face tracking is also appealing. Take a look at the following video from SeeingMachines, a commercial company:



I have a couple of webcams, and I don't mind a trip to RadioShack. I can program a bit, too, but I have very limited time. I welcome any advice you might have.

Sep 7, 2009

Stantum's Mobile Phone Multi-touch Interface: Demonstration of Precise Interactions on a Resistive Touch Screen



For more information, read Paul Miller's article (Engadget), Stantum's mind-blowing multitouch interface on video!

http://www.blogcdn.com/www.engadget.com/media/2009/02/stantum-top-001.jpg

Paul Miller says:

"The PMatrix multitouch firmware allows for unlimited inputs, detection of any contacting object (a finger, a stylus or even a paintbrush) and pressure sensitivity. We played with the demo unit for a bit and were frankly blown away, it's far and away the best touch experience we've ever seen or felt, and the multitouch functionality is just gravy on top. Stantum is targeting resistive touchscreens because they're still considerably cheaper to build than capacitive ones, and from our perspective there seems to be zero tradeoff -- for sensitivity and accuracy this destroys everything else we've seen on the market, capacitive or not."

How it works:  Tech Specs from Stantum:

PMatrix: The unparalleled Multi-Touch IP core and detection firmware (pdf)


"PMatrix™ is the most powerful software engine for Multi-Touch technology. It runs as a chip firmware or software on controllers orany embedded processing devices. PMatrix scans and delivers an exact representation of what’s happening on the touch-panel. The Multi-Touch driver dynamically updates the cursor list, enabling any application to control the Graphical User Interface objects."

Stantum's rational behind the company's choice of using resistive rather than capacitive touch technology:

Why resistive? :
In comparison to capacitive sensing, resistive technology brings :

• Versatility : it detects fingers, gloves, stylus and any contact object
• A lower solution cost
• Lower power consumption
• Faster scanning rate
• Lower overall latency
• Precision is linear (same in the center and at the borders)
• Performance is not affected by the size of the touch-panel or the number of touches
• Transparency and soft touch enabling user experience equivalent to the best capacitive touch-panels

I'll update this post when I have more information!


RELATED

For in-depth, detailed information about resisitive multi-touch technology,  read the following posts via Touch User Interface:

How the Stantum's Resistive Multi-Touch Screen Works 
Pressure Sensitive Multi-touch Surfaces


Aug 24, 2009

Multi-touch on Firefox: Felipe Gome's Work

Felipe Gomes is a 4th-year computer engineering student at the University of Sao Paulo, Brazil. He's been working on a multi-touch project that enables the FireFox web browser to respond to multi-touch interaction and gestures. Below is a demonstration of this work:

Multitouch on Firefox from Felipe on Vimeo.




Code snippets for the demo can be found on the Mozilla Wiki

Aug 17, 2009

Microsoft's Project Natal & "Fun" Natural User Interfaces for the Office


This looks great!

I came across the picture below today, and discovered more about it in a CNET article:

"Gesture-recognition technology, such as that seen in Project Natal, has changed the face of gaming, but Microsoft's Craig Mundie (pictured) believes it will also transform the office." (Ina Fried, CNET)


This approach looks like it incorporates visual thinking and conceptualization in a nice and fluid way.

RELATED:

"Microsoft: Future desktop PC's will transform the office" (Ina Fried, CNET UK, 7/31/09).


According to Fried, author of the the linked CNET article, Craig Mundie, at the Microsoft Research Faculty Summit this past July demonstrated the set-up in the above picture. The demonstration "...included hologram-like videoconferencing, a virtual digital assistant, and multiple surface computers, along with voice-, touch- and gesture-recognition technology. The desk in the demo was a multitouch surface computer, and the office's walls were also a display that could easily switch from being a virtual window or collection of digital photos to a corkboard of sticky notes or various workspaces. In one case, Mundie also used Project Natal-like depth cameras to put himself in the middle of an architectural demo, essentially putting himself inside a building that was not yet built."

Craig Mundie of Microsoft on the Future of Software: Digital Assistants, Natural User Interfaces, and Room Computing (Gregory T. Huang, Xconomy Seatle, 7/13/09)

The author of the above article was also at the Microsoft Research Faculty Summit and touched on the concept of natural user interfaces:

“All the things we talk about as natural user interfaces have been largely used one at a time as enhancements to [graphical user interfaces],” Mundie said. Gesture recognition, expressive responses, immersive 3-D virtual environments, and understanding of context—these advances in computing algorithms will lead to software that is “better at anticipating what you might want.”

Xbox's Project Natal Revamped for Offices? (J. Nicholas Hoover, 7/30/09)

When I find a video of the Project Natal/Surface/Natural User Interface Office in action, I'll be sure to post it!

Why this is important:

Natural user interfaces might have the potential to support more efficient and effective collaboration among people in the workplace, and the visual display of information may also support better decision-making and problem solving.

Jul 27, 2009

RENCI at UNC-Charlotte has a Multi-Touch Table in the Visualization Center!

uncc_setup_1
uncc_setup_4
The Urbanization Explorer Touch Table at UNC-Charlotte
(Pictures from the RENCI blog)

I wish this multi-touch table was around when I last took a class at UNC-C! With help from the RENCI Europa team, this table that was only a dream was finally built.

The following information was posted on the RENCI Vis Group Multi-Touch Blog
:

"Jason Coposky and Warren Ginn from RENCI Europa delivered UNC-Charlotte’s Multi-touch Table to the
Charlotte Visualization Center last week. Dubbed the Urbanization Explorer Touch Table, the device’s first role will be to display the Urban Growth Model, developed by the Center for Applied Geographic Information Science (CAGIS) and UNC-Charlotte’s Urban Institute. By accessing historical patterns of growth in the region, this application will provide forecasts on how much growth is expected to take place based on these historical patterns. Using satellite imagery for the 24-county region around Mecklenburg, for four time periods: 1976, 1985, 1996 and 2006, the Urban Growth Model tracks the advance of impervious surfaces, a key indicator of development, in expansion across the area since 1976, and estimates the extent of urbanization through 2030. With interfaces developed by collaborators at the Charlotte Visualization Center, multiple users will be able to select areas of interest, zoom, pan, and navigate the colorful, large-format maps using only their fingertips and on-screen digital tools."

"First introduced at North Carolina State University’s Institute for Emerging Issues annual forum this past Februrary, this multi-touch table represents the next leap in performance in touch tracking. As opposed to the previous Direct Illumination (DI) technique employed in the original table, this table employs Diffused Surface Illumination (DSI). By employing a sheet of Cyro Acrylite EndLighten with polished edges and LED Edge-View Ribbon Flex from Environmental Lights, we’ve been able to distribute the IR illumination more evenly."

uncc_envir_lights
"Infrared LEDs on a trip from Environmental Lights is applied to the inside perimeter of the frame where the polished Endlighten acrylic sheet will be installed."

I want one to try out my touch-screen experiments!

RELATED
RENCI Displays the Urbanization Explorer Touch Table

At the 0:48 mark, you can see the RENCI Charlotte Urbanization Explorer Tool:

RENCI - Emerging Issues Forum 2009, Raleigh, NC in HD from Renaissaince Computing Institute on Vimeo.

UNC-Charlotte RENCI website

Hierarchical multi-touch selection techniques for collaborative geospatial analysis (pdf)
Thomas Butkiewicz, Dong Hyun Jeong, Zachary Wartell, William Ribarsky, and Remco Chang
University of North Carolina at Charlotte

Jul 4, 2009

Haptic Tactile Feedback for the iPhone? MacRumors says, "YES!"

According to a recent MacRumors.com article written by Arnold Kim, Apple's future versions of the iPhone may offer haptic tactile feedback:

"Multi-touch display screen with localized tactile feedback"

"Apple proposes including a grid of piezoelectronic actuators that can be activated on command. By fluctuating the frequency of these actuators, the user will "feel" different surfaces as their finger moves across it. As an example, a display could include a virtual click wheel which vibrates at a different frequency as the center. Users could easily sense the difference and use the click wheel without having to look at it."

Arnold links to another patent application that will enable use fingerprint signatures to control and interact with a device. Apple will have all devices cornered, from what I can tell...

According to the description, "The present invention can be employed in any electronic device or system that permit user control, including any portable, mobile, hand-held, or miniature consumer electronic device. Illustrative electronic devices or systems can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, computer mice, other computer accessories, cars or portions thereof, or any combination thereof. Miniature electronic devices may have a form factor that is smaller than that of hand-held devices. Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or combinations thereof."

There is a provision for multi-user interaction:

"Furthermore, while the processes described above illustratively register, detect, and respond to the fingerprints of one user per device, a device of the present invention can be configured to register, detect, and respond to the fingerprints of multiple users."


Arnold links to a third patent application:

Touch Screen RFID Tag Reader


Comment:
The US Patent Office & Trademark Patent Application Full Text and Image Database is a great place to explore if you have lots of time on your hands!

(Cross-posted on the Technology-Supported Human-World Interaction blog).

TEI '10: Fourth International Conference on Tangible, Embedded and Embodied Interaction

I am really excited about TEI '10, which will be held in Cambridge, MA on January 25-27, 2010. If you are not familiar with tangible and embedded computing/interaction/interfaces, it is worth taking some time exploring this field.

Hiroshi Ishii, of the MIT Media Lab, is one of the TEI '10 conference chairs, and is known as the father of tangible computing. His scholarly publications have influenced the direction of human-computer interaction research, and as a consequence, the direction of emerging technologies.

Two of Ishii's articles, in my opinion, are "must-reads":

Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms
pdf (CHI 1997)

Tangible Bits: Beyond Pixels
pdf (TEI 2008)

Additional Information:
"TEI, the conference on tangible, embedded, and embodied interaction, is about HCI, design, interactive art, user experience, tools and technologies, with a strong focus on how computing can bridge atoms and bits into cohesive interactive systems."

You can follow TEI '10 on Twitter.

Here is more from the TEI '10 website:

About TEI

"TEI'10 is the fourth international conference dedicated to presenting the latest results in tangible, embedded, and embodied interaction."

"The work presented at TEI addresses HCI issues, design, interactive art, user experience, tools and technologies, with a strong focus on how computing can finally bridge atoms and bits into cohesive interactive systems. The intimate size of this single-track conference provides a unique forum for exchanging ideas and presenting innovative work through talks, interactive exhibits, demos, hands-on studios, posters, art installations and performances"

Jun 6, 2009

Information about Touch Screens, Multi-Touch, & Gesture Interaction is Spreading

Since the news about Windows 7 multi-touch capabilities has spread around, I haven't had enough time to keep up all of information related to multi-touch interaction. Fortunately there are a few bloggers out there who are doing a great job filling in the gaps.

The Touch User Interface blog has a wealth of information in the form of pictures, video clips, slides, and links that I'd like to share.

The following slideshow/videos were highlighted in the Touch User Interface blog post, "Touch UI: HCI Viewpoint":

Untold Stories of Touch, Gesture, & NUI

Joe Fletcher, Design Manager, Microsoft Surface

Touch and Gesture Computing, What You Haven't Heard
Dan Saffer



Other posts of interest on the Touch User Interface blog:
Touch screens and vision impairment
Link: Designing the Palm Pre: An Interview with Michelle Koh

Touch User Interface Overview

I've updated some additional information about UX,interactive multimedia, multi-touch, and gesture interaction on my Multimedia and Interaction Resources page, which is a work in progress.


Mar 30, 2009

Softkinetic 3D Gesture Recognition for Games and Rehabilitative Play

Taking 3D interaction further, Softkinetic has developed middle-ware that uses a 3D camera to support full-body gesture interaction with games and other applications. No controllers or devices are needed!



The following video is narrated in Portuguese, I, think, but you can understand the content in any language. I you love the Wii, you'll probably like this!


Here is a video that demonstrate how Softkinetic and Silverfit paired together to develop rehabilitative games for the elderly and others:


The following table is from the Silverfit website:
Game Movement trained
Puzzle While sitting down, bend whole body left and right, and stand up. Cognitive/visual component.
Mole Balance exercise by stepping with one leg while standing.
Catching grapes Walking movement left and right.
Walking Walking in place, while avoiding obstacles and thresholds. Activity of Daily Life (ADL) component.
Arm exercise Arm stretching and reaching in all directions with one or both arms. ADL component.
Picking flowers Walking backwards, forwards and sideways. Optionally, bending down.
Memory Arm stretching left, right, forwards and upwards. Cognitive component.

RELATED

Softkinetic and Silverfit Introduce Senior-Targeted Gaming

(Danny Cowan, Gamasutra, 12/19/08)

Softkinetic's Gesture-Based Interactive TV Action:

Interesting 3D Interactive Gesture Recognition System by Atracsys: BeMerlin Product Demonstration of Nespresso Products

Via Core 77:

The following video clip is an example of BeMerlin, an infrared guesture recognition system developed by Atracsys for demonstrating 3D representations of Nexpresso's expresso machines.


The video was taken at the Home + Housewares 2009 show in Switzerland.

Mar 18, 2009

More for Multi-touch: NextWindow Plug-in for Natural User Interface's Snowflake Multi-touch Software -and more.



Those of you have an HP TouchSmart, Dell Studio One PC, or NextWindow displays might be interested in the new NUI plug-in that supports NUI Suite Snowflake software. Here are the features of the plugin, according to information from the Natural User Interface website:
  • Detailed user manual included with FAQ
  • Developed on fast and reliable C++ platform
  • Intuitive
  • Customizable
  • Gesture recognition library
  • TUIO/OSC (Open Sound Control) support (sending and receiving events)
  • Low level API
  • Hardware accelerated rendering
  • Support for wide variety of media types
  • Advanced window handler that supports scaling and rotation
  • Suitable for Windows® XP and Windows® Vista (Mac OSX and Linux can be developed on request)
  • Audio support
  • Single, dual support
  • Multi-threaded resource handler (For fast data visualization)

"NUI has partnered up with NextWindow™, an international leader in the development of optical multi-touch technology and the manufacturer of optical multi-touch screens, overlays and OEM touch components."

"NextWindow™'s integrated technology allows for natural and intuitive interaction of digital content on flat TFT, LCD and Plasma solutions."

"The NUI NextWindow™ plug-in can be used with any programming language that supports TUIO, i.e. C/C++/C#, Java, Flash, Python, VVVV etc, meaning that software developers can run their own applications on NextWindow™, utilizing the NUI NextWindow™ plug-in."

Comment:
I became a fan of NextWindow touch-screen displays in early 2007 when I worked on a couple of touch-screen projects in my HCI and Ubicomp classes at UNC-Charlotte.


I've been using my HP TouchSmart PC at work with students with disabilities. I'm experimenting with the NUI Suite SnowFlake on my TouchSmart, and found that interacting with the Particles application delighted students with severe autism. The activities provided opportunities to establish joint attention. I also noticed an increase in the number of vocalizations and/or verbalizations among the students. Of course, this was NOT a scientific study.

RELATED
Definition of Joint Attention from UConn:

"Joint Attention is the process of sharing one’s experience of observing an object or event, by following gaze or pointing gestures. It is critical for social development, language acquisition, cognitive development…"

http://eigsti.psy.uconn.edu/jt_attn.JPG


Establishing joint attention is an important step in the development of social interaction skills among young people who have autism spectrum disorders.

More about joint attention:

Joint Attention Study Has Implication for Understanding Autism
Science Daily, 9/29/07

Asperger-Advice: Joint Attention

Autism Games: Joint Attention and Reciprocity

Why is joint attention a pivotal skill in autism?
Tony Charman
Philos Trans R Soc Lond B Biol Sci. 2003 February 28; 358(1430): 315–324.
doi: 10.1098/rstb.2002.1199.

Mar 6, 2009

Interaction with the Web of Things; LIFT '09

I recently came across the Web of Things blog and found it to be a gem. Vlad Trifa and Dominique Guinard are the brains behind the blog. They have lots of interesting ideas that are at the intersection of WSN (Wireless Sensor Networks) and HCI (Human Computer Interaction).

Vlad and Dominique recently presented at LIFT '09. If you haven't heard of Lift, take some time and visit the website:

"LIFT is a series of events to inspire and connect the community of doers and thinkers exploring the social impact of new technologies. Each LIFT conference is a three days experience made of talks, workshops, interactive art and discussions to understand and anticipate the most important social changes, and meet the people behind them."


Everything on the Web of Things blog inspired me to write two posts, back to back, on my Technology Supported Human World Interaction blog:

More cool things from the Web of Things blog


The Web of Things, Wireless Sensor Networks, Embedded Systems, and (Everyware) Health Care
(The above post was inspired by my experience at the Cleveland Clinic, spending time every day with my father, who has been in the cardio ICU since his surgery a few days ago.)

Feb 27, 2009

Tangible User Interfaces Part I: Siftables

In 1997, the vision of tangible user interfaces, also known as TUI's, was outlined by Hiroshi Ishii and Brygg Ullmer of the Tangible Media Group at MIT, in their paper, "Tangible Bits: Towards Seamless Interfaces between People, Bits, and Atoms


"(pdf). According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article is is a must-read for anyone interested in "new" interactive technologies.

The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth taking at least a glance of this seminal work.


Another must-read is Hiroshi Ishii's 2008 article,
Tangible Bits: Beyond Pixels (pdf). In this article, Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.

Related tothe Tangible User Interface research is the work of the Fluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. According to the Fluid Interfaces website, the goal of this
research group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."

The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined.
Siftables is the work of
David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk:

David Merrill's TED Talk: Siftables - Making the digital physical
-Grasp Information Physically

"Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....
Siftables can sense their neighbors, allowing applications to utilize topological arrangement
..No special sensing surface or cameras are needed."



Siftables Music Sequencer from Jeevan Kalanithi on Vimeo.

http://web.media.mit.edu/~dmerrill/images/music-against-wood-320x213.jpg

More about Siftables:

Rethinking display technology (Scott Kirsner, Boston Globe, 7/27/08)
TED: Siftable Computing Makes Data Physical
Siftables: Toward Sensor Network User Interfaces
(pdf)

It seems that people really like the Siftable concept, or they don't see the point. I found the following humerous critique of Siftables on YouTube:

"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket.
You'd lose em.
Your cat would eat em.
You'd vacuum them up.
They'd fall down in the sofa.
They'd be all over the car floor.
You'd throw them away by mistake..."


In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.

Jan 20, 2009

More Multi-touch Multimedia: Video demonstration of applications created with Snowflake and Flash



This video showcases the work of Natural User Interface-AB, using NUI Suite 1.0 Snowflake and Flash.

Here is the plug from the company's website:
"Natural User Interface (NUI) is a Swedish innovative emerging technology company specializing in commercially available advanced multi-touch software, hardware and service solutions. NUI's solutions can convert an ordinary surface into an interactive, appealing and intelligent display that creates a stunning user experience."

For more information and links:

For Techies and the Tech Curious: Multi-touch/Gesture from the NUI-Group

Search this blog!

Jan 13, 2009

Interacting and Communicating with HP TouchSmart Notes: Photo, Video, Audio, and More

The HP TouchSmart Notes application can be used in a variety of interesting ways. I have an HP TouchSmart PC, and I like how easy it is to slide the notes about the screen using this multi-modal application. It allows the user to take capture pictures, video, or audio to create notes, as well as traditional "stickies" that allow for typing or finger-writing input. You can even draw or write over photos. It is an application that has many accessibility features. It also supports many of the principles of Universal Design for Learning (UDL).

The various notes can be put together in a variety of ways, and allow for video or photo storytelling and interesting ways of leaving messages- even a song or two.


I did some hunting and found the more information about some non-verbal students with autism who are learning verbal skills through this technology at Hope Technology School in Palo Alto, California.

Video of student using the HP TouchSmart Notes application at Hope Technology School:







Here are a few "How Two" videos that demonstrate the TouchSmart Notes features and interactions:

HP TouchSmart Voice Notes


TouchSmart Text Notes


The pictures below are from the HP TouchSmart Notes Application Review , by Peter Redmer 11/14/08



Text/Drawing Note Creation Panel


Icons for selecting input method for notes



Photo notes input panel:



You can draw on your photo-notes: