Showing posts with label multi-user. Show all posts
Showing posts with label multi-user. Show all posts

Nov 21, 2009

Want to make some multi-touch? Try PyMT- Python Multitouch. Featured in Make. (via Sharath Patali)

Sharath Patali, a member of the NUI-Group, has been working with Python Multitouch, otherwise known as PyMT, to create multi-touch applications.  He shared a link to a recent post in Make, featuring PyMT.  Sharath is the author of the UI Addict blog, and is currently doing his internship at NUITEQ (Natural User Interface Technologies).

I've been told that the beauty of PyMT is that it makes it "easy" to create multi-touch prototype applications using very few lines of code, which is great for trying out different ideas in a short period of time.  It helps if you already know Python!


PyMT - A post-WIMP Multi-Touch UI Toolkit from Thomas Hansen on Vimeo.

"PyMT is a python module for developing multi-touch enabled media rich applications. Currently the aim is to allow for quick and easy interaction design and rapid prototype development. PyMT is written in Python, based on pyglet toolkit."


PyMT Programming Guide


PyMT Website

Note: 
Christopher, author of The Space Station blog, is a member of the NUI-Group, and is building his own multi-touch table running his PyMT-based applications. Christopher is a student in Koblenz, Germany, studying computational visualistics, known as information visualization in the US.

Oct 21, 2009

The WSN-Bar: Ambient Intelligence + Wireless Sensor Network + Interactive Touch Technology + Art

AMBIENT INTELLIGENCE + WIRELESS SENSOR NETWORK + INTERACTIVE TOUCH TECHNOLOGY + ART


Graduate students at the Center for Art and Technology -Taipei National University of the Arts,created a interactive touch creation, called the WSN-BAR, based on the concept of ambient intelligence, utilizing a wireless sensor network and vision-based tracking technologies. The video below demonstrates two modules, the Garden of Light, and Vivacious Bushes.

According to information from the WSN-Bar website, the installation detects the changes in the brightness of the environment, temperature, the C02 density of the outdoor air, and the movement of people within a building. This technology works in harmony to support the artistic focus of the WSN-BAR. It provides a means of looking at environmental factors and the relationships between humans and nature, in an innovative way. 

WSB-Bar


The WSN-Bar was created by Jiun-Shian Lin, SuChuHsu, and Ying-Chung Chen. The artwork was by Chiung-Fang Tsao, Chia-Wen Chen, Yu-Hsiung Huang and Yi-Wei Chia.  I'm not sure who created the relaxing ambient background music in the video.


 
 -Interactive WSN-Bar


Wireless Sensor Networks:  a building block for Mass Creativy and Learning (pdf)
(To appear in the Proceedings of ACM Creativity & Cognition 2009 - Understanding the Creative Conversation)

Thanks to Kevin O'Mahony for the link!

RELATED
ACM Creativity & Cognition 2009
Everyday Creativy: Shared Languages & Collective Action
Octobmer 27-30 2009
Berkeley Art Museum & UC Berkeley
 
KEYNOTE SPEAKERS
Mihály Csíkszentmihályi Professor of Psychology & Management
Claremont Graduate University [California, USA]

JoAnn Kuchera-Morin Director, Allosphere Research Laboratory
Nanosystems Institute [California, USA]


Jane Prophet Professor of Interdisciplinary Computing
Goldsmiths University of London [London, UK]

Oct 15, 2009

BEN: Breakable Experimental Network Simulation on a Multi-touch Wall (RENCI)

BEN, or Breakable Experiemental Network, is a mix of network visualization and multi-touch technology on a very large screen. The project aims to provide better network monitoring and management tools in a user-friendly, intuitive and efficient manner. It is also looks like great tool for learning about network engineering.

Hands-on networking!

BEN is part of the GENI project.

The music in the video is by Crystal Castles and Black Moth Super Rainbow.

Cross-posted: The World Is My Interface

Jul 29, 2009

Update on Multi-Touch, Multi-User NuMTyPYsics

A few months ago, I wrote a post about multi-touch "NuMTyPYsics". Here is an update:

Below is a video that shows how two use two single-touch Nokia Internet tablets to provide an emulation of multi-touch input for NuMTyPYsics, a drawing game that based on Numpty Physics:



Here's the description from Perlinet's YouTube channel:
"Using two single-touch tablets and a laptop running NuMTyPYsics, it's possible to emulate multi-touch input (max. 2 cursors - one from each tablet). This allows for precise input control and testing multi-touch applications. More info: http://thpinfo.com/2009/mt/"

http://www.tuxi.com.ar/wp-content/uploads/numpty-physics-tuxi.jpg
The game is also based on Crayon Physics, using the Box2D engine.

Jul 16, 2009

Convergence: Mobile content, Interactive Displays, Digital Out of Home, & Danoo

Danoo: "Location scouting and scalable web technologies combined to create a targeted DOOH experience".

If you are tech-savy and a member of the digital signage/interactive display/AV end of things, you probably know what this means. If not, you should. Things are pairing, merging, and converging in the world of "out and about" technology.

Technology, especially digital signage, is ubiquitous, and has targeted you and your neighborhood.


So what's the fuss? A company from Danoo, from China, is ramping things up in the Digital Signage/DOOH (Digital Out of Home) arena.
Fast.

Danoo has rapidly up with exisiting companies, such as National CineMedia, Blue Bite, Electronic Arts, and the HISTORY channel to create and push off-the-desktop and off-the-TV content and interactive user experiences. Bill Gerba, of Wirespring, and Manolo Almagro, CTO of Show + Tell in NYC (think Times Square digital signage experiences), and others have recently written about this phenomenon, right after Danoo released the following press announcement:

Danoo Inc Announces a Breakthrough in Mobile Content Delivery (7/11/09, Business Wire)
"Danoo partnered with mobile technology company Blue Bite to enable the campaigns, and is actively rolling out its mobile interactivity capability to all of its venues. The first 200 Danoo locations will be live in Los Angeles and New York by July 1st. At full deployment, Danoo will give advertisers the ability to get their content into the hands of more than 200,000 consumers in an average two-week campaign. In addition to content downloads, Danoo offers multiple ways to pair its screens with mobile interactivity to maximize campaign effectiveness, such as SMS call-to-actions, social gaming and mobile application promotion."

"Visitors to Danoo locations viewed video content on Danoo’s digital screens accompanied by an on-screen prompt to download exclusive content such as sneak peeks and ringtones from their Bluetooth or Wi-Fi-enabled devices via the Danoo network."



Here are a few related articles:
How Significant is the Danoo-IdeaCast-National CineMedia Deal?
(7/14/09, Bill Gerba, Wirespring)
A Watershed Moment for DOOH Media
(7/12/09, Bill Collins, Daily DOOH)

Pictures from Danoo's Website:

Danoo's IdeaCast
- "Captive TV".
http://www.danoo.com/images/danooideacast.jpghttp://www.danoo.com/images/danoopic2.jpg

I've probably encountered Danoo's technologies numerous times, judging from my archive of pictures of displays and related technologies that have crossed my path. The TV screen on the treadmill looks just like the one I saw on a treadmill in the fitness center of a Princess cruise ship last year. The picture in the middle looks like it was taken in an airport. The picture on the right is of a system in a coffeehouse. According to information on the Danoo website, the system is interactive and includes "content downloads, social gaming, mobile couponing and more via SMS, Bluetooth and Wi-Fi".


Blue Bite's website is worth looking at. It demonstrates their concept very well through animation:

Blue Bite: "Take the Ad With You"



Things are ramping up in my region, too.
T1 Tapas, a restaurant north of Charlotte, N.C., in the Birkdale Village in Huntersville, has majority owners with a technology background. Mike Feldman and Jim Morris started up Digital Optics Corporation, which focused on optics for computers and imaging, and after they sold their company, they teamed up with Denise Feldman to establish their company.

T 1 Connection Booth with Multi-User Touch Screen, HD TV, Sound System, Computer, & more:

"T1 Connection Booth seating gives you access to music, photos, and videos through built-in touchscreen tabletops, brilliant monitors, speakers, and computers." -Picture and taken from the T 1 Tapas website

The restaraunt serves as a test bed for T 1 Visions to try out their software and hardware designed to enhance the digitally connected dining experience. The restaurant was featured in May of this year on CNN:

Here is the link to the video: "High Tech Tapas"

Jul 4, 2009

Haptic Tactile Feedback for the iPhone? MacRumors says, "YES!"

According to a recent MacRumors.com article written by Arnold Kim, Apple's future versions of the iPhone may offer haptic tactile feedback:

"Multi-touch display screen with localized tactile feedback"

"Apple proposes including a grid of piezoelectronic actuators that can be activated on command. By fluctuating the frequency of these actuators, the user will "feel" different surfaces as their finger moves across it. As an example, a display could include a virtual click wheel which vibrates at a different frequency as the center. Users could easily sense the difference and use the click wheel without having to look at it."

Arnold links to another patent application that will enable use fingerprint signatures to control and interact with a device. Apple will have all devices cornered, from what I can tell...

According to the description, "The present invention can be employed in any electronic device or system that permit user control, including any portable, mobile, hand-held, or miniature consumer electronic device. Illustrative electronic devices or systems can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, computer mice, other computer accessories, cars or portions thereof, or any combination thereof. Miniature electronic devices may have a form factor that is smaller than that of hand-held devices. Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or combinations thereof."

There is a provision for multi-user interaction:

"Furthermore, while the processes described above illustratively register, detect, and respond to the fingerprints of one user per device, a device of the present invention can be configured to register, detect, and respond to the fingerprints of multiple users."


Arnold links to a third patent application:

Touch Screen RFID Tag Reader


Comment:
The US Patent Office & Trademark Patent Application Full Text and Image Database is a great place to explore if you have lots of time on your hands!

(Cross-posted on the Technology-Supported Human-World Interaction blog).

May 15, 2009

iPod Touch Apps, WiiMote Whiteboards, 3D multi-user environments in education, and a teacher's video of the SMARTTable in action.

I thought I'd share the last two posts from my TechPsych blog here, since they focus on newer technologies that involve multi-touch or multi-user interaction.

A teacher explores the multi-touch, multi-user SMARTtable in his classroom

From what I can see, multi-touch, multi-user applications are ideal for students to learn collaborative, cooperative social skills at the same time they learn academic skills. Smart Technologies, well-known in the education world for interactive whiteboards, has unleashed a few tables, known as SMARTTables, in classrooms. One teacher, Tom Barret, is sharing his journey with technology, including the SMARTtable, on-line via his blog, SPACE FOR ME TO EXPLORE

The following is a video of young children doing math on a multi-touch SMARTTable. In order to solve the finger- arithmetic problems, the students must work cooperatively


Addition App - Set to multi-touch finger counts from Tom Barrett on Vimeo.

(In the video, you will see some shapes that Tom mistakenly added, so disregard them as you view the video.)


Here is a quote from Tom's blog about his experience with the addition application:

"I was most pleased with the level of engagement from the children and although on the surface this seems to be a simple application, it definitely requires a level of teamwork that you often do not get.

It is intriguing watching the children’s first attempts and how they realise they need to work together. As the challenge is small scale, once they have been successful they begin to refine their approach, communicate better and so get to later answers quicker."


Educational iPod Touch Apps for Students and Teachers: Eric Sailers' blog
Eric Sailers is a speech and language pathologist and assisted technology specialist who explores new technologies that he's found useful in the schools. Below is Eric's demonstration of applications such as "I Write Words", Wikipanion, Preschool Adventure, Twitterific, Google Mobile, and the calendar.

To demonstrate the iPod Touch,Eric uses the Elmo document camera that projects onto a screen. Note that as Eric demonstrates the Twitterific application, , he navigates to a link to a blog of one of his colleagues, which highlights the way one school is using the Wii as an augmentive communication tool and also an assessment tool for occupational therapy.



Take some time to explore Eric's Speech-Language Pathology Sharing blog. It is full of great information!

Update: Here are two video clips Eric created to prepare for an interview as a finalist for the Cox Communication Innovation in Special Education award. In one of the videos, Eric discusses the EduSim application, a 3D multi-user virtual world platform and authoring toolkit intended for classroom interactive whiteboards.

Interactive Applications for Special Education: Wiimote Whiteboards and iPod Touch in Special Education, Part I


Wiimote Whiteboards and iPod Touch in Special Education, Part II

Mar 24, 2009

Struktable Multi-touch Installation at TOCA ME Design Conference






Struktable Multitouch Installation from Gregor Hofbauer on Vimeo.


Strukt is a design studio in Vienna, Austria, that specializes in interactive and generative design for a variety of purposes, such as interactive environments and installations, ambient intelligent environments, games, and multi-touch tables, screens, and walls. The video is a demonstration of applications that were presented at the March 2009 TOCA ME Design Conference in Munich, Germany. The applications were developed using
vvvv. (More information regarding vvvv can be found at the end of this post.)


MT Table 01

INFO FOR THE TECH-SAVVY OR TECH-CURIOUS:

According to information from the vvvv website, vvvv is a "toolkit for real time video synthesis. It is designed to facilitate the handling of large media environments with physical interfaces, real-time motion graphics, audio and video that can interact with many users simultaneously. vvvv is a visual programming interface. Therefore it provides a graphical programming language for easy prototyping and development. vvvv is real time, where many other languages have distinct modes for building and running programs, vvv only has one mode, run-time. vvvv is free for non-commercial use."

VVVV Screenshots

VVVV's Propaganda Page
Other projects using VVVV
Struktable: the 70-inch Multitouch Table

STRUK ON A SPHERE: Interactive installation at a Mercedes Benz conference

Feb 18, 2009

Ready for the SMARTTable?

The Smart Table is now available for purchase!



Here is the plug:

"The world's first multitouch, multiuser table for primary education - the SMART Table - is now available for purchase.Order the SMART Table"

"As a collaborative learning center, the SMART Table enables engaging and motivating small-group learning experiences. Up to eight students can use their fingers intuitively to sweep, slide and spin objects on the interactive screen. The SMART Table's ready-made activities help primary students gain and further their skills in areas like counting and reading."

"The SMART Table also makes an ideal complement to whole-class activities on the SMART Board interactive whiteboard. It helps reinforce concepts in a small-group setting and ensures students can participate in interactive and creative learning experiences."

(Cross-posted on the TechPsych and Technology-Supported Human-World Interaction blogs.)

Dec 7, 2008

Demo of Duke University's multi-touch wall at RENCI, running the Cobalt Metaverse Browser

The video below shows the "pre-alpha" version of the Cobalt Metaverse Browser:



"This video shows the Cobalt metaverse browser being tested on a 13-foot by 5-foot multi-touch visualization wall equipped with six high-definition projectors located at the Renaissance Computing Institute engagement center at Duke University. The input drivers are being developed by Dr. Xunlei Wu so that users can directly manipulate high-resolution data using both hands and multiple fingers for a more natural and intuitive data exploration experience. In the video, Dr. Wu is using both gesture and touch to navigate through, and rearrange content between, two Cobalt virtual worlds."

Related:
The Open Cobalt Project (on ning)
Cobalt Website & link to download to the latest pre-alpha build
Cobalt Community,
Cobalt Google Group
EduSim (A 3D multi-user virtual world platform and authoring toolkit for K-12 interactive whiteboards. The latest version is powered by Cobalt)

Cobalt can import objects from the Google 3D warehouse as well as Google Sketch-up:
Video Tutorial: Using Google 3D Warehouse to build Cobalt & Edusim Virtual Worlds

People:
Julian Lombardi, Duke University
Xunlei Wu, Senior Visualization Researcher, RENCI, Duke University
Rich White, EduSim; Greenbush Education Service Center, Girard, KS

Kids using Cobalt-based EduSim on desktop computers, via Rich White:

Nov 13, 2008

RENCI at Duke University: Multi-Touch Collaborative Wall and Table utilizing TouchLib; More about UNC-C's Viz lab...

RENCI is a multi-disciplinary collaboration between several universities in North Carolina, with centers located at the Europa Center, Duke University, N.C. State, UNC Chapel Hill, East Carolina University, UNC-Asheville, UNC-Charlotte, and the Health Sciences Library at UNC-Chapel Hill. Many of the centers focus on visualization and collaborative technologies, and have been involved in multi-touch "surface" computing.

The pictures below are from the RENCI center at Duke University:

http://vis.renci.org/multitouch/wp-content/themes/daleri-dark-10/img/front.jpg

Duke Multi-Touch Collaborative Wall

The multi-touch wall is 13 x 5 feet, and utilizes six high-definition projectors, resulting in a combined resolution of 5760-2160, and supports multiple users. According to information on the RENCI website, the design is scalable and applicable to non-flat surfaces. The wall system runs on Windows and Linux.

Duke Multitouch Wall. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

The Wall is positioned at the end of the primary collaboration space. (Photo credit: Josh Coyle)

(Photo by Josh Coyle)

DI, or Direct Illumination is used for touch detection in both the wall and the table for detecting touch. A separate instance of Touchlib runs for each of the 8 cameras used to detect touch. A gesture engine interprets the information about touches on the screen as gesture events. Each camera is handled separately for image processing and blob tracking tasks.

Direct Illumination (DI)

Graphics from the RENCI Vis Group Multi-Touch Blog

The Duke Multi-Touch Wall System

Here is cool picture of the "Multi-touch Calibration Device", which uses a built-in TouchLib utility.

Calibrating using the utility built into TouchLib.

Additional information can be found on the RENCI Vis Group Multi-Touch Blog.

FYI

Touchlib is a multi-touch development kit that can be found on the NUI-Group website.

"Touchlib is a library for creating multi-touch interaction surfaces. It handles tracking blobs of infrared light, and sends your programs these multi-touch events, such as 'finger down', 'finger moved', and 'finger released'. It includes a configuration app and a few demos to get you started, and will interace with most types of webcams and video capture devices. It currently works only under Windows but efforts are being made to port it to other platforms."

If you are interested in creating your own multi-touch table, the NUI-Group website and forums are a great place to start.

Related:

If you follow my blog, you probably know that I've taken several graduate courses at UNC-Charlotte. Some of my professors and a classmate or two have been involved in some exciting visualization research over the past year. (If you are serious about multi-touch and other visually-based applications, it is worth taking some time to familiarize yourself with visualization and interaction research.)

News from the UNC-Charlotte Vis Center:

At the University of North Carolina at Charlotte, RENCI is a collaboration between the UNC Charlotte Urban Institute, the Center for Applied Geographic Information Science, and the Charlotte Visualization Center.

11/06/2008
Robert Kosara’s group wins two awards at IEEE VisWeek Caroline Ziemkiewicz and Robert Kosara won Honorable Mention (the second highest award) at the IEEE InfoVis Conference for their paper, “The Shaping of Information by Visual Metaphors”. Also, Alex Godwin, Kosara’s student, won Best Poster for his submission, “Visual Data Mining of Unevenly-Spaced Event Sequences”.

The Vis Center is pretty fascinating, as you can see by the group of visitors at an open house.

If you are just as fascinated by this stuff as the guys in the picture, here are links to some recent papers by UNC-Charlotte faculty affiliated with the Vis Center:

The Shaping of Information by Visual Metaphors (Caroline Ziemkiewicz and Robert Kosara)

Evaluating the Relationship Between User Interaction and Financial Visual Analysis (Don Hyun Jeong, Wenwen Dou, Felsia Stukes, William Ribarsky, Heather Richter Lipford, Remco Chang)

Visual Analytics for Complex Concepts Using a Human Cognition Model (Tera Marie Green, William Ribarsky, and Brian Fisher)

Aug 16, 2008

Microsoft Research project: MouseMischief - Multi-user, Multi-Mice Interaction on Large Displays

This is an interesting demonstration of the use of multiple mice, controlled by children on an interactive whiteboard. The collaborative application uses Microsoft's Multi-Point technology. For more information and free downloads, go to MouseMischief.org.


Mar 17, 2008

Look, touch, listen, and play: Seth Sandler's Interactive Audio Touch Table Video; NUI Group and Google's Summer of Code

Seth Sandler's most recent video of the Audio Touch interactive table provides a good demonstration of how multi-touch on a table can work with music.



Seth is a member of the NUI group (Natural User Interface). He is finishing a Bachelors degree in Interdisciplinary Computing and the Arts, with an emphasis on Music, at the University of California, San Diego. His research and development work centers around multi-touch, multi-user musical interfaces.

Here is an update about the NUI group:

"Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications."

"We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."

The NUI group has been selected for mentoring organization for Google's Summer of Code, for those of you who are interested in working on open-source code for multi-touch systems. The student application process begins Monday, March 24th, 2008, and ends Monday, March 31st, 2008.

NUI group's project ideas page outlines the requirements for the application, which includes a 7500 word project proposal. The project page has a long list of ideas to spark some thinking for potential Summer of Code applicants.

For those of you who aren't into coding, I encourage you to take a look at the NUI Groups project ideas page just to get an idea of th interesting ideas that are being explored. The page has a list of links to other good resources.

Share the word with anyone who might be interested in the NUI Group's projects for the Summer of Code. We need to get more people interested in STEM careers, and the project ideas outlined by the NUI group look enticing.