Showing posts with label kinect. Show all posts
Showing posts with label kinect. Show all posts

Feb 6, 2011

Another close encounter with in-store digital display marketing at Best Buy, bad internet TV controllers, bowling with the Kinect, and more...

Not long ago I visited a Best Buy. While I was there, I wanted to play around with the Kinect, but the demo system hadn't arrived.  As I wandered around the store, I encountered quite a few digital displays, part of  Best Buy's recent in-store digital media marketing effort.  I later shared my experience in a blog post,  Close Encounter with "Best Buy On": Example of a multi-channel marketing approach using in-store digital media that includes an on-line magazine.  


During today's visit to Best Buy,  I noticed that there were more display centers in various departments in the store, and many of the displays had useful and informative content. The Kinect demo was up and running, too.


Within the store,  I noticed a strong emphasis on HDTVs with internet capabilities. I  was hoping that the new Internet HDTVs would come with user-friendly touch-screen controllers, or at least an app for use on touch-screen smartphones, iPads, or other touch-screen tablets. 

What I found was disappointing.  On display were traditional-looking multi-button remote controllers, controllers that looked like PC keyboards,  and of course, Sony's confusing multi-featured contribution to the Internet TV scene.


None of the controllers seemed to be easy-to-use, or capable of supporting web-navigation and other web interactions on HDTVs from a distance, especially when the goal is to watch movies and video from a recliner in a darkened family room.  


What sort of user-centered design or usability studies were in place during the hatching of Sony's Internet TV controller?!  (For more about bad remote controls, one of my pet peeves,  see  "Oh! No! Sony's "Mother of Remote Controls" for Google TV, 74 Buttons and Counting")


Below is a partially annotated slideshow of pictures I took of some of the displays and other things I encountered at Best Buy.  Enjoy!



Close Encounter with Kinect Bowling
It wasn't easy trying to bowl and take video with a phone at the same time! 
(Please excuse the shaky video effects and the view of my fingers.)

Jan 21, 2011

MIT MediaLab's DepthJS: Now your web page can interact with the Microsoft Kinect using Javascript (Link to code, more)

MIT Media Lab's DepthJS website

Info from the DepthJS website:
"Navigating the web is only one application of the framework we built - that is, we envision all sorts of applications that run in the browser, from games to specific utilities for specific sites. The great part is that now web developers who specialize in Javascript can work with the Kinect without having to learn any special languages or code. We believe this will allow a new set of interactions beyond what we first developed."


DepthJS is open source under the AGPL license. Code: https://github.com/doug/depthjs

RELATED
Gestures that your TV Will Understand (Features information about PrimeSense)
Tom Simonite, MIT Technology Review, 1/21/11


Hackers Take the Kinect to New Levels
Timothy Carmody, MIT Technology Review 12/2/10


Microsoft Kinect: How the device can respond to your voice and gestures
Erica Naone, MIT Technology Review, January/February 2011






Comment:
I went to Best Buy today and almost bought a Kinect.  Unfortunately, the demo hadn't come in yet, so I decided to wait until I could give it a try.  I'm curious to experiment with what it can do. 

Dec 12, 2010

LM3LAB's Useful Map of Interactive Gesture-Based Technologies: Tracking fingers, bodies, faces, images, movement, motion, gestures - and more

Nicolas Loeillot, of LM3LABS, has been ahead of the natural user interaction/ interface game for many years as his company has expanded. He's done quite a bit of deep thinking about the work of his company, and has used this wisdom to create a nice concept map that describes how LM3LABS' solutions fit into the world of gesture-based control and interaction:




In my opinion, this chart would make a great template for mapping out other natural interaction applications and products!


Here is the description of the concepts outlined in the chart:


"If all of them belong to the “gesture control” world, the best segmentation is made from 4 categories:
  • Finger tracking: precise finger tracking, it can be single touch or multi-touch (this latest not always being a plus). Finger tracking also encompasses hand tracking which comes, for LM3LABS products, as a gestures.
  • Body tracking: using one’s body as a pointing device. Body tracking can be associated to “passive” interactivity (users are engaged without their decision to be) or “active” interactivity like 3D Feel where “players” use their body to interact with content.
  • Face tracking: using user face as a pointing device. It can be mono user or multiple users. Face tracking is a “passive” interactivity tool for engaging user in an interactive relationship with digital content.
  • Image Tracking: Augmented Reality (AR) lets users use images (flyers, real products, t-shirts, faces,…) to interact with digital content. AR can be markerless or marker-based. Markerless technology has advantages but marker-based AR is easier for users to understand. (Please note here that Markerless AR is made in close collaboration with AR leader Total Immersion)."  -LM3LABS
   If you are interested in this subject and want to view some good examples of off-the-desktop interfaces and interactions, take a look at the LM3LABS blog, as well as Nicolas Loeillot's Vimeo channel.  Also take a look at the sample of posts I've written about LM3LABS over the last few years - the links are at the end of this post.

I love LM3LABS' Interactive Balloon:

Interactive balloons from Nicolas Loeillot on Vimeo.


Interactive Balloons v lm3 labs v2 (SlideShare)



Background
I first discovered LM3LABS when I was taking a VR class and researching interactive, immersive large displays in 2005 or 2006.  Back then, there wasn't much information about this sort of technology.  A lot has changed since then!


I've learned quite a bit from watching LM3LABS (and others) grow, given my passion for postWIMP interactive technology and my commitment to blogging about this subject.   Nicolas has really worked hard in this arena.  As early as 2005, LM3LABS was working with Scala to provide "smart" interactive displays, and his company's applications have been supported by computer vision technologies for many years, allowing for gesture-based, or "touch-less" interaction, as demonstrated by the Catchyoo Interactive Table.  This application caught my eye back in early 2007, when I was working on projects for large interactive displays for my HCI and Ubicomp classes, and was thinking about creating a table-top application.


My hunch is that LM3LABS has set the foundation for further growth in the future, given the lessons they've learned by taking risks with postWIMP technologies over the past few years!


Previous Blog Posts Related to LM3LABS:
Interactive Retail Book (Celebrating history of Christian Dior from 1948-2010 (video)
Ubiq Motion Sensor Display at Future Ready Singapore (video)
Interactive Virtual DJ on a Transparent Pane, by LM3LABS and Brief Ad
LM3LABS' Catchyoo Interactive Koi Pond: Release of ubiq'window 2.6 Development Kit and Reader
A Few Things from LM3LABS
LM3LABS, Nicolas Leoillot, and Multi-touch
More from LM3LABS: Ubiq'window and Reactor.cmc's touch screen shopping catalog, Audi's touch-less showroom screen, and the DNP Museum Lab.


About LM3LABS
"Founded in 2003 by a team of passionate researchers, engineers, designers, and marketers from various international backgrounds, focused on fast transformation of innovation into unique products, LM3LABS is a recognized pioneer in computer vision-based interactivity solutions. Keeping a strong customer focus, LM3LABS' team of unique people pioneers new directions, explores new concepts, new technologies and new interactions.  Engaging, playful and magic, LM3LABS' products and solutions are always scalable and reliable"

info@lm3labs.com

Note to readers:
Over the past couple of years there has been an explosion of postWIMP technologies and applications, and with this pace, it has been difficult for me to keep abreast of it all. There is quite a bit I miss, given my full time job and daily life!

I welcome information about postWIMP interactive technologies and applications from my readers.  Due to time constraints, not interest, I am not always able to post about a topic as soon as I'd like.  That is OK, as my intention is not to be the first blogger to spread the latest tech news.  I like to dig in deep when I can and make connections between innovative, interesting technologies and the people and ideas behind them. 




Dec 6, 2010

UPDATE: Demo 2 of the Kinect Theramin, Therenect, by Martin Kaltenbrunner

I recently posted about the Therenect, a gesture-controlled digital theremin created for Microsoft's Kinect, created by Martin Kaltenbrenner - Therenect: Theremin for the Kinect! (via Martin Kaltenbrenner)  It looks like Martin has been busy polishing up the application over the past few days, as you can see from the video below:

Therenect - Kinect Theremin - 2nd Demo from Martin Kaltenbrunner on Vimeo.

RELATED
Virtual Theremin Made with Kinect; Real Thereminists Will Make it Useful
Peter Kirn, Create Digital Music, 11/30/10

Interactive Information Visualization for the Kinect? Something like Jer Thop's "Just Landed-36 Hours" might work nicely if revamped!

I follow the O'Reilly Radar blogs and came across a recent post about an information visualization created by blprnt two years ago using Processing. I think it would have great potential if it was re-purposed for use on the Kinect! In the article, Edd Dumbill discusses the advantages of using Processing to create data and information visualizations.  


One example of the power of Processing is an information visualization, "Just Landed -36 Hours, created by Jer Thorp.  Jer gathered tweets from Twitter that included the statement, "just landed", along with location information for each tweet, within a 36-hour period, to create the visualization.


36 Hours- Just Landed is a great 3D visualization of air travel on our planet.  I especially lik the different views that the application provides. As soon as I watched the Just Landed video, I thought it would be great if it could be revamped for use on the Kinect!   (Leave a comment if you know of anyone working on a project in this area.)


Just Landed - 36 Hours from blprnt on Vimeo.


Information about the video from blprnt's Vimeo site:


"I was discussing H1N1 with a bioinformatics friend of mine last weekend, and we ended up talking about ways that epidemiologists model transmission of disease. I wondered how some of the information that is shared voluntarily on social networks might be used to build useful models of various kinds...I'm also interested in visualizing information that isn't implicitly shared - but instead is inferred or suggested...This piece looks for tweets containing the phrases 'just landed in...' or 'just arrived in...'. Locations from these tweets are located using MetaCarta's Location Finder API. The home location for the traveling users are scraped from their Twitter pages. The system then plots these voyages over time...I'm not entirely sure where this will end up going, but I am reasonably happy with the results so far.   Built with Processing (processing.org) You can read more about this project on my blog - blog.blprnt.com"


RELATED
Strata Gems:  Write your own visualizations:  The Processing language is an easy way to get started with graphics
Edd Dumbill, O'Reilly Radar, 12/3/10

Air Presenter Plus, for the Kinect, for Presentations, developed by Evoluce and So touch

As soon as Kinect was released by Microsoft, there was a flurry of app development. Evoluce and So Touch partnered to create a presentation application for the Kinect that could be used in work settings. Take a look!


Information about Air Presenter Plus, from the So touch's YouTube channel:

"So touch, the leading creative software company for new digital technologies, in partnership with Evoluce, the leading provider of advanced multi-touch screen technologies, present: So touch Air Presenter for Kinect. The world's first presentation software optimized for Kinect.

Turn your corporate presentations, welcome areas, trade show booths and point of sales into mind boggling experiences, controlling your presentation with multi-touch gestures leveraging So touch Air Presenter gestures software and Evoluce Kinect Windows 7 software.

Integrate your usual PDF, Power point, JPG and video materials into So touch multi-touch minority report's style interface and control it with gestures in the air.

So touch Air Presenter is delivered with a very graphic player, featuring a multi-touch zoom mode and an integrated video player as well as a very easy to use content manager.

So touch Air Presenter content, sourced locally or from the network, can be played on multiple screens at the same time. So touch Air Presenter content manager can deliver customize or generic content to each player.

So touch Air Presenter packaged with Evoluce Kinect Windows 7 software will be released soon. So touch Air Presenter is already available for TUIO based gestures devices. To know more and download a free trial version, visit http://www.so-touch.com/air-presenter"




So touch
Evoluce

Dec 3, 2010

More gesture and multi-touch interaction! Windows 7 Navigation with Kinect; Product browser by Immersive Labs,

Here are a couple of new natural user interface videos.  The first video, by Evoluce, demonstrates gesture interaction/navigation in Windows 7 applications supported by Kinect. The second video, by Immersive Labs, shows multi-touch product browsing interaction on a large display.

Kinect Treatment of Windows 7, by Evoluce

Evoluce: Leading Surface Technologies


Immersive Labs - Multi-touch Product Browser

Immersive Labs

Nov 30, 2010

TuioKinect, by Martin Kaltenbrunner: "A simple TUIO hand gesture tracker for Kinect"

More Kinect from Martin Kaltenbrunner:


Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/​ You can download the application from: code.google.com/​p/​tuiokinect/​ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"


I've played around with Tuio and OpenFrameworks, but it has been a while.  I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.

RELATED/SOMEWHAT RELATED
TuioKinect:  TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)

Nov 29, 2010

Tech Product Placement & Embedded Advertising: Cisco Telepresence, Surface, Kinect, Windows 7 Phone, Apple, Apple iAd - Videos, Links - plus legal & ethical concerns

I was watching a DVR'd episode of NCIS tonight with my husband and noticed Cisco Telepresence video conferencing system was a player in the story line,  as well as a Cisco Cius touch-screen tablet.   It seems that on TV, laptops, desktops, and old-fashioned cell phones are history.    "Emerging" technologies are woven into the story lines of more television episodes,  including CSI, NCIS, Bones, Grey's Anatomy, and others.

The original intention of this post was to discuss the concept of emerging technologies and product placement/embedded advertising on television programs and movies, and share a few interesting examples related to his topic.  I quickly realized that there is much more to this story. Why?  More people access video and interactive multimedia content when they are on-the-go using laptops, smart phones, iPads, and similar tablets.   New televisions, such as Sony Internet TV, are internet-enabled, and many people already access the web content on their televisions through devices such as game consoles or Apple TV.

It is a marketer's dream. 

Unfortunately, we might not be ways to "opt-out" of all of the indirect (and direct) advertising that will come our way as we access video and related content across multiple platforms.   It won't be as easy as blocking pop-up ads or fast-forwarding the DVR!  

Below are some examples of ways some emerging technologies are "placed" in television/film, grouped by company.  In the "Apple" section, I've included video of Steve Jobs introducing iAds. Near the end of this post, I've included links that relate to ethical legal and ethical issues regarding product placement and embedded advertising.   

Food for thought.   I'm still digesting what I've found!

Cisco
The following links about CISCO's product placement are from CISCO on TV and in the Movies:
Cisco TelePresence and Video Phone on NCIS (links to video clips)
Cisco TelePresence Conferencing on 30 Rock
Cisco Telepresence on CSI: NY


MICROSOFT 
Microsoft Surface on Grey's Anatomy


Kinect on Chuck


Kinect on Entourage


Kinect is a New Advertising Platform for Microsoft
David Erickson, e-StrategyBlog 11/22/10

WINDOWS 7 PHONE
Windows 7 Phone Product Placement on Bones


APPLE
iPad Gets Half Hour of Product Placement on Modern Family



Apple iAD Mobile advertising that delivers interaction and emotion, 1 billion ad impressions a day, within your app. Apple's iAD isn't really product placement. It is about embedded ads in your mobile devices.
"Who wants to get yanked out of their ad?"-Steve Jobs




"iAd is a breakthrough mobile advertising platform from Apple. With it, apps can feature rich media ads that combine the emotion of TV with the interactivity of the web. For developers, it means a new, easy-to-implement source of revenue. For advertisers, it creates a new media outlet that offers consumers highly targeted information." -Apple


iAd for Brands       iAd for Developers

HP TouchSmart
Annalyn Censky, CNN Money, 5/28/10
New Black Eyed Peas Video...or is it an AD for HP?
Duncan Riley, The Inquisitr, 4/19/10

RELATED

Ben Shaw, BBH, 7/16/10

Engaget's ScreenGrabs Posts
BrandCameo-Films  
Brand Cameo-Brands
On this website, you can search for product placement by brand, by film, and by year.
3D Technology: The End of Product Placement As We Know It?
Dan Nosowitz, Fast Company 3/5/10
Discusses the technical difficulties of embedding products in 3D movies.


Legal and Ethical Issues
As I searched for more information about product placement and embedded advertising, I came across a few posts/websites that suggests that in some circles, this is a hot/controversial topic:

Paul A. Cicelski, Common Law Center 9/2/10
Protection of Children Prompts FCC Regulation of Internet and Wireless Video PRogramming and Enhanced State Privacy Rules
Daide Oxenford, Broadcast Laww Blog, 8/26/09
Joseph Lewczak and Ann DiGiovanni, WLF Legal Backgrounder, 4/9/10
FIT Media FAQs (FIT= Fairness and Integrity in Telecommunications Media)
"FIT Media is a non-partisan coalition of health, media and child advocacy organizations and professionals supporting transparency and child protection in embedded TV advertising."
This is an interesting website - FIT Media covers topics such as "Advernews", "Embedded Propoganda", "Deceptive Advertising", and ways that embedded advertising might be harmful.  
Week ahead:  FCC meeting, Do Not Track hearing
Cecilia Kang, Washington Post 11/29/10
Giselle Tsirulnik, Mobile Marketer, 7/22/10
What is Apple's New Privacy Policy? "Amidst all the glitz of releasing a new mobile operating system and iPhone, Apple quietly updated their privacy policy. Why?"
Michael Kassner, Tech Republic, 6/28/10
FYI: If you have an iPhone running iOS 4 and wish to opt-out of iAD, you can do so at http://oo.apple.com

COMMENT
Xerox's new technology enables the alteration of content within a video or television program, based on specific information about the viewer/user:

Xerox Brings Behavioral Targeting To Television  (Interesting use of technology)
Go Rumors, 1/13/10
Xerox Patent Filing Make Product Placement Addressable
The Media Buyer, 1/12/10
"The patent describes the system (via GoRumors) as having the ability to alter content within a program based on the viewer. For example, if a character on a show mentions Macy’s, that content could be shown to general viewers. But that small portion of the broadcast could be “marked,” and the content could be changed so that the character instead says the name of sporting goods store Modell’s. That portion of the broadcast would be served to viewers who are into sports. Similarly, if the storefront was shown during the program, general audiences would see the Macy’s store, while sports fans would see the Modell’s store."

Nov 20, 2010

Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Gobeille (design.io) & Update on Funky Forest

I came across the following video on a post on the  Creative Applications blog about the Theo Watson and Emily Gobeille's  recent work with OpenFrameworks, Kinect, and an interactive puppet prototoype:

Interactive Puppet Prototype with Xbox Kinect from Theo Watson on Vimeo.

I'm not surprised that Theo Watson and Emily (design.io) decided to experiment with the XBox Kinect. I can't wait to see what they will create for Kinect in the future, based on their previous work --   A couple of years ago I wrote a post about their enchanting interactive installation, "Funky Forest":  OpenFrameworks & Interactive Multimedia:  Funky Forest Installation for CinKid


You can see from the video of their Funky Forest installation (2007 CinKid) that they create engaging interactive applications:

Funky Forest - Interactive Ecosystem from Theo Watson on Vimeo.
(Information from Theo Watson's Vimeo site)
"Funky Forest is a wild and crazy ecosystem where children manage the resources to influence the environment around them. By using their bodies or pillow "rocks" and "logs", water flowing from the digital stream on the floor can be dammed and diverted to the forest to make different parts grow. If a tree does not receive enough water it withers away but by pressing their bodies into the forest children create new trees based on their shape and character. As children explore and play they discover that the environment is inhabited by a variety of sonic life forms and creatures who appear and disappear depending on the health of the forest. As the seasons change the creatures also go through a metamorphosis."
Credits:
Theodore Watson
Emily Gobeille
Project Page:
muonics.net/​site_docs/​work.php?id=41
zanyparade.com/​v8/​projects.php?id=12


Note:  A version of Funky Forest, Moomah Edition, is permanently installed in New York City at the Moomah Children's Cafe. This version includes interactive environments related to the four seasons of the year, including an interactive particle system with falling leaves and snow.


FOR THE TECH-CURIOUS
Libfreenect for OX, by Theodore Watson

XBox Kinect running on OS X ( with source code ) from Theo Watson on Vimeo.
The following information and links were taken from the Vimeo description of the above video:  
"This is a port/adaptation of Hector Martin's libfreenect for OS X made by Theodore Watson.
Hector Martin's Libfreenect project page is here:  git.marcansoft.com/​?p=libfreenect.git
Most of the code is unchanged but there are some changes to libusb which was needed to get it running (and a few extra libusb commands) as well as some tweaking of the transfer sizes.
It should be self contained and you shouldn't need to install libusb (the app links it directly ).
Grab the Source Code:
UPDATE:
(Tested on OS X 10.6.3 - 32bit now and with fixes)
theo.tw/​deliver/​kinect/​001-libfreenect-modded-osx-updated.zip
ofxKinect for OF users - thanks Dan!
openframeworks.cc/​forum/​viewtopic.php?p=24948#p24948
Tips:
- Try both usb ports.
- Try not to have too many other devices plugged in (or any)."


(Check the Vimeo website to see if there are updates)


OpenFrameworks Forum
http://www.openframeworks.cc/forum/


RELATED
Presentation about Funky Forest  (ThisHappened)

Emily Gobeille & Theo Watson talk about Funky Forest from This happened – Utrecht on Vimeo.


Cross-posted on the TechPsych blog.

Nov 13, 2010

HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)

MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's  proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction.  The application was built on Ubuntu Linux written using libfreenect, by marcan42  and Florian's creation, libTISCH.



Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"


(I have SO many ideas for this!  I'll throw a few out there in an upcoming post....maybe someone can run with them!)


RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10


FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework.  You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.


LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10

Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.

For your convenience, I've reposted something I wrote about libTISCH back in 2009:

For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers



































Here is information from libTISCH announcement:

Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.


MeTaTop A Multi Sensory Table Top System for Medical Procedures