I recently posted about the Therenect, a gesture-controlled digital theremin created for Microsoft's Kinect, created by Martin Kaltenbrenner - Therenect: Theremin for the Kinect! (via Martin Kaltenbrenner) It looks like Martin has been busy polishing up the application over the past few days, as you can see from the video below:
Therenect - Kinect Theremin - 2nd Demo from Martin Kaltenbrunner on Vimeo.
RELATED
Virtual Theremin Made with Kinect; Real Thereminists Will Make it Useful
Peter Kirn, Create Digital Music, 11/30/10
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by date for query kinect. Sort by relevance Show all posts
Showing posts sorted by date for query kinect. Sort by relevance Show all posts
Dec 6, 2010
Interactive Information Visualization for the Kinect? Something like Jer Thop's "Just Landed-36 Hours" might work nicely if revamped!
I follow the O'Reilly Radar blogs and came across a recent post about an information visualization created by blprnt two years ago using Processing. I think it would have great potential if it was re-purposed for use on the Kinect! In the article, Edd Dumbill discusses the advantages of using Processing to create data and information visualizations.
One example of the power of Processing is an information visualization, "Just Landed -36 Hours, created by Jer Thorp. Jer gathered tweets from Twitter that included the statement, "just landed", along with location information for each tweet, within a 36-hour period, to create the visualization.
36 Hours- Just Landed is a great 3D visualization of air travel on our planet. I especially lik the different views that the application provides. As soon as I watched the Just Landed video, I thought it would be great if it could be revamped for use on the Kinect! (Leave a comment if you know of anyone working on a project in this area.)
Just Landed - 36 Hours from blprnt on Vimeo.
Information about the video from blprnt's Vimeo site:
"I was discussing H1N1 with a bioinformatics friend of mine last weekend, and we ended up talking about ways that epidemiologists model transmission of disease. I wondered how some of the information that is shared voluntarily on social networks might be used to build useful models of various kinds...I'm also interested in visualizing information that isn't implicitly shared - but instead is inferred or suggested...This piece looks for tweets containing the phrases 'just landed in...' or 'just arrived in...'. Locations from these tweets are located using MetaCarta's Location Finder API. The home location for the traveling users are scraped from their Twitter pages. The system then plots these voyages over time...I'm not entirely sure where this will end up going, but I am reasonably happy with the results so far. Built with Processing (processing.org) You can read more about this project on my blog - blog.blprnt.com"
RELATED
Strata Gems: Write your own visualizations: The Processing language is an easy way to get started with graphics
Edd Dumbill, O'Reilly Radar, 12/3/10
One example of the power of Processing is an information visualization, "Just Landed -36 Hours, created by Jer Thorp. Jer gathered tweets from Twitter that included the statement, "just landed", along with location information for each tweet, within a 36-hour period, to create the visualization.
36 Hours- Just Landed is a great 3D visualization of air travel on our planet. I especially lik the different views that the application provides. As soon as I watched the Just Landed video, I thought it would be great if it could be revamped for use on the Kinect! (Leave a comment if you know of anyone working on a project in this area.)
Just Landed - 36 Hours from blprnt on Vimeo.
Information about the video from blprnt's Vimeo site:
"I was discussing H1N1 with a bioinformatics friend of mine last weekend, and we ended up talking about ways that epidemiologists model transmission of disease. I wondered how some of the information that is shared voluntarily on social networks might be used to build useful models of various kinds...I'm also interested in visualizing information that isn't implicitly shared - but instead is inferred or suggested...This piece looks for tweets containing the phrases 'just landed in...' or 'just arrived in...'. Locations from these tweets are located using MetaCarta's Location Finder API. The home location for the traveling users are scraped from their Twitter pages. The system then plots these voyages over time...I'm not entirely sure where this will end up going, but I am reasonably happy with the results so far. Built with Processing (processing.org) You can read more about this project on my blog - blog.blprnt.com"
RELATED
Strata Gems: Write your own visualizations: The Processing language is an easy way to get started with graphics
Edd Dumbill, O'Reilly Radar, 12/3/10
Air Presenter Plus, for the Kinect, for Presentations, developed by Evoluce and So touch
As soon as Kinect was released by Microsoft, there was a flurry of app development. Evoluce and So Touch partnered to create a presentation application for the Kinect that could be used in work settings. Take a look!
Information about Air Presenter Plus, from the So touch's YouTube channel:
"So touch, the leading creative software company for new digital technologies, in partnership with Evoluce, the leading provider of advanced multi-touch screen technologies, present: So touch Air Presenter for Kinect. The world's first presentation software optimized for Kinect.
Turn your corporate presentations, welcome areas, trade show booths and point of sales into mind boggling experiences, controlling your presentation with multi-touch gestures leveraging So touch Air Presenter gestures software and Evoluce Kinect Windows 7 software.
Integrate your usual PDF, Power point, JPG and video materials into So touch multi-touch minority report's style interface and control it with gestures in the air.
So touch Air Presenter is delivered with a very graphic player, featuring a multi-touch zoom mode and an integrated video player as well as a very easy to use content manager.
So touch Air Presenter content, sourced locally or from the network, can be played on multiple screens at the same time. So touch Air Presenter content manager can deliver customize or generic content to each player.
So touch Air Presenter packaged with Evoluce Kinect Windows 7 software will be released soon. So touch Air Presenter is already available for TUIO based gestures devices. To know more and download a free trial version, visit http://www.so-touch.com/air-presenter"
So touch
Evoluce
Information about Air Presenter Plus, from the So touch's YouTube channel:
"So touch, the leading creative software company for new digital technologies, in partnership with Evoluce, the leading provider of advanced multi-touch screen technologies, present: So touch Air Presenter for Kinect. The world's first presentation software optimized for Kinect.
Turn your corporate presentations, welcome areas, trade show booths and point of sales into mind boggling experiences, controlling your presentation with multi-touch gestures leveraging So touch Air Presenter gestures software and Evoluce Kinect Windows 7 software.
Integrate your usual PDF, Power point, JPG and video materials into So touch multi-touch minority report's style interface and control it with gestures in the air.
So touch Air Presenter is delivered with a very graphic player, featuring a multi-touch zoom mode and an integrated video player as well as a very easy to use content manager.
So touch Air Presenter content, sourced locally or from the network, can be played on multiple screens at the same time. So touch Air Presenter content manager can deliver customize or generic content to each player.
So touch Air Presenter packaged with Evoluce Kinect Windows 7 software will be released soon. So touch Air Presenter is already available for TUIO based gestures devices. To know more and download a free trial version, visit http://www.so-touch.com/air-presenter"
So touch
Evoluce
Posted by
Lynn Marentette
Dec 5, 2010
Video: DaVinci Surface Physics Illustrator Interface on Xbox Kinect, with gesture interaction, by Razorfish
DaVinci prototype on Xbox Kinect from Razorfish - Emerging Experiences on Vimeo.
RELATED
Razorfish ports DaVinci interface to Kinect, makes physics cool (video)
Time Stevens, Engaget, 12/5/10
Razorfish
(I love this website.)
Posted by
Lynn Marentette
Labels:
daVinci,
gesture interaction,
interactive surface,
interface,
kinect,
NUI,
razorfish,
Xbox
No comments:
Dec 4, 2010
Top 10 Interactive Multimedia Technology Blogposts, November, 2010
Here are the "top ten" posts for this blog, according to the number of visitors during the month of November, 2010:
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)
11/13/2010
Power to the Pixel Cross-Media Forum Streaming Live from London Today!
10/12/2010
Revised Post 8/1/06: Interactive multimedia for social skills, understanding feelings, relaxtion, coping strategies, etc.
7/12/06
Interactive Touch Screen Technology, Participatory Design, and "Getting It": REVISED
11/11/2010
Microsoft Surface Light and Physics App for Kids at the Smithsonian
11/24/2010
Interactive iPad Apps for Kids with Autism: Could some of these be transformed for multi-touch tabletop activities?
11/06/10
New Version of Surface from Microsoft?
11/10/10
Tech Product Placement & Embedded Advertising: Cisco Telepresence, Surface, Kinect, Windows 7 Phone, Apple, Apple iAd - Videos, Links - plus legal & ethical concerns
11/29/10
Digital Newspaper from News Corp, for the iPad (via physog, Guardian)
10/18/10
Online Physics Games for Interactive Whiteboards and Touch Screens (including mobile devices)
10/18/10
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)
11/13/2010
Power to the Pixel Cross-Media Forum Streaming Live from London Today!
10/12/2010
Revised Post 8/1/06: Interactive multimedia for social skills, understanding feelings, relaxtion, coping strategies, etc.
7/12/06
Interactive Touch Screen Technology, Participatory Design, and "Getting It": REVISED
11/11/2010
Microsoft Surface Light and Physics App for Kids at the Smithsonian
11/24/2010
Interactive iPad Apps for Kids with Autism: Could some of these be transformed for multi-touch tabletop activities?
11/06/10
New Version of Surface from Microsoft?
11/10/10
Tech Product Placement & Embedded Advertising: Cisco Telepresence, Surface, Kinect, Windows 7 Phone, Apple, Apple iAd - Videos, Links - plus legal & ethical concerns
11/29/10
Digital Newspaper from News Corp, for the iPad (via physog, Guardian)
10/18/10
Online Physics Games for Interactive Whiteboards and Touch Screens (including mobile devices)
10/18/10
Posted by
Lynn Marentette
Dec 3, 2010
More gesture and multi-touch interaction! Windows 7 Navigation with Kinect; Product browser by Immersive Labs,
Here are a couple of new natural user interface videos. The first video, by Evoluce, demonstrates gesture interaction/navigation in Windows 7 applications supported by Kinect. The second video, by Immersive Labs, shows multi-touch product browsing interaction on a large display.
Kinect Treatment of Windows 7, by Evoluce
Evoluce: Leading Surface Technologies
Immersive Labs - Multi-touch Product Browser
Immersive Labs
Kinect Treatment of Windows 7, by Evoluce
Evoluce: Leading Surface Technologies
Immersive Labs - Multi-touch Product Browser
Immersive Labs
Posted by
Lynn Marentette
Labels:
evoluce,
gesture,
immersive labs,
kinect,
multi-touch,
NUI,
product browser,
touch,
Windows 7
No comments:
Nov 30, 2010
TuioKinect, by Martin Kaltenbrunner: "A simple TUIO hand gesture tracker for Kinect"
More Kinect from Martin Kaltenbrunner:
Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/ You can download the application from: code.google.com/p/tuiokinect/ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"
I've played around with Tuio and OpenFrameworks, but it has been a while. I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.
RELATED/SOMEWHAT RELATED
TuioKinect: TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)
Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/ You can download the application from: code.google.com/p/tuiokinect/ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"
I've played around with Tuio and OpenFrameworks, but it has been a while. I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.
RELATED/SOMEWHAT RELATED
TuioKinect: TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)
Posted by
Lynn Marentette
Therenect: Theremin for the Kinect! (via Martin Kaltenbrenner)
Yet another reason why I need to get a Kinect!
Martin Kaltenbrenner's video demonstrates how the Kinect can be transformed into a virtual Theremin.
Therenect - Kinect Theremin from Martin Kaltenbrunner on Vimeo.
Here's Martin's description of the Therenect:
"The Therenect is a virtual Theremin for the Kinect controller. It defines two virtual antenna points, which allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that are quite similar to playing an actual Theremin."
"This musical instrument has been developed by Martin Kaltenbrunner at the Interface Culture Lab at the University of Art and Industrial Design in Linz, Austria. The software has been developed using the Open Frameworks and OpenKinect libraries."
Martin Kaltenbrenner's video demonstrates how the Kinect can be transformed into a virtual Theremin.
Therenect - Kinect Theremin from Martin Kaltenbrunner on Vimeo.
Here's Martin's description of the Therenect:
"The Therenect is a virtual Theremin for the Kinect controller. It defines two virtual antenna points, which allow controlling the pitch and volume of a simple oscillator. The distance to these points can be controlled by freely moving the hand in three dimensions or by reshaping the hand, which allows gestures that are quite similar to playing an actual Theremin."
"This musical instrument has been developed by Martin Kaltenbrunner at the Interface Culture Lab at the University of Art and Industrial Design in Linz, Austria. The software has been developed using the Open Frameworks and OpenKinect libraries."
Posted by
Lynn Marentette
Nov 29, 2010
Tech Product Placement & Embedded Advertising: Cisco Telepresence, Surface, Kinect, Windows 7 Phone, Apple, Apple iAd - Videos, Links - plus legal & ethical concerns
I was watching a DVR'd episode of NCIS tonight with my husband and noticed Cisco Telepresence video conferencing system was a player in the story line, as well as a Cisco Cius touch-screen tablet. It seems that on TV, laptops, desktops, and old-fashioned cell phones are history. "Emerging" technologies are woven into the story lines of more television episodes, including CSI, NCIS, Bones, Grey's Anatomy, and others.
The original intention of this post was to discuss the concept of emerging technologies and product placement/embedded advertising on television programs and movies, and share a few interesting examples related to his topic. I quickly realized that there is much more to this story. Why? More people access video and interactive multimedia content when they are on-the-go using laptops, smart phones, iPads, and similar tablets. New televisions, such as Sony Internet TV, are internet-enabled, and many people already access the web content on their televisions through devices such as game consoles or Apple TV.
It is a marketer's dream.
Unfortunately, we might not be ways to "opt-out" of all of the indirect (and direct) advertising that will come our way as we access video and related content across multiple platforms. It won't be as easy as blocking pop-up ads or fast-forwarding the DVR!
Below are some examples of ways some emerging technologies are "placed" in television/film, grouped by company. In the "Apple" section, I've included video of Steve Jobs introducing iAds. Near the end of this post, I've included links that relate to ethical legal and ethical issues regarding product placement and embedded advertising.
Food for thought. I'm still digesting what I've found!
Cisco
The following links about CISCO's product placement are from CISCO on TV and in the Movies:
Cisco TelePresence and Video Phone on NCIS (links to video clips)
Cisco TelePresence Conferencing on 30 Rock
Cisco Telepresence on CSI: NY
MICROSOFT
Microsoft Surface on Grey's Anatomy
Kinect on Chuck
Kinect on Entourage
Kinect is a New Advertising Platform for Microsoft
David Erickson, e-StrategyBlog 11/22/10
WINDOWS 7 PHONE
Windows 7 Phone Product Placement on Bones
APPLE
iPad Gets Half Hour of Product Placement on Modern Family
Apple iAD Mobile advertising that delivers interaction and emotion, 1 billion ad impressions a day, within your app. Apple's iAD isn't really product placement. It is about embedded ads in your mobile devices.
"Who wants to get yanked out of their ad?"-Steve Jobs
"Who wants to get yanked out of their ad?"-Steve Jobs
"iAd is a breakthrough mobile advertising platform from Apple. With it, apps can feature rich media ads that combine the emotion of TV with the interactivity of the web. For developers, it means a new, easy-to-implement source of revenue. For advertisers, it creates a new media outlet that offers consumers highly targeted information." -Apple
iAd for Brands iAd for Developers
HP TouchSmart
Annalyn Censky, CNN Money, 5/28/10New Black Eyed Peas Video...or is it an AD for HP?
Duncan Riley, The Inquisitr, 4/19/10
RELATED
Ben Shaw, BBH, 7/16/10
Engaget's ScreenGrabs Posts
BrandCameo-Films
Brand Cameo-Brands
On this website, you can search for product placement by brand, by film, and by year.
3D Technology: The End of Product Placement As We Know It?
Dan Nosowitz, Fast Company 3/5/10
Discusses the technical difficulties of embedding products in 3D movies.
Legal and Ethical Issues
As I searched for more information about product placement and embedded advertising, I came across a few posts/websites that suggests that in some circles, this is a hot/controversial topic:
Paul A. Cicelski, Common Law Center 9/2/10
Protection of Children Prompts FCC Regulation of Internet and Wireless Video PRogramming and Enhanced State Privacy Rules
Daide Oxenford, Broadcast Laww Blog, 8/26/09
Protection of Children Prompts FCC Regulation of Internet and Wireless Video PRogramming and Enhanced State Privacy Rules
Daide Oxenford, Broadcast Laww Blog, 8/26/09
Joseph Lewczak and Ann DiGiovanni, WLF Legal Backgrounder, 4/9/10
FIT Media FAQs (FIT= Fairness and Integrity in Telecommunications Media)
"FIT Media is a non-partisan coalition of health, media and child advocacy organizations and professionals supporting transparency and child protection in embedded TV advertising."
This is an interesting website - FIT Media covers topics such as "Advernews", "Embedded Propoganda", "Deceptive Advertising", and ways that embedded advertising might be harmful.
Week ahead: FCC meeting, Do Not Track hearing
Cecilia Kang, Washington Post 11/29/10
Week ahead: FCC meeting, Do Not Track hearing
Cecilia Kang, Washington Post 11/29/10
Giselle Tsirulnik, Mobile Marketer, 7/22/10
What is Apple's New Privacy Policy? "Amidst all the glitz of releasing a new mobile operating system and iPhone, Apple quietly updated their privacy policy. Why?"
Michael Kassner, Tech Republic, 6/28/10
FYI: If you have an iPhone running iOS 4 and wish to opt-out of iAD, you can do so at http://oo.apple.com
COMMENT
Xerox's new technology enables the alteration of content within a video or television program, based on specific information about the viewer/user:
Xerox Brings Behavioral Targeting To Television (Interesting use of technology)
Go Rumors, 1/13/10
Xerox Patent Filing Make Product Placement AddressableThe Media Buyer, 1/12/10
"The patent describes the system (via GoRumors) as having the ability to alter content within a program based on the viewer. For example, if a character on a show mentions Macy’s, that content could be shown to general viewers. But that small portion of the broadcast could be “marked,” and the content could be changed so that the character instead says the name of sporting goods store Modell’s. That portion of the broadcast would be served to viewers who are into sports. Similarly, if the storefront was shown during the program, general audiences would see the Macy’s store, while sports fans would see the Modell’s store."
Posted by
Lynn Marentette
Labels:
Apple,
behavioral targeting,
cisco,
embedded advertising,
FCC,
FIT Media,
iAd,
kinect,
NCIS,
privacy,
product placement,
security,
Xerox
No comments:
Nov 20, 2010
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Gobeille (design.io) & Update on Funky Forest
I came across the following video on a post on the Creative Applications blog about the Theo Watson and Emily Gobeille's recent work with OpenFrameworks, Kinect, and an interactive puppet prototoype:
Interactive Puppet Prototype with Xbox Kinect from Theo Watson on Vimeo.
I'm not surprised that Theo Watson and Emily (design.io) decided to experiment with the XBox Kinect. I can't wait to see what they will create for Kinect in the future, based on their previous work -- A couple of years ago I wrote a post about their enchanting interactive installation, "Funky Forest": OpenFrameworks & Interactive Multimedia: Funky Forest Installation for CinKid.
You can see from the video of their Funky Forest installation (2007 CinKid) that they create engaging interactive applications:
Funky Forest - Interactive Ecosystem from Theo Watson on Vimeo.
(Information from Theo Watson's Vimeo site)
"Funky Forest is a wild and crazy ecosystem where children manage the resources to influence the environment around them. By using their bodies or pillow "rocks" and "logs", water flowing from the digital stream on the floor can be dammed and diverted to the forest to make different parts grow. If a tree does not receive enough water it withers away but by pressing their bodies into the forest children create new trees based on their shape and character. As children explore and play they discover that the environment is inhabited by a variety of sonic life forms and creatures who appear and disappear depending on the health of the forest. As the seasons change the creatures also go through a metamorphosis."
Credits:
Theodore Watson
Emily Gobeille
Project Page:
muonics.net/site_docs/work.php?id=41
zanyparade.com/v8/projects.php?id=12
Note: A version of Funky Forest, Moomah Edition, is permanently installed in New York City at the Moomah Children's Cafe. This version includes interactive environments related to the four seasons of the year, including an interactive particle system with falling leaves and snow.
FOR THE TECH-CURIOUS
Libfreenect for OX, by Theodore Watson
XBox Kinect running on OS X ( with source code ) from Theo Watson on Vimeo.
The following information and links were taken from the Vimeo description of the above video:
"This is a port/adaptation of Hector Martin's libfreenect for OS X made by Theodore Watson.
Hector Martin's Libfreenect project page is here: git.marcansoft.com/?p=libfreenect.git
Most of the code is unchanged but there are some changes to libusb which was needed to get it running (and a few extra libusb commands) as well as some tweaking of the transfer sizes.
It should be self contained and you shouldn't need to install libusb (the app links it directly ).
Grab the Source Code:
UPDATE:
(Tested on OS X 10.6.3 - 32bit now and with fixes)
theo.tw/deliver/kinect/001-libfreenect-modded-osx-updated.zip
ofxKinect for OF users - thanks Dan!
openframeworks.cc/forum/viewtopic.php?p=24948#p24948
Tips:
- Try both usb ports.
- Try not to have too many other devices plugged in (or any)."
(Check the Vimeo website to see if there are updates)
OpenFrameworks Forum
http://www.openframeworks.cc/forum/
RELATED
Presentation about Funky Forest (ThisHappened)
Emily Gobeille & Theo Watson talk about Funky Forest from This happened – Utrecht on Vimeo.
Cross-posted on the TechPsych blog.
Interactive Puppet Prototype with Xbox Kinect from Theo Watson on Vimeo.
I'm not surprised that Theo Watson and Emily (design.io) decided to experiment with the XBox Kinect. I can't wait to see what they will create for Kinect in the future, based on their previous work -- A couple of years ago I wrote a post about their enchanting interactive installation, "Funky Forest": OpenFrameworks & Interactive Multimedia: Funky Forest Installation for CinKid.
You can see from the video of their Funky Forest installation (2007 CinKid) that they create engaging interactive applications:
Funky Forest - Interactive Ecosystem from Theo Watson on Vimeo.
(Information from Theo Watson's Vimeo site)
"Funky Forest is a wild and crazy ecosystem where children manage the resources to influence the environment around them. By using their bodies or pillow "rocks" and "logs", water flowing from the digital stream on the floor can be dammed and diverted to the forest to make different parts grow. If a tree does not receive enough water it withers away but by pressing their bodies into the forest children create new trees based on their shape and character. As children explore and play they discover that the environment is inhabited by a variety of sonic life forms and creatures who appear and disappear depending on the health of the forest. As the seasons change the creatures also go through a metamorphosis."
Credits:
Theodore Watson
Emily Gobeille
Project Page:
muonics.net/site_docs/work.php?id=41
zanyparade.com/v8/projects.php?id=12
Note: A version of Funky Forest, Moomah Edition, is permanently installed in New York City at the Moomah Children's Cafe. This version includes interactive environments related to the four seasons of the year, including an interactive particle system with falling leaves and snow.
FOR THE TECH-CURIOUS
Libfreenect for OX, by Theodore Watson
XBox Kinect running on OS X ( with source code ) from Theo Watson on Vimeo.
The following information and links were taken from the Vimeo description of the above video:
"This is a port/adaptation of Hector Martin's libfreenect for OS X made by Theodore Watson.
Hector Martin's Libfreenect project page is here: git.marcansoft.com/?p=libfreenect.git
Most of the code is unchanged but there are some changes to libusb which was needed to get it running (and a few extra libusb commands) as well as some tweaking of the transfer sizes.
It should be self contained and you shouldn't need to install libusb (the app links it directly ).
Grab the Source Code:
UPDATE:
(Tested on OS X 10.6.3 - 32bit now and with fixes)
theo.tw/deliver/kinect/001-libfreenect-modded-osx-updated.zip
ofxKinect for OF users - thanks Dan!
openframeworks.cc/forum/viewtopic.php?p=24948#p24948
Tips:
- Try both usb ports.
- Try not to have too many other devices plugged in (or any)."
(Check the Vimeo website to see if there are updates)
OpenFrameworks Forum
http://www.openframeworks.cc/forum/
RELATED
Presentation about Funky Forest (ThisHappened)
Emily Gobeille & Theo Watson talk about Funky Forest from This happened – Utrecht on Vimeo.
Cross-posted on the TechPsych blog.
Posted by
Lynn Marentette
Nov 13, 2010
HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)
MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction. The application was built on Ubuntu Linux written using libfreenect, by marcan42 and Florian's creation, libTISCH.
Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"
(I have SO many ideas for this! I'll throw a few out there in an upcoming post....maybe someone can run with them!)
RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10
FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework. You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.
LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10
Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.
For your convenience, I've reposted something I wrote about libTISCH back in 2009:

Here is NUI-Group member Florian Echtler's proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction. The application was built on Ubuntu Linux written using libfreenect, by marcan42 and Florian's creation, libTISCH.
Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"
(I have SO many ideas for this! I'll throw a few out there in an upcoming post....maybe someone can run with them!)
RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10
FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework. You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.
LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10
Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.
For your convenience, I've reposted something I wrote about libTISCH back in 2009:
For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction, this is great news!
Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge. TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans. libTISCH, a C++ software framework, is included in this project. It provides a means for creating GUIs based on multi-touch and/or tangible input devices.
Here is how it works:

Here is information from libTISCH announcement:
Highlights of this release are, among others, the following features:
- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
as well as custom-defined gestures
- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink
- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python
libTISCH has a lot to offer for the multitouch developer. For example,
- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
as well as custom-defined gestures
- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink
- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python
libTISCH has a lot to offer for the multitouch developer. For example,
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.
More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/ mediawiki/tisch/
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.
More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/
Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his webpage.
I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.
Posted by
Lynn Marentette
Labels:
florian echtler,
kinect,
libfreenect,
libtisch,
mac,
mac os x,
MacOsX,
microsoft,
microsoft kinect,
MicrosoftKinect,
multitouch,
NUI hack,
port,
theo watson,
TheoWatson,
tisch,
UI,
video
2 comments:
Jun 22, 2010
Kinect Sensor for Xbox 360 Offers Full-Body and Gesture Interaction: No controllers or remotes!
Project Natal was the code name for the Kinect Sensor for Xbox 360. For $149.99 you can pre-order your very-own system from the Microsoft Store that will allow you to interact with video games with your body alone. No need for controllers or 'motes!
Presentation about the fitness benefits of the Kinect Sensor for Xbox 360:
This video is a preview of a dance game for the Xbox using the Kinect Sensor:
It would be great if I could do my Zumba moves with Kinect Sensor system and a great Xbox application!
Here's another video that explains the system in more detail, with brief interviews of innovators from Microsoft:
Here is a copy of my previous post about Project Natal:
How It Works: Microsoft's Project Natal for the Xbox 360 video from Scientific American
Microsoft gathered a wealth of biometric data to recognize the range of human movement in order to develop an algorithm for the next generation of controller-less gaming. "Natal will consist of a depth sensor that uses infrared signals to create a digital 3-D model of a player's body as it moves, a video camera that can pick up fine details such as facial expressions, and a microphone that can identify and locate individual voices."
The technology behind Natal has the potential for a range of uses beyond gaming.
Scientific American article:
Binary Body Double: Microsoft Reveals the Science Behind Project Natal for Xbox 360
Presentation about the fitness benefits of the Kinect Sensor for Xbox 360:
This video is a preview of a dance game for the Xbox using the Kinect Sensor:
It would be great if I could do my Zumba moves with Kinect Sensor system and a great Xbox application!
Here's another video that explains the system in more detail, with brief interviews of innovators from Microsoft:
Here is a copy of my previous post about Project Natal:
How It Works: Microsoft's Project Natal for the Xbox 360 video from Scientific American
Microsoft gathered a wealth of biometric data to recognize the range of human movement in order to develop an algorithm for the next generation of controller-less gaming. "Natal will consist of a depth sensor that uses infrared signals to create a digital 3-D model of a player's body as it moves, a video camera that can pick up fine details such as facial expressions, and a microphone that can identify and locate individual voices."
The technology behind Natal has the potential for a range of uses beyond gaming.
Scientific American article:
Binary Body Double: Microsoft Reveals the Science Behind Project Natal for Xbox 360
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)