Showing posts sorted by relevance for query gesture. Sort by date Show all posts
Showing posts sorted by relevance for query gesture. Sort by date Show all posts

Dec 13, 2011

Kinect in Education! (kinectEDucation)

Although I'm currently exploring the world of interactive HTML5, interactive video, etc., I think I just might make "kinecteducation" the focus of my tech-hobbies. I have some experience with game programming-one of my computer courses required a project using XNA- and I know quite a bit about gesture and multitouch, multi-user interactio, so it would'nt be too much of a stretch.


My motivation?

As a school psychologist, my main assignment is a school/program for students with disabilities, including about 40 or so who have autism spectrum disorders. Yesterday, the principal of the school attended a demonstration of the Kinect and requested that our school be considered for piloting it. One of my other assignments is a magnet high school for technology and the arts, and rumor has it that it will be offering a game programming curriculum.  I'd love to co-sponsor an after-school game club and encourage the students to program educational apps for the Kinect sometime in the near future! 


I'm also working as a client, in collaboration with come of my educator colleagues, with a team of university students who are creating a communication/social skills game suite geared for students with autism and related disabilities....


I'm inspired by the possibilities!


We have large SMARTboards in each classroom and in other locations around the building, and we have a Wii set up in the large therapy room adjacent to my office. The Wii has proven to be very useful in helping the students develop social and leisure skills that they can use in and outside of the school settings, but some of the students have difficulty manipulating the buttons on the controllers.


You can get Kinect-based apps from the Kinect Education website! Below are selected links from the website:

You can also get additional information from the Microsoft in Education "Kinect in the Classroom" website.

Below are a few videos to give you an overview of how open-source applications designed for the Kinect can be used in education: 






Jul 24, 2011

Video: Kinect SoundWall, links to info and code!





Here is information about the project from the KinetHacks SoundWall site:

"Kinect sound machines become prettier and easier with each development! The Kinect SoundWall is a drum beat music machine controlled by gestures and voice commands. This video by  displays this digital music machine at work and how through various gesture and voice commands, users can create awesome beats to dance to. In the video, the user gestures to to certain blocks in the screen in order to create a beat there or render the beats void. Through various voice commands, the beat can start, increase tempo, stop etc. Through the proper integration of both voice and gesture commands, the Kinect SoundWall sets the standard for a great and efficient sound machine of the Kinect!"
"For more information about the Kinect SoundWall visit the project’s website."
RELATED
Vertigo SoundWall CodePlex Project Site

Apr 29, 2011

Musical Multitouch/Gestural Interfaces by Osmosis

I've been pleasantly surprised by the increase of interesting multi-touch and gesture-based applications developed for musical interaction on large displays.  This topic is dear to my heart -  I took a computer music technology class back in 2003, and think it would have been great if this sort of thing was available back then.  Who wants to point and click around a music app for hours on end?!


The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.


Performance Systems for Stage/Studio from Osmosis on Vimeo.

Transparent Stage System Specs
Design
• Floating, transparent HD displays from 32"
• Haptic surface with tempered glass backing
• High gain image with wide viewing angle
• Rugged aircraft-grade aluminum build
• Enclosed projector and computer
• Minimalist style, compact footprint
• Disassembles for easy transportation


Interactivity
• Projected capacitive foil or IR bezel options
• Up to 32 simultaneous touch points
• Precise, responsive touch tracking (3mm)
• Immune to external light conditions
• Use of fingers, gloves or stylus


Technical
Display:
• DLP portable projector
• 1280×800 HD resolution
• 2500 ANSI-lumen, 1800:1 contrast
Computer:
• Mini-ITX, Core 2 Quad, 4GB RAM
• ATI Radeon X1250 graphics card
• Wireless keyboard and mouse
• Windows 7 Pro

Stuart McClean, the founder of Osmosis, shared the following information about his company:

"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."


cid:image003.jpg@01CBBEEE.FB8A4D70cid:image016.jpg@01CBBEEE.FB8A4D70cid:image011.jpg@01CBBEEE.FB8A4D70


For more information about the applications developed by the Osmosis team, take a look at their showcase page.  

Apr 22, 2011

Pervasive Retail Part I: Web UX Meets Retail CX - Screens Large and Small at the Mall, Revisited

If you follow my blog(s), you know that I have a passion for interactive displays in public spaces, and that I enjoy watching how various technologies converge, jump across platforms and devices, inter-operate, and re-purpose over time.  

The best places for watching this unfold, in my opinion, are airports, malls, shopping districts,  and larger "big box" establishments, where the Web meets Digital Out of Home (DOOH), old-fashioned kiosks morph into multi-touch screens and gesture-based windows, and visual merchandising meets technology, digital culture, architecture, and consumer metrics At the center of it all is the user/consumer - regular people, moms, dads, kids, teens, the elderly, the disabled, the hurried and the worried. Adding to the complexity is that an increasing number of people who are out-and-about are tethered to various mobile devices.

In scholarly tech circles, the concept of DOOH is known "Pervasive Retail".  The explosion of mobile devices and ubiquitous screens has fueled the fire for research, and is the focus of the current issue of IEEE's Pervasive and Ubiquitous Computing.   

Despite the influx of technology, no-one is exactly sure how to do it quite right.  (I have some ideas, which I'll save for a future post.)

If you are interested in learning more about concepts related to "pervasive retail", the Retail Customer Experience website is a treasure trove of information related to DOOH, digital signage, multi-channel retailing, in-store media, kiosks, interactive touch screens and windows, related metrics, and more, with stories about real-life technology implementation.


Mall Video
The following video, taken with my handy HTC Incredible, provides a quick sampling of the screens I encountered during a recent visit to South Park Mall, in Charlotte, N.C.  The last screens in the clip were taken in the Brookstone store, and will be included in another clip that focuses solely on all of the screens that were scattered about the retail space.  


I have a hunch that some of the smaller displays in the Brookstone store were iPads.  iPads and tablets have great potential for use for shelf-level in-store interactive visual merchandising deployments, given the right apps and mounting systems. (See iPads as Cheap Digital Signage, by Tony Hymes of DOOHSocial and the video about Premier's iPad mounts, for more information.)

Much of what you'll see in the following video, taken at the same mall in December of 2009, wasn't around during my most recent trip:
Screens Large and Small at the Mall

Interactive Coke Machine and Kid at the Mall












I was sad to see that the interactive screen on the Coke machine  had been replaced by an ordinary one.  Part of the problem, I think, is that the interactive display was too busy and as a consequence, made the goal getting a quick drink a bit too complicated for the average thirsty customer, as seen in the video below:


Touch Screen Coke Machine at the Mall: 90 seconds to get a coke!

RELATED

Previous Posts:

References and Resources (Partial List)
Ron Brunt, InTouch with Retailing Whitepaper, 1/15/06
Brian Monahan, IPG Emerging Media Blog, 4/15/11
When all the world is a screen (The video is worth taking the time to watch.)
Narayanswami, C.,  Kruger, A.,  Marmasse, N. Pervasive Retail, IEEE Pervasive Computing
April-June 2011 (Vol. 10, No. 2) pp. 16-18 1536-1268/11/$26.00 © 2011 IEEE 
References from the Pervasive Retail article:
Mobile Retail Blueprint, Nat'l Retail Federation; www.nrf.commodules.php?name=Pages&op=viewlive&sp_id=1268 .
G. Belkin, Pervasive Retail Business Intelligence, Aberdeen Group, Apr. 2010; www.slideshare.net/AxiomConsultingAustralia pervasive-retail-business-intelligence .
R. Wasinger, A. Krüger, and O. Jacobs, "Integrating Intra and Extra Gestures into a Mobile and Multimodal Shopping Assistant,"Proc. 3rd Int'l Conf. Pervasive Computing (Pervasive), Springer, 2005, pp. 297–314.
A. Meschtscherjakov et al., "Enhanced Shopping: A Dynamic Map in a Retail Store," Proc. 10th Int'l Conf. Ubiquitous Computing(UbiComp 08), ACM Press, 2008, pp. 336–339.
C. Stahl and J. Haupert, "Taking Location Modelling to New Levels: A Map Modelling Toolkit for Intelligent Environments," Proc. Int'l Workshop Location- and Context-Awareness (LoCA), LNCS 3987, Springer, 2006, pp. 74–85.

Feb 24, 2011

Vision-Based Hand-Gesture Applications: Video from Communications of the ACM



The latest edition of Communications of the ACM, via "snail mail", was the inspiration for this post:



Vision-Based Hand-Gesture Applications
Juan Pablo Wachs, Mathias Kolsch, Helman Stern and Yael Edan

"Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing."


More to come!

Dec 11, 2010

Gesture "multitouch" 12 x 7 interactive video wall provides tours of I/O Data Centers' facilities

I came across this demonstration of I/O DataCenter's 12 by 7 foot interactive video wall that makes playing around with views of data center modules...interesting! The display is a gesture-based "multi-touch" system. (I'll update this post when I get more information.)



Here is the description from the Datacenter YouTube channel:


"Instead of hauling a 40-foot long modular data center to a trade show, i/o Data Centers is taking a high-tech approach to customer tours of their i/o Anywhere modular data center. The i/o team has created a 12-foot by 7-foot touchscreen video wall to provide interactive tours of the company's facilities. Selecting a "hot spot" pops up a virtual data center, complete with cross sections and product info, following the concept of the touch screens in the sci-fi movie "Minority Report.""


FYI: I/O Data Centers has an application that runs on the Surface.

UPCOMING:
Stay tuned for my upcoming posts! 


News about LM3LABS (Previous post)
Interactive Surveillance CCeline Latulipe (technologist) Annabel Manning (artist)

Dec 3, 2010

More gesture and multi-touch interaction! Windows 7 Navigation with Kinect; Product browser by Immersive Labs,

Here are a couple of new natural user interface videos.  The first video, by Evoluce, demonstrates gesture interaction/navigation in Windows 7 applications supported by Kinect. The second video, by Immersive Labs, shows multi-touch product browsing interaction on a large display.

Kinect Treatment of Windows 7, by Evoluce

Evoluce: Leading Surface Technologies


Immersive Labs - Multi-touch Product Browser

Immersive Labs

Nov 30, 2010

TuioKinect, by Martin Kaltenbrunner: "A simple TUIO hand gesture tracker for Kinect"

More Kinect from Martin Kaltenbrunner:


Martin Kaltenbrunner's description of TuioKinect:
"TuioKinect tracks simple hand gestures using the Kinect controller and sends control data based on the TUIO protocol. This allows the rapid creation of gesture enabled applications with any platform or environment that supports TUIO tuio.org/​ You can download the application from: code.google.com/​p/​tuiokinect/​ Music: Jabon Jabon by El Club de los Astronautas (Institut Fatima)"


I've played around with Tuio and OpenFrameworks, but it has been a while.  I can't wait until I have time to dig into this with a Kinect. I think this has great potential for supporting learning and communication among students with special needs.

RELATED/SOMEWHAT RELATED
TuioKinect:  TUIO Hand tracker for Kinect
Martin Kaltenbrenner, Tangible Interaction Frameworks 11/27/10
Therenect: Theremin for the Kinect! (via Marten Kaltenbrenner)
Xbox Kinect Interactive Puppet Prototype, from Theo Watson and Emily Bobeille, creators of Funky Forest
Hacked Kinect Multitouch using libFreenect and libTISCH (via Florian Echtler)

Nov 13, 2010

HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)

MULTI-TOUCH WITH HACKED KINECT
Here is NUI-Group member Florian Echtler's  proof-of-concept HD video of using a hacked Kinect camera for multitouch-like interaction.  The application was built on Ubuntu Linux written using libfreenect, by marcan42  and Florian's creation, libTISCH.



Florian decided to use picture-browsing interaction to demonstrate proof-of concept, so "everybody can focus on more interesting things :-)"


(I have SO many ideas for this!  I'll throw a few out there in an upcoming post....maybe someone can run with them!)


RELATED
Hacked Kinect taught to work as multitouch interface
Paul Miller, engadget, 11/11/10


FOR THE TECH-CURIOUS:
TISCH stands for Tangible Interactive Surfaces for Collaboration between Humans, and is a cross-platform, cross device multi-touch development framework.  You can download the source package for Windows, MacOS X, and Linux from the TISCH Sourceforge website. The Ubantu Lucid/Karmac version has "superquick installation via PPA" - the instructions can be found on the TISCH Sourceforge website.


LibFreenect- Open Source PC Drivers for Kinect
Xan Tium, XBLOG 360 11/10/10

Marcan is Hector Martin Cantero, the author of the Abort, Retry, Hack? blog.

For your convenience, I've reposted something I wrote about libTISCH back in 2009:

For techies (and the tech-curious) who like technologies that support collaboration and multi-touch interaction,  this is great news!

Florian Echtler announced the first stable releas of libTISCH, a multi-touch development framwork, which can be found on Sourceforge.  TISCH stands for Tangible Interaction Surfaces for Collaboration between Humans.  libTISCH, a C++ software framework, is included in this project.  It provides a means for creating GUIs based on multi-touch and/or tangible input devices.

Here is how it works:

Architecture Layers



































Here is information from libTISCH announcement:

Highlights of this release are, among others, the following features:

- ready-to-use multitouch widgets based on OpenGL
- reconfigurable, hardware-independent gesture recognition engine
- support for widely used (move, scale, rotate..), pre-defined gestures
 as well as custom-defined gestures

- hardware drivers for FTIR, DI, Wiimote, DiamondTouch..
- TUIO converters: source and sink

- cross-platform: Linux, MacOS X, Windows (32 and 64 bit)
- cross-language: C++ with bindings for C#, Java, Python

libTISCH has a lot to offer for the multitouch developer. For example, 
the textured widgets enable rapid development of applications for many
kinds of multi-touch or tangible interfaces. The separate gesture
recognition engine allows the translation of a wide range of highly
configurable gestures into pre-defined or custom events which are then
acted on by the widgets. While the lower layers of libTISCH provide
functionality similar to tbeta, touche etc. (you can interface existing
TUIO-based software with libTISCH in both directions), it goes far
beyond.

More information about the library and underlying architecture can be found on http://tisch.sf.net/ and in the Sourceforge wiki at
http://sourceforge.net/apps/mediawiki/tisch/


Note:
Dr. Florian Echtler is on the scientific staff at the Technisch Universitat Munchen in Germany. Be sure to check out his  webpage.

I especially like the concept of the MeTaTop: "A Multi-Sensory Table Top System for Medical Procedures" that is linked from Florian's website.


MeTaTop A Multi Sensory Table Top System for Medical Procedures

Oct 31, 2010

Microsoft is acquiring Canesta, Inc., a developer of 3-D electronic perception technology for natural user interaction, gaming, and more.

Microsoft to Acquire 3-D Chip Firm Canesta
Michael Baron, TheStreet 10/29/10

Thanks to Harry Van Der Veen, of NUITEQ, for this link!

RELATED
The following video is from the Canesta3D YouTube channel. It demonstrates the 3D input sensor in action, with four people moving around in a living room. The chip used in the system depicted in the video was the precursor to the current chip, called the "Cobra 320x200".


Below is a demo of gesture interaction using Canesta3D technology to control and select information and content on a large display.  In my opinion, this will change the way we interact with our TV's, at least for those of us who hate using bad remotes!  Microsoft's acquisition of Canesta is good news, especially if they allow this technology to be used by the masses.   I'm pretty sure it has the capability of supporting  interaction with HD TV's are internet-ready, and can support GoogleTV, LeanBack, and Vimeo's Couch Mode.




Canesta Announces Definitive Agreement to be Acquired by Microsoft
Press Rease, 10/29/10, Canesta

About Canesta (From the Canesta website)
"Canesta (www.canesta.com) is the inventor of revolutionary, low cost electronic perception technology and leading provider of single chip CMOS 3-D sensors that fundamentally change the relationship between devices and their users. This capability makes possible true 3-D perception as input to everyday devices, rather than the widely understood 3-D representational technologies as output. Canesta’s 3-D input technology, based upon tiny, CMOS 3-D imaging chips or “sensors”, enables fine-grained, 3-dimensional depth-perception in a wide range of applications. Products based on this capability can then react on sight to the actions or motions of individuals and objects in their field of view, gaining levels of functionality and ease of use that were simply not possible in an era when such devices were blind. Canesta’s focus is on mass market consumer electronics, but many applications exist in other markets as well. Canesta is located in Sunnyvale, CA. The company has filedin excess of fifty patents, 44 of which have been granted so far."


Canesta Corporate Fact Sheet (pdf)
Videos: http://canesta.com/applications/consumer-electronics/gesture-controls

I posted some videos about Canesta's technologies on the following post. There are two videos that show Canesta's 3D depth camera works on a Hitachi flat-panel display: Interactive Displays 2009 Conference

For more information about interactive TV, GoogleTV, Leanback and Couch Mode, see the second section of my recent post:
Philipp Geist: Blending the Physical with the Digital;  Google TV/Leanback, Vimeo's new Couch Mode, oh..and ViewSonic's 3D (glasses-less) pocket camcorder...

Sep 18, 2010

Interactive 360 Degree Glass-less 3D Video Display with Gesture Sensor: Demo of Sony's RayModeler

The video below gives a demo of Sony's RayModeler, "A 360-Degree Display that doesn't require glasses". The video shows how the auto-stereoscopic 3D content is filmed. It also shows how items within the display respond to gesture interaction. The first prototype was introduced in 2009 and then brought out at the SIGGRAPH conference this summer.



According to an article written by Richard Lawler, Core77 created "Breakout" for the RayModeler, a game similar to Pong.  I'll have to think more about this technology before I form an opinion!

RELATED
Sony's 360-degree RayModeler 3D display brings its glasses-free act to LA, plays Breakout
Richard Lawler, Engadget 7/28/10

Sony's 360-degree 3D display prototype makes virtual pets more lifelike, expensive
Thomas Ricker, Engadget 10/19/09

Jul 6, 2010

Samsung Transparent OLED + Wedge Camera, Glassless 3D, Telepresence, Mid-air Interaction: Applying Science at Microsoft

The Microsoft Applied Sciences Group has been working on several projects that have the potential of changing how we interact with various displays and surfaces in the very near future.   Here's some what I came across my RSS feeds and Google Alerts this morning:
INAVATE  July 5, 2010

According to an article in InAVate, "Microsoft has combined Samsung’s transparent OLED with a sub-two-inch camera to revolutionize the Microsoft Surface platform. The touchless telepresence screen creates a 3D gesture-control interface that tracks movement by seeing through the display. The company’s Applied Sciences Group has also added its recently revealed wedge shaped lens, that InAVate reported on last month, to deliver glasses-free 3D content...the latest breakthrough could revolutionize the Surface concept, taking touch away from the display and projecting the images in 3D.-InAVate 7/5/2010


3D Gesture Interaction

"In this demonstration, we've placed the Microsoft Applied Science's wedge technology behind Samsung's transparent OLED display. This enables a camera to image through the display, see the user's hand above it, and alter the image based upon her gestures." -Microsoft Applied Sciences Group

3D Without the Glasses: A new type of display from Microsoft produces multiple images and tracks the viewers eyes - Kate Greene, MIT Technology Review (6/11/2010)

According to an article in MIT's Technology Review, "the new lens, which is thinner at the bottom than at the top, steers light to a viewer's eyes by switching light-emitting diodes along its bottom edge on and off. Combined with a backlight, this makes it possible to show different images to different viewers, or to create a stereoscopic (3-D) effect by presenting different images to a person's left and right eye. "What's so special about this lens is that it allows us to control where the light goes," says Steven Bathiche, director of Microsoft's Applied Sciences Group." -Kate Greene, Technology Review

Steerable Multi-view Display

"In this demonstration, we use head tracking to determine where multiple users are. Then, with the Microsoft Applied Sciences' wedge technology, we steer completely independent images to each user. In the video, one user is seeing a sun while at the same time another is seeing a rocket. This is maintained even as the users change positions relative to each other." -Microsoft Applied Sciences Group
Transparent Display for Telepresence

"In this demonstration, we've placed the Microsoft Applied Science's wedge technology behind Samsung's transparent OLED display. This enables a camera to image directly through the display. In the video, objects held up to the screen are captured and shown to the user on the other side of the telepresence communication (the other monitor in the video), while far away from the screen, the display shows the user a view dependent image."-Microsoft Applied Sciences Group
Steerable 3D Auto Stereo Display

"In this demonstration, we use head tracking to determine where a user's eyes are. Then, with the Microsoft Applied Sciences' wedge technology, we steer different views of the scene to each eye to produce a 3D image without the need for glasses or for fixing the location of the user." -Microsoft Applied Sciences Group
Mid-air Interactive Display

"In this demonstration, we illuminate objects above the display with infrared light. We capture the reflection using the Microsoft Applied Sciences' wedge technology. This enables us to see above the display while keeping the form factor small. Seeing above the display allows us to track the interaction between direct contacts on the display. In the video, the user associates a function (color choice) with one hand and a different function (zoom/rotation) with the other hand. This tool persistence is maintained regardless of the relative positions of the hands." -Microsoft Applied Sciences Group

RELATED
About Microsoft Applied Sciences Group
"The Applied Sciences Group (ASG) is an applied research and development team dedicated to createing the next generation of computer interaction technologies.  The interdisciplinary group focuses on the synergy between optics, electronics and software to create novel human computer interfaces.  The ASG is part of the Entertainment and Devices Division at Microsoft Corp. and mainly supports projects for Microsoft Hardware, XBox, and Microsoft Surface.  It also works closely with Microsoft Research."