Showing posts with label kinect. Show all posts
Showing posts with label kinect. Show all posts

Aug 10, 2013

Kinect Interaction to support people with disabilities: DoubleFine's Happy Action Theater/Kinect Party; OAK Air Switch and Face Switch, resources

I've been experimenting with gesture and touch-based applications for many years and I'm excited to see how things have unfolded over the past couple of years, especially in the field of special education.

Last week I downloaded DoubleFine'Kinect-based Happy Action Theater/Kinect Party,  to use during some group activities with students I work with who have significant disabilities (including severe autism).  I wish I had discovered this suite of games sooner!

I had loads of fun with students and colleagues as we explored some of the 36 creative, and sometimes zany, minigames.  I had heard that DoubleFine had launched something special, but didn't realize how awesome it was until I spent some serious playful time with it at home last weekend. I then tried it out at work this past week.  

If you are planning to explore Happy Action Theater/Kinect party, keep in mind that it plays best when there are at least two people and an audience to cheer everything along.  Through the use of blob detection algorithms, the games can handle up to 6 players at a time, which is perfect for small-group special classes.

The following trailer gives just a little hint of what this suite of mini-games is all about!


I noted that many of the games were effective in helping students become more aware of their peers. They began to play and interact with one-another in ways I hadn't previously imagined.  I especially liked the fact that many of the mini-games made it possible for students in wheelchairs to participate.  

I look forward to exploring more of the games over the next few months and will follow up with a future post after I get more input from my colleagues (and students).

I learned about Kinect Party through my contact with people involved with the GestureSEN wiki. The wiki was created as part of a Professional Learning Community (PLC) for people who work with students in specialized schools, similar to the school where I work, and contains a wealth of information about the use of newer and emerging technologies, such as the iPad, Leap Motion, the Kinect, and eye-gaze systems to support young people with significant disabilities  including autism  Some members of the GestureSEN wiki have learned to code or are in the process of doing so, motivated by what they've experienced so far with their students.  (More information and links are listed in the "RELATED" section of this post.)

OAK

OAK was developed by RCAST at the University of Tokyo in collaboration with Microsoft Japan Co., Ltd. It uses the motion-tracking capabilities of Microsoft's Kinect sensor to create non-contact switches for people with limited mobility, enabling them access to computers and other electronic devices and systems.  The video below provides a nice overview of the OAK system.

The OAK Pro bundle includes the following applications:

The Air Switch software uses the distance/depth capabilities of the Kinect sensor to support gestures of the head, hands, or larger body part to turn things off or on. The infrared from the Kinect also supports the use of the Air Switch in the dark.   The color mode function captures movements from smaller parts of the body, such as a fingertip.

The Face Switch software uses facial recognition software that can track the movements of the face, mouth, tongue, and eyes.  It can identify facial parts that have moved significantly, and records motion data 

The Motion History software observes  the movement of a person's body using the video component of the Kinect sensor.   This customizes the system to the individual and ensures accuracy of the switch.   Movements are color coded and provide the person who is setting up the system a means to fit the system to the specific capabilities and needs of the user.  

The OAK system can be enhanced by the sue of peripherals, such as a USB 4 channel relay box, an IR remote control device or outlet, or other on/off switches/outlets.

The Assist-i corporation has made the OAK system and peripherals available on Amazon Japan.  From what I can tell from the company's website, the OAK software can be downloaded free for a 30-day trial.   I'd love to see how it would work with some of the students I work with who have difficulty accessing conventional switches!  It would be wonderful to come up with ways for these students to access a wider range of digital media activities and games.


















RELATED
University adapting videogame technology to help physically disabled computer users
Philip Kendall, Japan Today, 10/10/12
OAK Air Switch (PC Kinect)
OneSwitch.org.uk 4/30/13
OAK Air Switch, Face Switch, Motion History Pro Bundle (pdf)
Assist-i Corporation
Amazon Ai store: Assist-i Corporation (Prices are in Yen.)

Below is a partial list of links to resources related to using or creating engaging interactive applications and games for people with special needs: 

Using Kinect in Special Ed Classrooms: Advice from Loudoun County, Virginia Teachers
Microsoft in Education Team, Microsoft in Education Blog, 6/1/12

KinectSEN-Kinect and Special Educational Needs round-up
Greg Dunan, Microsoft Coding4Fun, 10/11/12

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform andKaiser Permanente therapists offers a barrel of possibilities!
Lynn Marentette, Interactive Multimedia Technology, 7/23/13

Behind the Scenes: Creating Marty the Monkey (The character from Vectorform's autism assessment app) John Einselen, Vectorform Blog, 7/24/13


Kinect Party Review: More Fun from the Fun Kings
Casey Lynch, IGN, 12/20/12


The Power of Kinect in Special Needs Education
Willemijn de Lint,  Hans Smeele, mytylschool De Ruimte

Sign Language Recognition and Translation with Kinect (pdf)
Ming Zhou, et. al.

Cool Kinect move: Reading sign language in real time
Christopher MacManus, CNET, 7/18/13

Anthony Rhys, Trinity Fields ICT

James Winchester, SENClassroom blog

PMLD Eyegaze Project at Trinity Fields

Kinect hacking using Processing

Kinect SEN and Processing Resources
Keith Manville, Oak Grove College OpenSEN

Mat's Classroom Blog

GestureSEN Wiki
KinectSEN Wiki; KinectSEN News
ProcessingSEN wiki
LeapSEN Wiki
EyegazeSEN Wiki

SEN Students and Coding
OpenSEN, 3/5/13

Processing2

Kinect for Windows Blog

Kinect For Windows
DoubleFine

Understanding Engagement, Module 3.2Training materials for teachers of learners with severe, profound and complex learning difficulties, UK Dept. for Education


Subscribe by e-mail

May 21, 2013

Xbox One and Kinect 2 for the Playground of the Future

Xbox One and Kinect 2, Playground of the Future

The big news in tech today is the unveiling of the new Xbox One/Kinect 2 system.  For now, the video below might be the closest you'll get to the system.  Wired's senior editor, Peter Rubin had a chance to interview Scott Evans, of Microsoft, as he demonstrated the fascinating technical details in a family-room type setting.

Wired's interview of Scott Evans and demo of the new Xbox One and Kinect 2, using Active IR technology.



From what I learned, the new Kinect sensor has six times the fidelity of the previous version. Paired with the new Xbox One, it can do amazing things.  Engineers from around the world collaborated on this project, providing expertise in facial recognition, digital signal processing, speech recognition, machine learning, and computer vision.  The Xbox One is fueled by an 8-core x86 processor, supported by 8GB of RAM, which is sure to handle the hardest gamer's needs. It also includes a 500GB hard drive and an HD Blu-ray player.


The new system was designed to enhance the gaming/user experience. The 1080p camera provides a field of view that is 60 degrees larger than its  predecessor, and can handle a high level of detail.  It provides a better means of interpreting movement and orientation, and it processes skeleton and hand movements more precisely.  The system features "muscle man", a human-based physics model that is layered over the skeleton and depth map. It senses and calculating the forces the player uses while moving in a game. 

What I find interesting is that the camera can detect the player's pulse by measuring subtle changes of the skin that can't be perceived by the naked eye.  It also can quickly identify each player (it handles up to six), and identify facial expressions.  The active IR (infrared) system provides the system with better accuracy than the original Kinect. 

I wasn't able to find out much information regarding privacy issues with this system.  This is a concern, since it can sense your physiological responses, movement patterns, and facial expressions.  Over time, a good deal of very personal information would be gathered about each user. I shudder to think about the consequences if the data fell into the wrong hands.  

Possibilities for Special Needs Populations

I can see that the Xbox One + Kinect 2 system has the potential for games and other interactive applications for use in physical rehabilitation and fitness.  Since it can interpret facial expressions, it could also provide a way to support social skills learning among children and teens who have autism spectrum disorders.

RELATED

Microsoft invests a good deal of attention to proof-of-concept projects that may or not become part of a commercial product.  Below is an example of IllumiRoom:


Hrvoje Benko, of Microsoft Research, discusses the IllumiRoom concept during an interview at CHI 2013.


Xbox One Website
The new Xbox One Kinect tracks your heart rate, happiness, hands and hollers
Matthew Panzarino, The Next Web, 5/22/13
Kinect 2 Full Video Walkthrough: The Xbox Sees You Like Never Before
Kyle Wagner, Gizmodo, 5/21/13
Hands-on with prototypes of the Xbox One and New Kinect Sensor
Ben Gilbert, engadget, 5/21/13
Efficient Human Pose Estimation from Single Depth Images
Shotton, J., Girshick, R., Fitzgibbon, A., Sharp, T., Cook, M., Finocchio, M., Moore, R., Kohli, P., Crinisi, A., Kipman, A., Blake, A.   Video
Consumer Depth Cameras for Computer Vision:  Research Topics and Applications
Fossati, A., Gall, J., Grabner, H., Ren, X., Konolige, K. (Eds.)
Xbox One: Microsoft's supergeeks reveal what's inside the hardware
Dean Takahashi, VentureBeat, 5/21/13
Next Xbox Will Face New Array of Rivals
Nick Wingfield, New York Times, 5/21/13

Mar 16, 2013

UPDATE: What's New for Kinect? Fusion, real-time 3D digitizing, design considerations, and more.

The Evolution of Microsoft Kinect

I've been following the evolution of Microsoft's Kinect, and recently discovered a few interesting videos that show how far the system has come. According to Josh Blake, the founder of the OpenKinect community and author of the Deconstructing the NUI blog,  the Kinect for Windows SDK v1.7 will be released on Monday, March 18th, from http://www.kinectforwindows.com.  More details about this version can be found on Josh's blog as well as the official Kinect for Windows blog.


It is possible to create applications for desktop systems that work with the Kinect in interesting ways, as you'll see in the following videos. I think there is potential here for use in education/edutainment!

Below is a video of Toby Sharp, of Microsoft Research, Cambridge, demonstrating Kinect Fusion.  The software allows you to use a regular Kinect camera to reconstruct the world in 3D.



KinEtre: A Novel Way to Bring Computer Animation to Life
According to information from the YouTube description, "KinÊtre is a research project from Microsoft Research Cambridge that allows novice users to scan physical objects and bring them to life in seconds by using their own bodies to animate them. This system has a multitude of potential uses for interactive storytelling, physical gaming, or more immersive communications."




The following videos are quite long, so feel free to re-visit this post when you have time to relax and take it all in!

Kinect Design Considerations
This video covers Microsoft's Human Interface Guidelines, scenarios for interaction and use, and best practices for user interactions.  It also includes a preview of the next major version of the Kinect SDK. 


Kinect for Windows Programming Deep Dive
This video discusses how to build Windows Desktop apps and experiences with the Kinect, and also previews some future work.




RELATED
Kinect for Windows Developer Downloads
Kinect for Windows Blog
Deconstructing the NUI Blog (Josh Blake)
Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible
Celia Gorman, IEEE Spectrum, 3/13/13
Kinect hand recognition due soon, supports pinch-to-zoom and mouse click gestures.
Tom Warren, The Verge, 3/6/13
Microsoft's KinEtre Animates Household Objects
Samuel K. Moore, IEEE Spectrum, 8/8/12
Kinect Fusion Lets You Build 3-D Models of Anything Celia Gorman, IEEE Spectrum, 3/6/13
Description of Kinect sessions at Build 2012
Kinect for every developer!
Tom Kerhove, Kinecting for Windows, 2/15/13
Kinect in the Classroom
Kinect Education

Note: Although I recently received my developer kit for Leap Motion, another gesture-based interface, I haven't lost interest in following news for Kinect.

Mar 11, 2013

Leap Motion: My Dev Kit Arrived - Now What?! Thoughts About "NUI" Child-Computer-Tech-Interaction - and More



My Leap Motion developer kit arrived last week. I carefully unboxed the small device and tried out the demo apps that came with the SDK.  I'm doing more looking than leaping at this point.

I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year.  It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.

Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson.   Family and friends captured his first moments after birth with iPhones, and shared across the Internet.  Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.

The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video.   He was seven months old when he first encountered my first iPad.  It was fingers-and-toes interaction from the start.  

In the first picture below,  he's playing with NodeBeat.  In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.




















My grandson is new to motion control applications, so I'm just beginning to learn what he likes,  and what he is capable of doing.  A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft!  (I think next time we'll try Kinect Sesame Street TVor revisit Kinectimals.)  


One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far.  There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.

One app I really like is  Adam Somer's AirHarp, featured in the video clip below:


I also like the idea behing the following app, developed by undergraduate students:

Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
 
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."

"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team



There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation

"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab



Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine.  From the video, you can tell that they had a blast!  

Here is an excerpt from the chatter:  "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."

According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."  

The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:



Below are a few more videos featuring Leap Motion:


Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)


The Leap Motion Experience at SXSW 2013


LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...



RELATED
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Social Sign
IntuiLab
Leap Motion: Low Cost Gesture Control for Your Computer Display

SOMEWHAT RELATED
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."

Comment:
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group.  I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs.   It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.

Jan 17, 2013

XBox Kinect in the OR: Kinect supports gesture interaction with 3D imaging of the patient, while operating.

Here's an interesting use of technology for health - the Xbox Kinect in the OR!

Thanks to Harry van der Veen for the link!


RELATED
Kinect sensor poised to leap into everyday life
Niall Firth, NewScientist, 1/17/13

For the tech-curious:
PrimeSense (Company that developed the 3D depth sensor that powers the Kinect, the sensor in Ava, a healthcare robot by iRobot, and more.)

OpenNI (Framework for the development of 3D sensing middleware libraries and applications.)

NiTE: Natural Interface Technology for End User (Perception algorithms layer for 3D computer vision, allows for hand locating, tracking, analyzing scenes, and tracking skeleton joints.)

Aug 12, 2012

Tech and Stuff shared by my FB friends.

It seems that the weekend is ripe for sharing interesting things on Facebook, judging from what I've seen from my FB friends.  These are just a few that came my way:


This picture below is from the World is Beautiful FB page. Where?  The  Igloo Village of Hotel Kakslauttanen, in Finland.  The igloos are made of glass, and according to the description, provide views of the Aurora Borealis:



In case you missed this--- at about 1:45 the dolphins appear.  Beautiful!

The Blue from Mark Peters on Vimeo.

17 minute video from LEGO about the history of the company:


Context-Aware Computing, by Albrecht Schmidt:


iGlass, shared by Pixelonomics:


Patent application for "peripheral treatment for head-mounted displays", for the above device.

Michael Husted's post:


Shared by Barbara Bray, via Smart Apps for Kids, via Success in Learning



My comment:
"It doesn't hurt to take a few self-defense classes.  I took kickboxing for the exercise and I do not feel defenseless.  As adults, we encounter criminals who are beyond the bully stage, who don'e care if they hurt (or kill) when they want to engage in illegal activities.  It makes sense to do the things that make us strong, healthy, fit, and safe.  This means having the strength to help others during a crisis, such as the shootings at the movie theater and other seemingly "random" acts of local terrorism."

I shared the following picture on Facebook:  
I set up the XBox 360 and the Kinect in the Activities of Daily Living room (it is also the music room), and when I went to take a picture of my rafting adventure, the system took a picture of me!
Photo: We got the Kinect working at school, here I'd a picture of me  taking a picture of the screen when the in-game camera took a picture of me trying to ride the rapids...

Shared by World Sepsis Day - the German delegation's presentation at the Project Fair of the International Federation of Medical Students Association August meeting.



RELATED
Albrecht Schmidt's blog
Interaction Design Foundation:  "Free educational materials - made by the world's technology elite"
Mashable

Jan 3, 2012

"Kinect-based Telepresence with Room Sized 3D Capture and Life Sized Display", Includes Behind-the Scene "how-to" (UNC Chapel Hill)

Jim Spadaccini, of Open Exhibits, recently told me about a project that involves the real-time, interactive 3-D capture of people in a room.  As the viewer moves around the screens, the depth-detecting feature of the Kinect is harnessed to set the stage for a realistic telepresence experience.  


Take the time to view the video, which contains some interesting views of how the system works:





Thanks, Jim, for the link!

RELATED/SOMEWHAT RELATED
Kinect Real-Time Room Telepresence
Kinecthacks, 1/3/12
From the project's website:
Maimone, A. and H. Fuchs. "A First Look at a Telepresence System with Room-Sized Real-Time 3D Capture and Large Tracked Display." The 21st International Conference on Artificial Reality and Telexistence (ICAT) (Osaka, Japan, November 28-30, 2011) [paper] [video]
Maimone, A. and H. Fuchs. "Encumbrance-free Telepresence System with Real-time 3D Capture and Display using Commodity Depth Cameras." The IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 2011 (Basel, Switzerland, October 26-29, 2011) [paper] [video]
Open Exhibits

Dec 13, 2011

Kinect in Education! (kinectEDucation)

Although I'm currently exploring the world of interactive HTML5, interactive video, etc., I think I just might make "kinecteducation" the focus of my tech-hobbies. I have some experience with game programming-one of my computer courses required a project using XNA- and I know quite a bit about gesture and multitouch, multi-user interactio, so it would'nt be too much of a stretch.


My motivation?

As a school psychologist, my main assignment is a school/program for students with disabilities, including about 40 or so who have autism spectrum disorders. Yesterday, the principal of the school attended a demonstration of the Kinect and requested that our school be considered for piloting it. One of my other assignments is a magnet high school for technology and the arts, and rumor has it that it will be offering a game programming curriculum.  I'd love to co-sponsor an after-school game club and encourage the students to program educational apps for the Kinect sometime in the near future! 


I'm also working as a client, in collaboration with come of my educator colleagues, with a team of university students who are creating a communication/social skills game suite geared for students with autism and related disabilities....


I'm inspired by the possibilities!


We have large SMARTboards in each classroom and in other locations around the building, and we have a Wii set up in the large therapy room adjacent to my office. The Wii has proven to be very useful in helping the students develop social and leisure skills that they can use in and outside of the school settings, but some of the students have difficulty manipulating the buttons on the controllers.


You can get Kinect-based apps from the Kinect Education website! Below are selected links from the website:

You can also get additional information from the Microsoft in Education "Kinect in the Classroom" website.

Below are a few videos to give you an overview of how open-source applications designed for the Kinect can be used in education: 






Jul 24, 2011

Video: Kinect SoundWall, links to info and code!





Here is information about the project from the KinetHacks SoundWall site:

"Kinect sound machines become prettier and easier with each development! The Kinect SoundWall is a drum beat music machine controlled by gestures and voice commands. This video by  displays this digital music machine at work and how through various gesture and voice commands, users can create awesome beats to dance to. In the video, the user gestures to to certain blocks in the screen in order to create a beat there or render the beats void. Through various voice commands, the beat can start, increase tempo, stop etc. Through the proper integration of both voice and gesture commands, the Kinect SoundWall sets the standard for a great and efficient sound machine of the Kinect!"
"For more information about the Kinect SoundWall visit the project’s website."
RELATED
Vertigo SoundWall CodePlex Project Site

Jul 18, 2011

Emerging Interactive Multimedia, New Models of HCI for Museum Exhibits (Course offered by Ideum's Jim Spadaccini, plus info about the MT55 multi-touch table)

Jim Spadaccini, the director and founder of Ideumwill be teaching a course on exhibit development through the University of Victoria, "Emerging Exhibits: Exploring New Models of Human Computer Interaction (HCI).  The excerpt from the course description provides a glimpse of how emerging technologies are beginning to change the museum experience:


"Computer-based interactive exhibits are undergoing a major transformation. The lone, single-user kiosk is now being replaced by multitouch tables and walls, motion-capture spaces, networked installations, and RFID-based exhibits. Advances in augmented reality, voice recognition, eye tracking, and other technologies promise even more radical change for exhibits in the near future."


I've been following Jim's journey with Ideum, a multimedia design firm that collaborates with museums and related non-profits, for many years, and I am impressed with the work of this company.  In addition to his work at Ideum,  Jim serves as the Principal Investigator of a National-Science Foundation sponsored open-source exhibit software project, Open Exhibits, which provides a free software development kit that supports the creation of multi-touch and multi-user software applications for museums and educational settings.


I'm happy to put in a plug for Ideum's latest product, the MT55 Platform Multi-Touch Table. It incorporates a range of features that I'm sure will meet the needs of museum visitors.  In my opinion, this table would be a fantastic resource for all types of libraries, including those in K-12 settings.

The MT55 Platform Multi-touch Table, from Ideum

The MT55 Platform Multitouch Table from Ideum on Vimeo.  (Note: This video features music by Moby, the track "Sevastopol" on his current album, Destroyed. The music was used with the artists' permission. Learn more at: moby.com") - Ideum

"The thinnest, largest, most powerful multitouch table available.The MT55 Platform multitouch table houses a powerful computer and a 55-inch interactive LCD display that responds to 32 touch-points, inside a rugged aluminum body."

"The bright 55″ 1920×1080 HD display has a 5,000,000:1 contrast ratio. A wide 178-degree viewing angle accommodates multiple users around the table. The optical multitouch system supports 32 simultaneous touch points for collaborative interaction. The system is multitouch-enabled from start-up, and runs Windows 7 64-bit professional edition."


"The integrated computer is packed with power. It contains an Intel® Hyper-Threaded DualCore i5® which runs at 2.66 GHz, 8GB of RAM, and a 128 GB solid-state drive (upgradeable to an i7®)."


"The table comes complete with WIFI, Bluetooth, and Ethernet connectivity. It also has multiple HDMI outputs that allow you to easily mirror the table's display, extend the desktop, or connect to and display from another computer or HDMI device."

"Convenient, but secure ports: CAT5, HDMI, and USB 2.0 are available on both the side on bottom the table."

"The MT55 Platform includes blue LED under-lights to illuminate the area beneath the interactive surface (custom LED colors are available). Every MT table includes a sophisticated internal cooling cell to maintain operating conditions that exceed the optimum environmental specifications for the internal components."

"The interactive surface of the MT55 Platform protected by a sheet of hardened, crystal clear, low-iron 5mm tempered glass surface...
As an option, we offer Sevasa HapticGlas®, produced exclusively for Ideum. Micro-etched HapticGlas® provides tactile feedback, reduces fingerprints, increases scratch resistance, and directs user focus." -Ideum
 

RELATED
High-res photos of the MT55 Platform

GestureWorks Software
Open Exhibits
Ideum
Open Exhibits Tuio Kinect

Jun 17, 2011

In case you missed this: Microsoft Releases Kinect SDK Beta for PC

Kinect for Windows SDK Beta!   IT IS TRUE!!!!!


My Kinect and PC are waiting for my summer project.    What a great opportunity to "practice" programming over my 5 week summer break..... I already know C#, and I've done a little game programming (ie. AI for Game Development - using XNA Game Studio Express- it has been a while).  
Skeleton tracking image
-Photo credit: Microsoft Research


I have some cool ideas for basic games that might be good for the students I work with who have autism spectrum disorders... and some ideas that might be fun for my grand-baby.  I can't wait to have time to code again!   


Here's some info from the Microsoft Kinect for Windows SDK Beta website:


"The Kinect for Windows SDK beta is a programming toolkit for application developers. It enables the academic and enthusiast communities easy access to the capabilities offered by the Microsoft Kinect device connected to computers running the Windows 7 operating system."


"The Kinect for Windows SDK beta includes drivers, rich APIs for raw sensor streams and human motion tracking, installation documents, and resource materials. It provides Kinect capabilities to developers who build applications with C++, C#, or Visual Basic by using Microsoft Visual Studio 2010."


This SDK includes the following features:


Raw sensor streams
Access to raw data streams from the depth sensor, color camera sensor, and four-element microphone array enables developers to build upon the low-level streams that are generated by the Kinect sensor.
Skeletal tracking
The capability to track the skeleton image of one or two people moving within the Kinect field of view make it easy to create gesture-driven applications.
Advanced audio capabilities
Audio processing capabilities include sophisticated acoustic noise suppression and echo cancellation, beam formation to identify the current sound source, and integration with the Windows speech recognition API.
Sample code and documentation
The SDK includes more than 100 pages of technical documentation. In addition to built-in help files, the documentation includes detailed walkthroughs for most samples provided with the SDK.
Easy installation
The SDK installs quickly, requires no complex configuration, and the complete installer size is less than 100 MB. Developers can get up and running in just a few minutes with a standard standalone Kinect sensor unit (widely available at retail outlets)."



Nicholas Kolakowski, Application Development News, 6/16/11


May 31, 2011

Top 10 All-Time Posts on the Interactive Multimedia Technology Blog

I'm finishing up the last couple weeks of the school year, so I'll have little time to post this week.  I hope you enjoy exploring the following links!


Revised Post 8/1/06: Interactive multimedia for social skills, understanding feelings, relaxation and coping strategies


Teliris Interact TouchTable and TouchWall: Immersive Collaboration & Telepresence; DVE's Holographic Tele-Immersion Room


Games to lift stress away: Flower, flOw, (and Cloud), from thatgamecompany


Power to the Pixel Cross-Media Forum Streaming Live from London Today #PttP


HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)


Link to iTV Doctor Rick Howe's post about 2D to 3D, 3D TV data points, and 3D content distributers


Temple Grandin - A gifted visual thinker, who also has autism, featured in HBO movie starring Claire Danes.  Update: Video of Claire Danes' acceptance of a Golden Globe for her performance


Algodoo physic app. for the SMART Board 800 series, supports multi-user interaction!


Wii Just Dance2 and Kinect Dance Central:  UI and Usability Approaches; Challenges for Developoing Accessible Games


Interactive Touch-Screen Technology, Participatory Design, and "Getting It" -Revisited