I came across an interesting video clip while researching haptic and tactile interfaces.
Chris Harrison , a Ph.D. student, and Scott E. Hudson, his advisor, are researchers at Carnegie-Mellon's Human-Computer Interaction Institute. They've been working on a variety of methods to provide tactile properties for visual displays
The paper can be accessed on Chris Harrison's website.
This work was also highlighted in a recent article in Popular Science:
Your Next Touchscreen Might Bulge With Inflatable Buttons: Latex overlays inflate to provide the tactile sensation of pressing a button -Dan Smith, 7/21/09
I'm not very fond of buttons, but if something like this could be configured for my remote control, I could control my TV effortlessly while viewing in the dark!
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by relevance for query "remote control". Sort by date Show all posts
Showing posts sorted by relevance for query "remote control". Sort by date Show all posts
Jul 24, 2009
Haptic/Tactile Interface: Dynamically Changeable Physical Buttons on a Visual Display -
Posted by
Lynn Marentette
Aug 10, 2013
Kinect Interaction to support people with disabilities: DoubleFine's Happy Action Theater/Kinect Party; OAK Air Switch and Face Switch, resources
I've been experimenting with gesture and touch-based applications for many years and I'm excited to see how things have unfolded over the past couple of years, especially in the field of special education.
Last week I downloaded DoubleFine's Kinect-based Happy Action Theater/Kinect Party, to use during some group activities with students I work with who have significant disabilities (including severe autism). I wish I had discovered this suite of games sooner!
I had loads of fun with students and colleagues as we explored some of the 36 creative, and sometimes zany, minigames. I had heard that DoubleFine had launched something special, but didn't realize how awesome it was until I spent some serious playful time with it at home last weekend. I then tried it out at work this past week.
If you are planning to explore Happy Action Theater/Kinect party, keep in mind that it plays best when there are at least two people and an audience to cheer everything along. Through the use of blob detection algorithms, the games can handle up to 6 players at a time, which is perfect for small-group special classes.
The following trailer gives just a little hint of what this suite of mini-games is all about!
I noted that many of the games were effective in helping students become more aware of their peers. They began to play and interact with one-another in ways I hadn't previously imagined. I especially liked the fact that many of the mini-games made it possible for students in wheelchairs to participate.
I look forward to exploring more of the games over the next few months and will follow up with a future post after I get more input from my colleagues (and students).
I learned about Kinect Party through my contact with people involved with the GestureSEN wiki. The wiki was created as part of a Professional Learning Community (PLC) for people who work with students in specialized schools, similar to the school where I work, and contains a wealth of information about the use of newer and emerging technologies, such as the iPad, Leap Motion, the Kinect, and eye-gaze systems to support young people with significant disabilities including autism Some members of the GestureSEN wiki have learned to code or are in the process of doing so, motivated by what they've experienced so far with their students. (More information and links are listed in the "RELATED" section of this post.)
OAK
OAK was developed by RCAST at the University of Tokyo in collaboration with Microsoft Japan Co., Ltd. It uses the motion-tracking capabilities of Microsoft's Kinect sensor to create non-contact switches for people with limited mobility, enabling them access to computers and other electronic devices and systems. The video below provides a nice overview of the OAK system.
The OAK Pro bundle includes the following applications:
The Air Switch software uses the distance/depth capabilities of the Kinect sensor to support gestures of the head, hands, or larger body part to turn things off or on. The infrared from the Kinect also supports the use of the Air Switch in the dark. The color mode function captures movements from smaller parts of the body, such as a fingertip.
The Face Switch software uses facial recognition software that can track the movements of the face, mouth, tongue, and eyes. It can identify facial parts that have moved significantly, and records motion data
The Motion History software observes the movement of a person's body using the video component of the Kinect sensor. This customizes the system to the individual and ensures accuracy of the switch. Movements are color coded and provide the person who is setting up the system a means to fit the system to the specific capabilities and needs of the user.
The OAK system can be enhanced by the sue of peripherals, such as a USB 4 channel relay box, an IR remote control device or outlet, or other on/off switches/outlets.
The Assist-i corporation has made the OAK system and peripherals available on Amazon Japan. From what I can tell from the company's website, the OAK software can be downloaded free for a 30-day trial. I'd love to see how it would work with some of the students I work with who have difficulty accessing conventional switches! It would be wonderful to come up with ways for these students to access a wider range of digital media activities and games.
RELATED
University adapting videogame technology to help physically disabled computer users
Philip Kendall, Japan Today, 10/10/12
OAK Air Switch (PC Kinect)
OneSwitch.org.uk 4/30/13
OAK Air Switch, Face Switch, Motion History Pro Bundle (pdf)
Assist-i Corporation
Amazon Ai store: Assist-i Corporation (Prices are in Yen.)
Below is a partial list of links to resources related to using or creating engaging interactive applications and games for people with special needs:
Using Kinect in Special Ed Classrooms: Advice from Loudoun County, Virginia Teachers
Microsoft in Education Team, Microsoft in Education Blog, 6/1/12
KinectSEN-Kinect and Special Educational Needs round-up
Greg Dunan, Microsoft Coding4Fun, 10/11/12
Monkeying Around with Autism Assessments: Kinect-based game by Vectorform andKaiser Permanente therapists offers a barrel of possibilities!
Lynn Marentette, Interactive Multimedia Technology, 7/23/13
Behind the Scenes: Creating Marty the Monkey (The character from Vectorform's autism assessment app) John Einselen, Vectorform Blog, 7/24/13
Kinect Party Review: More Fun from the Fun Kings
Casey Lynch, IGN, 12/20/12
The Power of Kinect in Special Needs Education
Willemijn de Lint, Hans Smeele, mytylschool De Ruimte
Sign Language Recognition and Translation with Kinect (pdf)
Ming Zhou, et. al.
Cool Kinect move: Reading sign language in real time
Christopher MacManus, CNET, 7/18/13
Anthony Rhys, Trinity Fields ICT
James Winchester, SENClassroom blog
PMLD Eyegaze Project at Trinity Fields
Kinect hacking using Processing
Kinect SEN and Processing Resources
Keith Manville, Oak Grove College OpenSEN
Mat's Classroom Blog
GestureSEN Wiki
KinectSEN Wiki; KinectSEN News
ProcessingSEN wiki
LeapSEN Wiki
EyegazeSEN Wiki
SEN Students and Coding
OpenSEN, 3/5/13
Processing2
Kinect for Windows Blog
Kinect For Windows
DoubleFine
Understanding Engagement, Module 3.2, Training materials for teachers of learners with severe, profound and complex learning difficulties, UK Dept. for Education
Subscribe by e-mail
Last week I downloaded DoubleFine's Kinect-based Happy Action Theater/Kinect Party, to use during some group activities with students I work with who have significant disabilities (including severe autism). I wish I had discovered this suite of games sooner!
I had loads of fun with students and colleagues as we explored some of the 36 creative, and sometimes zany, minigames. I had heard that DoubleFine had launched something special, but didn't realize how awesome it was until I spent some serious playful time with it at home last weekend. I then tried it out at work this past week.
If you are planning to explore Happy Action Theater/Kinect party, keep in mind that it plays best when there are at least two people and an audience to cheer everything along. Through the use of blob detection algorithms, the games can handle up to 6 players at a time, which is perfect for small-group special classes.
The following trailer gives just a little hint of what this suite of mini-games is all about!
I noted that many of the games were effective in helping students become more aware of their peers. They began to play and interact with one-another in ways I hadn't previously imagined. I especially liked the fact that many of the mini-games made it possible for students in wheelchairs to participate.
I look forward to exploring more of the games over the next few months and will follow up with a future post after I get more input from my colleagues (and students).
I learned about Kinect Party through my contact with people involved with the GestureSEN wiki. The wiki was created as part of a Professional Learning Community (PLC) for people who work with students in specialized schools, similar to the school where I work, and contains a wealth of information about the use of newer and emerging technologies, such as the iPad, Leap Motion, the Kinect, and eye-gaze systems to support young people with significant disabilities including autism Some members of the GestureSEN wiki have learned to code or are in the process of doing so, motivated by what they've experienced so far with their students. (More information and links are listed in the "RELATED" section of this post.)
OAK
OAK was developed by RCAST at the University of Tokyo in collaboration with Microsoft Japan Co., Ltd. It uses the motion-tracking capabilities of Microsoft's Kinect sensor to create non-contact switches for people with limited mobility, enabling them access to computers and other electronic devices and systems. The video below provides a nice overview of the OAK system.
The OAK Pro bundle includes the following applications:
The Air Switch software uses the distance/depth capabilities of the Kinect sensor to support gestures of the head, hands, or larger body part to turn things off or on. The infrared from the Kinect also supports the use of the Air Switch in the dark. The color mode function captures movements from smaller parts of the body, such as a fingertip.
The Face Switch software uses facial recognition software that can track the movements of the face, mouth, tongue, and eyes. It can identify facial parts that have moved significantly, and records motion data
The Motion History software observes the movement of a person's body using the video component of the Kinect sensor. This customizes the system to the individual and ensures accuracy of the switch. Movements are color coded and provide the person who is setting up the system a means to fit the system to the specific capabilities and needs of the user.
The OAK system can be enhanced by the sue of peripherals, such as a USB 4 channel relay box, an IR remote control device or outlet, or other on/off switches/outlets.
The Assist-i corporation has made the OAK system and peripherals available on Amazon Japan. From what I can tell from the company's website, the OAK software can be downloaded free for a 30-day trial. I'd love to see how it would work with some of the students I work with who have difficulty accessing conventional switches! It would be wonderful to come up with ways for these students to access a wider range of digital media activities and games.
RELATED
University adapting videogame technology to help physically disabled computer users
Philip Kendall, Japan Today, 10/10/12
OAK Air Switch (PC Kinect)
OneSwitch.org.uk 4/30/13
OAK Air Switch, Face Switch, Motion History Pro Bundle (pdf)
Assist-i Corporation
Amazon Ai store: Assist-i Corporation (Prices are in Yen.)
Below is a partial list of links to resources related to using or creating engaging interactive applications and games for people with special needs:
Using Kinect in Special Ed Classrooms: Advice from Loudoun County, Virginia Teachers
Microsoft in Education Team, Microsoft in Education Blog, 6/1/12
KinectSEN-Kinect and Special Educational Needs round-up
Greg Dunan, Microsoft Coding4Fun, 10/11/12
Monkeying Around with Autism Assessments: Kinect-based game by Vectorform andKaiser Permanente therapists offers a barrel of possibilities!
Lynn Marentette, Interactive Multimedia Technology, 7/23/13
Behind the Scenes: Creating Marty the Monkey (The character from Vectorform's autism assessment app) John Einselen, Vectorform Blog, 7/24/13
Kinect Party Review: More Fun from the Fun Kings
Casey Lynch, IGN, 12/20/12
The Power of Kinect in Special Needs Education
Willemijn de Lint, Hans Smeele, mytylschool De Ruimte
Sign Language Recognition and Translation with Kinect (pdf)
Ming Zhou, et. al.
Cool Kinect move: Reading sign language in real time
Christopher MacManus, CNET, 7/18/13
Anthony Rhys, Trinity Fields ICT
James Winchester, SENClassroom blog
PMLD Eyegaze Project at Trinity Fields
Kinect hacking using Processing
Kinect SEN and Processing Resources
Keith Manville, Oak Grove College OpenSEN
Mat's Classroom Blog
GestureSEN Wiki
KinectSEN Wiki; KinectSEN News
ProcessingSEN wiki
LeapSEN Wiki
EyegazeSEN Wiki
SEN Students and Coding
OpenSEN, 3/5/13
Processing2
Kinect for Windows Blog
Kinect For Windows
DoubleFine
Understanding Engagement, Module 3.2, Training materials for teachers of learners with severe, profound and complex learning difficulties, UK Dept. for Education
Subscribe by e-mail
Posted by
Lynn Marentette
Nov 29, 2009
Tabletop Conference in Banff: Martin Kaltenbrunner's post on the Tangible Interaction Frameworks blog
I'd like to share with you a link to a great post covering the recent Interactive Tabletops and Surfaces conference in Banff, Canada. The post was written by Martin Kaltenbrunner, author of the Tangible Interaction Frameworks blog, and is packed with info and interesting links:
Tabletop Conference in Banff
Martin was impressed by the iLabat the University of Calgary. He also mentioned the work of researchers from the Media Computing Groupat RWTH Aachen University known for SLAP, and the Media Interaction Lab at the Upper Austria University of Applied Sciences, known for CRISTAL.
Info about Martin from his website:
"Martin Kaltenbrunner, co-founder of Reactable Systems, is a Ph.D. candidate at the Pompeu Fabra University in Barcelona, Spain. His research concentrates on tangible user interfaces and human computer interaction in general, topics he has been also teaching at the Kunstuniversität Linz, Universitat Pompeu Fabra and UCP Porto. Recently he has been mainly working on the human computer interaction concepts of the Reactable - an electronic musical instrument with a tangible user interface. He is author of the open source tangible interaction framework reacTIVision and the related TUIO protocol, which has been widely adopted for open source multi-touch applications."
For an overview of what Interactive Tabletops and Surfaces is about, take a careful look of the video from the 2008 conference, credits listed below:
Video Credits (in order of appearance, Tabletop 2008):
1. System Design for the WeSpace: Linking Personal Devices to a Table-Centered Multi-User, Multi-Surface Environment. Jiang, H., Wigdor, D., Forlines, C., Shen, C.
2. Group Coordination and Negotiation through Spatial Proximity Regions around Mobile Devices on Augmented Tabletops, Kray C., Rohs, M., Hook, J. Kratz, S.
3. Tabletop AgilePlanner: A Tabletop-Based Project Planning Tool for Agile Software Development Teams
Wang, X., Maurer, F.
4. TableTrays: Temporary, Reconfigurable Work Surfaces for Tabletop Groupware. Pinelle, D., Stach, T., Gutwin, C.
5. IntuPaint: Bridging the Gap Between Physical and Digital Painting. Vandoren, P., Laerhoven, T., Claesen, L., Taelman, J., Raymaekers, C., Reeth, F.
6. Experiences with Building a Thin Form-Factor Touch and Tangible Tabletop. Izadi, S., Butler, A., Hodges, S., West, D., Hall, M., Buxton, B., Molloy, M.
7. ShapeTouch: Leveraging Contact Shape on Interactive Surfaces, Cao, X., Wilson, A., Balakrishnan, R., Hinckley, K., Hudson, S.
8. PocketTable: Mobile Devices as Multi-Touch Controllers for Tabletop Application Development, Hafeneger, S., Weiss, M., Herkenrath, G., Borchers, J.
9. Presenting using Two-Handed Interaction in Open Space, Vlaming, L., Smit, J., Isenberg, T.
10. DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions on and Above the Interactive Surface Benko, H., and Wilson, A.
11. Pokey: Interaction Through Covert Structured Light. Wren, C., Ivanov, Y., Beardsley, P., Kaneva, B., Tanaka, S.
12. Creating Malleable Interactive Surfaces using Liquid Displacement Sensing. Hilliges, O. Kim, D., Izadi, S.
13. Collaborative Interaction and Integrated Spatial Information and Services in Disaster Management, Fruijtiera, S., Dulkb, P., Diasc, E.
Tabletop Conference in Banff
Martin was impressed by the iLabat the University of Calgary. He also mentioned the work of researchers from the Media Computing Groupat RWTH Aachen University known for SLAP, and the Media Interaction Lab at the Upper Austria University of Applied Sciences, known for CRISTAL.
Info about Martin from his website:
"Martin Kaltenbrunner, co-founder of Reactable Systems, is a Ph.D. candidate at the Pompeu Fabra University in Barcelona, Spain. His research concentrates on tangible user interfaces and human computer interaction in general, topics he has been also teaching at the Kunstuniversität Linz, Universitat Pompeu Fabra and UCP Porto. Recently he has been mainly working on the human computer interaction concepts of the Reactable - an electronic musical instrument with a tangible user interface. He is author of the open source tangible interaction framework reacTIVision and the related TUIO protocol, which has been widely adopted for open source multi-touch applications."
RELATED
Previous Posts
For an overview of what Interactive Tabletops and Surfaces is about, take a careful look of the video from the 2008 conference, credits listed below:
Video Credits (in order of appearance, Tabletop 2008):
1. System Design for the WeSpace: Linking Personal Devices to a Table-Centered Multi-User, Multi-Surface Environment. Jiang, H., Wigdor, D., Forlines, C., Shen, C.
2. Group Coordination and Negotiation through Spatial Proximity Regions around Mobile Devices on Augmented Tabletops, Kray C., Rohs, M., Hook, J. Kratz, S.
3. Tabletop AgilePlanner: A Tabletop-Based Project Planning Tool for Agile Software Development Teams
Wang, X., Maurer, F.
4. TableTrays: Temporary, Reconfigurable Work Surfaces for Tabletop Groupware. Pinelle, D., Stach, T., Gutwin, C.
5. IntuPaint: Bridging the Gap Between Physical and Digital Painting. Vandoren, P., Laerhoven, T., Claesen, L., Taelman, J., Raymaekers, C., Reeth, F.
6. Experiences with Building a Thin Form-Factor Touch and Tangible Tabletop. Izadi, S., Butler, A., Hodges, S., West, D., Hall, M., Buxton, B., Molloy, M.
7. ShapeTouch: Leveraging Contact Shape on Interactive Surfaces, Cao, X., Wilson, A., Balakrishnan, R., Hinckley, K., Hudson, S.
8. PocketTable: Mobile Devices as Multi-Touch Controllers for Tabletop Application Development, Hafeneger, S., Weiss, M., Herkenrath, G., Borchers, J.
9. Presenting using Two-Handed Interaction in Open Space, Vlaming, L., Smit, J., Isenberg, T.
10. DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions on and Above the Interactive Surface Benko, H., and Wilson, A.
11. Pokey: Interaction Through Covert Structured Light. Wren, C., Ivanov, Y., Beardsley, P., Kaneva, B., Tanaka, S.
12. Creating Malleable Interactive Surfaces using Liquid Displacement Sensing. Hilliges, O. Kim, D., Izadi, S.
13. Collaborative Interaction and Integrated Spatial Information and Services in Disaster Management, Fruijtiera, S., Dulkb, P., Diasc, E.
Posted by
Lynn Marentette
Oct 25, 2009
Interactive multi-touch for sound design, dj-ing, and music creation
NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:
multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister
JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard. Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks. (I'll discuss this further in a future post.)
Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:
Additional tutorials can be found the Jazzmutant YouTube channel.
Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."
RELATED
Video: Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures
Stantum's Mobile Phone Multi-touch Interface: Demonstration of precise interactions on a resistive touch screen
How the Stantum's Resistive Multi-Touch Screen Works
multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister
JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard. Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks. (I'll discuss this further in a future post.)
Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:
Additional tutorials can be found the Jazzmutant YouTube channel.
Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."
RELATED
Video: Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures
Stantum's Mobile Phone Multi-touch Interface: Demonstration of precise interactions on a resistive touch screen
How the Stantum's Resistive Multi-Touch Screen Works
Posted by
Lynn Marentette
Feb 15, 2009
Interactive Displays 2009 Conference: Tuesday, April 21 -Thursday April 23, Hilton San Jose, California
The Interactive Displays Conference, sponsored by Intertech Pira, will highlight an interesting mix of existing and emerging interactive display technologies and applications. The conference will be held at the Hilton in San Jose, California, from Tuesday, April 21st through Thursday, April 23rd.
The pre-conference seminar will feature Sakuya Morimoto, of CANESTA, who will present his company's innovative single-chip 3D image sensor technology that supports gesture interaction.
Keynote speakers will be Jeff Han, of Perceptive Pixel, and Steven Bathiche, of Microsoft US.
Some Highlights:
Pre-conference Seminar: Gesture Navigation in the World of Digital Contents, Enabled by a Single-Chip 3D Image Sensor Presenter: Sakuya Morimoto, Senior Director, Business Development in Asia, CANESTA, Japan
Related:
Hitachi at CES 2009: Use of Canesta's 3D sensor to control television and home systems using hand gestures.
"With the wave of a hand, with the shake of a hand, you can control volume, you can actually change the channels, watch your favorite program...the most exciting thing, I think, is that you can actually control your temperature and the lighting in the room, the environmental lighting. So..it is very unique technology that is out there.."
Another demonstration of Hitachi's gesture interaction using the Canesta's 3=D Depth camera:
When a TV Remote is Just Too Much Effort, Wave -Jennifer Bergen, PC Magazine
CANESTA Corporate Fact Sheet (pdf)
How does Canesta's Electronic Perception Technology Work?
"Canesta’s electronic perception technology forms 3-D, real time moving images in a single chip through patented methods which use light photons to “range” the image, similar to radar. The silicon sensor chip develops 3-D depth maps at a rate in excess of 30 frames per second, and then performs additional processing on these depth maps to resolve the images into application specific information that can easily be processed by embedded processor(s) in the end-use device or machine. Since Canesta’s software starts with a three-dimensional view of the world, provided immediately by the hardware, it has a substantial advantage over classical image processing software that struggles to construct three-dimensional representations using complex mathematics, and images from multiple cameras or points of view. This dramatic reduction in complexity makes it possible to embed the processing software directly into the chips themselves so they may be used in the most cost-conscious applications."
I will highlight some of the featured presentations in future blog posts:
Steven Bathiche, Director of Research, Applied Sciences Group, Entertainment and Devices Division MICROSOFT, US
Guillaume Largillier, Chief Strategy Officer and Co-Founder, STANTUM, France
Jeff Han, PERCEPTIVE PIXEL, US
Mark Fihn, Publisher, VERITAS ET VISUS, US
Derek Mitchell, Conference Producer, INTERTECHPIRA, US
Vinita Jakhanwal, Principal Analyst, Small/Medium Displays, ISUPPLI CORPORATION, US
Joseph Carsanaro, President and CEO F-ORIGIN, US
Tommi Ilmonen, CEO MULTITOUCH OY, Finland
Stephen Sedaker, Director of Component Sales WACOM TECHNOLOGY CORPORATION, US
Brad Gleeson, Managing Director, Business Development TARGETPATH GLOBAL LLC., US
Henry Kaufman, President and Founder, TACTABLE, US
Christophe Ramstein, Chief Technology Officer, IMMERSION CORPORATION, US
Mary Lou Jepsen, CEO, PIXEL QI, US
John Newton, Chief Technology Officer, NEXTWINDOW, New Zealand
Herve Martin, CEO, SENSITIVE OBJECT, France
Scott Hagermoser, Gaming Business Unit Manager 3M TOUCH SYSTEMS, US
Bob Cooney, Vice President, Business Development, ECAST, US
Brent Bushnell, Chief Technology Officer UWINK, US
Stephan Durach, Head, Technology Office, BMW GROUP, US
Jeff Doerr, Senior Manager, Business Development Self Service Solutions Group, FLEXTRONICS, US
Andy Wilson, Senior Researcher, Adaptive Systems and Interaction Group, MICROSOFT, US
Mats W. Johansson, Chief Executive Officer, EON REALITY, US
Lenny Engelhardt, Vice President for Business Development, N-TRIG, Israel
Dr Paul Diefenbach, Director, RePlay Lab, DREXEL UNIVERSITY, US
Andrew Hsu, Technical Marketing and Strategic Partnerships Manager, SYNAPTICS, US
Dean LaCoe, Business Development Manager, GESTURETEK, Canada
Keith Pradhan, Global Director of Product Management, TYCO ELECTRONICS, ELO TOUCHSYSTEMS, US
Jerry Bertrand, Managing Member/Acting CEO, MICROSCENT, LLC, US
Frederic Kaplan, CEO and Co-Founder, OZWE, Switzerland
Related
Visionary Jeff Han and Microsoft's Steven Bathiche to Keynote at Interactive Displays 2009
The pre-conference seminar will feature Sakuya Morimoto, of CANESTA, who will present his company's innovative single-chip 3D image sensor technology that supports gesture interaction.
Keynote speakers will be Jeff Han, of Perceptive Pixel, and Steven Bathiche, of Microsoft US.
Some Highlights:
Pre-conference Seminar: Gesture Navigation in the World of Digital Contents, Enabled by a Single-Chip 3D Image Sensor Presenter: Sakuya Morimoto, Senior Director, Business Development in Asia, CANESTA, Japan
Related:
Hitachi at CES 2009: Use of Canesta's 3D sensor to control television and home systems using hand gestures.
"With the wave of a hand, with the shake of a hand, you can control volume, you can actually change the channels, watch your favorite program...the most exciting thing, I think, is that you can actually control your temperature and the lighting in the room, the environmental lighting. So..it is very unique technology that is out there.."
Another demonstration of Hitachi's gesture interaction using the Canesta's 3=D Depth camera:
When a TV Remote is Just Too Much Effort, Wave -Jennifer Bergen, PC Magazine
CANESTA Corporate Fact Sheet (pdf)
How does Canesta's Electronic Perception Technology Work?
"Canesta’s electronic perception technology forms 3-D, real time moving images in a single chip through patented methods which use light photons to “range” the image, similar to radar. The silicon sensor chip develops 3-D depth maps at a rate in excess of 30 frames per second, and then performs additional processing on these depth maps to resolve the images into application specific information that can easily be processed by embedded processor(s) in the end-use device or machine. Since Canesta’s software starts with a three-dimensional view of the world, provided immediately by the hardware, it has a substantial advantage over classical image processing software that struggles to construct three-dimensional representations using complex mathematics, and images from multiple cameras or points of view. This dramatic reduction in complexity makes it possible to embed the processing software directly into the chips themselves so they may be used in the most cost-conscious applications."
I will highlight some of the featured presentations in future blog posts:
Steven Bathiche, Director of Research, Applied Sciences Group, Entertainment and Devices Division MICROSOFT, US
Guillaume Largillier, Chief Strategy Officer and Co-Founder, STANTUM, France
Jeff Han, PERCEPTIVE PIXEL, US
Mark Fihn, Publisher, VERITAS ET VISUS, US
Derek Mitchell, Conference Producer, INTERTECHPIRA, US
Vinita Jakhanwal, Principal Analyst, Small/Medium Displays, ISUPPLI CORPORATION, US
Joseph Carsanaro, President and CEO F-ORIGIN, US
Tommi Ilmonen, CEO MULTITOUCH OY, Finland
Stephen Sedaker, Director of Component Sales WACOM TECHNOLOGY CORPORATION, US
Brad Gleeson, Managing Director, Business Development TARGETPATH GLOBAL LLC., US
Henry Kaufman, President and Founder, TACTABLE, US
Christophe Ramstein, Chief Technology Officer, IMMERSION CORPORATION, US
Mary Lou Jepsen, CEO, PIXEL QI, US
John Newton, Chief Technology Officer, NEXTWINDOW, New Zealand
Herve Martin, CEO, SENSITIVE OBJECT, France
Scott Hagermoser, Gaming Business Unit Manager 3M TOUCH SYSTEMS, US
Bob Cooney, Vice President, Business Development, ECAST, US
Brent Bushnell, Chief Technology Officer UWINK, US
Stephan Durach, Head, Technology Office, BMW GROUP, US
Jeff Doerr, Senior Manager, Business Development Self Service Solutions Group, FLEXTRONICS, US
Andy Wilson, Senior Researcher, Adaptive Systems and Interaction Group, MICROSOFT, US
Mats W. Johansson, Chief Executive Officer, EON REALITY, US
Lenny Engelhardt, Vice President for Business Development, N-TRIG, Israel
Dr Paul Diefenbach, Director, RePlay Lab, DREXEL UNIVERSITY, US
Andrew Hsu, Technical Marketing and Strategic Partnerships Manager, SYNAPTICS, US
Dean LaCoe, Business Development Manager, GESTURETEK, Canada
Keith Pradhan, Global Director of Product Management, TYCO ELECTRONICS, ELO TOUCHSYSTEMS, US
Jerry Bertrand, Managing Member/Acting CEO, MICROSCENT, LLC, US
Frederic Kaplan, CEO and Co-Founder, OZWE, Switzerland
Related
Visionary Jeff Han and Microsoft's Steven Bathiche to Keynote at Interactive Displays 2009
Posted by
Lynn Marentette
Jul 23, 2013
Monkeying Around with Autism Assessments: Kinect-based game by Vectorform and Kaiser Permanente therapists offers a barrel of possibilities!
"The goal was to build a game that is extremely accessible, non-complex and includes simple mechanics that children with autism can quickly understand to retain their attention, prevent over-stimulation, encourage play, and prevent frustration." -Vectorform "Monkey Business" Team
I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.
I was impressed!
As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child. The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills. Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.
In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application. Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.
Below are screen shots and descriptions of a few activities from the Monkey Business game:
Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another. The child is provided with a set of toy lily pads on the floor. As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.
Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.
Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen. The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements. Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.
Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:
In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.
Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials. If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results. Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.
To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism. Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment. A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams. I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement. Monkey Business holds potential to fill the bill.
What next?
I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.
Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction. The most recent area of exploration is Google Glass.
Intrigued by Google Glass? So am I.
During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass. As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.
In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision. In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine. He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.
RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.
Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:
I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.
I was impressed!
As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child. The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills. Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.
In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application. Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.
Below are screen shots and descriptions of a few activities from the Monkey Business game:
Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another. The child is provided with a set of toy lily pads on the floor. As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.
Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.
Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen. The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements. Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.
Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:
In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.
Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials. If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results. Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.
To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism. Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment. A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams. I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement. Monkey Business holds potential to fill the bill.
What next?
I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.
Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction. The most recent area of exploration is Google Glass.
Intrigued by Google Glass? So am I.
During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass. As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.
In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision. In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine. He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.
RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.
Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:
Vectorform Partners with Kaiser
Permanente to Assess Autism Spectrum Disorder Through Innovation Patrick
Samona, Vectorform Blog, 7/10/13
Using Kinect and Happy Action Theater
as therapy: how one school is reaching autistic children Ben Kuchera, The Penny Arcade Report,
11/19/12
Hacking for autism: Apps to help
everyone on the spectrum Donna Tam,
CNET, 2/22/13
Italian Team Uses the Kinect to Treat
Autistic Children Andrea
Lorini, Epoch Times, 12/13/12
Microsoft Surface Multi-touch
Application for Pediatric Neuropsychology Assessment (Featuring Vectorform) Lynn
Marentette, TechPsych Blog, 8/26/09
Video: http://vimeo.com/4141208
Microsoft Gaming Technology Helps
Children with Cerebral Palsy eHealthNews,
8/16/12
Should We Integrate Video Games Into
Home-Based Rehabilitation Therapies for Cerebral Palsy? Elaine
Biddiss, Future Neurology, Medscape Today, 7/2012
Xbox One, Kinect 2.0 and the future of
health technology Marcelo
Calbucci, Mobihealth News, 5/26/13
Accessible Games for Health and K-12
Education: Lessons from the Classroom (SlideShare) Lynn
Marentette, 5/9/08, Games for Health Conference Presentation
Researchers: Microsoft Kinect is a
Money-Saving Telemedicine Device Gabriel
Perna, Healthcare Informatics, 2/15/13
Lowes,
LP., Alfano LN, Yetter BA, Worthen-CHaudhari, L, Hinchman W, Samona P, Flanigan
KM, Mendell JR Proof of Concept of the Ability of the Kinect to Quantify
Upper Extremity Function in Dystrophinopathy
PLoS Curr.
2013 Mar 14; 5 doi: 10.1371/currents.md.9ab5d872bbb944c6035c9f9bfd314ee2
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)






