Showing posts sorted by relevance for query "remote control". Sort by date Show all posts
Showing posts sorted by relevance for query "remote control". Sort by date Show all posts

Nov 29, 2009

Tabletop Conference in Banff: Martin Kaltenbrunner's post on the Tangible Interaction Frameworks blog

I'd like to share with you a link to a great post covering the recent Interactive Tabletops and Surfaces conference in Banff, Canada. The post was written by Martin Kaltenbrunner, author of the Tangible Interaction Frameworks blog, and is packed with info and interesting links:

Tabletop Conference in Banff

Martin was impressed by the iLabat the University of Calgary.  He also mentioned the work of researchers from the Media Computing Groupat RWTH Aachen University known for SLAP, and the Media Interaction Lab at the Upper Austria University of Applied Sciences, known for CRISTAL.

Info about Martin from his website:

"Martin Kaltenbrunner, co-founder of Reactable Systems, is a Ph.D. candidate at the Pompeu Fabra University in Barcelona, Spain. His research concentrates on tangible user interfaces and human computer interaction in general, topics he has been also teaching at the Kunstuniversität Linz, Universitat Pompeu Fabra and UCP Porto. Recently he has been mainly working on the human computer interaction concepts of the Reactable - an electronic musical instrument with a tangible user interface. He is author of the open source tangible interaction framework reacTIVision and the related TUIO protocol, which has been widely adopted for open source multi-touch applications."


RELATED

Previous Posts



For an overview of what Interactive Tabletops and Surfaces is about, take a careful look of the video from the 2008 conference, credits listed below:


Video Credits (in order of appearance, Tabletop 2008):
1. System Design for the WeSpace: Linking Personal Devices to a Table-Centered Multi-User, Multi-Surface Environment. Jiang, H., Wigdor, D., Forlines, C., Shen, C.
2. Group Coordination and Negotiation through Spatial Proximity Regions around Mobile Devices on Augmented Tabletops, Kray C., Rohs, M., Hook, J. Kratz, S.
3. Tabletop AgilePlanner: A Tabletop-Based Project Planning Tool for Agile Software Development Teams
Wang, X., Maurer, F.
4. TableTrays: Temporary, Reconfigurable Work Surfaces for Tabletop Groupware. Pinelle, D., Stach, T., Gutwin, C.
5. IntuPaint: Bridging the Gap Between Physical and Digital Painting. Vandoren, P., Laerhoven, T., Claesen, L., Taelman, J., Raymaekers, C., Reeth, F.
6. Experiences with Building a Thin Form-Factor Touch and Tangible Tabletop. Izadi, S., Butler, A., Hodges, S., West, D., Hall, M., Buxton, B., Molloy, M.
7. ShapeTouch: Leveraging Contact Shape on Interactive Surfaces, Cao, X., Wilson, A., Balakrishnan, R., Hinckley, K., Hudson, S.
8. PocketTable: Mobile Devices as Multi-Touch Controllers for Tabletop Application Development, Hafeneger, S., Weiss, M., Herkenrath, G., Borchers, J.
9. Presenting using Two-Handed Interaction in Open Space, Vlaming, L., Smit, J., Isenberg, T.
10. DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions on and Above the Interactive Surface Benko, H., and Wilson, A.
11. Pokey: Interaction Through Covert Structured Light. Wren, C., Ivanov, Y., Beardsley, P., Kaneva, B., Tanaka, S.
12. Creating Malleable Interactive Surfaces using Liquid Displacement Sensing. Hilliges, O. Kim, D., Izadi, S.
13. Collaborative Interaction and Integrated Spatial Information and Services in Disaster Management, Fruijtiera, S., Dulkb, P., Diasc, E.


Oct 25, 2009

Interactive multi-touch for sound design, dj-ing, and music creation

NUI-Group member Christian Bannister, a musician, designer, and developer behind Subcycle Labs has been experimenting with music and multi-touch technology and interactions. Here's a demonstration of what he's come up with so far:


multi-touch the storm - interactive sound visuals - subcycle labs from christian bannister on Vimeo.
.
"The big picture goal of this project is to bridge the gap between sound visualization and musical instrument. With multi-touch interaction it is possible to manipulate multiple characteristics of a sound—directly and visually. Right now a lot of electronic music involves staring at the back of the performers laptop. This is a shame because in many cases a lot of really interesting things are happening on the computer that the audience is completely unaware of. This project hopes to create a common visual language and experience for the electronic musician and the audience by enhancing the perception of sound and music on both sides...These sketches are built with Processing and Max/MSP networked with OSC on a single computer..." -Christian Bannister



JazzMutant Lemur
The multi-touch tech company now known as Stantum evolved from JazzMutant, Jazzmutant was founded in 2002 by Guilluame Largillier and Pascal Joguet, and joined by Julien Olivier in 2003. The original focus was to create a multi-touch controller for music applications, and the Lemur was born. It is now in version 2, with features such as a gesture object that provides three ways for people to interact with sound, extended scripting abilities, and remote control of your computer's mouse cursor or keyboard.  Stantum recently developed a next-gen multi-touch screen system for use in mobile devices such as smart phones and netbooks.  (I'll discuss this further in a future post.)



Mapping Ableton Live to Jazz Mutant's multi-touch Lemur for sound design:


Additional tutorials can be found the Jazzmutant YouTube channel.


Ableton Live, the software used in the above video, will include Max. Max/MSP is now known as MAX 5:
"In use for over twenty years by performers, composers, artists, scientists, teachers, and students, Max is the way to make your computer do things that reflect your individual ideas and dreams. Version 5 is a dramatic transformation of Max that begins a new chapter in the program's history."

RELATED
Video:  Max for Live
Max is a product of cycling74
Cycling74 created the Make Controller Kit, which includes fully programmable controllers. The kit is networked based. It is capable of working with actuators and can read sensor information into Max.
Jamoma, a platform for interactive art-based research and performance. Jamoma is the prototyping environment for SpatDIF, the Spatial Sound Description Interchange Format, and GDIF, the Gesture Description Interchange Format.
GDIF: Gesture Description Interchange Format, a tool for music related movements, actions, and gestures 
Stantum's Mobile Phone Multi-touch Interface:  Demonstration of precise interactions on a resistive touch screen

How the Stantum's Resistive Multi-Touch Screen Works


Feb 15, 2009

Interactive Displays 2009 Conference: Tuesday, April 21 -Thursday April 23, Hilton San Jose, California

The Interactive Displays Conference, sponsored by Intertech Pira, will highlight an interesting mix of existing and emerging interactive display technologies and applications. The conference will be held at the Hilton in San Jose, California, from Tuesday, April 21st through Thursday, April 23rd.

The pre-conference seminar will feature Sakuya Morimoto, of CANESTA, who will present his company's innovative single-chip 3D image sensor technology that supports gesture interaction.
Keynote speakers will be
Jeff Han, of Perceptive Pixel, and Steven Bathiche, of Microsoft US.

Some Highlights:

Pre-conference Seminar: Gesture Navigation in the World of Digital Contents, Enabled by a Single-Chip 3D Image Sensor Presenter: Sakuya Morimoto, Senior Director, Business Development in Asia, CANESTA, Japan

Related:
Hitachi at CES 2009: Use of Canesta's 3D sensor to control television and home systems using hand gestures.



"With the wave of a hand, with the shake of a hand, you can control volume, you can actually change the channels, watch your favorite program...the most exciting thing, I think, is that you can actually control your temperature and the lighting in the room, the environmental lighting. So..it is very unique technology that is out there.."

Another demonstration of Hitachi's gesture interaction using the Canesta's 3=D Depth camera:



When a TV Remote is Just Too Much Effort, Wave -
Jennifer Bergen, PC Magazine
CANESTA Corporate Fact Sheet (pdf)

How does Canesta's Electronic Perception Technology Work?
"Canesta’s electronic perception technology forms 3-D, real time moving images in a single chip through patented methods which use light photons to “range” the image, similar to radar. The silicon sensor chip develops 3-D depth maps at a rate in excess of 30 frames per second, and then performs additional processing on these depth maps to resolve the images into application specific information that can easily be processed by embedded processor(s) in the end-use device or machine. Since Canesta’s software starts with a three-dimensional view of the world, provided immediately by the hardware, it has a substantial advantage over classical image processing software that struggles to construct three-dimensional representations using complex mathematics, and images from multiple cameras or points of view. This dramatic reduction in complexity makes it possible to embed the processing software directly into the chips themselves so they may be used in the most cost-conscious applications."



I will highlight some of the featured presentations in future blog posts:

Steven Bathiche, Director of Research, Applied Sciences Group, Entertainment and Devices Division MICROSOFT, US
Guillaume Largillier, Chief Strategy Officer and Co-Founder, STANTUM, France
Jeff Han, PERCEPTIVE PIXEL, US
Mark Fihn, Publisher, VERITAS ET VISUS, US
Derek Mitchell, Conference Producer, INTERTECHPIRA, US
Vinita Jakhanwal, Principal Analyst, Small/Medium Displays, ISUPPLI CORPORATION, US
Joseph Carsanaro, President and CEO F-ORIGIN, US
Tommi Ilmonen, CEO MULTITOUCH OY, Finland
Stephen Sedaker, Director of Component Sales WACOM TECHNOLOGY CORPORATION, US
Brad Gleeson, Managing Director, Business Development TARGETPATH GLOBAL LLC., US
Henry Kaufman, President and Founder, TACTABLE, US
Christophe Ramstein, Chief Technology Officer, IMMERSION CORPORATION, US
Mary Lou Jepsen, CEO, PIXEL QI, US
John Newton, Chief Technology Officer, NEXTWINDOW, New Zealand
Herve Martin, CEO, SENSITIVE OBJECT, France
Scott Hagermoser, Gaming Business Unit Manager 3M TOUCH SYSTEMS, US
Bob Cooney, Vice President, Business Development, ECAST, US
Brent Bushnell, Chief Technology Officer UWINK, US
Stephan Durach, Head, Technology Office, BMW GROUP, US
Jeff Doerr, Senior Manager, Business Development Self Service Solutions Group, FLEXTRONICS, US
Andy Wilson, Senior Researcher, Adaptive Systems and Interaction Group, MICROSOFT, US
Mats W. Johansson, Chief Executive Officer, EON REALITY, US
Lenny Engelhardt, Vice President for Business Development, N-TRIG, Israel
Dr Paul Diefenbach, Director, RePlay Lab, DREXEL UNIVERSITY, US
Andrew Hsu, Technical Marketing and Strategic Partnerships Manager, SYNAPTICS, US
Dean LaCoe, Business Development Manager, GESTURETEK, Canada
Keith Pradhan, Global Director of Product Management, TYCO ELECTRONICS, ELO TOUCHSYSTEMS, US
Jerry Bertrand, Managing Member/Acting CEO, MICROSCENT, LLC, US
Frederic Kaplan, CEO and Co-Founder, OZWE, Switzerland


Related

Visionary Jeff Han and Microsoft's Steven Bathiche to Keynote at Interactive Displays 2009

Jul 23, 2013

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform and Kaiser Permanente therapists offers a barrel of possibilities!

"The goal was to build a game that is extremely accessible, non-complex and includes simple mechanics that children with autism can quickly understand to retain their attention, prevent over-stimulation, encourage play, and prevent frustration." -Vectorform "Monkey Business" Team


I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.

I was impressed!

As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child.  The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills.  Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.

In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application.  Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.


Below are screen shots and descriptions of a few activities from the Monkey Business game:

Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another.  The child is provided with a set of toy lily pads on the floor.  As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.























Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.






















Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen.  The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements.  Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.






















Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:












































In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.  


Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials.  If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results.  Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.   

To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism.  Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment.   A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams.  I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement.  Monkey Business holds potential to fill the bill.

What next?

I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.  

Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction.  The most recent area of exploration is Google Glass. 

Intrigued by Google Glass?  So am I.  

During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass.  As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.

In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision.  In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine.  He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.


RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.


















Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:





Microsoft Kinect and Autism (SlideShare) Susan McCarthy, Little Angel's School 2/09/13


Italian Team Uses the Kinect to Treat Autistic Children Andrea Lorini, Epoch Times, 12/13/12

Microsoft Surface Multi-touch Application for Pediatric Neuropsychology Assessment (Featuring Vectorform) Lynn Marentette, TechPsych Blog, 8/26/09   



Xbox One, Kinect 2.0 and the future of health technology Marcelo Calbucci, Mobihealth News, 5/26/13



Accessible Games for Health and K-12 Education: Lessons from the Classroom (SlideShare) Lynn Marentette, 5/9/08, Games for Health Conference Presentation

Researchers: Microsoft Kinect is a Money-Saving Telemedicine Device Gabriel Perna, Healthcare Informatics, 2/15/13

Lowes, LP., Alfano LN, Yetter BA, Worthen-CHaudhari, L, Hinchman W, Samona P, Flanigan KM, Mendell JR Proof of Concept of the Ability of the Kinect to Quantify Upper Extremity Function in Dystrophinopathy
PLoS Curr. 2013 Mar 14; 5   doi:  10.1371/currents.md.9ab5d872bbb944c6035c9f9bfd314ee2

Jan 24, 2010

IEEE One Voice: Sharing the accomplishments and potential of people involved in technology and engineering. Great for sharing with students to encourage STEM studies!

John R. Vig, IEEE President and CEO, recently wrote a column about the need to share with the world the importance of the accomplishments of people who work in technological fields. In his column, he refers to the IEEE One Voice video, which I think is something all young people should view as they gather information for making important decisions for the future.

IEEE is a professional organization that is not limited to the stereotype of an "engineer". If you are a parent, teacher, school counselor, or mentor to a young person, take the time to learn more about the range of careers related to engineering and technology.

If you know a seventh or eighth grade student, take a look at IEEE's Future City Competition, designed to provide young people an opportunity to learn more about STEM (Science, Technology, Engineering and Mathematics) fields while teaming with engineers who are volunteer mentors to create computer models, then three dimensional models, of a future city. The Future City program is part of the National Engineers Week foundation.

RELATED

Engineering Your Life ( This is a great site for encouraging girls to consider engineering and related fields, great for career exploration activities, as it covers a variety of engineering disciplines.)

Below is a list of the various IEEE societies, taken from the IEEE website.  If we want to encourage our young people to consider STEM fields, it is important that we know we're talking about!



IEEE Aerospace & Electronic Systems Society
IEEE Antennas and Propagation Society
IEEE Broadcast Technology Society
IEEE Circuits and Systems Society
IEEE Communications Society
IEEE Components Packaging and Manufacturing Technology Society
IEEE Computational Intelligence Society
IEEE Computer Society
IEEE Consumer Electronics Society
IEEE Control Systems Society
IEEE Dielectrics and Electrical Insulation Society
IEEE Education Society
IEEE Electromagnetic Compatibility Society
IEEE Electron Devices Society
IEEE Engineering in Medicine and Biology Society
IEEE Geoscience & Remote Sensing Society
IEEE Industrial Electronics Society
IEEE Industry Applications Society
IEEE Information Theory Society
IEEE Instrumentation and Measurement Society
IEEE Intelligent Transportation Systems Society
IEEE Magnetics Society
IEEE Microwave Theory and Techniques Society
IEEE Nuclear and Plasma Sciences Society
IEEE Oceanic Engineering Society
IEEE Photonics Society
IEEE Power and Energy Society
IEEE Power Electronics Society
IEEE Product Safety Engineering Society
IEEE Professional Communication Society
IEEE Reliability Society
IEEE Robotics & Automation Society
IEEE Signal Processing Society
IEEE Society on Social Implications of Technology
IEEE Solid-State Circuits Society
IEEE Standards Association
IEEE Systems Man and Cybernetics Society
IEEE Technology Management Council
IEEE Ultrasonics Ferroelectrics and Frequency Control Society
IEEE Vehicular Technology Society
IEEE Women in Engineering