Showing posts sorted by date for query kinect. Sort by relevance Show all posts
Showing posts sorted by date for query kinect. Sort by relevance Show all posts

Jul 12, 2012

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST)

CFP for Special Issue of Personal and Ubiquitous Computing on Educational Interfaces, Software, and Technology (EIST) 


Overview 
One of the primary goals of teaching is to prepare learners for life in the real world. In this ever-changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Given the potential of these new interfaces, software, and technologies as learning tools, as well as the ubiquitous application of interactive technology in formal and informal learning environments, there is a growing need to explore how next-generation technologies will impact education in the future. 


As a community of Human-Computer Interaction (HCI) and educational researchers, we need to theorize and discuss how new technologies should be integrated into the classrooms and homes of the future. In the last three years, three CHI workshops have provided a forum to discuss key issues of this sort, particularly in the context of next-generation education. The aim of this special issue of Personal and Ubiquitous Computing is to summarize the potential design challenges and perspectives on how the community should handle next-generation technologies in the education domain for both teachers and students. 

We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. Topics of interest include but are not limited to: 

  • Gestural input, multitouch, large displays 
  • Mobile devices, response systems (clickers) 
  • Tangible, VR, AR & MR, multimodal interfaces 
  • Console gaming, 3D input devices 
  • Co-located interaction, presentations 
  • Educational pedagogy, learner-centric, child computer interaction 
  • Empirical methods, case studies 
  • Multi-display interaction 
  • Wearable educational media
Important Dates
  • Full papers due: November 9, 2012
  • Initial reviews to authors: January 18, 2013
  • Revised papers due: March 15, 2013
  • Final reviews to authors: April 26, 2013
  • Final papers due: June 14, 2013
Submission Guidelines
Submissions should be prepared according to the Word template located at the bottom of this page. All manuscripts are subject to peer review. Manuscripts must be submitted as a PDF to the easychair submission system. Submissions should be no more than 8000 words in length.

Guest Editors and Contact Information
  • Syed Ishtiaque Ahmed, Cornell University
  • Quincy Brown, Bowie State University
  • Jochen Huber, Technische Universität Darmstadt
  • Si Jung “Jun” Kim, University of Central Florida
  • Lynn Marentette, Union County Public Schools, Wolfe School
  • Max Mühlhäuser, Technische Universität Darmstadt
  • Alexander Thayer, University of Washington 
  • Edward Tse, SMART Technologies

Information about the Journal of Personal and Ubiquitous Computing

Jul 8, 2012

PO-MO, a creative group that combines digital art, interaction, movement, and play to create engaging surfaces and spaces.

I recently learned more  PO-MO, a relatively new start-up tech company based in Winnipeg, Canada. According to the company's information, PO-MO "specializes in interactive digital display solutions, including gesture and motion based interactivity, interactive display content creation and management, and large interactive display and projection services for advertisers, educators, and events."  


Po-Motion was a finalist in an elevator pitch video contest last fall. It has several advantages over potential competitors.  The system is easy to use, and priced within the range that is affordable for schools, museums, and other cost-conscious groups who would like to provide technology-supported immersive interactive experiences for people of all ages.  The PO-MOtion software designed for interactive floors and walls starts at $39.99, and works on any computer, using any USB web camera and a projector. Other applications make use of Kinect sensors.


I especially like one of PO-MO's recent projects, the Impossible Animals Museum Exhibit, created using Unity 3-D, for the Manitoba Children's Museum.  How does it work?  Children create a colored egg using crayons and paper, which is then scanned into the exhibit and digitally embedded into the system, which includes an interactive wall and floor.  When the egg is touched, it is activated to hatch, and then becomes a motion reactive animal.  The environment includes things like water, landscapes, and even a spaceship.  The system has a "reset world" button for museum staff to use when needed.  

Impossible Animals Exhibit

Impossible Animals Interactive Museum Installation from PO-MO Inc. on Vimeo.



The following video explains how the PO-MO system works:


PO-MO is also involved in promotional projects, assisting retailers, ad agencies, and brand managers with creative ways to engage customers and clients:
Ragpackers Kinect-based Window Display

Ragpickers Kinect Window Display from PO-MO Inc. on Vimeo.


The following video provides a scrolling description about PO-MO's work, including promising data collected during implementation:

Other products and services provided by PO include mobile app development. I especially like the augmented reality business card depicted in the following video clip:

Augmented Reality Business Card from PO-MO Inc. on Vimeo


Imagine if your local shopping centers, museums, libraries, or even schools offered this level of immersive interaction on a regular basis!

RELATED
The PO-MOtion system has a wide range of uses. It is currently used in an educational setting in a sensory room for students with special needs, something that I'd like to try out in the near future with students at Wolfe School. I plan to share more about this in another post.


PO-MO Case Studies


PO-MO Bios:
Meghan Athavale – Director/CEO, PO-MO Inc.
"Meghan has been a professional designer and animator since graduating from Red River College in 1997. After graduation, she moved to Calgary, where she spent almost two years directing projects at Aurenya Studios, a start-up animation company. In 2001, Meghan was engaged by Community Connections to support community-based IT development projects in rural Manitoba and in Winnipeg’s inner city.  In 2008, Meghan joined Manlab, developing educational interactive games and resources for Immigrate Manitoba. She also launched Meghan PO-MO Project, a sole proprietorship which provided sound reactive visuals for DJs and venues across Canada. In 2009, Meghan was contracted as the User Experience Designer at Tipping Canoe, a multinational internet marketing company.

In 2010, Meghan formed PO-MO Inc. in partnership with Curtis Wachs. She began working exclusively for the company in December, 2010. Today, Meghan is the driving force behind PO-MO Inc."


Curtis Wachs – Technical Director/COO, PO-MO Inc.
"Curtis graduated from Assiniboine Community College in 2003 where he studied object oriented programming. Directly upon graduating, Curtis was hired by Assiniboine Community College to help design and develop software for online classes. Curtis relocated to Winnipeg in 2006 to create interactive training material for sales staff at E.H. Price. During the course of his work, Curt was apprenticed in 3D modelling and animation by Liem Ngyuen, a former Frantic Films resident. In 2008, Curtis joined Manlab, where he created online educational games for Travel Manitoba, Immigrate Manitoba, and other clients. In 2010, Curtis formally joined PO-MO Project, and the company became a partnership. In June 2010, PO-MO Inc. was founded.

Curtis is currently the technical director at PO-MO Inc., overseeing the project management and workflow of contracted and R&D development projects."


May 21, 2012

Leap Motion: Low Cost Gesture Control for Your Computer Display

Jessica Vascellaro, of the Wall Street Journal, reports about gesture,  motion. and even object control for computers, highlighting the work of  Leap Motion and Flutter.




Apparently the Leap Motion sensor is less expensive than Microsoft's Kinect. It can track movements down to 1/100 of a millimeter and can track fingers and movement. It handles interaction with 8 cubic feet of space.


Below is a video from the Leap Motion website:






RELATED
Leap FAQs
Leap Motion Developer Kit Application
Leap Motion: 3D hands-free motion control, unbound
Daniel Terdiman, CNET, 5/20/12
FYI:  Do a search and you'll find many more articles and posts about Leap Motion!

May 19, 2012

Johnny Chung Lee's Recent Words of Wisdom & Google's Open-Source Ceres Non-Linear Least Squares Solver


I have been a fan of Johnny Chung Lee since 2007 or 2008, before he finished his Ph.D in Human-Computer Interaction.  Johnny went on to work at Microsoft (Kinect) and then Google, where he works as a Rapid Evaluator. 


Johnny is known for his experiments with the Wii Remote, which he introduced to the world during a TED Talk in 2008.  He continues to maintain his Procrastineering blog, and from time-to-time, uses his blog to share his take on the world of technology.  The following quote is a good example of his viewpoint, taken from his post, "Technology as a Story":


"...what saddens me is when I encounter technologists with the brilliance to create new and wonderful things, but lack a sense of what is beautiful to people. Technology is most often known for being ugly and unpleasant to use, because technologists most often build technology for other technologists.
...But to touch millions of people, you have to tell a story - a story that they can believe in, a story that can inspire them. Technology is a tool by which new stories can be crafted." - 



Today, I came across Johnny's most recent post, which asks, "So, what exactly is a "non-linear least squares solver"?  And why should you care?   Take a moment to read his post, "Ceres: solving complex problems using computing muscle".  Google just open sourced the Ceres Non-Linear Least Squares Solver.


If Johnny Chung Lee thinks that this is "probably the most interesting code library" that he's had a chance to work with, it probably has some value. 


Even if if you don't have a clue about the Ceres Non-Linear Lest Squares Solver,  you might appreciate Johnny's examples of how would it would useful. In today's rapidly-accelerating technology-supported world, you just might need it in your future!


Here are a few examples:
---Making sense of sensor data from multiple locations (see video "SLAM 1: Viewed at 6X speed")
---Figuring out the position of a camera and the objects in view (see video "Parallel Tracking and Mapping for Small AR Workspaces")
---Combining GPS data with vehicle sensors in cars. (see video "Street View Sensor Fusion with Ceres")


RELATED
Johnny Chung Lee's Website
Excerpt from a post I wrote about Johnny Chung Lee four years ago:
I wish I could be Johnny Chung Lee for a Day! 3/2/08
I've mentioned in previous posts that I am a fan of Johnny Chung Lee, a Ph.D. student in the Human-Computer Interaction department at Carnegie-Mellon University. Johnny expects to complete his Ph.D this year. Johnny recently presented his innovative work at TED 2008. 


What impresses me about Johnny is the way that he has documented his intellectual journey in a very accessible way, by using YouTube and his well-organized, appealing website. Johnny has taken interesting ideas that most would dismiss as silly or impractical, and transformed them into useful, usable applications that hold great promise for future work. 


 In my opinion, many of Johnny's "hacks" will spark ideas related to the design and development of universally designed technologies and applications that will meet the technology needs of a wider range of people. This is important, especially now that an increasing number of "connected" interactive displays and kiosks (known by the marketing industry as interactive digital signage) in public spaces.


January 2011 post:
"Hi, Google. My name is Johnny Chung Lee": Johnny Chung Lee Leaves Microsoft. (I still wish I could be Johnny Chung Lee for a day.)

Jan 14, 2012

You Know You've Secretly Wanted to Learn To Code! (Info, links, video!)

You know you've secretly wanted to learn to code. Just do it! 


"Make your New Year's resolution learning to code.Sign up on Code Year to get a new interactive programming lesson sent to you each week and you'll be building apps and web sites before you know it." -Code Year  http://codeyear.com/


BTW, coding skills are needed beyond the world of apps and websites. Take a look at some of the posts and links on this blog - your imagination just might be sparked!  If you already know how to code, why not commit to learning something new?  


COMMENT:  There is much more to coding than what you'll learn through Code Academy's Code Year process. If you are serious about learning more about coding, computer science, and software systems, take an introductory course at your local community college or university extension program,  preferably with a friend. Ask the instructor if some of the assignments can be done through the "paired programming" technique.  It is more fun and social than the traditional way of coding!


Of all the textbooks, videos, and coding/programming self-help books I've come across, the series that has made the most sense to me is the brain-friendly "Head First" publications from O'Reilly.  It explains things well for beginners. Although it contains text and code, there are many pictures, diagrams, and humorous visual representation of basic concepts that are much more engaging than traditional "learn-to-code" tomes.


It is not too late to learn to code! 
There are so many great resources available to us now, in 2012, there is no excuse to ignore your inner geek.  If some (or all) of your hair is grey, why do crossword puzzles or Sudoku when you can be creative with code?


I took my first programming class about 8 years ago, when my youngest daughter was in high school.  It was daunting at first, because the textbook was dry, the programming labs were tedious, and some of my mostly-male classmates already knew how to code.  In my case, I was motivated to learn to code because I wanted to create games at the time, and this got me over the hump.  I soon learned that coding is both a science and an art, and learning to code opens up a whole new way of thinking. (See the video of Jeanette Wing's presentation about computational thinking, at the end of this post.)


RELATED
Why your 2012 New Year's Resolution Should Be Learning to Code
Sarah J., SPOTLIGHT on Digital Media and Learning Blog 1/9/12
PLAYBACK:  Pedagogy, Coding and Teaching Kids to Think Deeply
Sarah J., SPOTLIGHT on Digital Media and Learning 1/13/12
Center for Computational Thinking
Computational Thinking (pdf) (Jeannette Wing)
Code Academy
http://codeyear.com/
Head First Labs
Coding4Fun
Coding4Fun Blog
Coding4Fun KinectToolkit
Kinect for Windows SDK
CSTA:  ACM K-12 Computer Science Model Curriculum, 2nd Edition
CS Model Curriculum, 2nd Edition (pdf)
Jeanette M. Wing's Vision: "Computational thinking will be a fundamental skill used by everyone in the world by the middle of the 21st Century"
br />

For 2012, my goal is to brush up on my previously learned coding skills and learn a few new ones related to the Kinect. I also want to become comfortable with HTML5.














Jan 3, 2012

"Kinect-based Telepresence with Room Sized 3D Capture and Life Sized Display", Includes Behind-the Scene "how-to" (UNC Chapel Hill)

Jim Spadaccini, of Open Exhibits, recently told me about a project that involves the real-time, interactive 3-D capture of people in a room.  As the viewer moves around the screens, the depth-detecting feature of the Kinect is harnessed to set the stage for a realistic telepresence experience.  


Take the time to view the video, which contains some interesting views of how the system works:





Thanks, Jim, for the link!

RELATED/SOMEWHAT RELATED
Kinect Real-Time Room Telepresence
Kinecthacks, 1/3/12
From the project's website:
Maimone, A. and H. Fuchs. "A First Look at a Telepresence System with Room-Sized Real-Time 3D Capture and Large Tracked Display." The 21st International Conference on Artificial Reality and Telexistence (ICAT) (Osaka, Japan, November 28-30, 2011) [paper] [video]
Maimone, A. and H. Fuchs. "Encumbrance-free Telepresence System with Real-time 3D Capture and Display using Commodity Depth Cameras." The IEEE International Symposium on Mixed and Augmented Reality (ISMAR) 2011 (Basel, Switzerland, October 26-29, 2011) [paper] [video]
Open Exhibits

Dec 13, 2011

Kinect in Education! (kinectEDucation)

Although I'm currently exploring the world of interactive HTML5, interactive video, etc., I think I just might make "kinecteducation" the focus of my tech-hobbies. I have some experience with game programming-one of my computer courses required a project using XNA- and I know quite a bit about gesture and multitouch, multi-user interactio, so it would'nt be too much of a stretch.


My motivation?

As a school psychologist, my main assignment is a school/program for students with disabilities, including about 40 or so who have autism spectrum disorders. Yesterday, the principal of the school attended a demonstration of the Kinect and requested that our school be considered for piloting it. One of my other assignments is a magnet high school for technology and the arts, and rumor has it that it will be offering a game programming curriculum.  I'd love to co-sponsor an after-school game club and encourage the students to program educational apps for the Kinect sometime in the near future! 


I'm also working as a client, in collaboration with come of my educator colleagues, with a team of university students who are creating a communication/social skills game suite geared for students with autism and related disabilities....


I'm inspired by the possibilities!


We have large SMARTboards in each classroom and in other locations around the building, and we have a Wii set up in the large therapy room adjacent to my office. The Wii has proven to be very useful in helping the students develop social and leisure skills that they can use in and outside of the school settings, but some of the students have difficulty manipulating the buttons on the controllers.


You can get Kinect-based apps from the Kinect Education website! Below are selected links from the website:

You can also get additional information from the Microsoft in Education "Kinect in the Classroom" website.

Below are a few videos to give you an overview of how open-source applications designed for the Kinect can be used in education: 






Dec 12, 2011

UPDATE POST: Educational Interfaces, Software, and Technology: 2012 ACM-CHI Workshop Call for Papers/Presentations

There is still time left to submit your paper!


CALL FOR PAPERS
EDUCATIONAL INTERFACES, SOFTWARE, AND TECHNOLOGY 2012
3rd Workshop on UI Technologies and Educational Pedagogy
May 5-6 2012
in conjunction with ACM-CHI 2012, Austin, Texas

This will be our third annual workshop in conjunction with CHI 2012.



One of the primary goals of teaching is to prepare learners for life in the real world. In this ever changing world of technologies such as mobile interaction, cloud computing, natural user interfaces, and gestural interfaces like the Nintendo Wii and Microsoft Kinect, people have a greater selection of tools for the task at hand. Teachers and students can leverage these tools to improve learning outcomes. Educational interfaces and software are needed to ensure that new technologies serve a clear purpose in the classrooms and homes of the future.



Since teachers are always looking for creative ways to engage 21st century learners, there needs to be an academic venue for researchers to discuss novel educational tools and their role in improving learning outcomes. This workshop aims at filling this void: combining the pedagogical expertise of the cooperative learning, and learning sciences communities with the technical creativity of the CHI, UIST and interactive surface communities. The objective of this workshop is to become a conference within two years


We invite authors to present position papers about potential design challenges and perspectives on how the community should handle the next generation of HCI in education. 




Topics of interest include:

  • Gestural input, multitouch, large displays
  • Mobile Devices, response systems (clickers)
  • Tangible, VR, AR & MR, Multimodal interfaces
  • Console gaming, 3D input devices
  • Co-located interaction, presentations
  • Educational Pedagogy, learner-centric, Child Computer Interaction
  • Empirical methods, case studies
  • Multi-display interaction
  • Wearable educational media
Submission:  The deadline for workshop paper submissions is Dec 20, 2011. Interested researchers should submit a 4-page position paper in the ACM CHI adjunct proceedings style to the workshop management system. Acceptance notifications will be sent out February 20, 2012. The workshop will be held May 5-6, 2012 in Austin, Texas. Please note that at least one author of an accepted position paper must register for the workshop and for one or more days of the CHI 2012 conference.

Website: http://smarttech.com/eist2012
Contact: Edward Tse, SMART Technologies, edwardtse@smarttech.com


RELATED
Educational Interfaces, Software, and Technology Workshop Organizers
Edward Tse, SMART Technologies 
Lynn V. Marentette, Union County Public Schools
 Syed Ishtiaque Ahmed, Cornell University
 Alex Thayer, University of Washington
 Jochen Huber, Technische Universität Darmstadt

 Max Mühlhäuser, Technische Universität Darmstadt
 Si Jung “Jun” Kim, University of Central Florida

 Quincy Brown, Bowie State University

Oct 11, 2011

Hacking Autism: Touch Technology for Young People with Autism Spectrum Disorders (October 11 is the Hackathon!)

October 11, 2011 is a special day. A number of software programmers will be working to develop "innovative, touch-enabled applications for the autism cimmunity and make this software available for free on HackingAutism.org." Take a moment to watch the following video clip, and then explore the Hacking Autism website!
"When touch-enabled computing was introduced to the world, no one could have anticipated that this technology might help open up a new world of communication, learning and social possibilities for autistic children. Yet it has. Hacking Autism is a story of technology and hope and the difference it's making in the lives of some people who need it most.Hacking Autism doesn't seek to cure autism, but rather it aims to facilitate and accelerate technology-based ideas to help give those with autism a voice." -hackingautism.org
Touch technology + people with autism spectrum disorders = 
One of the reasons why I returned to school to take computer courses and explore natural user interfaces and interaction.   

RELATED
Interacting with HP TouchSmart Notes: Photo, Video, Audio and More
Interactive Visual Supports for Children with Autism:  Gillian Hayes' Work at the Social and Technology Action Research Group
Open Source Multi-touch Software for Young People with Autism
Interactive iPad Apps for Kids with Autism: Could some of these be transformed for multi-touch tabletop activities?
iPad Apps: Supporting Communication for Young People with Autism (links to Moms with Apps)
Reflections about interactivity in my present world (Aug. 2010)
Interactive Multi-touch for Children with Autism Spectrum Disorders: Research and Apps by Juan Paplo Hourcade, Thomas Hanson, and Natasha Bullock-Rest, University of Iowa
Open Autism Software "Where Social Skills and Interest in Computers Meet"
Sen H. Hirano, Michael T. Yeganyan, Gabriela Marcu, David H. Nguyen, Lou Anne Boyd, Gillian R. Hayes vSked: Evaluation of a System to Support Classroom Activities for Children with Autism. In CHI 2010 (Atlanta, GA, 2010).(pdf) Gillian R. Hayes, Sen Hirano, Gabriela 
Marcu, Mohamad Monibi, David H. Nguyen, and Michael Yeganyan. Interactive Visual Supports for Children with Autism. Personal and Ubiquitous Computing. April 2010. 
Monibi, M., Hayes, G.R. Mocotos: Mobile Communication Tools for Children with Special Needs. Proceedings of Interaction Design and Children, pages 121-124 ACM, 2008 
SOMEWHAT RELATED
Hope Technology School
Do2Learn JobTips
Autism Research Group at Georgia Tech
Immersive Cocoon Interaction"  "It's people who are now the interface"
Today I hooked up a Will to the IWB in the school's therapy room.  Next, a Kinect? 
(IWBs + Games + Social Skills)

Jul 27, 2011

Apple's iOS 5 facial recognition feature opens up interactive possibilities

I've been thinking about creating my first iPad app, and as I was searching for information, I came across a few articles related to Apple's new iOS 5 that I found interesting.  


Because my target user group includes young people with autism spectrum disorders (ASD),  I was intrigued by the possibility that the facial recognition APIs might provide a means of assessing mood or emotional states.   Most of us understand that our faces function as mirrors to feelings, and we use our facial expressions to communicate our feelings to others.  Unfortunately, this is a concept that is difficult for young people with ASD to understand.  My hunch is that there is an "app for that".   


I'd love to great to create a little iPad app for young people with ASD for education, intervention, and/or communication activities that incorporates the facial recognition feature!


Apple's iOS facial recognition could lead to Kinect-like interaction
Darrell Etherington, GigaOm/Reuters, 7/27/11
Here is a quote from the above article:
"You could create apps that track a user’s eye movement and dynamically change content accordingly, for instance. App developers might even be able to use data gathered from facial recognition APIs to identify so-called “hotspots,” providing insight about where a user is looking most within an app and arranging content accordingly. In time, an iPhone app might even be able to assess the emotional state of the user, based on whether they’re frowning or smiling, and address the user in a manner appropriate to their mood. It might also be able to tell how engaged users are with mobile ads and content, which might be useful for iAd customers, among others."


Stan Schroeder, Mashable, 7/26/11
Below is a video from from the above post which demonstrates an app developed by Polar Rose, a company that was purchased by Apple.
RELATED
Apple plans native panorama functionality in iOS5
Seth Weintraub, 9TO5Mac, 7/8/11
iOS 5's final release may include "Assistant" speech-to-text feature
Chris Rawson, TUAW, 7/23/11

Jul 24, 2011

Video: Kinect SoundWall, links to info and code!





Here is information about the project from the KinetHacks SoundWall site:

"Kinect sound machines become prettier and easier with each development! The Kinect SoundWall is a drum beat music machine controlled by gestures and voice commands. This video by  displays this digital music machine at work and how through various gesture and voice commands, users can create awesome beats to dance to. In the video, the user gestures to to certain blocks in the screen in order to create a beat there or render the beats void. Through various voice commands, the beat can start, increase tempo, stop etc. Through the proper integration of both voice and gesture commands, the Kinect SoundWall sets the standard for a great and efficient sound machine of the Kinect!"
"For more information about the Kinect SoundWall visit the project’s website."
RELATED
Vertigo SoundWall CodePlex Project Site

Jul 18, 2011

Emerging Interactive Multimedia, New Models of HCI for Museum Exhibits (Course offered by Ideum's Jim Spadaccini, plus info about the MT55 multi-touch table)

Jim Spadaccini, the director and founder of Ideumwill be teaching a course on exhibit development through the University of Victoria, "Emerging Exhibits: Exploring New Models of Human Computer Interaction (HCI).  The excerpt from the course description provides a glimpse of how emerging technologies are beginning to change the museum experience:


"Computer-based interactive exhibits are undergoing a major transformation. The lone, single-user kiosk is now being replaced by multitouch tables and walls, motion-capture spaces, networked installations, and RFID-based exhibits. Advances in augmented reality, voice recognition, eye tracking, and other technologies promise even more radical change for exhibits in the near future."


I've been following Jim's journey with Ideum, a multimedia design firm that collaborates with museums and related non-profits, for many years, and I am impressed with the work of this company.  In addition to his work at Ideum,  Jim serves as the Principal Investigator of a National-Science Foundation sponsored open-source exhibit software project, Open Exhibits, which provides a free software development kit that supports the creation of multi-touch and multi-user software applications for museums and educational settings.


I'm happy to put in a plug for Ideum's latest product, the MT55 Platform Multi-Touch Table. It incorporates a range of features that I'm sure will meet the needs of museum visitors.  In my opinion, this table would be a fantastic resource for all types of libraries, including those in K-12 settings.

The MT55 Platform Multi-touch Table, from Ideum

The MT55 Platform Multitouch Table from Ideum on Vimeo.  (Note: This video features music by Moby, the track "Sevastopol" on his current album, Destroyed. The music was used with the artists' permission. Learn more at: moby.com") - Ideum

"The thinnest, largest, most powerful multitouch table available.The MT55 Platform multitouch table houses a powerful computer and a 55-inch interactive LCD display that responds to 32 touch-points, inside a rugged aluminum body."

"The bright 55″ 1920×1080 HD display has a 5,000,000:1 contrast ratio. A wide 178-degree viewing angle accommodates multiple users around the table. The optical multitouch system supports 32 simultaneous touch points for collaborative interaction. The system is multitouch-enabled from start-up, and runs Windows 7 64-bit professional edition."


"The integrated computer is packed with power. It contains an Intel® Hyper-Threaded DualCore i5® which runs at 2.66 GHz, 8GB of RAM, and a 128 GB solid-state drive (upgradeable to an i7®)."


"The table comes complete with WIFI, Bluetooth, and Ethernet connectivity. It also has multiple HDMI outputs that allow you to easily mirror the table's display, extend the desktop, or connect to and display from another computer or HDMI device."

"Convenient, but secure ports: CAT5, HDMI, and USB 2.0 are available on both the side on bottom the table."

"The MT55 Platform includes blue LED under-lights to illuminate the area beneath the interactive surface (custom LED colors are available). Every MT table includes a sophisticated internal cooling cell to maintain operating conditions that exceed the optimum environmental specifications for the internal components."

"The interactive surface of the MT55 Platform protected by a sheet of hardened, crystal clear, low-iron 5mm tempered glass surface...
As an option, we offer Sevasa HapticGlas®, produced exclusively for Ideum. Micro-etched HapticGlas® provides tactile feedback, reduces fingerprints, increases scratch resistance, and directs user focus." -Ideum
 

RELATED
High-res photos of the MT55 Platform

GestureWorks Software
Open Exhibits
Ideum
Open Exhibits Tuio Kinect

Jul 12, 2011

Summer Break: Music Apps, Multimedia, Kinect, My New iPad2, Tech-reading, Google+, Dancing...

I'm on summer break, which for me, means that I spend an increased amount playing/creating music and doing all of the other fun stuff I don't have much time for during the school year. I'm still exploring what I can do with my new iPad2 - there are so many music apps!  My favorite at this moment is Garage Band. It keeps me engaged for hours, and I can take it with me anywhere I go.  I'm also exploring iPad apps for education and students with special needs, since many of the young people I work with have autism spectrum disorders.  They all really love music.


Today, I came across turntable.fm, a "social-djing" website, from a link shared by Dimitri Diakopoulos.  I think it would be fun to play with.




















Turntable.FM, The Fastest-Growing Music Service You're Not Using
William Fenton, PC Magazine, 6/23/11
Social DJing with Turntable.FM
Andrew Mager, 5/28/11


I'm still plowing through technology journals and zines from previous months - I had to skip over my stack to read the cover article of the most recent Communications of the ACM:
Michael Edwards, University of Edinburgh, 2011


I LOVE the design of this cover. It would make for a nice interactive interface for an iPad music app. Or a larger touch-screen display. Or even a SMARTBoard! (BTW, My first computer-related course was Computer Music Technology, in 2003. My undergraduate honors research (psychology), years ago, focused on constructive cognition and music recognition/memory. This topic is dear to my heart.)

I've spent some quality time with my first grand-baby this summer.  Although his "screen time" is limited, given his age of 7 1/2 months, he enjoys playing with music on my iPad.  He likes the drums found in the iPad GarageBand application.  Here he is playing with NodeBeat, an app created by Seth Sandler and Justin Windle:













Most of his time is spent off-screen:





Over the last few months, there has been a surge of interactive touch-enabled apps for education, including some for young people with special needs.  This will be the topic of a few of my future posts.