Showing posts sorted by relevance for query "natural user interface". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interface". Sort by date Show all posts

Jan 8, 2009

For Techies and the Tech Curious: Multi-Touch/Gesture from the NUI-Group

If you are a new visitor to this blog and interested in interactive multimedia, you'll need to know more about the NUI-Group. Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications.
(For related information, please read my recent post, Usability, Accessibility, and User Experience in a Win7 Environment.)

Seth Sandler, of the NUI-Group, sent out a great email with links and resources for people who are interested in multi-touch/gesture interaction, hardware, and/or software development. The list of NUI-Group members who have completed projects is listed below, with links to project websites as well as related threads on the NUI-Group forum.

(The information can be found on the NUI-Group Wiki, which boasts a nice icon based front-page)

Thanks, Seth, for organizing this wealth of information!

The following projects are divided up by type. (Links to information about the various types of multi-touch and gesture systems can be found near the end of this message.)

FTIR - Frustrated Total Internal Reflection

http://img144.imageshack.us/img144/8105/shemeftir2yu7.jpg
Name: Seth (cerupcat)
Project Name: AudioTouch
Project Website: http://ssandler.wordpress.com
Project Thread: http://nuigroup.com/forums/viewthread/1352/ http://nuigroup.com/forums/viewthread/2309/

Name: bassmang5
Project Name: Æ-table
Project Thread: http://nuigroup.com/forums/viewthread/3144/

Name: Daniel (Zin)
Project Name: Prometheus
Project Thread: http://nuigroup.com/forums/viewthread/2612/
Name: Carsten (carschdn)
Project Name : aTRACKtive
Project Thread: http://nuigroup.com/forums/viewthread/3223/

LLP Laser Light Plane

http://www.codelaboratories.com/images/LLP/HiPressure.jpg

Name: Denis Santelli(dsan)
Project Website: http://www.touchwall.fr/
Project Thread: http://nuigroup.com/forums/viewthread/3051/


LED-LP Light-Emitting Diode Laser Plane
Name: Nolan - (PeauProductions)
Project Name: PeauProductions (LCD)
Project Website: http://peauproductions.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/3291/
DI - Diffused Illumination
http://img359.imageshack.us/img359/1143/shemedi2bx3.jpg
Name: Seth (cerupcat)
Project Name: MTmini
Project Website: http://ssandler.wordpress.com/MTmini
Project Thread: http://nuigroup.com/forums/viewthread/1731/

Name: Fairlane
Project Name: ORION v2
Project Website: http://orionmultitouch.blogspot.com/
Project Thread: http://nuigroup.com/forums/viewthread/1709/
Name: Abdullah (EfeNDy)
Project Name: EfeNDy’s Diffused Illumination MT
Project Thread: http://nuigroup.com/forums/viewthread/3176/
Name: Sandor
Project Name: EXPO REAL 2008
Project Website: http://vimeo.com/2240537
Project Thread: http://nuigroup.com/forums/viewthread/3576/
Name: Matthew (MatthewW)
Project Name: Design Garage
Project Website: http://www.gotuasciencecenter.org/
Project Thread: http://nuigroup.com/forums/viewthread/3730/


Touchless

Name: Jimi Hertz
Project Name: Touchless Wall
Project Website: http://sassexperience.org/projettouchwall.html
(try: http://sassexperience.org/multitouch/inprogress.html )
Project Thread: http://nuigroup.com/forums/viewthread/2414/

YouTube: http://www.youtube.com/watch?v=KCFbWPf37jw

Other Types:
DSI Diffused Screen Illumination

http://iad.projects.zhdk.ch/multitouch/wp-content/uploads/2008/06/sheme_dsi_web.jpg



RELATED LINKS

Tips for Success

Resources
Thanks to all NUI-Group members who have been working so hard at this mission!

I'll end this post with a YouTube video created by Jimi Hertz, a NUI-Group member. "MULTI-TOUCHLESS WALL HOW TO?"

I especially like the music!


Aug 29, 2009

Natural User Interface Surfaces: TouchCo - IFSR technology company, founded by NYU Media Reseach Lab's Ken Perlin and Ilya Rosenburg

Earlier this year, I had a very vivid dream about haptic touch-screen overlays. That's how things go for me sometimes. At any rate, the dream inspired a blog post, "Last night I dreamt about haptic touch-screen overlays...". Since this is one of my interests, I try to keep up with developments in this field when I can.

It is amusing to see how seemingly unrelated things have similar roots. Today I came across information about a cool mobile phone concept for the visually impaired, designed by Samsung. The touch technology involved in the concept phone incorporated something called Electric Activating Plastic, or EAP. 



This sparked my curiosity, and while searching for more information, I came across an article in MIT's Technology Review that caught my eye:


A Touch of Ingenuity: Inexpensive pressure-sensitive pad could make surfaces smarter
(Kate Green, MIT Technology Review September/October 2009)



UPDATE:  As of 2/2010, TouchCo no longer exists. It was bought by Amazon:



"..Now that more and more smart phones and MP3 players have touch-screen interfaces, people have grown accustomed to interacting with gadgets using only taps and swipes of their fingers. But on the 11th floor of a downtown Manhattan building, New York University researchers Ilya Rosenberg and Ken Perlin are developing an interface that goes even further. It’s a thin pad that responds precisely to pressure from not only a finger but a range of objects, such as a foot, a stylus, or a drumstick. And it can sense multiple inputs at once."

Ken Perlin and Ilya Rosenberg developed this system as part of their human-computer interaction work at NYU's Media Research Lab, and with the success of their work, went on to start a new company, Touchco, to commercialize and market this concept, known as IFSR (Interpolating Force-Sensitive Resistance) technology.

"The IFSR sensor revolutionizes the field of multi-touch human-computer interaction by out-performing existing solutions in both price and capabilities. Unlike traditional capacitive sensors, our patent pending system can detect any object - not just a finger - and can determine how much pressure is being applied to every point on a sensor simultaneously. IFSR sensors are natively multi-touch, use less power than capacitive sensors, and are much less expensive to produce, making them a highly disruptive technology with widespread market applications."


The Touchco website has a wealth of information regarding this technology. (Update: The TouchCo website was taken down in February 2010)

Other members of Touchco include Nadim Awad, Fang Cheng, Julien Beguin, and Christopher Nam.

SOMEWHAT RELATED
I know of Ken Perlin through his Games for Learning work, and also through his on-line application, Responsive Face, which I use in my work with students who have autism.




Ken Perlin's Blog
Games for Learning Institute

TechPsych Post:
Samsung Braille TouchPhone Prototype is Cool: Uses Electric Active Plastic for functional tactile feedback.

Jul 23, 2013

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform and Kaiser Permanente therapists offers a barrel of possibilities!

"The goal was to build a game that is extremely accessible, non-complex and includes simple mechanics that children with autism can quickly understand to retain their attention, prevent over-stimulation, encourage play, and prevent frustration." -Vectorform "Monkey Business" Team


I recently had the chance to visit the team at Vectorform's Royal Oak (MI) headquarters and experience a demo of one of their newest applications, a Kinect-based game developed to assist in the assessment of children with autism. The game, known as Monkey Business, was designed in collaboration with physical, occupational, and speech therapists from Kaiser Permanente, a large health care service provider in California. The project was supported by Kaiser Permanente's Innovation Fund for Technology group.

I was impressed!

As I watched the demo, it was clear that much time and attention was given to the input of the therapists at Kaiser Permanente. The avatar in the game is a friendly monkey who embodies the mannerisms of a friendly, curious child.  The artwork is visually appealing and not too busy. The sound effects relate to the interaction at hand and provide feedback to the child as they engage in various game-like tasks designed to assess visual-motor, gross motor, and basic receptive language skills.  Another good feature of this application is the simplicity of the screens designed for input of the child's information and viewing data regarding progress.

In my opinion, the Monkey Business assessment game has the potential for use in school settings, and would be enhanced if additional assessment activities are provided in the application.  Expanded activities within the Monkey Business environment could support a range of treatment, education, and related intervention efforts. In addition to supporting assessment and intervention of children with autism, the Monkey Business concept would be especially useful in working with children who have experienced a traumatic brain injury.


Below are screen shots and descriptions of a few activities from the Monkey Business game:

Lily Pad Jumping Test
In the jumping test, the child is asked to jump across a pond, from one lily pad to another.  The child is provided with a set of toy lily pads on the floor.  As the child jumps, the Kinect sensor picks up movement and displays the jumps on the screen in the form of the monkey.























Balance Beam Test
The following screen shot depicts the monkey avatar crossing a stream on a balance beam. The monkey's movements reflect the child's movements across a real balance beam in front of the Kinect sensor and a large screen display.






















Block Building Test
In the block building test, the monkey avatar encourages the child to build a tower of blocks to match the one displayed on the large screen.  The child uses real blocks, placed on a table, to build a tower while the monkey provides encouragements.  Objects other than blocks can be used, as the Kinect sensor is capable of identifying a variety of objects that can be use for building and stacking.






















Clinician Screens
The screens designed for inputting and reviewing data regarding the child are designed with simplicity and ease-of use in mind, as shown in the pictures below:












































In the present version of Monkey Business, a TV remote is used to control the application. It is possible that future versions of Monkey Business could integrated with a Smartwatch to input student information and control the activities in the application.  


Comment: As a school psychologist who works with children, teens, and young adults with autism and other significant disabilities, I know how difficult it can be to conduct assessments using traditional test materials.  If a student has difficulty interacting with unfamiliar adults, has communication challenges, or has motor difficulties, it may not be possible to administer an assessment that generates meaningful or valid results.  Even if the student is capable of completing some of the traditional test tasks, much time and effort is required on the part of the evaluator to sustain their interest, effort, and attention.   

To get a better picture of a student's emerging skills, I use interactive multimedia applications during my assessments of children with autism.  Newer technologies such as interactive whiteboards, larger touch-screen monitors, and tablets have proven to be useful tools in assessment.   A significant drawback to my approach is that it is cobbled together and might be difficult to replicate by other evaluation teams.  I see a growing need for a range of technologically-enhanced tools for assessment, including applications that offer opportunities for playful engagement.  Monkey Business holds potential to fill the bill.

What next?

I'd like to share some stories about the people who make things come to life at Vectorform. During my visit, I learned that they are involved in a number of projects related to health care and are always brainstorming new ways to harness technology for their various clients.  

Over the past several years, I've followed Vectorform's journey as they've jumped off of the desktop and transformed from web-developers to Post-WIMP explorers of natural user interfaces and interaction.  The most recent area of exploration is Google Glass. 

Intrigued by Google Glass?  So am I.  

During my visit to Vectorform, I had a chance to see the world - or rather Vectorform's basement conference room - through Google Glass.  As soon as I put them on I was flooded with ideas, and will expand on my ideas, and those of others, in future posts.

In the meantime, take the time to read an excellent post by Kevin Foreman, Vectorform's Director of Product Vision.  In "The Glass Experience", Kevin provides an in-depth reflection of his experience wearing Google Glass throughout his daily routine.  He also explains the inner workings of the hardware, the strengths and limitations of the Glass system, details about the user interface, the "on-board" accessories, and a few comments about what it is like to be a new Glass-wearing celebrity.


RELATED
Here are just three of the innovators I met during my recent visit to Vectorform, left to right: Jennifer Tonio, Marketing Manager, Kevin Foreman, Director of Product Vision, and Patric Samona, Director of Health Solutions.


















Below are a few links related to the use of games for the assessment and intervention of autism, along with links to information about the use of games and emerging technologies for rehabilitation, health care and health promotion:





Microsoft Kinect and Autism (SlideShare) Susan McCarthy, Little Angel's School 2/09/13


Italian Team Uses the Kinect to Treat Autistic Children Andrea Lorini, Epoch Times, 12/13/12

Microsoft Surface Multi-touch Application for Pediatric Neuropsychology Assessment (Featuring Vectorform) Lynn Marentette, TechPsych Blog, 8/26/09   



Xbox One, Kinect 2.0 and the future of health technology Marcelo Calbucci, Mobihealth News, 5/26/13



Accessible Games for Health and K-12 Education: Lessons from the Classroom (SlideShare) Lynn Marentette, 5/9/08, Games for Health Conference Presentation

Researchers: Microsoft Kinect is a Money-Saving Telemedicine Device Gabriel Perna, Healthcare Informatics, 2/15/13

Lowes, LP., Alfano LN, Yetter BA, Worthen-CHaudhari, L, Hinchman W, Samona P, Flanigan KM, Mendell JR Proof of Concept of the Ability of the Kinect to Quantify Upper Extremity Function in Dystrophinopathy
PLoS Curr. 2013 Mar 14; 5   doi:  10.1371/currents.md.9ab5d872bbb944c6035c9f9bfd314ee2

Feb 20, 2013

AirHarp for Leap Motion, a Responsive Musical Natural User Interface

I like this demonstration of Adam Somers  AirHarp music application for use with the Leap Motion 3D controller:


AirHarp is being developed in C++ using Adam Somer's audio processing toolkit, MusKit.  This looks interesting!  Things have changes since I last took a computer music technology course (back in 2003).

Adam Somers is a senior software engineer at Universal Audio.  He has a graduate degree in music technology from Stanford, and a background in computer science, electronics, human-computer interaction, and signal processing.

Leap Motion is a motion-control software and hardware start-up company located in San Francisco, California. According to promotional information from the website, the company's first product, the Leap Motion controller, is 200 times more sensitive than existing technologies.  It will be interesting to see how this plays out.  (I'm still waiting for my pre-order.)

RELATED
AirHarp (links to GitHub)
Leap FAQs
Leap Motion Website
Leap Motion Developer Portal
Leap Motion Leadership Team
Leap Motion goes retail: Motion controller sold exclusively at Best Buy
Michael Gorman, engadget, 1/16/13

Leap Motion: Low Cost Gesture Control for your Computer Display
Asus partners up with Leap Motion, PCs with 3D motion control to debut in 2013
Michael Gorman, engadget, 1/3/13
Stanford Center for Computer Research in Music and Acoustics


Nov 4, 2011

Le Chal: Smart shoes for the visually impaired, with haptic//vibrational feedback, sensors and GPS


Le Chal is a haptic/vibrational feedback shoe designed for the blind and visually impaired, on of the projects of Anirudh Sharma, a member of the NUI Group (Natural User Interface).  Le Chal was conceptualized at the 2011 MIT Media Lab Design and Innovation Workshop at COEP.
Take me there Shoe/ Le Chal

Shoe for visually impaired - Le Chal Himanshu Khanna, 10/31/11

RELATED
touchaddict blog

Feb 20, 2011

Human-Computer Interaction (HCI) Is Changing the World: BLUR Conference, February 22-23, Omni Orlando Resort (Includes video)

Tuesday, February 22, 2011 at 8:00 AM - Wednesday, February 23, 2011 at 5:00 PM (ET) Omni Orlando Resort, 1500 Masters Blvd. 
 ChampionsGate, Florida 33896 Phone: (407) 390-6664
Blur Conference

ABOUT BLUR 
(from the conference website)

"It’s easy to forget that the computer mouse is over 45 years old."

"What’s not as easy to forget is that we’re now collectively getting used to interacting with computers via means and interfaces that have moved way beyond the keyboard and the mouse — the iPhone and Wii being the most prominent examples."

"The truth is that we stand on the verge of a major revolution in the models of Human Computer Interaction (HCI). A revolution that will fly right past academic and into a world of retail, medical, gaming, military, public event, sporting, personal and marketing applications."

"From multi-touch to motion capture to spatial operating environments, over the next 10 years, everything we know about HCI will change."

"Blur is the only conference that is exploring the line of interaction between computers and humans in a substantive, real-world and hands-on way."

"At Blur, vendors, strategists, buyers and visionaries assemble to not only discuss the larger issues of HCI, but also to lay their hands on the latest in HCI technology. Blur is the only forum for a focused, hands-on exploration of the varied technologies evolving in the HCI."

"Come play, investigate, learn and apply at Blur — where we’re changing how you interact with computers forever." -Blur




BLUR Conference Agenda
(Note:   I added the links to conference participants and/or their organizations. Feel free to leave a comment if you know of any corrections or better links!)
Keynotes:

Neuroergonomics: How an Understanding of the Brain is Changing the Practice of Human Factors Engineering - Dr. Kay Stanney, Design Interactive
When Computers Feel: Understanding Human Emotional Measurement  - Hans Lee, EmSense
A Quick Hit on Mobility and HCI - Juan Pons, Swype
Panel Discussion: Haptics- The Beginnings and Future of Touch  - Nimish Mehta
Why HCI will lead the biggest tech revolution yet - Andrew Tschesnok, Organic Motion
Location as a Primary Interface Input - Matt Galligan, SimpleGeo; Nick Brachet, Skyhook Wireless
Robotics, Gaming and The Future of Entertainment- Paul Berberian, Orbotix
Virtual Coaches in Healthcare: A Vision of the Future - Dan Siewiorek, Carnegie Mellon University
10 reasons to be happy about giving computers emotion sensing - Dr. Rosalind Picard, MIT
Commercializing HCI Technology - Dr. Paul Kedrosky, Ewing Marion Kauffman Foundation and Dr. Gerry Barnett


Breakout Sessions:
Human Instrumentation - James Park, FitBit; Ben Rubin, Zeo; Jason Jacobs, RunKeeper;
Steve Larsen, moderator

New Museum Experiences: Learning from Multitouch and Multiuser Installations - Jim Spadacinni, Ideum
Kinect Hacks - Jonathan C. Hall; Lonergan Harrington; Jim Spadacinni, Sean Kean, moderator
Interactive Ads and Consumer Experiences - Alessio Signorini, Immersive Labs; Jon Fox, Helios Interactive
Augmented Reality - Ready for Primetime? - Vikas Reddy, Occipital; Carlin Getliffe, OmniarEdwin Rivera, Credelis; Dan Rua, moderator
Building an Interface for Endangered Language Learners - Finn Thye and Kelson Adams, Univ of Colorado - Boulder
Alternative Interface Inputs - Gary Clayton, Nuance; Nick Langdale-Smith, Seeing Machines; RJ Auburn, Voxeo; Steve Larsen, moderator
Ewing Marion Kauffman Foundation "Idea Hack" - Commercializing HCI Technology: A Discussion - led by Paul Kedrosky
Building Natural User Interfaces - Thomas Peterson, SoftKinetic; Ohad Shvueli, Prime Sense; David Minnen, Oblong
3D Interactive Design for the Human Body - Albert Hwang
Panel: Will the Kinect Change the HCI Industry Forever? A Group Discussion
HCI in the 21st Century:  Technologies for Extending and Amplifying the Human Experience (pdf) - Dr. Charlie Hughes, UCF; Dr. David Pratt; Dr. Joseph LaViola;  moderated by Steve Fiore, UCF


Some Videos of HCI/Tech featured at Blur 2011 
360 Panorama occipitalhq


"Illuminous" Eric Gradman



"Standard Gravity" Eric Gradman, OpenKinect (libfreenect/python)


Organic Motion Markerless Motion Capture


Advisory:
Steve Fiore, University of Central Florida
Bob Allen, Disney R&D
Kay Stanney, Design Interactive
Capt. Dylan Schmorrow, USN