Seth Sandler's most recent video of the Audio Touch interactive table provides a good demonstration of how multi-touch on a table can work with music.
Seth is a member of the NUI group (Natural User Interface). He is finishing a Bachelors degree in Interdisciplinary Computing and the Arts, with an emphasis on Music, at the University of California, San Diego. His research and development work centers around multi-touch, multi-user musical interfaces.
Here is an update about the NUI group:
"Natural User Interface or ~ NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications."
"We offer a collaborative environment for developers that are interested in learning and sharing new HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization."
The NUI group has been selected for mentoring organization for Google's Summer of Code, for those of you who are interested in working on open-source code for multi-touch systems. The student application process begins Monday, March 24th, 2008, and ends Monday, March 31st, 2008.
NUI group's project ideas page outlines the requirements for the application, which includes a 7500 word project proposal. The project page has a long list of ideas to spark some thinking for potential Summer of Code applicants.
For those of you who aren't into coding, I encourage you to take a look at the NUI Groups project ideas page just to get an idea of th interesting ideas that are being explored. The page has a list of links to other good resources.
Share the word with anyone who might be interested in the NUI Group's projects for the Summer of Code. We need to get more people interested in STEM careers, and the project ideas outlined by the NUI group look enticing.
Focused on interactive multimedia and emerging technologies to enhance the lives of people as they collaborate, create, learn, work, and play.
Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts
Showing posts sorted by relevance for query "natural user interaction". Sort by date Show all posts
Mar 17, 2008
Jun 6, 2013
Interactive Displays and "Billboards" in Public Spaces; Pervasive Displays 2013
The 2013 International Symposium on Pervasive Displays (PerDis 2013), recently convened in Mountain View, California. Since I couldn't attend this conference, I was happy to learn from Albrecht Schmidt that the conference proceedings were recently uploaded to the ACM Digital library. There are many exciting things going on in this interdisciplinary field!
Researchers involved with the Instant Places project, described in the video below, presented their work at PerDis 2013. The Instant Places project was part of PD-Net, a series of research efforts exploring the future of pervasive display networks in Europe. (See the "Related" section for additional references and links.)
Instant Places: Tools and Practices for Situated Publication in Display Networks
Below is information from the Instant Places video and website:
"The video describes a novel screen media system that explores new practices for individual publication and identity projection in public digital displays."
"Instant Places has been developed by the Ubicomp group of the Information Systems Department, at the University of Minho, and has been funded within the scope of pd-net: Towards Future Pervasive Display Networks, by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 244011."
Saul Greenberg was the keynote speaker at PerDis 2013. His keynote, "Proxemic Interactions: Displays and Devices that Respond to Social Distance", highlights how far off-the-desktop our digital/physical lives have become, and how this has influenced recent research in human-computer interaction. Saul is a professor at the University of Calgary and leads research in Human Computer Interaction, Computer Supported Cooperative Work, and Ubiquitous Computing.
Although the video of Saul Greenberg's presentation below is not from PerDis 2013, it touches on the same topics and is worth taking an hour to watch. In this video, Greenberg presents an overview of the history of human-computer interaction. He also offers up a discussion how an understanding social theory, perception of spatial relationships, and embodied interaction can be applied to the design of natural user interfaces and interactive systems. Useful examples of interaction design explorations, within an ecological context, are provided later in the video.
Proxemic Interactions: the New Ubicomp?
Sample Papers:
Otero, N., Muller, M., Alissandrakis, A., and Milrad, M. Exploring video-based interactions around digital public displays to foster curiosity about science in the schools. PerDis 2013 (pdf)
Alt, F., Schneegass, S., Girgis, M., Schmidt, A. Cognitive Effects of Interactive Public Display Applications. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 2013
Langeinrich, M., Schmidt, A., Davies, N., and Jose, R. A practical framework for ethics: the
Note: Members of ACM have access to all of the proceedings of PerDis2013 in the ACM Digital Library. Non-members have access to the abstracts.
PD-Net
PD-NET Publications - a great reference list, with links to many papers
Reading List on Pervasive Public Displays
About Instant Places
About the Living Lab for Screens Set
J. Müller et al., "Looking Glass: A Field Study on Noticing Interactivity on a Shop Window," Proc. 2012 SIGCHI Conf. Human Factors in Computing Systems (CHI 12), ACM, 2012, pp. 297–306
Michelis, D., Meckel, M. Why Do We Want to Interact With Electronic Billboards in Public Space? First Workshop on Pervasive Advertising, Pervasive 2009, 5/11/09
The Rage of Interactive Billboards
The Print Innovator, 11/28/12
10 Brilliant Interactive Billboards (Videos)
Amy-Mae Elliot, Mashable, 8/21/11
Tan, D.S., Gergle, D, Scupelli, P., Pauch, R. Physically large displays improve performance on spatial tasks. ACM Transactions on Computer-Human Interaction, V13(1) 2006 (71-99)
Revisiting promising projects: Dynamo an application for sharing information on large interactive displays in public spaces (blog post)
Lynn Marentette, Interactive Multimedia Technology, 09/16/07
Brignull, H., Izadi, S., Fitzpatrick, G., Rogers, Y., Rodden, T. The introduction of a shared interactive surface into a communal space. Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW'04), Chicago, ACM Press, 2004 (pdf)
Researchers involved with the Instant Places project, described in the video below, presented their work at PerDis 2013. The Instant Places project was part of PD-Net, a series of research efforts exploring the future of pervasive display networks in Europe. (See the "Related" section for additional references and links.)
Instant Places: Tools and Practices for Situated Publication in Display Networks
Below is information from the Instant Places video and website:
"The video describes a novel screen media system that explores new practices for individual publication and identity projection in public digital displays."
"Instant Places has been developed by the Ubicomp group of the Information Systems Department, at the University of Minho, and has been funded within the scope of pd-net: Towards Future Pervasive Display Networks, by the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement no. 244011."
Saul Greenberg was the keynote speaker at PerDis 2013. His keynote, "Proxemic Interactions: Displays and Devices that Respond to Social Distance", highlights how far off-the-desktop our digital/physical lives have become, and how this has influenced recent research in human-computer interaction. Saul is a professor at the University of Calgary and leads research in Human Computer Interaction, Computer Supported Cooperative Work, and Ubiquitous Computing.
Although the video of Saul Greenberg's presentation below is not from PerDis 2013, it touches on the same topics and is worth taking an hour to watch. In this video, Greenberg presents an overview of the history of human-computer interaction. He also offers up a discussion how an understanding social theory, perception of spatial relationships, and embodied interaction can be applied to the design of natural user interfaces and interactive systems. Useful examples of interaction design explorations, within an ecological context, are provided later in the video.
Proxemic Interactions: the New Ubicomp?
RELATED
My Backstory
Regular readers of this blog know that to subject interactive displays in public spaces holds my interest. When I was taking computer courses during the mid 2000s, I focused some of my energy on projects designed for large interactive displays, inspired by reading articles like "Physically Large Displays Improve Performance on Spatial Tasks" (Desney S. Tan, Darren Gergle, Peter Scupelli, and Randy Pausch) and "Dynamo: public interactive surface supporting the cooperative sharing and exchange of media" (Shahram Izadi, Harry Brignull, Tom Rodden, Yvonne Rogers, Mia Underwood).
Jeff Han's 2006 TED talk was another inspiration. I remember my excitement as watched his demonstration of an interactive multi-touch touch screen the size of a drafting board, before the iPhone/iPad was born. Another inspiration was Hans Rosling's TED Talk about health statistics, with his animated interactive data visualizations presented on a huge screen.
The following year, I stumbled upon the NUI-Group while searching for information about multi-touch displays, and was inspired by many of the early members of the group. I also became acquainted with a world-wide network of people who share similar interests, such as Albrecht Schmidt and his team of researchers at the Unversity of Stuttgart. This busy group recently presented at PerDis 2013 and at CHI 2013 and are involved in a wider range of ongoing projects.
Jeff Han's 2006 TED talk was another inspiration. I remember my excitement as watched his demonstration of an interactive multi-touch touch screen the size of a drafting board, before the iPhone/iPad was born. Another inspiration was Hans Rosling's TED Talk about health statistics, with his animated interactive data visualizations presented on a huge screen.
The following year, I stumbled upon the NUI-Group while searching for information about multi-touch displays, and was inspired by many of the early members of the group. I also became acquainted with a world-wide network of people who share similar interests, such as Albrecht Schmidt and his team of researchers at the Unversity of Stuttgart. This busy group recently presented at PerDis 2013 and at CHI 2013 and are involved in a wider range of ongoing projects.
INTERACTIVE DISPLAYS
Alt, F. Sahami, A., Kubitza, T., Schmidt, A. Interaction Techniques for Creating and Exchanging Content with Public Displays. In: Proceedings of the 2013 ACM Annual Conference on Human Factors in Computing Systems
Hinrichs, U., Carependale, S., Valkanova, N., Kulkkaniemi, K., Jacucci, G., Moer, A.V., Interactive Public Displays Computer Graphics, Vol. 33(2) IEEE Computer Society (25-27)
PerDis 2013 ProgramSample Papers:
Otero, N., Muller, M., Alissandrakis, A., and Milrad, M. Exploring video-based interactions around digital public displays to foster curiosity about science in the schools. PerDis 2013 (pdf)
Alt, F., Schneegass, S., Girgis, M., Schmidt, A. Cognitive Effects of Interactive Public Display Applications. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 2013
Langeinrich, M., Schmidt, A., Davies, N., and Jose, R. A practical framework for ethics: the
Note: Members of ACM have access to all of the proceedings of PerDis2013 in the ACM Digital Library. Non-members have access to the abstracts.
PD-NET
PD-net approach to supporting ethics compliance in public display studies. Proceedings of the 2nd ACM International Symposium on Pervasive Displays. 139-143PD-Net
PD-NET Publications - a great reference list, with links to many papers
Reading List on Pervasive Public Displays
About Instant Places
About the Living Lab for Screens Set
DOOH-DIGITAL OUT-OF-HOME
Digital Out of Home (DOOH): Screens Large and Small at the Mall (and some touch-screen Coke machines!) Interactive Multimedia Technology, 1/1/10
Daily Digital Out of Home post "Billboards That Look Back" : Could miniature cameras embedded in ads lead to Big Brother at the mall? The World Is My Interactive Interface, 5/28/08J. Müller et al., "Looking Glass: A Field Study on Noticing Interactivity on a Shop Window," Proc. 2012 SIGCHI Conf. Human Factors in Computing Systems (CHI 12), ACM, 2012, pp. 297–306
Michelis, D., Meckel, M. Why Do We Want to Interact With Electronic Billboards in Public Space? First Workshop on Pervasive Advertising, Pervasive 2009, 5/11/09
The Rage of Interactive Billboards
The Print Innovator, 11/28/12
10 Brilliant Interactive Billboards (Videos)
Amy-Mae Elliot, Mashable, 8/21/11
SOME INTERESTING EARLIER WORK
Jeff Han's 2006 TED Talk (This is worth revisiting, as it came out before the iPhone, iPad, etc.)
Tan, D.S., Gergle, D, Scupelli, P., Pauch, R. Physically large displays improve performance on spatial tasks. ACM Transactions on Computer-Human Interaction, V13(1) 2006 (71-99)
Revisiting promising projects: Dynamo an application for sharing information on large interactive displays in public spaces (blog post)
Lynn Marentette, Interactive Multimedia Technology, 09/16/07
Brignull, H., Izadi, S., Fitzpatrick, G., Rogers, Y., Rodden, T. The introduction of a shared interactive surface into a communal space. Proceedings of the 2004 ACM conference on Computer supported cooperative work (CSCW'04), Chicago, ACM Press, 2004 (pdf)
Izadi, S., Brignull, H., Rodden, T., Rogers, Y. and Underwood,M. Dynamo: public interactive surface supporting the cooperative sharing and exchange of media. In Proc. User
Interfaces and Software Technologies (UIST’03), Vancouver, ACM Press, 2003, 159-168. (pdf)
Proxemics (Wikipedia)
Proxemics (Wikipedia)
Why Do We Want to Interact With Electronic Billboards in Public Space?
Posted by
Lynn Marentette
Dec 14, 2008
Links to videos of natural user interfaces - multi-touch & gesture
Here are links to two recent posts about multi-touch and gesture interaction on the Technology-Supported Human-World Interaction blog:
Jonathan Brill's Point& Do: "Your Guide to Natural User Interfaces"
More from Lm3labs: Ubiq'window & Reactor.cmc's touch screen shoppiong catalog, AUDI's touch-less interactive showroom screen, and the DNP Museum Lab
Jonathan Brill's Point& Do: "Your Guide to Natural User Interfaces"
More from Lm3labs: Ubiq'window & Reactor.cmc's touch screen shoppiong catalog, AUDI's touch-less interactive showroom screen, and the DNP Museum Lab
Posted by
Lynn Marentette
Oct 31, 2009
Sensory-Mind's Ring Wall, an interactive multi-touch wall you don't even have to touch!
Sensory-Minds is a small company in Germany that is focused on research and design in the field of Natural User Interfaces. If you visit the SENSORY-MINDS website, you'll find that it has been designed for touch interaction.
ring°wall from SENSORY-MINDS on Vimeo.
Information about the Ring Wall from Sensory-Mind's Vimeo site:
".....The two-piece ring°wall consists of a LED display and a multitouch information-wall and impresses by its size: a total surface of 425 square meters, which equals more than 6000 computer displays, is the biggest of its kind. An interactive World emerges out of 34 million pixels generated by 15 high definition projectors and is supported by 30 directional speakers.
Multitouch sensors basing on laser technology, convert the usage of the natural user interface into an experience. By direct touching, more than 80 users can simultaneously get informed about news and activities around the ringworld.
The interactive wall is not only a central information system, but also an innovative advertising tool and medium for public viewings."
Heiko Hoffman of Sensory-Minds recently joined the NUI-Group. Here is a response to a question on the NUI-group forum about the way the system's sensors work:
"The sensor works like spinning radar gun, or like a wiper in a car, that means like a radar gun you get the distance to the object. It’s not quiet difficult when you know the position of the sensor and the degree from the beam to get the X,Y position. Yes, you don’t need to touch the surface but we arranged it that the radar beam is very close to the screen so it seems that you have to touch the surface.
At the moment the system (has) limitations because each sensor (gets) coordinates and this means that you got shadows.This is no problem because the people interact very fast. We are working on a system where the data from the sensors were put together and that would be the solution for this problem."
Note:
I'll be writing a few more posts updating the work of past and present NUI-group members as well as news from the commercial multi-touch & gesture community. Be sure to check back soon, or subscribe to this blog!
Posted by
Lynn Marentette
May 6, 2010
Link to Innovative Interactivity (II) & post: SMART Technologies' acquisition of NextWindow: A "smart window" to the world
I am happy to announce that will be contributing a post bi-weekly on the 1st and 3rd Thursday of each month for the Innovative Interactivity (II) blog.
In my first post, I discuss interactive technologies in education and the explosion in the number of interactive whiteboards making way in classrooms in the US and around the globe. The motivation for this post came from last week's announcement of Smart Technologie's acquisition of NextWindow. Not long after the announcement, I had a chance to speak with Al Monro, CEO of NextWindow, and Nancy Knowlton, CEO of Smart Technologies. I share some of their insights in my post:
SMART Technologies' acquisition of NextWindow: A "smart window" to the world
About Innovative Interactivity (II):
"Innovative Interactivity serves as an open forum for multimedia producers, web developers and all other visualization junkies. Content focuses on the dynamics and theory of how people receive and react to different forms of information on the Web, both through visual, multimedia storytelling and interactive data visualization."
"The goal is that this blog will provide an outlet for those in the online realm, whether you are interested in learning about multimedia, interactivity, programming languages, data visualization, or all of the above. Hopefully you will be inspired from what you read here to surpass your current web standards in order to develop highly effective multimedia interactives for the digital community."
Previously, she was a multimedia producer at Roanoke.com, served as the UNC correspondent for CNN.com and interned with The Atlanta Journal-Constitution. In 2007, she was selected to participate in the Poynter Summer Fellowship. Boyer graduated with a multimedia degree from UNC’s School of Journalism and Mass Communication. Her passions lie in travel and multimedia production with a focus on video, audio and interactive graphics. See more of her work at www.tracynboyer.com.
Boyer is available for speaking engagements and seminars. Please contact her for more information.
In 2005, she participated in the multimedia project “The Ancient Way”, in Spain. Since then she has participated in several other projects, including “Chasing Crusoe,” “Atacama Stories,” “Special Olympics in Shanghai and Idaho,” and “South of Here.” She has also taught and coordinated projects at Universidad de los Andes, Chile.
Andrea Ballocchi is currently a multimedia journalism graduate student at the University of Miami, Fla. and works at The Knight Center for International Media.
Ellen also serves in the role of producer and creative/content advisor to documentary film, music and other projects in the arts associated with social and environmental issues. Her projects include the film and soundtrack for “Born into Brothels,” (USA) which garnered the Oscar for Best Documentary in 2005, and for “Stolen,” (Australia) currently showing at international film festivals.
Ellen is a Liberal Arts grad with a degree from Amherst College.
Lynn has presented on topics such as multimedia thinking and learning, universal usability of interactive multimedia, and universal design and accessibility for games. She returned to school a few years ago to learn how to make interactive multimedia applications and games for the web, handheld devices, and large displays, and has taken graduate courses such as game design, ubiquitous computing, and virtual reality in preparation for a potential PhD in Information Technology at UNC-Charlotte.
In my first post, I discuss interactive technologies in education and the explosion in the number of interactive whiteboards making way in classrooms in the US and around the globe. The motivation for this post came from last week's announcement of Smart Technologie's acquisition of NextWindow. Not long after the announcement, I had a chance to speak with Al Monro, CEO of NextWindow, and Nancy Knowlton, CEO of Smart Technologies. I share some of their insights in my post:
SMART Technologies' acquisition of NextWindow: A "smart window" to the world
About Innovative Interactivity (II):
"Innovative Interactivity serves as an open forum for multimedia producers, web developers and all other visualization junkies. Content focuses on the dynamics and theory of how people receive and react to different forms of information on the Web, both through visual, multimedia storytelling and interactive data visualization."
"The goal is that this blog will provide an outlet for those in the online realm, whether you are interested in learning about multimedia, interactivity, programming languages, data visualization, or all of the above. Hopefully you will be inspired from what you read here to surpass your current web standards in order to develop highly effective multimedia interactives for the digital community."
Tracy Boyer | Founder & Managing Editor
Tracy Boyer is an award-winning multimedia producer, specializing in interactive Web development and multimedia storytelling. Currently, she is a dual master’s candidate (MBA/MSIS) at UNC-Chapel Hill where she is studying Human-Computer Interaction in the School’s Information Science program and Entrepreneurship at Kenan-Flagler Business School.Previously, she was a multimedia producer at Roanoke.com, served as the UNC correspondent for CNN.com and interned with The Atlanta Journal-Constitution. In 2007, she was selected to participate in the Poynter Summer Fellowship. Boyer graduated with a multimedia degree from UNC’s School of Journalism and Mass Communication. Her passions lie in travel and multimedia production with a focus on video, audio and interactive graphics. See more of her work at www.tracynboyer.com.
Boyer is available for speaking engagements and seminars. Please contact her for more information.
Andrea Ballocchi | Spanish Editor & Social Media Manager
Andrea Ballocchi is a Chilean journalist. She studied video production at the Art Institute in Fort Lauderdale, Fla., and worked as an assistant producer for Sábado Gigantes on channel 13 in Santiago, Chile, and as a producer at Anticipa, an internet company also in Chile.In 2005, she participated in the multimedia project “The Ancient Way”, in Spain. Since then she has participated in several other projects, including “Chasing Crusoe,” “Atacama Stories,” “Special Olympics in Shanghai and Idaho,” and “South of Here.” She has also taught and coordinated projects at Universidad de los Andes, Chile.
Andrea Ballocchi is currently a multimedia journalism graduate student at the University of Miami, Fla. and works at The Knight Center for International Media.
Ellen Peck | Contributor, Philanthropic Multimedia
Posts publish bi-weekly on the 2nd and 4th Thursdays of each month
Ellen Peck works as consultant with a focus on the Non-profit sector. Ellen worked for over 15 years with Save the Children as a Director of Development, and also created and managed their Emergencies and Crisis Leadership Council. She has worked with start-up non-profits, and first-time capital campaigns for more established organizations. She has been involved in fundraising and strategic partnerships with individuals, corporations, foundations and organizations, and in strategic planning with artists, agency project teams and senior management at organizations for new initiative development.Ellen also serves in the role of producer and creative/content advisor to documentary film, music and other projects in the arts associated with social and environmental issues. Her projects include the film and soundtrack for “Born into Brothels,” (USA) which garnered the Oscar for Best Documentary in 2005, and for “Stolen,” (Australia) currently showing at international film festivals.
Ellen is a Liberal Arts grad with a degree from Amherst College.
Lynn Marentette | Contributor, Academic Multimedia
Posts publish bi-weekly on the 1st and 3rd Thursdays of each month
Lynn Marentette has over 20 years of professional experience as a school psychologist, and has used interactive multimedia applications and games with students who have disabilities since the early 1990’s. She is passionate about emerging collaborative technologies, natural user interactions and interfaces, and how the power of interactive multimedia can be harnessed in education. In addition to her work as a psychologist, she blogs about accessible off-the-desktop natural-user interfaces at “Interactive Multimedia Technology.”Lynn has presented on topics such as multimedia thinking and learning, universal usability of interactive multimedia, and universal design and accessibility for games. She returned to school a few years ago to learn how to make interactive multimedia applications and games for the web, handheld devices, and large displays, and has taken graduate courses such as game design, ubiquitous computing, and virtual reality in preparation for a potential PhD in Information Technology at UNC-Charlotte.
Mar 11, 2013
Leap Motion: My Dev Kit Arrived - Now What?! Thoughts About "NUI" Child-Computer-Tech-Interaction - and More
I'd like to create a simple cause-and-effect music, art and movement application for my 2-year-old grandson, knowing that he'll be turning three near the end of this year. It would be nice if my app could provide young children with enough scaffolding to support gameplay and learning over a few years of development.
Now that I'm a grandmother, I've spent some time thinking about what the evolution of NUI will mean for young children like my grandson. Family and friends captured his first moments after birth with iPhones, and shared across the Internet. Born into the iWorld, he knows how to use an iPad or smart phone to view his earlier digital self on YouTube, without ever touching a mouse or a physical keyboard.
The little guy is pretty creative in his method of interacting with technology, as I've informally documented on video. He was seven months old when he first encountered my first iPad. It was fingers-and-toes interaction from the start.
In the first picture below, he's playing with NodeBeat. In the second picture, he's 27 months old, experimenting with hand and foot interaction, on a variety of apps.
My grandson is new to motion control applications, so I'm just beginning to learn what he likes, and what he is capable of doing. A couple of weeks ago, we played River Rush, from the Kinect Adventures game. He loved jumping up and down as he tried to hit the adventure pins. Most of the time, he kept jumping right out of the raft! (I think next time we'll try Kinect Sesame Street TV, or revisit Kinectimals.)
One of the steps I'm taking to prepare for my Leap Motion adventure is take a look at what people have done with it so far. There are at least 12,000 developer kits released, so hopefully there will be some interesting apps to go along with the retail version of Leap Motion when it is released at Best Buy on May 19th of this year.
One app I really like is Adam Somer's AirHarp, featured in the video clip below:
I also like the idea behing the following app, developed by undergraduate students:
Social Sign: Multi-User sign language gesture translator using the Leap Motion Controller (git.to/socialSign)
"Built at the PennApps Spring 2013 hackathon, Social Sign is a friendly tool for learning sign language! By using the Leap Motion device, the BadApples team implemented a rudimentary machine learning algorithm to track and identify American Sign Language from a user's hand gestures."
"Social Sign visualizes these hand gestures and broadcasts them in textual and visual representations to other signers in a signing room. In a standard chat room fashion, the interface permits written communication but with the benefit of enhanced learning in mind. It's all about learning a new way to communicate."-BadApples Team
There are a few NUI-focused tech companies that have experimented with Leap Motion. Today, I received a link to the following videoclip Joanna Taccone, of Intuilab, featuring their most recent work:
Gesture recognition with Leap Motion using IntuiFace Presentation
"Preview of our work with the Leap Motion controller. In the same spirit as our support for Microsoft Kinect, we have encoded true gesture support, not just mouse emulation, for the creation of interactive applications by non-programmers. The goal is to hide complexity from designers using our product, IntuiFace Presentation (IP). Through the use of IP's trigger/action syntax, designers simply select a gesture as a trigger - Swipe Left, Swipe Right, Point, etc. - and associate that gesture with an action like "turn the page" or "rotate the carousel". As you can see in this video, it works quite well. :-) We will offer Leap support as soon as it ships." -IntuiLab
Below is a demonstration of guys playing Drop Cord, a collaboration between Leap Motion and Double Fine. From the video, you can tell that they had a blast!
Here is an excerpt from the chatter: "The thing is that everyone just looks cool..Yeah, I know, it doesn't matter what you are doing...it's got the right amount of speed-up-slow-down stutter-y stuff...it is like a blend of art and science.."
According to the website, Drop Chord is a "A music-driven score challenge game for the Leap Motion controller, coming soon for PC, Mac, & IOS from the creators of Kinect Party.."
The following video is a demonstration of the use of Leap Motion to control an avatar and other interaction in Second Life:
Below are a few more videos featuring Leap Motion:
Control Your Computer With a Chopstick: Leap Motion Hands On (Mashable)
The Leap Motion Experience at SXSW 2013
LEAP Motion demo: Visualizer, Windows 8, Fruit Ninja, and More...
RELATED
Air Harp for Leap Motion, Responsive Interaction
Leap Motion and Double Fine team on Dropchord, give air guitar skills an outlet
John Fingas, Engadget, 3/7/13
Leap Motion Controller Set To Ship May 13 for Global Pre-Orders, In Best Buy Stores May 19.
Hands on With Leap Motion's Controller
Lance Ulanoff, Mashable, 3/10/13
Leap Motion website
Social Sign
IntuiLab
Leap Motion: Low Cost Gesture Control for Your Computer Display
SOMEWHAT RELATED
Kinect for Windows Academic: Kaplan Early Learning
"3 years & up. Hands-on play with a purpose -- the next generation way. This unique learning tool uses your body as the game controller making it a great opportunity to combine active play and learning all in one. Use any surface to actively engage kinesthetic, visual, and audio learners. Bundle includes the following software: Word Pop, Directions, Patterns, and Shapes."
Comment:
I've been an enthusiastic supporter of natural-user interfaces and interaction for years - back in 2007 I worked on touch-screen applications for large displays as a graduate student, and became an early member of the NUI group. I'm also a school psychologist, and from my experience, I understand how NUI-based applications and technologies, such as interactive whiteboards and touch-tablets, such as the iPad can support the learning, communication, and leisure needs of students who have significant special needs. It looks like Leap Motion and similar technologies have the potential to support a wide range of applications that target special populations, of all ages.
Posted by
Lynn Marentette
Labels:
AirHarp,
App,
controller,
Double Fine,
Drop Cord,
game,
gesture,
HCI,
Intuilab,
kinect,
Leap Motion,
Linden,
Motion Controller,
music,
NUI,
second life,
Social Sign,
SWSX,
technology
No comments:
Jun 26, 2010
A few links: GizmoWatch's 10 Interactive User Interfaces for the Future, CNN's Eatocracy, EVA 2010 and More!
Here's a quick link to a recent Gizmowatch post, Ten Interactive User Interfaces for the Future. Bharat, the authro, reviews a variety of interfaces, input methods, and interaction techniques, such as Skinput, a water-based touch screen, a muscle-computer interface, air gestures, brain-computer systems, and even a mud-tub interface.
I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.
Totally Unrelated
Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz". The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.
Here is the description of Eatocracy from the website:
Eatocracy "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."
Enjoy!
(The above is a repost from The World is My Interactive Interface)
I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.
Totally Unrelated
Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz". The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.
Here is the description of Eatocracy from the website:
Eatocracy "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."
Enjoy!
(The above is a repost from The World is My Interactive Interface)
Coming Soon
--More about 3D TV and Interactive TV
--Highlights from CHI 2010 (better late than never!)
--My experiments- SMARTTable, a game, interactive timeline prototype pictures...
--A post about Lieven van Velthoven's interesting Post-WIMP explorations - here are some links that he recently sent me:
- 3D birthday invitation: http://www.youtube.com/watch?
v=pNOP5aIoS9o - Virtual overgrowth in the real world: http://www.youtube.com/watch?
v=nuJPoUpBIKU - Make race tracks for virtual cars with any real stuff you can find: http://www.youtube.com/watch?
v=e1K3og8qLcA - Smoke Pong: Pong, ONLY for smokers: http://www.youtube.com/watch?
v=zwrsVo5p2_E - Webcam Airmouse: http://www.youtube.com/watch?
v=57kTrKJHOjM - MusicalNodes: the visual music library: www.musicalnodes.com, http://www.youtube.com/watch?
v=czQdhfA__UY (will be published at EVA2010)
As I took a peek at Lieven's video links, I noticed an interesting video mash-up Lieven created from the open-source code from the RadioHead's House of Cards music video and his One Million Particles app. I'll post them soon.
I'll try to get video, pictures, and commentary about EVA 2010. EVA stands for Electronic Visualization and the Arts. "Electronic Information, the Visual Arts, and Beyond.
FYI
I'm in the process of sorting through and re-organizing my blogs, which have been around for over four years! During this time, my blogs have attracted a growing number of readers. Because of this, I'd like to make things a bit user-centered. So expect to see little changes here and there. I promise I'll give my readers warnings in advance if I make any serious changes!
If you are new to this blog, you should know that my blogs started out as on-line filing cabinets, open to the world. Although there is a bit of overlap of material and some cross-posting between the blogs, they are arranged to serve as a paper-less way of keeping track of things that I've learned through my coursework, conference attendance, readings, and research. Since emerging technologies are high on my list of interests, I also use my blogs to share interesting things that cross my path.
I changed the name of my World Is My Interface blog to The World Is My Interactive Interface. "Off-the-desktop natural user interfaces, interaction, and user experience" are the main topics of the blog. It sometimes includes information about ubiquitous computing and DOOH, otherwise known as Digital Out Of Home.
I plan to tinker with my TechPsych blog later on. It focuses on topics that are useful to psychologists, educators, special education teachers, speech and language therapists, health and wellness professionals, and parents.
Feel free to leave comments, as I welcome your input.
Posted by
Lynn Marentette
Jul 12, 2012
TechCrunch Charlotte Highlights: T1 Vision; inTouch Collaborative Software
Yesterday evening I attended a meetup of TechCrunch and Charlotte-area techies, held at the uptown Packard Place entrepreneurial center. It was jammed-packed with people all abuzz with tech start-up fever, fueled by awesome food provided by Zen Fusion. Although my main purpose for attending the TechCrunch meet-up was to learn more about innovative technology start-ups in my region, I also was hoping to capture a few shots of interesting people. I like to keep my eye open for tee-shirt slogans, and one worn by a young gentleman caught my eye, proclaiming that he'd seen the future, and it is in his browser. On the back of his tee-shirt was a bright HTML5 logo, something that is dear to my heart, as I am moving from HTML4 to HTML5. He was polite and agreed to pose for a couple of photographs:

It turned out that the HTML5 guy was at the TechCrunch event with one of his colleagues from T1 Visions, a social touchscreen solutions company that I've featured previously on this blog. They caught me up on the growth of this start-up company, which now has 15 employees and has broadened its reach beyond table-top restaurant applications to the healthcare, education, corporate, retail, and broader hospitality sectors.
What I like about table-top systems is that they provide support for "natural user interaction". It allows for multiple modes of interaction with and presentation of multimedia content. Over the past several years, these systems have been proven to be useful to a wider range of people and settings. Interfaces that support touch and gesture interaction are no longer viewed as novelties, given the pervasiveness of touch-phones and tablets and their ease-of-use for most people.
To learn more about what T1 Visions has to offer, take a few minutes to view the following videos and follow the links at the end of this post!
Demonstration of how the collaboration table can work within a business environment:
Demonstration of the T1VISION touch wall:
T1 Visions Gallery
T1 Visions: Social Touchscreen Solutions
Interactive tabletops bring people together
Marty Minchin, Charlotte Observer, South Charlotte News, 2/20/12
Interactive Technology in the Carolinas: T-1 Visions Update
NOTE:
TechCrunch is a technology media group founded in 2005 that focuses on innovative technologies. This summer, a group of TechCrunchers are visiting cities in the south that were previously not under their radar, such as Savannah and my home region, Charlotte, N.C. The Charlotte TechCrunch meetup was held on Wednesday, July 11, 2012. I plan to devote a few more blog posts to share what I learned.
Posted by
Lynn Marentette
Labels:
Charlotte,
interactive systems,
inTouch,
media,
multimedia,
Packard Place,
social,
south,
start-up,
T1,
T1Visions,
techcrunch,
technology,
touchscreen,
Visions
1 comment:
Aug 21, 2010
Reflections About Interactivity in my Present World
Background: For those of you who are new to this blog, I work as a school psychologist in a public school district in North Carolina. During the mid 2000's, I took a variety of computer courses, leading up to a decision to work part time while taking graduate courses part-time such as human-computer interaction, games, information visualization, ubiquitous computing, VR, etc. I returned to work full time in 2008, due to the economic downturn.
I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs. I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.
Update: I don't usually post much about my work on this blog, but I have some exciting things to share.
New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.
School started for Wolfe students on July 26th, and three more whiteboards were installed. One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.
Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.
Why is this important to me?
Technology and my role as a school psychologist:
Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory. I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation. There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".
We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district, how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:
Working towards solutions:
I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district. She's used interactive applications and web resources for many years along with more traditional "hands-on" materials, but like me, has had limited access to IWB's until recently. She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April. We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.
One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards. Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.

vSKED in action
Workshop Presentation
This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs. We shared some of the exciting things we've noticed with our students and shared "before and after" examples. Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)
During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning. We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens. Below are a few of the resources we shared:
Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.
At Wolfe, we'd like to create a conversation wheel with related activities for the SMARTTable that co-ordinate with SMARTBoard activities. We'll probably experiment with the conversation wheel concept on the SMARTboard first.
Considerations:
One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart. We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated. The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.
Complications:
Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3. My HP Touchsmart, my home computer, is 64 bit, but runs Vista. My school-issued laptop runs Windows XP and is also 64-bit. To upgrade it, I will have to send it to the tech department for several weeks. It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.
Possible SMARTTable Solutions:
I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out! We'd welcome volunteers from other universities as well. Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.
I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs. I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.
Update: I don't usually post much about my work on this blog, but I have some exciting things to share.
New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.
School started for Wolfe students on July 26th, and three more whiteboards were installed. One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.
Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.
Why is this important to me?
Observations of positive changes:
Students who previously had limited interest in applications on the computer, or a limited attention span for such activities, demonstrate a high degree of interest and engagement with the activities on the IWBs we've tried so far. Many students who do not attend to learning materials, or only for brief periods of time, actively look at the IWB screen and even interacting with it, and smile. Some students who rarely-or never- speak sometimes vocalize more often during IWB activities! Some are making connections between learning materials and what they interact with on the screen, something that simply did not happen previously. Some students who rarely established eye contact with others or demonstrated joint attention do so during IWB activities, something I discussed in a post on the TechPsych blog.
Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory. I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation. There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".
We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district, how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:
"School psychologists promote the development and maintenance of learning environments that support resilience and academic growth, promote high rates of academic engaged time, and reduce negative influences on learning and behavior."
"School psychologists use information and assistive technology resources to enhance students’ cognitive and academic skills."
"School psychologists work with other school personnel to develop, implement, and evaluate effective interventions for increasing the amount of time students are engaged in learning."
"School psychologists facilitate design and delivery of curricula to help students develop effective behaviors, such as self-regulation and self-monitoring, planning/ organization, empathy, and healthy decision-making."
I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district. She's used interactive applications and web resources for many years along with more traditional "hands-on" materials, but like me, has had limited access to IWB's until recently. She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April. We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.
One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards. Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.

vSKED in action
Workshop Presentation
This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs. We shared some of the exciting things we've noticed with our students and shared "before and after" examples. Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)
During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning. We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens. Below are a few of the resources we shared:
Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.

Sensory World provides an intuitive navigation system for students to use as they explore various areas of a house and engage in learning and related activities. The site map for all of the areas of the house, as well as the activities, is graphical.
We shared resources from the National Gallery of Art's Kids Zone, a great website that offers plenty of interactive content:
Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
"The intention is that symbol readers, teachers and carers will contribute, and we are delighted at the number of contributors that have already sent us their work"
We shared a recent find, Taptu rthat can be used on an IWB. Although Taptu focuses on the mobile touchable web, accessed through smartphones with internet access, it provides links to a wide range of websites designed with intuitive touch interaction in mind, perfect for students with special needs. In many cases, Taptu web navigation resources work fine on the larger screen of the IWB. The advantage is that a icon/visual based touch navigation system supports students who can't type and also those who have very limited reading skills.

Sports Navigation for Taptu
Below is a slideshow about the touchable web:
Taptu: Virtual Roundtable
Update: SMARTTable
Considerations:
One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart. We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated. The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.
Complications:
Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3. My HP Touchsmart, my home computer, is 64 bit, but runs Vista. My school-issued laptop runs Windows XP and is also 64-bit. To upgrade it, I will have to send it to the tech department for several weeks. It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.
Possible SMARTTable Solutions:
I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out! We'd welcome volunteers from other universities as well. Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.
Upcoming Interactive Multimedia Technology Posts
I have about 6 posts in draft form about interesting interactive technologies, research, people, and companies involved with multi-touch, interactive multimedia, and natural user interface/interaction applications. I'll get them up as soon as possible.
One post is about 3M Touch Systems. Another is about an interesting multi-touch start-up company, Osmosis. I revisited Ballantyne Village to update the use of interactive touch screen technology, something I discussed in detail, along with photos and video clips, in a 2008 post, Technology Supported Shopping and Entertainment User Experience at Ballantyne Village: "A" for concept, "D" for touch-screen usability". I got some inside scoop about the rationale behind the changes at the upscale center from a seasoned owner of a high-end audio-visual store I interviewed about 3D TV. I have some interesting information about current research in "glasses-less" 3D displays. Sharp will be coming out with a 3D cell-phone camera AND glasses-less cell-phone display in the not-to-distant future.
I'm preparing for posts that highlight a few of my favorite blogs. Until then, take a look at the recent posts on Tracy Boyer's Innovative Interactivity blog, and also InteractiveTV Today.
RELATED & SOMEWHAT RELATED
"The AAC-RERC conducts a comprehensive program of research, development, training, and dissemination activities that address the NIDRR priorities and seek to improve technologies for individuals who rely on augmentative and alternative communication (AAC) technologies. The mission of the AAC-RERC is to assist people who rely on augmentative and alternative communication to achieve their goals by advancing and promoting AAC technologies and supporting the individuals who use, manufacture, and recommend them."
Janet Light, Kathy Drager, Penn State University
Posted by
Lynn Marentette
Subscribe to:
Posts (Atom)











