Showing posts with label demo. Show all posts
Showing posts with label demo. Show all posts

Mar 15, 2014

Graphene, Nanotechnology, and Programmable Interfaces; Samsung Galaxy Demo


I've been intrigued by graphene's multiple possibilities for the future. It is a flexible, programmable material that harness nano-technology to create flexible touch screens, "wearables", efficient energy storage systems, and more.  The following videos provide just two examples of graphene's potential.  

The details?  If you are curious, follow the links at the end of this post.  




Here is a short clip of a demo of a graphene touch screen on a Samsung Galaxy:


RELATED
Graphene nanoribbons could be the savior of Moore's Law
Ryan Whitwam, Extreme Tech, 2/17/14
High-Performance Multifunctional Graphene Yarns: Toward Wearable All-Carbon Engery Storage Textiles
ACS NANO, 2/11/14
Hydrogenation-Assisted Graphene Origami and Its Application in Programmable Molecular Mass Uptake, Storage, and Release
Shuze Zhu and Teng Li, University of Maryland, ACS Nano, 2/24/14
Teng Li Group, Harvard University
Chemically and structurally functionalized graphene for real-world applications
Marko Spasenovic, Graphenea, 3/06/14
Nanoscale graphene origami cages set world record for densest hydrogen storage
Kurzweil Newsletter, 3/14/14
Auto-switchable graphene bio-interface with a 'zipper' nanoarchitecture
Onur Parlak, Anthony P.F. Turner, Ashutosh Tiwari, Nano Werk 10/31/13
Samsung files patent for graphene-based touch screen
Marko Spasenovic, Graphene Tracker, 3/7/14
Graphene: Wikipedia
Grahpene:  Flexible touch screen, made from a sheet of carbon the thickness of one atom!   
Lynn Marentette, Interactive Multimedia Technology blog, 6/23/10


May 27, 2013

Leap Motion and Google Earth Experiment: Cute Doggie Photo-globe Mashup

Leap Motion and Google Earth Experiment: Cute Doggie Photo-globe Mashup 

I finally experimented with my Leap Motion controller and Google Earth, using a mashup I created a few years ago with pictures of cute dogs from my Flickr photo-stream.  In the video below, you can see that my gesture navigation skills still need some practice!

I should have watched the following video of Leap Motion in action with Google Earth before trying this experiment at home : )  

I am pretty sure that developers will be able to tweak Leap Motion + Google Earth interaction in the near future.  I'd like to adapt it for use with kids as well as adults who have mild motor impairments.





















Cute Doggies Photo-Globe Mash-up using Google Earth and a Flickr Set (How-to)

If you'd like to make your very own photo-globe using Google Earth and Flickr photos, here are the directions, ported and updated from a previous post:


This photo is a screen shot of photos of just about every dog I know, and some that happened to cross my path. In this post, I'll share some information about how to create a photo-globe in Google Earth. 

The first step is to make sure you have lots of pictures related to your theme uploaded to a site such as Flickr.  (You can also create a photo-globe using pictures from your computer's hard drive.)

To get the pictures into Google Earth, I used the Image Overlay feature, and in the "link" textbox, I entered the image URL for each picture that I'd previously loaded as a set in Flickr.



To do prepare for this, make sure you go to "view" tab on the upper left-hand section of your screen, and make sure that "toolbar" is checked. Also make sure that "Grid" is selection, as this will help make it easier to arrange and align your pictures.  You can turn off this feature later. Near the top of the screen, click on the Image Overlay icon. (I've highlighted it in the picture.)



You'll have to enter the URL of the image you'd like to add to the globe in the "Link" textbox, which I've highlighted in the above picture.  In this case, I've used a link to one of my pictures in a Flickr set I created for this project.

One thing to keep in mind is that the picture will take up a much larger space than you might prefer, so you'll have to adjust the size using the green markers:

Positioning the Overlay in the Viewer
The following directions are from the "Positioning the Imagery in the Viewer" section in the help section:


  1. Use the center cross-hair marker to slide the entire overlay on the globe and position it from the center. (Tip: do this first.)
  2. Use the triangle marker to rotate the image for better placement.
  3. Use any of the corner cross-hair markers to stretch or skew the selected corner. If you press the Shift key when selecting this marker, the image is scaled from the center.
  4. Use any of the four side anchors to stretch the image in or out of from the selected side. If you press the Shift key when doing this, the image is scaled from the center.

TIP:  Try positioning the center of the image as a reference point first, and then use the Shift key in combination with one of the anchors to scale the image for best positioning.

Directions updated to reflect latest version of Flickr, as of 5/27/13:

To find the image URL for a photo in Flickr that you wish to link on your photo-globe, select your desired photo and right click "Copy Image URL".
















Put your curser in the Link section of  the "New Image Overlay" dialog box in Google Earth, and right click to select "paste" from the drop-down menu















Then repeat the process.  It helps to name each picture so that you can find it easily in Google Earth.

To enhance your mash-up, you can add place-marks that contain URLs that link to additional information about the subject of a picture, such as blog posts with embedded videos and/or text related to a picture, and so forth. Directions can be found in Google Earth's help section.

The process of building a photo-globe in Google Earth is a bit tedious.  If someone has a short-cut to share, please let me know!


RESOURCES
Google Earth
Flickr
Programmable Web (My hunch is that this site might provide some information about shortcuts for creating a photo-globe in Google Earth.)
LEAP Motion

May 21, 2013

Xbox One and Kinect 2 for the Playground of the Future

Xbox One and Kinect 2, Playground of the Future

The big news in tech today is the unveiling of the new Xbox One/Kinect 2 system.  For now, the video below might be the closest you'll get to the system.  Wired's senior editor, Peter Rubin had a chance to interview Scott Evans, of Microsoft, as he demonstrated the fascinating technical details in a family-room type setting.

Wired's interview of Scott Evans and demo of the new Xbox One and Kinect 2, using Active IR technology.



From what I learned, the new Kinect sensor has six times the fidelity of the previous version. Paired with the new Xbox One, it can do amazing things.  Engineers from around the world collaborated on this project, providing expertise in facial recognition, digital signal processing, speech recognition, machine learning, and computer vision.  The Xbox One is fueled by an 8-core x86 processor, supported by 8GB of RAM, which is sure to handle the hardest gamer's needs. It also includes a 500GB hard drive and an HD Blu-ray player.


The new system was designed to enhance the gaming/user experience. The 1080p camera provides a field of view that is 60 degrees larger than its  predecessor, and can handle a high level of detail.  It provides a better means of interpreting movement and orientation, and it processes skeleton and hand movements more precisely.  The system features "muscle man", a human-based physics model that is layered over the skeleton and depth map. It senses and calculating the forces the player uses while moving in a game. 

What I find interesting is that the camera can detect the player's pulse by measuring subtle changes of the skin that can't be perceived by the naked eye.  It also can quickly identify each player (it handles up to six), and identify facial expressions.  The active IR (infrared) system provides the system with better accuracy than the original Kinect. 

I wasn't able to find out much information regarding privacy issues with this system.  This is a concern, since it can sense your physiological responses, movement patterns, and facial expressions.  Over time, a good deal of very personal information would be gathered about each user. I shudder to think about the consequences if the data fell into the wrong hands.  

Possibilities for Special Needs Populations

I can see that the Xbox One + Kinect 2 system has the potential for games and other interactive applications for use in physical rehabilitation and fitness.  Since it can interpret facial expressions, it could also provide a way to support social skills learning among children and teens who have autism spectrum disorders.

RELATED

Microsoft invests a good deal of attention to proof-of-concept projects that may or not become part of a commercial product.  Below is an example of IllumiRoom:


Hrvoje Benko, of Microsoft Research, discusses the IllumiRoom concept during an interview at CHI 2013.


Xbox One Website
The new Xbox One Kinect tracks your heart rate, happiness, hands and hollers
Matthew Panzarino, The Next Web, 5/22/13
Kinect 2 Full Video Walkthrough: The Xbox Sees You Like Never Before
Kyle Wagner, Gizmodo, 5/21/13
Hands-on with prototypes of the Xbox One and New Kinect Sensor
Ben Gilbert, engadget, 5/21/13
Efficient Human Pose Estimation from Single Depth Images
Shotton, J., Girshick, R., Fitzgibbon, A., Sharp, T., Cook, M., Finocchio, M., Moore, R., Kohli, P., Crinisi, A., Kipman, A., Blake, A.   Video
Consumer Depth Cameras for Computer Vision:  Research Topics and Applications
Fossati, A., Gall, J., Grabner, H., Ren, X., Konolige, K. (Eds.)
Xbox One: Microsoft's supergeeks reveal what's inside the hardware
Dean Takahashi, VentureBeat, 5/21/13
Next Xbox Will Face New Array of Rivals
Nick Wingfield, New York Times, 5/21/13

Feb 20, 2013

AirHarp for Leap Motion, a Responsive Musical Natural User Interface

I like this demonstration of Adam Somers  AirHarp music application for use with the Leap Motion 3D controller:


AirHarp is being developed in C++ using Adam Somer's audio processing toolkit, MusKit.  This looks interesting!  Things have changes since I last took a computer music technology course (back in 2003).

Adam Somers is a senior software engineer at Universal Audio.  He has a graduate degree in music technology from Stanford, and a background in computer science, electronics, human-computer interaction, and signal processing.

Leap Motion is a motion-control software and hardware start-up company located in San Francisco, California. According to promotional information from the website, the company's first product, the Leap Motion controller, is 200 times more sensitive than existing technologies.  It will be interesting to see how this plays out.  (I'm still waiting for my pre-order.)

RELATED
AirHarp (links to GitHub)
Leap FAQs
Leap Motion Website
Leap Motion Developer Portal
Leap Motion Leadership Team
Leap Motion goes retail: Motion controller sold exclusively at Best Buy
Michael Gorman, engadget, 1/16/13

Leap Motion: Low Cost Gesture Control for your Computer Display
Asus partners up with Leap Motion, PCs with 3D motion control to debut in 2013
Michael Gorman, engadget, 1/3/13
Stanford Center for Computer Research in Music and Acoustics


Feb 14, 2013

Affinity+: Semi-Structured Brainstorming on Large Displays, from Pacific Northwest National Laboratory

The Affinity+ concept has the potential to be useful in educational settings such as schools, museums, and libraries. Although it was designed to support collaborative activities among software designers/developers, it could support a wide range of collaborative project-based learning activities. The clearly narrated video below was produced by a team from the Pacific Northwest National Laboratory




"Affinity diagraming is a powerful method for encouraging and capturing lateral thinking in a group environment. The Affinity+ Concept was designed to improve the collaborative brainstorm process through the use of large display surfaces in conjunction with mobile devices like smart phones and tablets. The system works by capturing the ideas digitally and allowing users to sort and group them on a large touch screen manually. Additionally, Affinity+ incorporates theme detection, topic clustering, and other processing algorithms that help bring structured analytic techniques to the process without requiring explicit leadership roles and other overhead typically involved in these activities." -PNNL



RELATED

Affinity+ Semi-Structured Brainstorming on Large Displays
Russ Burtner, Richard May, Randy Scarberry, Ryan LaMothe, Alex Endert
Pacific Northwest National Laboratory

Information Visualization Core Area:  Natural User Interactions
Information Visualization Core Area:  User Experience
Pacific Northwest National Laboratory

Large Displays: Will it ever be enough? (pdf)

Richard May, Jim Thomas, Pacific Northwest National Laboratory

Although this paper is from 2006, it contains a discussion of the "Top Ten Research Challenges" associated with  large high-resolution displays:
A Survey of Large High-Resolution Display Technologies, Techniques, and Applications (pdf)
Tao Ni, Greg S. Schmidt, Oliver G. Staadt, Mark A Livingston, Robert Ball, Richard May
IEEE Virtual Reality Conference 2006, pp223-226 Virginia Tech, 2006

Advanced Visualization and Interaction Techniques for Large High-Resolution Displays (pdf)

Sebastian Thelen (in Ariane Middel, Inga Scheler, and Hans Hagen (eds.), Visualization of Large and Unstructured Data Sets - Applications in Geospatial Planning, Modeling and Engineering (IRTG 1131 Workshop), VLUDS 2010, March 19-21, 2010, Bodega Bay, CA, USA DOI: 10.4230/OASIcs.VLUDS.2010.73

Affinity Diagraming

Usability Net


Nov 3, 2012

iPad3 and iPad Mini: Hands-on Side-by Side Comparison Video, by Eric Sailers (quick post)

Here is a good side-by-side "hands-on" comparison of the new iPad3 and the new iPad Mini by Eric Sailers:



Eric Sailers is a speech and language pathologist who has co-created apps for iOS devices since 2009. His website has a wealth of information iOS devices and apps for education, especially for children with special needs.  

May 15, 2012

NUITEQ's Latest Multitouch Showreel: Snowflake Suite

I've been following a number of people that have been working in the area of natural user interfaces and interaction for many years.  An example of this work is NUITEQ, a company lead by Harry van deer Veen.  Below is NUITEQ's most recent show reel of Snowflake Suite, an off-the-shelf multitouch SDK.


Here is the description of the software from the naturaluserinterface YouTube channel:


"NUITEQ's award-winning multitouch software product Snowflake Suite comes off the shelf with 30+ apps, a free SDK to develop your own multitouch software apps and its content is easy to customize. The solution is offers high performance, stability, quality and comes with dedicated support. Apps includes presentation, productivity and creativity tools as well as games. The software can be used in different scenarios such as corporate presentations, exhibitions, entertainment, education, public spaces, consumer electronics, retail and digital signage."

FYI: Tutorials about the user of Snowflake Suite can be found on the naturaluserinterface YouTube channel. 


Harry van der Veen has been sharing his NUI journey journey since 2007 on his Multitouch blog.






Sep 16, 2011

MindHabits Online Demo: Useful Serious Game for Social Skills Group Activities



I'd like to share the on-line demo of MindHabit's suite of serious games that I've found useful in my work with teens and young adults who need support in the area of social-emotional skills. 


What I like about the online demo is that it adjusts to the player's responses. This feature made it fun to use during the last few social skills groups I facilitated at work, since it could be played by students with a range of cognitive abilities. I had students take turns playing the game using a SMARTboard, and found that all of the students paid attention to what was going on. In my opinion, using the interactive whiteboard supported "off-the-shoulder" learning among the students who were not at the board. 


MindGames is available for Windows and Macs, and the full version is just $19.99 and provides 100 game levels.  The full version tracks progress and includes four games.


Here's some information from the company's website:  
"Based on social intelligence research conducted at McGill University, these stress busting, confidence boosting games use simple, fun-to-play exercises designed to help players develop and maintain a more positive state of mind." 
 "Based on the principles of social intelligence: Inhibition - uses game mechanics to promote positive habits; Association - connects personal info to positive feedback; Activation - uses personal references"




 MindHabits MindHabits MindHabits MindHabits
 MindHabits
You are playing the MindHabits Trainer online demo. Your progress will not be logged beyond this session.
Copyright © 2008 MindHabits inc. All rights Reserved.

Upcoming: 

Apr 10, 2011

Immersive Cocoon Interaction: "It's people who are now the interface" (Updated, with videos, photos, links.)

"It's people who are now the interface." -Ole Bowman, cultural and architectural historian


I found the above quote from the Immersive Cocoon website and smiled.


When I first learned about the Immersive Cocoon in 2008, I thought it was just another technological fancy that probably would not come to market anytime soon.  Although it still is in the concept stage, I think it has a chance of making it, given the rapid advances in interactive technology over the past few years.

It wouldn't surprise me to see i-Cocoons finding a place in libraries, educational settings, museums, and other public spaces within the next 5-8 years, given an economic turnaround.


What is the Immersive Cocoon?
"The Immersive Cocoon is a future concept study by Tino Schaedler with design collective NAU; an idea to push the envelope and provoke a new conception of interface technology...Directed and 3D CG by Oliver Zeller. More info, behind the scenes and full credits at i-cocoon.com.-adNAU"


What is inside the cocoon?



Photo: arch.nau.coop

Photo: arch.nau.coop
Teaser Video:


"Please play fullscreen and LOUD! ...This spec teaser reveals an evolution in computing interaction, within a setting inspired by the penultimate scene from Stanley Kubrick's 2001: A Space Odyssey...Starring that film's lead actor, Keir Dullea; "2011" was developed over a two year period. Live action was filmed multi-camera, against green screen atop a backlit plexi floor on a shoestring budget. Mr. Dullea was then integrated into an entirely digitally created CG set rendered at 1080HD."


Here are some previous videos about the iCocoon concept:




RELATED
Immersive Cocoon Concept Website
Designers developing virtual-reality 'Cocoon'
Mark Tutton, 9/12/08, Telepresence Options /Human Productivity Lab
Immersive Cocoon-Facebook
"NAU is an international, multidisciplinary design firm, spanning the spectrum from architecture and interior design to exhibitions and interactive interfaces. As futurists creating both visual design and constructed projects, NAU melds the precision of experienced builders with the imagination and attention to detail required to create innovative exhibits, public events and architecture."
FYI:
Concerning interactive technology, things have changed a bit in my corner of the world - as I write this post, there is a Kinect beckoning me to dance in my bonus room. The Kinect was something that came to market much sooner than I expected.  I'll have an iPad2 sometime in the near future- another example of how rapidly things are evolving.   I skim the news by touch/swiping my now-outdated HTC Incredible.  My 88-year-old aunt, has used Skype more than once to "chat" with her baby great-nephew across the miles.


I use a Wii at work at least once a week to support social interaction skills with some students who have moderate-to-severe autism. Every classroom in the main school I serve has a huge, immersive, interactive whiteboard that relies on touch and kinesthetic interaction-my colleagues can't imagine going back to teaching without them.  



Jan 12, 2011

Multi-modal Interactive Maps for People with Visual Impairments: Featuring a Stantum multitouch screen with a tactile layer.

To learn more about this project, take a look at the video and related publications below. This is a great example of a team that is harnessing emerging technologies to improve the lives of people with disabilities.


Video: "Multimodal Maps for Blind People"


Website


Publications
Anke Brock, Philippe Truillet, Bernard Oriola, Christophe Jouffrai (IRIT CNRS and Universite de Toulouse) Usage of Multimodal Maps for Blind People: Why and How
ITS’10, November 7–10, 2010, Saarbrücken, Germany
Paper: http://www.irit.fr/~Philippe.Truillet/projects/doc/MultimodalMapsForTheBlind-ITS10.pdf
Poster: http://www.irit.fr/~Philippe.Truillet/projects/doc/Poster-ITS10.pdf

Stantum (Multi-touch screen used for the application.)
Ivy Middleware (Used in this application.)

Cross-posted on the TechPsych blog.

Dec 6, 2010

UPDATE: Demo 2 of the Kinect Theramin, Therenect, by Martin Kaltenbrunner

I recently posted about the Therenect, a gesture-controlled digital theremin created for Microsoft's Kinect, created by Martin Kaltenbrenner - Therenect: Theremin for the Kinect! (via Martin Kaltenbrenner)  It looks like Martin has been busy polishing up the application over the past few days, as you can see from the video below:

Therenect - Kinect Theremin - 2nd Demo from Martin Kaltenbrunner on Vimeo.

RELATED
Virtual Theremin Made with Kinect; Real Thereminists Will Make it Useful
Peter Kirn, Create Digital Music, 11/30/10

Nov 4, 2010

USB Midi on the iPad: Video demonstration of iOS 4.2, supports wireless MIDI


Video from the MooCowMusic YouTube Channel

RELATED
On iPad, iPod touch and iPhone, New MIDI Support, Via Wires, Wireless
Pete Kirn, Create Digital Music, 11/3/10
MIDI on the iPAD
Display Blog, 11/4/10

Thanks to Johannes Schöning for the link!
(FYI: Johannes will be at the Interactive Tabletops and Surfaces conference, held in Saarbrucken, Germany, from November 8-10.)

Oct 6, 2009

I want to play with mice! Microsoft's Multiple Multi-touch Mice Preview

Hot off the press from Microsoft's Applied Sciences Group at UIST 2009!
Mouse 2.0: Multi-touch Meets the Mouse

"In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementa-tions of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study."

The following video is courtesy of Microsoft's Applied Sciences Group:



Music:  "Motion Blur", by Bjorn Hartman. (Bjorn is an HCI researcher and electronic musician.)

The researchers on the team: 
Nicolas Villar, Shahram Izadi, Dan Rosenfeld, Hrvoje Benko, John Helmes, Jonathan Westhues, Steve Hodges, Eyal Ofek, Alex Butler, Xiang Cao and Billy Chen


Here is a video preview/demo of the multi-touch mice prototypes from Microsoft's Applied Sciences Group Lab, courtesy of CrunchGear:





meese
-CrunchGear

Apr 26, 2009

Giving Healthcare a Digital "Touch" via Microsoft

According to the PR release from Microsoft, Texas Health Resources and Microsoft partner Infusion Development have developed a prototype to assist with doctor-patient communication and collaboration:





http://www.microsoft.com/presspass/images/features/2009/04-06THROnSurface_lg.jpg

Medhost has created an emergency department dashboard that can assist medical professionals decision-making process more efficiently:

http://www.microsoft.com/presspass/images/features/2009/04-06SurfaceDashboardFade_lg.jpg



Vectorform developed an application to assist children in rehabilitation at the Cook Children's Health System in Fort Worth, Texas. The application allows the rehabilitation specialist design their own evaluations for patients:

http://www.microsoft.com/presspass/images/features/2009/04-06TracingApp_lg.jpg

The only drawback is that the Surface has a very high price tag. I think I'll stick with HP TouchSmart PC projects with my students!

Mar 24, 2009

Struktable Multi-touch Installation at TOCA ME Design Conference






Struktable Multitouch Installation from Gregor Hofbauer on Vimeo.


Strukt is a design studio in Vienna, Austria, that specializes in interactive and generative design for a variety of purposes, such as interactive environments and installations, ambient intelligent environments, games, and multi-touch tables, screens, and walls. The video is a demonstration of applications that were presented at the March 2009 TOCA ME Design Conference in Munich, Germany. The applications were developed using
vvvv. (More information regarding vvvv can be found at the end of this post.)


MT Table 01

INFO FOR THE TECH-SAVVY OR TECH-CURIOUS:

According to information from the vvvv website, vvvv is a "toolkit for real time video synthesis. It is designed to facilitate the handling of large media environments with physical interfaces, real-time motion graphics, audio and video that can interact with many users simultaneously. vvvv is a visual programming interface. Therefore it provides a graphical programming language for easy prototyping and development. vvvv is real time, where many other languages have distinct modes for building and running programs, vvv only has one mode, run-time. vvvv is free for non-commercial use."

VVVV Screenshots

VVVV's Propaganda Page
Other projects using VVVV
Struktable: the 70-inch Multitouch Table

STRUK ON A SPHERE: Interactive installation at a Mercedes Benz conference