Showing posts sorted by date for query gesture. Sort by relevance Show all posts
Showing posts sorted by date for query gesture. Sort by relevance Show all posts

Oct 31, 2010

Microsoft is acquiring Canesta, Inc., a developer of 3-D electronic perception technology for natural user interaction, gaming, and more.

Microsoft to Acquire 3-D Chip Firm Canesta
Michael Baron, TheStreet 10/29/10

Thanks to Harry Van Der Veen, of NUITEQ, for this link!

RELATED
The following video is from the Canesta3D YouTube channel. It demonstrates the 3D input sensor in action, with four people moving around in a living room. The chip used in the system depicted in the video was the precursor to the current chip, called the "Cobra 320x200".


Below is a demo of gesture interaction using Canesta3D technology to control and select information and content on a large display.  In my opinion, this will change the way we interact with our TV's, at least for those of us who hate using bad remotes!  Microsoft's acquisition of Canesta is good news, especially if they allow this technology to be used by the masses.   I'm pretty sure it has the capability of supporting  interaction with HD TV's are internet-ready, and can support GoogleTV, LeanBack, and Vimeo's Couch Mode.




Canesta Announces Definitive Agreement to be Acquired by Microsoft
Press Rease, 10/29/10, Canesta

About Canesta (From the Canesta website)
"Canesta (www.canesta.com) is the inventor of revolutionary, low cost electronic perception technology and leading provider of single chip CMOS 3-D sensors that fundamentally change the relationship between devices and their users. This capability makes possible true 3-D perception as input to everyday devices, rather than the widely understood 3-D representational technologies as output. Canesta’s 3-D input technology, based upon tiny, CMOS 3-D imaging chips or “sensors”, enables fine-grained, 3-dimensional depth-perception in a wide range of applications. Products based on this capability can then react on sight to the actions or motions of individuals and objects in their field of view, gaining levels of functionality and ease of use that were simply not possible in an era when such devices were blind. Canesta’s focus is on mass market consumer electronics, but many applications exist in other markets as well. Canesta is located in Sunnyvale, CA. The company has filedin excess of fifty patents, 44 of which have been granted so far."


Canesta Corporate Fact Sheet (pdf)
Videos: http://canesta.com/applications/consumer-electronics/gesture-controls

I posted some videos about Canesta's technologies on the following post. There are two videos that show Canesta's 3D depth camera works on a Hitachi flat-panel display: Interactive Displays 2009 Conference

For more information about interactive TV, GoogleTV, Leanback and Couch Mode, see the second section of my recent post:
Philipp Geist: Blending the Physical with the Digital;  Google TV/Leanback, Vimeo's new Couch Mode, oh..and ViewSonic's 3D (glasses-less) pocket camcorder...

Oct 12, 2010

Oh! No! Sony's "Mother of Remote Controls" for Google TV. 74 Buttons and Counting.

Today we switched from DISH to Time Warner Cable, and tonight I had to battle with a new remote control, the UR5U-8780L.  The experience with this remote led me to search for something better. What a coincidence!  In this day and age of touch-screens,  I was hoping for something better than....


Sony's Mother of Remote Controls!

-From the SonyStyle website: Television, meet internet.

I first learned of this complex addition to the world of TV/Internet surfing from an article and a video in a recent article in Engadget:  Sony's Google TV controller outed on ABC's Nightline (video) Ross Miller,10/5/10.  Harry Brignull also posted about the new controller- Sony, Sony, what have you done?
(Harry is a UX Consultant at Madgex, and author of the 90percent of everything blog.)



ABC Video, via engadget

It is 2010, and with TVs connected to the internet, we'll be interacting with content in ways we could only dream of in the recent past.  Interactive TV is here.  Do I really have to push a lot of buttons in order to have the best "interactive" experience?  




OTHER OPTIONS
Xfinity Remote Prototype for the iPad


Turn Your iPhone into a TV Remote Samuel Axon, Mashable/Apple
L5 Remote: Turn your iPhone or iPod touch into a universal remote control:
L5 remote


Not Yet Available:  Vizio's XRT100 touchscreen remote
Vizio_touch_remote.jpg

My Fancy New Remote, Instructions Included:

Sep 22, 2010

PD-NET Project: "Exploring..large scale networks of pervasive public displays..."

I love the concept of interactive, networked public displays!   The PD-NET project has a fairly new website and a Facebook page.  If this interests you, take the time to learn about the PD-NET project, a collaborative effort between researchers from several different universities in Europe. I'd like to see this take hold in the US.  


PD-NET Project Objectives (From the PD-NET Website):
  • To create enabling technologies for large-scale pervasive display networks through the design, development, and evaluation of a robust, scalable, distributed and open platform for interconnecting displays and their sensors.
  • To establish Europe as the international centre for work on pervasive display networks.
  • To address key scientific challenges that may inhibit the widespread adoption of pervasive display network technology:  Tensions between privacy and personalization, situated displays, business and legislative requirements, User Interaction.
Here is a list of participating universities and researchers, taken from the PD-NET website:


Participants from Lancaster University (UK):
Participants from University of Duisburg-Essen (Germany):
Participants from University of Lugano (Italy):
Participants from University of Minho (Portugal):
Publication
J. Müller, F. Alt, D. Michelis, and A. Schmidt, "Requirements and Design Space for Interactive Public Displays," in Proceedings of the 18th annual ACM international conference on Multimedia (Multimedia 2010), to appear, 2010.

COMMENT
If you are familiar with this blog, you know that I am passionate about interactive displays, especially in public spaces.  I've devoted numerous posts to this topic on this blog, and also on  The World Is My Interactive Interface, a blog I started a few years ago when I was working on projects for large interactive displays.  


I am interested in how interactive displays, of any size, can be accessible and universally designed, especially those that can inter-operate with mobile devices, including devices that support communication for people with special needs.


SOMEWHAT RELATED
Here are links to some of my previous blogposts related to the topic of interactive displays in public spaces.  Some have links to scholarly publications.  
What is DOOH and Why Should We Care? (DOOH- Digital Out-Of-Home)
Interactive Displays and Interaction (Presentation via Daniel Michelis)
Thoughts about technology on a cruise ship, and other reflections
Multi-touch and gesture interaction out-and-about
UPDATED: More News, Videos, and Links about Multi-Touch and Screen Technologies
Technology-Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch screen usability.
Usability/Interaction Hall of Shame (In a Hospital)
Think Globally, Act Locally:  Exploring the Problem Space - Top-down, bottom-up, local and the global...

Sep 18, 2010

Interactive Tabletops and Surfaces: 2010 ACM Conference, Nov. 7-10, Saarbrucken, Germany. Wish I could go!

If you are new to this blog, you should know that I'm passionate about interactive tables and surfaces of all sizes!   Although this technology has been around for a while, it is a new concept to most people.  The researchers and practitioners involved in the upcoming 2010 Interactive Tabletops and Surfaces Conference have been an important influence in the way people think about interacting with technology, and have made significant contributions to this emerging field over the past several years.   It hasn't been an easy road, given that most of us have minds brainwashed through years of forced keyboard-and-mouse interaction and traditional WIMP (Windows, Icons, Menus, Pointers) interfaces.

I first learned about the first Interactive Tabletops conference, held in 2006, in early 2007.  At the time, I was working on projects for my HCI and Ubiquitous Computing classes, trying to learn everything I could about natural user interaction, large touch-screen displays, tabletop computing, and multi-touch.   I was inspired by the interesting work going on in this field.  This was before the first iPhone was introduced, before Microsoft's multi-touch Surface was unveiled, and three years before Apple broke out with the iPad.

Many of the people involved with the 2010 Interactive Tabletops and Surfaces Conference are (or have been) affiliated with the NUI group NUI stands for Natural User Interface, or Natural User Interaction - the NUI group is "a global research community focused on the open discovery of natural user interfaces."  I joined the NUI-group in 2007 when I was looking for more information about the nuts and bolts of multi-touch programming and systems, and have been encourage to see how things have evolved since then.

Members of another group, sparkon, are also participating in the Interactive Tabletops and Surfaces conference.  Sparkon is an on-line community that includes people involved with  interactive technologies, including tabletop and surface computing. "On sparkon, you'll find projects demonstrating the latest interactive techniques, applications, software frameworks, case studies, and blog articles relating to creative and emergent technology."  (I'm also a member of Sparkon.)


Here's the information from the conference website:

ACM Interactive Tabletops and Surfaces, Saarbrücken, Germany:  7-10 November, 2010
"ITS 2010 is a premier venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a young community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, interaction design, and studies expanding our understanding of design considerations of ITS technologies and of their applications in modern society. ITS 2010 will bring together top researchers and practitioners who are interested in both the technical and human aspects of interactive tabletop and surface technologies. It is our hope that we will be able to achieve increased synergy of approaches between the disciplines engaged in the research in the area of interactive tabletops and surfaces, Design, HCI, UbiComp, Psychology, MobileHCI and other related fields. More directly, we intend to encourage immediate interdisciplinary collaboration on future research topics. Young scholars and Ph.D. students are especially encouraged to submit papers and participate in the doctoral colloquium."


Johannes Schöning, DFKI GmbH
Antonio Krüger, DFKI GmbH
Conference General Chairs



KEYNOTE SPEAKER:  W. Bradford Paley

"Bio: W. Bradford Paley uses computers to create visual displays with the goal of making readable, clear, and engaging expressions of complex data. He did his first computer graphics in 1973, founded Digital Image Design Incorporated in 1982, and started doing financial & statistical data visualization in 1986. He has exhibited at the Museum of Modern Art; he created TextArc.org; he is in the ARTPORT collection of the Whitney Museum of American Art; has received multiple grants and awards for both art and design, and his designs are at work every day in the hands of brokers on the floor of the New York Stock Exchange. He is an adjunct associate professor at Columbia University, and is director of Information Esthetics: a fledgling interdisciplinary group exploring the creation and interpretation of data representations that are both readable and esthetically satisfying."


SAMPLE TOPICS



  • Applications
  • Gesture-based interfaces
  • Multi-modal interfaces
  • Tangible interfaces
  • Novel interaction techniques
  • Data handling/exchange on large interactive surfaces
  • Data presentation on large interactive surfaces
  • User-interface technology
  • Computer supported collaborative systems
  • Middleware and network support
  • Augmented reality
  • Social protocols
  • Information visualizations
  • Interactive surface hardware, including sensing and input technologies with novel capabilities
  • Human-centered design & methodologies





RELATED
Previous Conferences








PLUGS
From the conference website -Links to the conference sponsors:




We appreciate the generous support of the following sponsors, without whom this conference would not be possible. Click on the logos to learn more about our generous supporters, and let us know if you are interested in becoming a sponsor.

Champions:

 

Benefactors:

  

Donors:

Contributors:

Academic Sponsors:

     

Interactive 360 Degree Glass-less 3D Video Display with Gesture Sensor: Demo of Sony's RayModeler

The video below gives a demo of Sony's RayModeler, "A 360-Degree Display that doesn't require glasses". The video shows how the auto-stereoscopic 3D content is filmed. It also shows how items within the display respond to gesture interaction. The first prototype was introduced in 2009 and then brought out at the SIGGRAPH conference this summer.



According to an article written by Richard Lawler, Core77 created "Breakout" for the RayModeler, a game similar to Pong.  I'll have to think more about this technology before I form an opinion!

RELATED
Sony's 360-degree RayModeler 3D display brings its glasses-free act to LA, plays Breakout
Richard Lawler, Engadget 7/28/10

Sony's 360-degree 3D display prototype makes virtual pets more lifelike, expensive
Thomas Ricker, Engadget 10/19/09

Aug 21, 2010

Reflections About Interactivity in my Present World

Background:  For those of you who are new to this blog, I work as a school psychologist in a public school district in North Carolina. During the mid 2000's, I took a variety of computer courses, leading up to a decision to work part time while taking graduate courses part-time such as human-computer interaction, games, information visualization, ubiquitous computing, VR, etc. I returned to work full time in 2008, due to the economic downturn.

I continue to maintain my passion for technology that supports natural user interaction, and share my interests on this blog and to a lesser extent, two other blogs.  I initially started blogging because it was a requirement for one of my courses, and found that my blogs served me well as on-line filing cabinets.

Update:  I don't usually post much about my work on this blog, but I have some exciting things to share.

New School Year, New SMARTBoard: I'd like am excited about the upcoming school year because my "home" school, Wolfe, a program that serves mostly teens and young adults up to age 22 who have more severe disabilities (including autism), now has an interactive whiteboard in every classroom.

School started for Wolfe students on July 26th, and three more whiteboards were installed.  One of the new SMARTboards is in the speech and language room, along with a multi-touch SMARTTable the school received last April, one is in the Art/Work Adjustment room, and one is in the room I use with small groups of students, which is used for music one day a week.


Although I have used interactive whiteboards in my work with students for several years, this is the first year that I have had one available to me most of the time in a school. Since most of the teachers received their IWBs over the past year, there is a pressing need to figure out what works, why it works, and how to create useful interactive content that is appropriate for the needs of students who have severe or multiple disabilities. We really can't wait.  

Why is this important to me?

Observations of positive changes:
Students who previously had limited interest in applications on the computer, or a limited attention span for such activities, demonstrate a high degree of interest and engagement with the activities on the IWBs we've tried so far.  Many students who do not attend to learning materials, or only for brief periods of time, actively look at the IWB screen and even interacting with it, and smile.   Some students who rarely-or never- speak sometimes vocalize more often during IWB activities!   Some are making connections between learning materials and what they interact with on the screen, something that simply did not happen previously.  Some students who rarely established eye contact with others or demonstrated joint attention do so during IWB activities, something I discussed in a post on the TechPsych blog.

Technology and my role as a school psychologist:
Designing effective interventions is part of my job as a school psychologist, and mixing interactive whiteboards and interventions is fairly uncharted territory.  I know that there are probably a few other school psychologists, counselors, social workers, and speech/language therapists out there who are wondering what to make of this technology and the students they serve directly, or indirectly through consultation.  There isn't much literature about this topic, so it is difficult to know what is truly "evidence based".

We are in the process of discovery. Within a school that has an IWB in every classroom, serving students with the most complex needs within the school district,  how can I best follow professional school psychology practices(pdf) regarding the provision of direct and indirect services. Some of these expectations are listed below:

"School psychologists promote the development and maintenance of learning environments that support resilience and academic growth, promote high rates of academic engaged time, and reduce negative influences on learning and behavior."

"School psychologists use information and assistive technology resources to enhance students’ cognitive and academic skills."

"School psychologists work with other school personnel to develop, implement, and evaluate effective interventions for increasing the amount of time students are engaged in learning."

"School psychologists facilitate design and delivery of curricula to help students develop effective behaviors, such as self-regulation and self-monitoring, planning/ organization, empathy, and healthy decision-making."

Working towards solutions:
I work closely with my colleague Kelly Cross, a speech and language therapist who serves Wolfe school and two classrooms of students with severe autism at a "regular" elementary school. Kelly also serves as the assistive technology and augmentative communication consultant for our school district.  She's used interactive applications and web resources for many years along with more traditional "hands-on" materials,  but like me, has had limited access to IWB's until recently.  She now has a SMARTboard in her room, along with the portable multi-touch SMARTTable Wolfe received last April.  We work with many of the same students, so with the influx of IWBs in our school, we've stepped up our collaboration.

One of Kelly's challenges is to figure out ways for teachers to integrate assistive technology and/or augmentative communication systems into their work with students during activities that involve interactive whiteboards.   Most of the research the area of assistive technology/aug com focuses on the use of technology to access applications as they are displayed on laptops or computer monitors, or factors related to the use of individual communication devices. The vSked project, led by Dr. Gillian Hayes at the University of California at Irvine, is on the right track in that it incorporates the use of a large interactive display that was used with students with autism, along with smaller hand-held communication systems, which I've discussed in a previous post.


vSked_1 575x320
vSKED in action

Workshop Presentation
This past week, Kelly and I presented at a workshop held in the Arlington VA school district that focused on the use of interactive whiteboards and related applications and web resources with students with special needs.  We shared some of the exciting things we've noticed with our students and shared "before and after" examples.  Included in the presentation were some of the video clips I quickly put together during the first days of school that had an impressive positive impact with students when they were used during interactive whiteboard activities. (Some of these video clips can be found in a previous post: Video clips that help students with autism learn and feel calm at the same time!)

During the workshop, we discussed a few guiding principals, such as the "least dangerous assumption" and the concept of Universal Design for Learning.   We also provided a sampling of resources previously used on the "small screen" appear to powerful when used with students interacting on the larger whiteboard screens.  Below are a few of the resources we shared:

Clicker5 is an application to support reading and writing that provides a natural voice output, and child-friendly word processor. It works well on interactive whitboards It can incorporate several communication symbol systems. It is a good tool for informal assessment with students with communication and other related needs.




Sensory World provides an intuitive navigation system for students to use as they explore various areas of a house and engage in learning and related activities. The site map for all of the areas of the house, as well as the activities, is graphical.




We shared resources from the National Gallery of Art's Kids Zone, a great website that offers plenty of interactive content:


Teachers who want a limited browsing environment for their students the Zac Browser is great for use on IWBs and the new large all-in-one touch screen displays:
Zac Browser

Another website that is good for visual-based browsing is Symbol World:

"The intention is that symbol readers, teachers and carers will contribute, and we are delighted at the number of contributors that have already sent us their work"

We shared a recent find, Taptu rthat can be used on an IWB.  Although Taptu focuses on the mobile touchable web, accessed through smartphones with internet access, it provides links to a wide range of websites designed with intuitive touch interaction in mind, perfect for students with special needs.  In many cases, Taptu web navigation resources work fine on the larger screen of the IWB.  The advantage is that a icon/visual based touch navigation system supports students who can't type and also those who have very limited reading skills.


Sports Navigation for Taptu

Below is a slideshow about the touchable web:
Taptu: Virtual Roundtable


Update: SMARTTable
At Wolfe, we'd like to create a conversation wheel with related activities for the SMARTTable that co-ordinate with SMARTBoard activities.  We'll probably experiment with the conversation wheel concept on the SMARTboard first.


Considerations:
One thing we've learned over the past few months is that it is bit more complicated to create content and activities for the SMARTTable than for the SMARTBoard notebook or Promethean flipchart.   We hope to have it customized to support scaffolded, customized classroom activities for students with special needs, but it might take more time than we had anticipated.  The school's immediate need is for touchable interactive content for all of the classroom IWB's, suitable for our students.

Complications:
Programming for the SMARTTable, for those who have 64-bit computers, requires Windows 7, Visual Studio 2008, and Expression Blend 3.   My HP Touchsmart, my home computer, is 64 bit, but runs Vista.  My school-issued laptop runs Windows XP and is also 64-bit.  To upgrade it, I will have to send it to the tech department for several weeks.   It will be slow-going, since I have to plan for my day-to-day activities and evaluations with students first.

Possible SMARTTable Solutions:
I'm hoping that some students from UNC-Charlotte might want to try their hand at multi-touch programming and help us out!  We'd welcome volunteers from other universities as well.   Wolfe's principal, Mary Jo Breckenridge, is very supportive of the use of innovative technologies with students with special needs, and would figure out a way to make a collaboration happen.










Upcoming Interactive Multimedia Technology Posts
I have about 6 posts in draft form about interesting interactive technologies, research, people, and companies involved with multi-touch, interactive multimedia, and natural user interface/interaction applications. I'll get them up as soon as possible.

One post is about 3M Touch Systems. Another is about an interesting multi-touch start-up company, Osmosis.  I revisited Ballantyne Village to update the use of interactive touch screen technology, something I discussed in detail, along with photos and video clips, in a 2008 post, Technology Supported Shopping and Entertainment User Experience at Ballantyne Village:  "A" for concept, "D" for touch-screen usability".   I got some inside scoop about the rationale behind the changes at the upscale center from a seasoned owner of a high-end audio-visual store I interviewed about 3D TV.   I have some interesting information about current research in "glasses-less" 3D displays. Sharp will be coming out with a 3D cell-phone camera AND glasses-less cell-phone display in the not-to-distant future.  

I'm preparing for posts that highlight a few of my favorite blogs. Until then, take a look at the recent posts on Tracy Boyer's Innovative Interactivity blog, and also InteractiveTV Today.

RELATED & SOMEWHAT RELATED
"The AAC-RERC conducts a comprehensive program of research, development, training, and dissemination activities that address the NIDRR priorities and seek to improve technologies for individuals who rely on augmentative and alternative communication (AAC) technologies. The mission of the AAC-RERC is to assist people who rely on augmentative and alternative communication to achieve their goals by advancing and promoting AAC technologies and supporting the individuals who use, manufacture, and recommend them."



Janet Light, Kathy Drager, Penn State University




Jeff Higginbotham's Bibliography:  Viewing AAC Through Authentic Social Interactions (pdf)

ProLoquo2Go (iPhone/iPod Touch/iPad communication system)


Also posted on the TechPsych blog

Jul 6, 2010

Multi-touch and NUI News: NUITEQ's Multi-touch Snowflake Suite for Windows 7

NUITEQ is a company that has been involved in multi-touch technologies for the past few years, and works diligently at ensuring that it keeps up with the evolution in interactive displays and software.  The company's Snowflake Suite is now available for Windows 7, and is compatible with 3M Touch Systems, N-trig, NextWindow, Lumino, and other third-party platforms.  

Here is the current compatibility list, and also a list of Snowflake Suite for Winodws 7 features:


Compatibility
* NextWindow platforms like HP TouchSmart IQ500, IQ770, IQ800 series and the Dell Studio One (all 2 touch)
* N-trig platforms like HP TouchSmart TX2, Dell Latitude XT/XT2 and others
* 3M Touch Systems multi-touch kit
* Lumio platforms
* dreaMTouch
* PQ Labs
* Wacom

Snowflake Suite for Windows 7 Features

* Detailed user manual included with FAQ
* Developed on fast and reliable C++ platform
* Intuitive
* Customizable backgrounds, logo's, images, 3d models, videos etc
* Gesture recognition library
* TUIO / OSC (Open Sound Control) support (sending and receiving events)
* Low level API
* Hardware accelerated rendering
* Support for wide variety of media types (jpg,jpeg,tga,png,dds,gif,tif, tiff, bmp,mov,avi,wav,ogg)
* Advanced window handler that supports scaling and rotation
* Audio support
* Multi-threaded resource handler (For fast data visualization)

Snowflake Suite is also compatible with the following systems:

RELATED 
"The 3M multi-touch displays support up to 20 finger touches simultaneously and come with a one year warranty. NUITEQ's Snowflake Suite, is included at no additional charge, offering the benefits of a complete bundled product of hardware and software, being the perfect solution to get started with multi-touch technology."


http://www.nuiteq.com/images/products_shot.jpg
Photo Credit: NUITEQ

Case Study
"NUITEC freely assisted handicapped children with their education, by applying multi-touch technology for Action For Kids, a UK based national charity working with children and young people with physical and learning disabilities.The children used NUITEQ’s award winning multi-touch software product Snowflake Suite, to enhance their way of learning, by offering a more playful and intuitive solution, in comparison to conventional learning methods."

Samsung Transparent OLED + Wedge Camera, Glassless 3D, Telepresence, Mid-air Interaction: Applying Science at Microsoft

The Microsoft Applied Sciences Group has been working on several projects that have the potential of changing how we interact with various displays and surfaces in the very near future.   Here's some what I came across my RSS feeds and Google Alerts this morning:
INAVATE  July 5, 2010

According to an article in InAVate, "Microsoft has combined Samsung’s transparent OLED with a sub-two-inch camera to revolutionize the Microsoft Surface platform. The touchless telepresence screen creates a 3D gesture-control interface that tracks movement by seeing through the display. The company’s Applied Sciences Group has also added its recently revealed wedge shaped lens, that InAVate reported on last month, to deliver glasses-free 3D content...the latest breakthrough could revolutionize the Surface concept, taking touch away from the display and projecting the images in 3D.-InAVate 7/5/2010


3D Gesture Interaction

"In this demonstration, we've placed the Microsoft Applied Science's wedge technology behind Samsung's transparent OLED display. This enables a camera to image through the display, see the user's hand above it, and alter the image based upon her gestures." -Microsoft Applied Sciences Group

3D Without the Glasses: A new type of display from Microsoft produces multiple images and tracks the viewers eyes - Kate Greene, MIT Technology Review (6/11/2010)

According to an article in MIT's Technology Review, "the new lens, which is thinner at the bottom than at the top, steers light to a viewer's eyes by switching light-emitting diodes along its bottom edge on and off. Combined with a backlight, this makes it possible to show different images to different viewers, or to create a stereoscopic (3-D) effect by presenting different images to a person's left and right eye. "What's so special about this lens is that it allows us to control where the light goes," says Steven Bathiche, director of Microsoft's Applied Sciences Group." -Kate Greene, Technology Review

Steerable Multi-view Display

"In this demonstration, we use head tracking to determine where multiple users are. Then, with the Microsoft Applied Sciences' wedge technology, we steer completely independent images to each user. In the video, one user is seeing a sun while at the same time another is seeing a rocket. This is maintained even as the users change positions relative to each other." -Microsoft Applied Sciences Group
Transparent Display for Telepresence

"In this demonstration, we've placed the Microsoft Applied Science's wedge technology behind Samsung's transparent OLED display. This enables a camera to image directly through the display. In the video, objects held up to the screen are captured and shown to the user on the other side of the telepresence communication (the other monitor in the video), while far away from the screen, the display shows the user a view dependent image."-Microsoft Applied Sciences Group
Steerable 3D Auto Stereo Display

"In this demonstration, we use head tracking to determine where a user's eyes are. Then, with the Microsoft Applied Sciences' wedge technology, we steer different views of the scene to each eye to produce a 3D image without the need for glasses or for fixing the location of the user." -Microsoft Applied Sciences Group
Mid-air Interactive Display

"In this demonstration, we illuminate objects above the display with infrared light. We capture the reflection using the Microsoft Applied Sciences' wedge technology. This enables us to see above the display while keeping the form factor small. Seeing above the display allows us to track the interaction between direct contacts on the display. In the video, the user associates a function (color choice) with one hand and a different function (zoom/rotation) with the other hand. This tool persistence is maintained regardless of the relative positions of the hands." -Microsoft Applied Sciences Group

RELATED
About Microsoft Applied Sciences Group
"The Applied Sciences Group (ASG) is an applied research and development team dedicated to createing the next generation of computer interaction technologies.  The interdisciplinary group focuses on the synergy between optics, electronics and software to create novel human computer interfaces.  The ASG is part of the Entertainment and Devices Division at Microsoft Corp. and mainly supports projects for Microsoft Hardware, XBox, and Microsoft Surface.  It also works closely with Microsoft Research."