Josh Blake is the Tech Lead of the InfoStrat Advance Technology Group in DC. He has been creating multi-touch applications Microsoft's Surface multi-user table-tops for a while. Recently, his team built a suite of applications designed for use by young children at a museum. Below is a video demonstration of some of this work. It really looks exciting!
NIME 2011 OSLO: The International Conference on New Interfaces for Musical Expression - NIME is an outgrowth of a workshop held at CHI 2001 (Human Factors in Computing Systems). "The NIME conference draws a varied group of participants, including researchers (musicology, computer science, interaction design, etc.), artists (musicians, composers, dancers, etc.) and developers (self-employed and industrial). The common denominator is the mutual interest in groundbreaking technology and music, and contributions to the conference cover everything from basic research on human cognition through experimental technological devices to multimedia performances." Just take a look at all of the presentations that were at NIME 2011! NIME 2011 Program (pdf)
Touch the Web 2011: 2nd International Workshop on Web-Enabled ObjectsJune 20-24, 2011, Paphos, Cyprus (in conjunction with the International Conference on Web Engineering ICWE) "The vision of the Internet of Things builds upon the use of embedded systems to control devices, tools and appliances. With the addition of novel communications capabilities and identification means such as RFID, systems can now gather information from other sensors, devices and computers on the network, or enable user-oriented customization and operations through short-range communication. When the information gathered by different sensors is shared by means of open Web standards, new services can be defined on top of physical elements. In addition, the new generation of mobile phones enables a true mobile Internet experience. These phones are today’s ubiquitous information access tool, and the physical token of our "Digital Me“. These meshes of things and “Digital Me” will become the basis upon which future smart living, working and production places will be created, delivering services directly where they are needed."
Upcoming Conferences and Workshops
Ubicomp 2011, September 17-21, 2011, Beijing, China
"Ubicomp is the premier outlet for novel research contributions that advance the state of the art in the design, development, deployment, evaluation and understanding of ubiquitous computing systems. Ubicomp is an interdisciplinary field of research and development that utilizes and integrates pervasive, wireless, embedded, wearable and/or mobile technologies to bridge the gaps between the digital and physical worlds. The Ubicomp 2011 program features keynotes, technical paper sessions, specialized workshops, live demonstrations, posters, video presentations, panels, industrial exhibition and a Doctoral Colloquium."
"Given its multi-disciplinary nature, Ubicomp has developed a broad base of audience over the past 12 years. Key audience communities are: Human Computer Interaction, Pervasive Computing, Distributed and Mobile Computing, Real World Modeling, Sensors and Devices, Middleware and Systems research, Programming Models and Tools, and Human Centric Validation and Experience Characterization. More detailed information about the topical focus of Ubicomp can be found in the Call for Papers."
Eurodisplay 2011: XXXI International Display Research Conference, September 19-22, Bordeaux-Arcacho, France Eurodisplay 2011 in Bordeaux/Arcachon provides researchers, engineers, and technical managers a unique opportunity to present their results and update their knowledge in all display-related research fields...The two keynote addresses on the first day of the Eurodisplay 2011 Symposium on September 20th will be a unique chance to hear a global overview on the future of the display market (Samsung LCD) and on display applications in the automotive industry (Daimler AG)
Eurodisplay ’11 in Bordeaux-Arcachon will be the right spot to learn more about the last results on display research and related fields such as organic electronics. The cognitic science will give a new vision on the impact of display in our day to day life, not only from a perception but from the Information standpoint since SID deals with “Information Display” and not only technologies.The two invited talk of Pr. Bernard Claverie and Dr. Lauren Palmateer will focus on this aspect......A dedicated business oriented session will address two important aspect of our business from display company creation by Thierry Leroux to end user trends analysis for TV market by Dr. Jae Shin. This conference will provide a global vision of our Display World including not only the well-known display leaders but also the very actives BRIC countries......Dr. V. Pellegrini Mammana will give a vivid illustration of this display industry dynamic in Brazil."
UIST Symposium, October 16-19, 2011, Santa Barbara, California "UIST (ACM Symposium on User Interface Software and Technology) is the premier forum for innovations in the software and technology of human-computer interfaces. Sponsored by ACM's special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together researchers and practitioners from diverse areas that include traditional graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW. The intimate size, single track, and comfortable surroundings make this symposium an ideal opportunity to exchange research results and implementation experiences."
VisWeek 2011: Viz, Infovis, VASTOctober 23-28, Providence, RI "Computer-based information visualization centers around helping people explore or explain data through interactive software that exploits the capabilities of the human perceptual system. A key challenge in information visualization is designing a cognitively useful spatial mapping of a dataset that is not inherently spatial and accompanying the mapping by interaction techniques that allow people to intuitively explore the dataset. Information visualization draws on the intellectual history of several traditions, including computer graphics, human-computer interaction, cognitive psychology, semiotics, graphic design, statistical graphics, cartography, and art. The synthesis of relevant ideas from these fields with new methodologies and techniques made possible by interactive computation are critical for helping people keep pace with the torrents of data confronting them."
6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011ITS 2011 November 13-16, 2011 Portopia Hotel, Kobe, Japan "The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications."
AFFINE: 4th International Workshop on Affective Interaction in Natural Environments (ICMI 2011)November 17, 2011, Alicante, Spain (CFP deadline is August 19, 2011) Scope: "Computer gaming has been acknowledged as one of the computing disciplines which proposes new interaction paradigms to be replicated by software engineers and developers in other fields. The abundance of high-performance, yet lightweight and mobile devices and wireless controllers has revolutionized gaming, especially when taking into account the individual affective expressivity of each player and the possibility to exploit social networking infrastructure. As a result, new gaming experiences are now possible, maximizing users’ skill level, while also maintaining their interest to the challenges in the same, resulting in a state which psychologists call flow: “a state of concentration or complete absorption with the activity at hand and the situation”. The result of this amalgamation of gaming, affective and social computing has brought increased interest in the field in terms of interdisciplinary research........Natural interaction plays an important role in this process, since it gives game players the opportunity to leave behind traditional interaction paradigms, based on keyboards and mice, and control games using the same concepts they employ in everyday human-human interaction: hand gestures, facial expressions and head nods, body stance and speech. These means of interaction are now easy to capture, thanks to low-cost visual, audio and physiological signal sensors, while models from psychology, theory of mind and ergonomics can be put to use to map features from those modalities to higher-level concepts, such as desires, intentions and goals. In addition to that, non-verbal cues such as eye gaze and facial expressions can serve as valuable indicators of player satisfaction and help game designers provide the optimal experience for players: games which are not frustratingly hard, but still challenging and not boring.........Another aspect which makes computer gaming an important field for multimodal interaction is the new breed of multimodal data it can generate: besides videos of people playing games in front of computer screens or consoles, which include facial, body and speech expressivity, researchers in the field of affective computing and multimodal interaction may benefit from mapping events in those videos (e.g. facial signs of frustration) to specific events in the game (large number of enemies or obstacles close to the player) and infer additional user states such as engagement and immersion. Individual and prototypical user models can be built based on that information, helping produce affective and immersive experiences which maintain the concept of ‘flow’. This workshop will cover real-time and off-line computational techniques for the recognition and interpretation of multimodal verbal and non-verbal activity and behaviour, modelling and evolution of player and interaction contexts, and synthesis of believable behaviour and task objectives for non-player characters in games and human-robot interaction.The workshop also welcomes studies that provide new insights into the use of gaming to capture multimodal, affective databases, low-cost sensors to capture user expressivity beyond the visual and speech modalities and concepts from collective intelligence and group modelling to support multi-party interaction."
"Major Topics of Interest to IUI include: Intelligent interactive interfaces, systems, and devices, Ubiquitous interfaces, Smart environments and tools, Human-centered interfaces, Mobile
interfaces, Multimodal interfaces, Pen-based interfaces, Spoken and natural language interfaces, Conversational interfaces, Affective and social interfaces, Tangible interfaces, Collaborative multiuser interfaces, Adaptive interfaces, Sensor-based interfaces, User modeling and interaction with novel interfaces and devices, Interfaces for personalization and recommender systems, Interfaces for plan-based systems, Interfaces that incorporate knowledge- or agent-based approaches, Help interfaces for complex tasks, Example- and demonstration-based interfaces, Interfaces for intelligent generation and presentation of information, Intelligent authoring systems, Synthesis of multimodal virtual characters and social robots, Interfaces for games and entertainment, learning based interactions, health informatics, Empirical studies and evaluations of IUI interfaces, New approaches to designing Intelligent User Interfaces, and related areas"
IXDA: Interaction 12, Dublin, Ireland, February 1-4, 2012 "Interaction|12 is an ideal venue to showcase the most inspiring and original stories in interaction design. We have a world of creative talent to tap into, so our conference roster will fill up fast. We’re on the lookout for thoughtful, original proposals that will inspire our community of interaction designers from all over the world. Do you have an uncommon or enlightening design story, valuable lessons learned from hands-on experience and want to be a part of the programme at Interaction|12? You do? Great!"
Sensory-Minds is a small company in Germany that is focused on research and design in the field of Natural User Interfaces. If you visit the SENSORY-MINDS website, you'll find that it has been designed for touch interaction.
Information about the Ring Wall from Sensory-Mind's Vimeo site:
".....The two-piece ring°wall consists of a LED display and a multitouch information-wall and impresses by its size: a total surface of 425 square meters, which equals more than 6000 computer displays, is the biggest of its kind. An interactive World emerges out of 34 million pixels generated by 15 high definition projectors and is supported by 30 directional speakers.
Multitouch sensors basing on laser technology, convert the usage of the natural user interface into an experience. By direct touching, more than 80 users can simultaneously get informed about news and activities around the ringworld.
The interactive wall is not only a central information system, but also an innovative advertising tool and medium for public viewings."
Heiko Hoffman of Sensory-Minds recently joined the NUI-Group. Here is a response to a question on the NUI-group forum about the way the system's sensors work:
"The sensor works like spinning radar gun, or like a wiper in a car, that means like a radar gun you get the distance to the object. It’s not quiet difficult when you know the position of the sensor and the degree from the beam to get the X,Y position. Yes, you don’t need to touch the surface but we arranged it that the radar beam is very close to the screen so it seems that you have to touch the surface.
At the moment the system (has) limitations because each sensor (gets) coordinates and this means that you got shadows.This is no problem because the people interact very fast. We are working on a system where the data from the sensors were put together and that would be the solution for this problem."
Note:
I'll be writing a few more posts updating the work of past and present NUI-group members as well as news from the commercial multi-touch & gesture community. Be sure to check back soon, or subscribe to this blog!
When I first started writing about interactive touch-screen technology and multi-touch, there really wasn't much going on. Things have changed over the past couple of years.
Below is a video clip of Paper Four, via Touch User Interface, of a capacitive touch paper developed by researchers at the Mid Sweden University. Sensing electrodes are printed on the paper with conductive ink:
Natural User Interface will be demonstrating multi-touch applications at the upcoming DreamHack Winter 2008 festival November 27-30. DreamHack is known as the world's largest LAN party and computer festival. For more information, see my post on the Technology Supported Human World Interaction blog.
Multi-touch & Surface Computing Thoughts:
Despite the downturn in the global economy, I truly feel in my heart that collaborative multi-touch, or surface computing, will be an important "movement", on many levels. I think that it will enable people to gain a deeper understanding and enjoyment of ideas & concepts.
It is new way of expression and communication, with the potential to support learning, exploring, music, art, cooperation, gaming, and more effective information visualization &data-driven decision making.
Not too many people really "get" the concept of multi-touch - surface computing. Yet.
Those who do seem to be creative, forward thinking, thoughtful, and optimistic. Important characteristics to have during these trying times!
"(pdf). According to this vision, "the goal of Tangible Bits is to bridge the gaps between both cyberspace and the physical environment, as well as the foreground and background of human activities." This article isis a must-read for anyone interested in "new" interactive technologies.
The pictures in the article of the metaDesk, transBoard, activeLENS, and ambientRoom, along with the references, are worth taking at least a glance of this seminal work.
Another must-read is Hiroshi Ishii's 2008 article, Tangible Bits: Beyond Pixels(pdf). In this article,Ishii provides a good overview of TUI concepts as well as the contributions of his lab to the field since the first paper was written.
Related tothe Tangible User Interface research is the work of theFluid Interfaces Group at MIT. The Fluid Interfaces Group was formerly known as the Ambient Intelligence Group, and many of the group's projects incorporate concepts related to TUI and ambient intelligence. According to the Fluid Interfaces website, the goal of thisresearch group is to "radically rethink the human-machine interactive experience. By designing interfaces that are more immersive, more intelligent, and more interactive we are changing the human-machine relationship and creating systems that are more responsive to people's needs and actions, and that become true "accessories" for expanding our minds."
The Siftables project is an example of how TUI and fluid interface (FI) interaction can be combined.
Siftables is the work of David Merrill and Pattie Maes, in collaboration with Jeevan Kalanithi, and was brought to popular attention through David Merrill's recent TED talk: David Merrill's TED Talk: Siftables - Making the digital physical -Grasp Information Physically "Siftables aims to enable people to interact with information and media in physical, natural ways that approach interactions with physical objects in our everyday lives. As an interaction platform, Siftables applies technology and methodology from wireless sensor networks to tangible user interfaces. Siftables are independent, compact devices with sensing, graphical display, and wireless communication capabilities. They can be physically manipulated as a group to interact with digital information and media. Siftables can be used to implement any number of gestural interaction languages and HCI applications....Siftables can sense their neighbors, allowing applications to utilize topological arrangement..No special sensing surface or cameras are needed."
"Imagine if all the little programs you had on your iphone were little separate chicklets in your pocket. You'd lose em. Your cat would eat em. You'd vacuum them up. They'd fall down in the sofa. They'd be all over the car floor. You'd throw them away by mistake..."
In my opinion, it is exciting to learn that perhaps some of this technology has the potential of becoming main-stream.
This flight-tracker application for the Surface, looks fun to use. As I watched the video, I realized that it wasn't the application itself that I liked. It was the music that accompanied the video. The choice of music was from the 80's synth-pop band, Art of Noise Since I'm a music lover, the music got me thinking.
Wouldn't it be great if productivity/work-related applications like Flight Tracker could be developed to provide a means for incorporating a sound-track?
Several thoughts and ideas flashed into my mind:
Surface and related natural user interface/interaction (NUI) applications have the potential to transform routine, ho-hum work tasks into activities that are a bit more pleasant. Since people often listen to music while they work, it stands to reason that NUI productivity applications should incorporate a music component, at least as an option.
To support a user-centered music platform for NUI applications, the application could incorporate a "smart" music library within the system, with the capability of integrating music libraries and playlists from user's mobile devices, as well as the web, effortlessly.(Of course, there are privacy/security and firewall issues to address, but that is another story.)
Users could have a choice of listening to their own music playlists (including a shuffle option, selecting from a variety of presets, or go for something like the iTunes genius effect, listening to music generated from an algorithm that takes into account music preferences and user interaction with the productivity application over time.
Since many Surface/NUI applications are designed to support collaborative work and interaction between two or more people, the music situation could get a bit complicated, since people have differing tastes. If co-workers disagreed about the music selection, the program would automatically default to generic elevator music, or silence.
NUI applications might even pave the way for a new genre of music. This concept isn't too far-fetched. Think of all the music we've come to love over the years that was composed for movies and even video games!
If you know of anyone that is working on this concept, or would like to collaborate with me sometime in the future on this concept, please let me know. I'm slowly working on an interactive timeline prototype, and I have some ideas about adding a music/sound track component.
RELATED Art of Noise - Close To The Edit (Version 1):
I was fortunate to see some of these interaction techniques and interfaces when I attended CHI 2010 this past April, and plan to share some of my photos and video clips from the conference on this blog soon.
Totally Unrelated
Online connection for foodies
Eatocracy is a new website within the CNN pages that provides news- and more- about all things related to food. The categories on the site include "main", "news", "bite", "sip", "make", "think", and "buzz". The best part, in my opinion, is the heirloom recipe collection index, where people can upload and share family recipies and the stories behind them.
Here is the description of Eatocracy from the website:
Eatocracy "is your online home for smart, passionate conversation and information about food news, politics, culture. We'll highlight regional and family recipes, dive into restaurants and food shopping, chat with celebrity and local chefs, and show you what's for dinner around the world tonight. Grab a place at the table and read with your mouth full."
As I took a peek at Lieven's video links, I noticed an interesting video mash-up Lieven created from the open-source code from the RadioHead's House of Cards music video and his One Million Particles app. I'll post them soon.
I'll try to get video, pictures, and commentary about EVA 2010. EVA stands for Electronic Visualization and the Arts. "Electronic Information, the Visual Arts, and Beyond.
FYI
I'm in the process of sorting through and re-organizing my blogs, which have been around for over four years! During this time, my blogs have attracted a growing number of readers. Because of this, I'd like to make things a bit user-centered. So expect to see little changes here and there. I promise I'll give my readers warnings in advance if I make any serious changes!
If you are new to this blog, you should know that my blogs started out as on-line filing cabinets, open to the world. Although there is a bit of overlap of material and some cross-posting between the blogs, they are arranged to serve as a paper-less way of keeping track of things that I've learned through my coursework, conference attendance, readings, and research. Since emerging technologies are high on my list of interests, I also use my blogs to share interesting things that cross my path.
I changed the name of my World Is My Interface blog to The World Is My Interactive Interface."Off-the-desktop natural user interfaces, interaction, and user experience" are the main topics of the blog. It sometimes includes information about ubiquitous computing and DOOH, otherwise known as Digital Out Of Home.
I plan to tinker with my TechPsych blog later on. It focuses on topics that are useful to psychologists, educators, special education teachers, speech and language therapists, health and wellness professionals, and parents.
Feel free to leave comments, as I welcome your input.
"NextWindow, the leader in optical touch technology, released a touch screen driver for Windows 7 developers this week. The introduction of multi-touch gestures to this operating system will generate applications giving users more flexibility when engaging with content."
"NextWindow introduced this technology at Microsoft's Professional Developer's Conference (PDC) in October and will be presenting it at the Windows Hardware Engineering Conference (WinHEC) November 4 - 6, 2008. At this event, NextWindow will be showcasing their latest in optical touch solutions including the integrated touch display found on the HP TouchSmart All-in-One computer."
"Providing a touch screen solution for Windows 7 will give end-users an intuitive method to interact with their computers," said Anthony Uhrick, VP of Sales, NextWindow. "We've created a driver that enables developers to take full advantage of multi-touch functionality."
Software developers interested in touch solutions for Windows 7 are invited to contact NextWindow or download the driver directly from http://www.nextwindow.com/windriver. " More information about NextWindow is available at nextwindow.com."
"Multi-Touch and Pen Technology to Humanize the Computer Interface and Create New Markets"
KFAR SABA, Israel & AUSTIN, Texas--(BUSINESS WIRE)--"N-trig, providers ofDuoSenseTMTM evolution. With the industry’s only combined pen and multi-touch capabilities, N-trig is transforming the way people interact with computers. technology, combining pen and capacitive touch in a single device, brings the power of technology and the human touch together to begin a new era in interface technologies and lead the Hands-on computing evolution. With the industry’s only combined pen and multi-touch capabilities, N-trig is transforming the way people interact with computers" "...Realizing the power of the human interface, N-trig’s DuoSense digitizers are designed to integrate easily, support any type of screen, keep devices slim, light and bright, and can support numerous applications from small notebooks to large LCDs. Combined, pen and touch enables users to open files, manipulate pictures and browse the desktop as they would the files on their desk...Currently available on the Dell Latitude XT and additional OEM designs planned to come to market in early 2009, N-trig is opening a window onto a world where multi-touch is the accepted standard for computer interfaces." Related
Globacore Globacore used four 72" Blue Ocean Low Gain Screens 16:9 to create their twenty-foot multi-touch wall, which I think was also used for the "holographic" screen in the first video. The tracker and base flash API from the NUI-group were used in this system. I wasn't sure if I'd heard about Blue Ocean screens. Much to my surprise, the company behind Blue Ocean is Nippura, located in Japan. Nippura has another office in Charlotte, N.C. For years, Nippura's core business has been focused on creating "the world's largest acrylic panels, tunnels, cylinders and spheres for the professional aquarium industry", but is also focused on screen systems as well.
US Nippura/Blue Ocean Projection Screens Blue Ocean screens were use in the second videos of this post. Below is some information the screens from the company's website: "Blue Ocean® Projection Screens are the first ever to cast the screen into a hi-tolerance viewing plane centered between two ultra-clear panels of cell-cast acrylic, the same type of material used in hi-optic fighter jet canopies and the world’s largest aquariums. Blue Ocean® Projection Screens also offer something never before found in screens: self-rigidity. The thickness of each screen can be fabricated so that it maintains its form without external frames or tension systems. Frameless high definition projected images without borders amplify the infinite depth viewing experience. Because of its durability, outdoor applications of entertainment around the garden and pool are also possible. Normal wear and tear scratching has no long-term effect on the screens’ performance, and a ten year old screen, simply re-polished, will literally look and perform exactly as it did the day it left the factory." Super High Definition Stereo Image using 4K Projection Systems (pdf) Tetsuro Ogi, Hiroaki Daigo, So Sato, Yoshisuke Tateyama, Yasuaki Nishida, International Conference on Artificial Reality and Telexistence 2008
NUITEQ NUITEQ offers single and duo-touch plugins for Snowflake Suite that enables the software to run on NEXIO and NextWindow screens, including NextWindow's screens that are integrated in the HP TouchSmart and Dell Inspiron all-in-one PC's and others. SnowFlake Suite is also compatable with 3M Touch Systems, N-Trig, and Lumino. You can view the SnowFlake Suite application in action on NUITEQ's YouTube channel. "NUITEQ's software product Snowflake Suite, which received a Stevie Awards finalist recognition for Best New Product or Service of the Year 2009 - Media & Entertainment, is available to end clients, software developers, system integrators, VAR's and OEM's. Snowflake operates on a wide variety of hardware technology platforms including rear camera based solutions, IR overlay systems, bending wave, surface acoustic wave (SAW), projected and surface capacitive, resistive and other touch technologies for small hand held mobile devices and larger scale interactive displays." Recent news: NUITEQ Assists LTU Skelleftea at SACO event with new educational multi-touch desk
NextWindow and All-In-One PC's: NextWindow, the company that made the large single-touch display I used for a couple of class projects nearly three years ago, has really grown over the past few years. NextWindow provides optical touch screens for the HP TouchSmart PC, and also for the Dell Inspiron Studio One All-In-One PC, and the Dell Sx2210T desktop monitor. The next Dell with a NextWindow touch screen will be the Inspirion One 19, an all-in-one PC. You can find NextWindow touch screens in the Medion X9613, the NEC ValueStar W, and the Sony L Series. (NextWindow has Windows 7 certification for touch screens)
STANTUM and GUNZE USA Stantum has partnered with Gunze USA, a manufacturer of resistive touch-panels, to co-design and deliver multi-touch products. "Gunze USA will provide these custom multi-touch modules to the North American market for use in industrial, automotive, defense, aerospace, medical, kiosk/point-of-sales, and other professional applications. Unlike capacitive multi-touch, Stantum’s technology is particularly well suited to these professional markets, thanks to its:
immunity to EMI issues, via easy coexistence with Wi-Fi or Bluetooth antennas around the screen;
stylus input and high precision, enabling handwriting recognition; and
ability to work flawlessly with gloves."
According to the press release, Stantum's multi-touch technologies provide advantages such as handling 10 or more touches at a time, finger and stylus input, finger-pressure consumption, and native Windows 7 support, with no need for a driver. Developers can order the "beta" version of the Slate PC, based on the Dell Inspiron Mini netbook platform, from the Stantum website.
GESTURETEK The video below is of an interactive 3D application created for the Gondwana Das Praehistorium in Schiffweler, Germany, and provides an simulation of what it might be like to be a flying ancient pterosaur of giant dragonfly. Two gamers can play at the same time. If you want to fly, you flap your arms, and navigate by leaning. You crouch to dive, and increase your speed by flapping your arms even faster: GestureTek's 3D Interactive Flying and Virtual Game Experience for Gonwana Museum
This company always has something going on that looks fun and interactive. To get an idea this company's interactive work, take a look at the information on the GestureTek website links below. GestureTek has a variety product videos on the website.
SNIBBE INTERACTIVE Snibbe Interactive was founded by Scott Snibbe to develop social immersive media as a communication medium that incorporates the language of cinema and engages participation and interaction with others (and technology) through unencumbered body movements.
Scott Snibbe's Social Immersive Media presentation at Stanford's Seminar on People, Computers and Design, 5/2009
'Scott Snibbe of Sona Research discusses a distinct form of augmented reality focused on social interaction: social immersive media. He discusses design principles and interaction techniques to create strong emotional responses and social engagement through visceral interaction. He also describes approaches to clearly communicate cultural and scientific ideas through the medium, and how to promote specific distinct social behavior in uses." -YouTube description
RELATED Globacore "Globacore was founded by Ben Unsworth and Jeff Williams out of a passion for technology, art, and business. Globacore began as a web hosting and development company supporting the the technical needs of sports apparel startup VC Ultimate and it's associated companies."
"Globacore Inc. was officially incorporated in April 2003. In 2007 Ben and Jeff were joined by programmer Kyle Davis and graphic designer (and trail biker) Matt Venhuizen. In 2008, the Flexmaster Lukasz Ruminski joined us and continues to be a huge member of our team.Update: We'd like to welcome our newest team member: Developer Dave Rapin!"
Globacore Interactive Technologies 67 Mowat Ave. Suite 431 Toronto, ON M6K 3E3 Phone: + 1 (416) 925-2082 Tollfree: + 1 (888) 309-7658 Fax: + 1 (647) 477-3270 General Email:info@globacore.com
Natural User Interface Asia Pte Ltd(Asia office)
12 Little Road
#06-01 Lian Cheong Industrial Building
Singapore 536986
Phone: +65 6748 5245
Fax: +65 6748 7961
FINGERTAPPS Contact: James Bell-Booth
james[at]fingertapps.com
+64 21 226 6399 www.fingertapps.com/
GUNZE USA Product Overview Contact:Gunze Electronics USA 2113 Wells Branch Parkway, #5400 Austin, Texas 78728 Tel: 512-990-3400 Fax: 512-252-1181
LM3LABS Website: http://lm3labs.com/ENG/index_eng.html info@lm3labs.com Europe: Antipolis Innovation Campus 300, route des Cretes 06560 Sophia-Antipolis France Hong Kong, China +852-8198.8406 Australia +61(02)8011.3342 USA: +1(646)461.6146 TGarden Square 409 3-5-26 Funabori Edogawa-ku 134-0091 Tokyo, Japan +81(0)3.3877.0779
Outdoor Media Company (China) Products Contact: Kuo Lin (?) 42x46 Multi-touch LCD Display
HIWAD TOUCH - Beijing Yhjn Science and Technology Co. LTD
Interactive Sandbox "An interactive sand box or sand table for use in tactical training includes an automated control system and image generator so that infinitely variable visual images may be projected onto the contoured surface of the sand for training and preparation for action in the military field and disaster control, industrial planning, environmental protection, and similar situation."
Interactive Holographic Touch Screen
Beijing Yihejianing Science and Technology Co., Ltd
Address: Room 621, Jinyuan Modern shopping Mall, NO.1 Yuanda Road, Haidian District, Beijing, China
TE:8610-88892897,88892895
FAX:8610-88892867-801
Email: hxrong2001@hotmail.com http://www.cnhci.cn