"The SMART Table interactive learning center lets students get hands-on with collaborative activities – and the more hands the merrier. The SMART Table is a versatile learning center where groups of students can create, explore and discover together on the durable screen. The interface is so intuitive that even the youngest child can play games and plunge into learning activities in no time. You can choose from a wide variety of ready-to-go lesson activities, or customize them to suit your own lessons."
I know that Gates predicts that every surface will one day be a computer. Maybe his new company will be addressing this concept behind the scenes? "Federal trademark filings provide more clues – describing bgC3 as a think tank, under a generic trademark classification that corresponds broadly to areas including "scientific and technological services," "industrial analysis and research," and "design and development of computer hardware and software."...The “bg” is Bill Gates, the insider says, but the “C” stands for “catalyst.” The idea is that Gates will play that role as he brings together new people and ideas." For more information about surface computing, read my previous post, "The Internet of Surfaces? Microsoft's Pete Thompson discusses screens and surfaces of all sizes"
If you've heard more about the focus bgC3 LLC, leave a comment!
Food for thought: "While a child using Surface for the first time will tend to use his or her whole hand to interact with objects, adults learn to use just a fingertip because they quickly realize that essentially the computer is only recognizing a single point for each "touch." "
CNET video about Microsoft's surface - tabletop computing:
According to the Ambient Information Systems Workshop (Ubicomp 08), ambient info systems are "a large set of applications that publish information a highly non-intrusive manner, following on from Mark Weiser's concept of calm technology.".
"The current research in pervasive and ubiquitous computing suggests a future in which we are surrounded by innumerable information sources, all competing for our attention. These information sources may manifest as both novel devices and as devices embedded in common objects, such as refrigerators, automobiles, toys, furniture, clothes, and even our own bodies."
"While this vision of the future has prompted great advancements in context-aware computing, wireless connectivity, multi-sensor platforms, smart materials, and location-tracking technologies, there is a concern that this proliferation of technology will increasingly overwhelm us with information. Our belief is that information should move seamlessly between the periphery and the center of one’s attention, and that good technology is highly transparent. We see ambient information systems as a way to support these ideas."
Ambient Information Systems Pics linked from Infosthetics:
Real-time data panoramas: "Once the stock market opens, our 3D simulation comes to life & people start 'breathing' business information"via Bashiba.com
" BASHIBA Panorama exploits the visual perceptual capabilities of the human brain. It harnesses untapped brain power."
Ambient Devices Energy JouleviaAmbient Devices "Save Money. Help the Planet. Track energy prices and your energy use with a simple night light." This device provides you with the weather forecast for the day, the day's high temperature, tthe current cost of electricity, your current energy usage, rewards for using less energy, and signal strength.
Affective Diary:
The sensor based Affective Diary is a collaborative project between Microsoft Research and the Interaction Lab at SICS. You can download the code from the application page. The source code is available as a Visual Studio 2005 solution.Here is a screenshot:
"To expand on the ways in which we creatively engage in diary-keeping, we have designed an affective diary that captures some of the physical, bodily aspects of experiences and emotions—what we refer to as “affective body memorabilia”. The affective diary assembles sensor data, captured from the user and uploaded via their mobile phone, to form an ambiguous, abstract colourful body shape. With a range of other materials from the mobile phone, such as text and MMS messages, photographs, etc., these shapes are made available to the user. Combining these materials, the diary is designed to invite reflection and to allow the user to piece together their own stories."
Ambient Mug Ambient Persuasive Mug via Ads of the World "The external surface of the cup is printed with a second layer of heat sensitive ink that is revealed when hot water is poured into the cup."
"a system comprised of a series of physical objects designed as individual playthings, but wirelessly networked via RF to act as both input and output devices for a collective visualization of distributed activity. These hand-held, translucent silicone toys have embedded sensors (for input) and 3 colors of LEDs (for output) which allow them to be reactive to both sound and touch. Action around one of the nimios will cause the others to glow in different patterns and colors. The interaction design is deliberately open-ended, in order to allow the emergence of distinctive patterns of collaborative engagement in real groups." Interactive Waterfall Interactive waterfall -Charles Forman, from Setpixel. (includes a video with nice ambient music.) The project was produced for the Children's Center at the Hackensack University Medical Center.
Wearables from the Reach project at the Interactive Institute, Design Goteborg: Scarf that reveals messages when it heats up.
Temperature changing scarves
Bag with sensors that measure sound level, light, and temperature.
The atracTable is a multi-touch presentation system developed by the Swiss engineering and development group, Atracsys. It is similar to Microsoft's Surface. Interaction on the table can be triggered by laying objects on the table.
From the Atracsys Website -"How does it work": "atracTable is the combination of a video-based movements tracking system, a computer, a beamer, and a screen.
When you lay an object on the screen, the tracking system recognizes the object wiht a visual tracking tag on the base of the product. At the same time, the tracking system detects the positions and movements of your fingers and of the objects on the screen.
The whole pieces of information concerning the product(s) and the different locations and movements are sent to the computer. The data is processed and sent back to the beamer.
The processing is performed instantaneously. The real-time interaction is obtained by continous detection of fingers and objects movements. The whole technology fits in the table and is invisible for customers".
Another creation by Atracsys is beMerlin, a gesture-based interactive system that plays out as an interactive window. Although it is used for visual merchandising, it looks like it has potential for other uses, such as wayfinding, building directories, interactive museum exhibits, and education.
When I first started writing about interactive touch-screen technology and multi-touch, there really wasn't much going on. Things have changed over the past couple of years.
Below is a video clip of Paper Four, via Touch User Interface, of a capacitive touch paper developed by researchers at the Mid Sweden University. Sensing electrodes are printed on the paper with conductive ink:
Natural User Interface will be demonstrating multi-touch applications at the upcoming DreamHack Winter 2008 festival November 27-30. DreamHack is known as the world's largest LAN party and computer festival. For more information, see my post on the Technology Supported Human World Interaction blog.
Multi-touch & Surface Computing Thoughts:
Despite the downturn in the global economy, I truly feel in my heart that collaborative multi-touch, or surface computing, will be an important "movement", on many levels. I think that it will enable people to gain a deeper understanding and enjoyment of ideas & concepts.
It is new way of expression and communication, with the potential to support learning, exploring, music, art, cooperation, gaming, and more effective information visualization &data-driven decision making.
Not too many people really "get" the concept of multi-touch - surface computing. Yet.
Those who do seem to be creative, forward thinking, thoughtful, and optimistic. Important characteristics to have during these trying times!
Long Zheng, from the I Started Something blog, was privy to a survey from Microsoft about "Oahu" (via someone named Kerien).
The following description of Oahu is a quote from Long Zheng's website, and reportedly was the introductory section of Microsoft's survey: "The following questions refer to a computing device called “Oahu” that has an innovative multi-touch screen. Oahu is a flat screen that sits horizontally like a table top. You can interact with Oahu by touching the screen, instead of using a mouse, and more than one person can interact with Oahu at the same time. You and others can move objects on the screen with your hands and touch icons to open up programs, games, or music. People using the device can also use their fingertips to expand and shrink objects on the screen. The screen recognizes people’s hand movements and touches and reacts accordingly. You can bring up an on-screen keyboard to input information. Oahu also works with other devices (such as digital cameras, cell phones, and MP3 players) by getting information from or sending information to them. Oahu is on with no waiting time to start up. Oahu can come as a freestanding table, placed into a piece of furniture, or built into a countertop. The type of Oahu devices we are asking about today are not portable but if they are furniture or tables, they can be placed anywhere in your home." mmmm.... sounds just like a Surface....
The price of Oahu quoted in the survey? $1,499.00. A substantial savings, considering that the price of Microsoft's Surface is $10,000.00.
This price approaches the affordable range for schools. I wonder if any questions in the survey addressed the learning aspects of the Oahu, other than helping children with homework. With the upcoming Windows 7 OS and its multi-touch capabilities, I'm sure we'll be seeing the spread of this technology.
FYI: Long Zheng is working on a Business Commerce and Multimedia Systems double degree at Monash University in Australia. His purpose in blogging is to be on the cutting edge of first-breaking news related to technology.
This is a picture of Gesturetek's interactive multi-touch installation for the NASA Moon Station exhibit, at the Marshall Space Flight Center in Huntsville, Alabama. Inhance Digital Corporation collaborated with Gesturetek on this project too create a multi-user game that teaches players about one of NASA's lunar outposts.
Read the press release for more information and links related to Gesturetek's multi-touch-multi-point display. Inhanced Digital Corporation's website is worth the visit. The company is involved with a variety of interactive project, including the "Connected Home", in cooperation with Cisco Systems.
According to an article in the Wall Street Journal (10/16/08), Hewlett-Packard will unveil a touch-screen notebook. Frog Design, the company that developed the touch-screen browser-interface on the HP Touchsmart PC is working on this product.
To learn more about Frog Design and HP touch-screen technology, go to the source and read this post on GeekTieGuy's blog: The cat's finally out of the bag..
I happen to love my HP TouchSmart. (Secret: It can handle at least duo-touch input, since it has a NextWindow touch screen interface) More from NextWindow:
103-inch display for election coverage in New Zealand.
Collaborative project between NextWindow and Nemes for a museum application.
It is good to see more touch screen technology in museums... Next step, the classroom?
I have a TouchSmart, and if I wasn't so busy at my job, I would love to spend my time digging into the core of the system.
Thanks, RKVS Raman, for your explanation about the way to identify the NextWindow touch screen as an input device in the HP TouchSmart PC using UbuntuStudio. This was something I was curious about.
Here is RKVS Raman's "how-to", from his RKVS Raman Blogs "This gotcha explains how we made it to work.
1) Installed xserver-xorg-input-evdev 2) HP Touch Smart uses NextWindow TouchScreen. The tricky part was to find which input device is connected. An cat /proc/buc/input/devices pointed wrongly to event2. 3) After googling a lot, we found evtest. Downloaded evtest.c and compile it. 4) Ran evtest as root with various input devices like /dev/input/event[1...X] 5) One of the runs showed Input Device as NextWindow TouchScreen. That was the one. The device was /dev/input/event4
Input driver version is 1.0.0 Input device ID: bus 0x3 vendor 0x596 product 0x1 version 0x200 Input device name: "NextWindow TouchScreen" Supported events: Event type 0 (Reset) Event code 0 (Reset) Event code 1 (Key) Event code 3 (Absolute) Event type 1 (Key) Event code 330 (Touch) Event type 3 (Absolute) Event code 0 (X) Value 0 Min 0 Max 32687 Event code 1 (Y) Value 0 Min 0 Max 32687
6) Made the necessary changes to our xorg.conf according to this page. Change the /dev/input/event1 to the device that we found as connected to touchscreen (/dev/input/event4) through our evtest runs.
7) Saved xorg.conf and rebooted. And Voila, touchscreen works.
Now that touchscreen works, let me load Sugar, it's gonna be fun."
Sugar is the core of the One Laptop Per Child (OLAP) Human Interface. It can be emulated on a PC, but doesn't have all of the features of the original.
Of interest to computer students and armchair technologists:
RKVS Raman teaches data structures (and artificial intelligence?), at least from what I gather from his blog posts.
NUI (Natural User Interface) recently unveiled Gravitoy, a multi-touch physics application that offers two modes of interaction. Below is a picture of children engaged in collaborative play on an interactive table running Gravitoy in the Sandbox mode. The children can draw objects that interact using principles of physics. (Photos are from NUI co-founder Harry van der Veen's blog)
Below are table-shots of the Sandbox mode:
See the triangle?
The second mode of Gravitoy is "Stacks". Two players play a game of stacking objects of various sizes. The difficulty level can be adjusted by changing gravity options.
This would make science lab engaging for students!
In addition, NUI developed a 3d model viewer application, which looks like it would be fun for budding architects and designers.
If you are interested in learning more about developing multi-touch applications and systems, visit theNUI Group website for opensource resources, code, and "how-to" support forums.
Video clips and links related to the work of NUI-Group members, such as Christian Moore, Harry van der Veen, and Seth Sandler, can be found on a previous post: More Multi-touch from Members of NUI Group!
Touchless Multi-touch... I recently learned of another "open-source" touch-less "multi-touch" resource. The TouchlessSDK was developed for a college project by a Columbia University student, Mike Wasserman.
You will need Visual Studio 2008, the code, and a webcam to get started. From what I can tell, the code will work with in the XNA development environment.
The TouchlessSDK website has links to a video demonstration, demo code, FAQs, and more. The discussion list has a lot of good information.
For those of you who would like to know how to handle the stress regarding our economy as it relates to children, teens, and family, take a look at my recent post on the TechPsych blog: Children, Teens, Families, and the Economic Crisis
(I am still looking for more resources for children and families, so feel free to add a link in the comments.)
I posted a variety of links to data and information visualization resources that might be helpful to those of you who are interested in exploring our current economic crisis:
I'm using the Economic Sights and Sounds blog as a repository of information and links for use in a collaborative interactive multimedia time-line of events surrounding the current situation.
The above picture is from Jim Sapadaccini's post about Ideum's mutli-touch table prototype, using NUI (Natural User Interface) software, "Snowflake". Ideum is working on gesture-recognition software for use in future projects.
Ideum will be designing for museum installations, and will include user-testing as they work towards applications that will support collaboration among museum visitors as they interact around the exhibits.
Stimulant, a company that has partnered with Microsoft to develop applications for the Surface multi-touch table: "What do you get when you mash-up Microsoft Surface with a Wii Balance Board? Pressure-sensitive surface computing! Yes, the Surface is sitting directly on the Balance Board (it supports 600 pounds, we checked). Here, Josh demonstrates a simple application that lets users create bubbles of various sizes and roll them around the table using pressure on the edges of the Surface. You also get a sneak peek at the WPF/Silverlight physics engine we've been working on as well. Pressure sensitivity adds an extra dimension to the Surface experience and opens new doors on an already highly advanced platform."
I'd like to learn more about Stimulant's other Surface experiments!
Raja Manohar, from Hexolabs sent me a link to his interactive YouTube + click games, and I thought I'd share it with my readers. Enjoy saving the car from a disastrous crash!
Hexolabs, located in India, is involved in a variety of projects, including mobile games for health.
As the economy spun out of control, I decided to gather information from a range of sources and disciplines, to help me understand the complexity of the events leading to the crisis.
I came across this interesting video clip of Audience, a "performance" of 64 mirrors programmed to move together when someone walks by, in attention. The installation was created by Chris O'Shea, author of the Pixelsumo blog, and rAndom internationalas part of an exhibit at Deloitte Ignite08, a three-day festival of performances and experiences created by artists and scientists at the Royal Opera in London.
Wayne McGregor, the choreographer for the Royal Ballet, was the director of Ignite08. The following is McGregor's vision behind the festival.
"Explaining the ethos behind such a diverse festival programme, McGregor reveals he was influenced by Greek philosopher Aristotle’s definition of the five senses. ‘We experience everything through our senses and, although we are by nature sight-dominated, it is through a complex network and inter-relatedness of these multiple senses that we can perceive events, artistic endeavour included,’ he says."
The "What's On" section of the Ignite08 website includes links short interviews and written profiles of the various Ignite08 participants.
For more information about Wayne McGregor, see Mind in Motion.(Times)
Since I've been playing Spore and thinking about the the artificial life/genetic algorithm used to create the Black Shoals Stock Market Planetarium (see my previous post), I thought I'd visit the Philippe Rennard's interactive genetic algorithms website I first visited his site when I was taking a course in artificial intelligence for game design.
The first time I visited the site, I viewed it from a small laptop screen. This time, I expored the site on my 22-inch HP Touchsmart. What a difference! I can't wait until the school week to see how this looks on an interactive whiteboard.
Although the website was created several years ago, it still is fun site to visit, especially when you watch all of your little things morph according to the variables you choose.
Another website I recently revisited was the NERO website. NERO stands for Neuro-Evolving Robotic Operatives. The original NERO project was developed at the University of Austin a few years ago. The latest version can be found on the NERO 2.0 website.
If you have some time on your hands, visit the site and download the program. You use genetic algorithms to train up soldiers in using real-time neuro evolution. See what happens if you let them evolve overnight on your computer!
"Evolve your own robot army by tuning their artificial brains for challenging tasks, then pit them against your friends' teams in online competitions" Somewhat Related
While I was at Rennard's website, I linked to his page about a book he edited in 2006 that I want to run out and read ASAP. "Handbook of Research on Nature Inspired Computing for Economics and Management". I think this will help book might help me with my quest to better inform myself about the events surrounding the economic situation.
I'm sharing what I find on my Technology Supported Human-World Interaction blog using one of my recent posts to store my collection of pictures, media, quotes, articles, references, and links related to the quest. I hope to incorporate this, hopefully with the help of others, into a dynamic, interactive web-based time-line of the history we are living through right now. For now, here is the link to the "repository":
Take a look at my recent post on the TechPsych blog to learn more about the math activities created by Spencer Riley for use on interactive whiteboards, then visit his website. The activities are free!
The most frequently visited post on this blog is Interactive Multimedia for Social Skills, Understanding Feelings, Relaxation, and Coping Strategies, and the hits are increasing. For that reason, I'm working on an extensive update, which is very involved process, given the increase in research in this area. Until then, I thought that it would be helpful to provide a mini-update. Below are some resources and links for those of you who are interested in digital multimedia story telling or digital social stories with young people. Although some of the resources are specific to children or teens with autism spectrum disorders, I've also included information that is appropriate for use in regular classroom settings. (If this is your first visit to this blog, it might be a good idea to first check out the resources from the above link, and then return to this post.)
This nine-page primer is useful for anyone interested in learning how to create digital stories or develop digital storytelling activities with young people. The information was provided by David Jakes, an instructional technology coordinator for Community High School District 99 in Downers Grove, IL, provides a good case for digital storytelling and an outline of the process of implementing related activities at the high school level.
David Jakes has a website, Jakesonline.org, that contains additional resources about digital storytelling, including strategies for instruction. The website also provide information about collaborative tools and a collection of extensive web resources.
Process: • Team identifies the need for behavior intervention. • Functional assessment is completed. • Social stories included in behavior plan. • Social story is written. • Social story is introduced and progress is monitored with data. • Success is evaluated with data. Note: The authors cite an article written in 1999 that mentions that the use of multimedia social stories has not been demonstrated to be effective, as little research had been completed in this area at the time. Take this with a grain of salt, as there are recent studies (see below) regarding the effective use of multimedia applications, including computer/video games and VR, for social skills training as well as counseling.
This link provides extensive information about on-line resources for programs that simulate social interaction. It also includes information about the use of social stories with students, and resources for putting together multimedia social stories. Included are some summaries of research about multimedia social stories and the use of multimedia for instructional activities.
Be sure to explore the rest of the CITEd site when you have the time.