I subscribe to InteractiveTV Today, which focuses on news and information about interactive multi-platform television. Brian David Johnson, an Intel Futurist, is the author of the Jet Set column in ITVT. His most recent article provides a good overview of Intel's Developer Forum (IDF) in his article, Get Your Geek On.
In my opinion, the best part of the article were Johnson's links to his new book, Screen Future: The Future of Entertainment, Computing, and the Devices We Love, and a video interview of key players in the world of the future of interactive television/media. About Screen Future:"Screen Future is about the people, technology, and economics that are shaping the evolution of entertainment. Blending social and computer sciences, media history, and engaging conversations with industry experts, author Brian David Johnson provides an informed and illuminating vision for what happens when TV and entertainment are transformed by the power and intelligence of computers."
A New Vision of Television In the video below, Brian David Johnson discusses the future of TV with LeVar Burton and Mark Wolfe, of Burton/Wolfe, and Asra Rasheed, CEO, RRKidz:
Futurist and Director, Future Casting and Experience Research
Intel Corporation
The future is Brian David Johnson's business. As a futurist at Intel Corporation, his charter is to develop an actionable vision for computing in 2020. His work is called "future casting"—using ethnographic field studies, technology research, trend data, and even science fiction to provide Intel with a pragmatic vision of consumers and computing. Along with reinventing TV, Johnson has been pioneering development in artificial intelligence, robotics, and using science fiction as a design tool. He speaks and writes extensively about future technologies in articles and scientific papers as well as science fiction short stories and novels (Fake Plastic Love, Nebulous Mechanisms: The Dr. Simon Egerton Stories and the forthcoming This Is Planet Earth). He has directed two feature films and is an illustrator and commissioned painter.
Brief background: I've been following developments in intelligent tutoring systems for a while, and find it interesting to see how researchers are combining artificial intelligence, learning theory, affective computing, and sensor networks to create applications that might prove to be useful and effective.
The advantage of using intelligent tutoring applications in some cases is that it provides students with additional support and feedback the moment it is needed, something that is difficult for teachers to provide to students in large classrooms. With the increase in use of smartphones and other mobile devices such as the iPad, there is a good chance that this sort of technology will be used to support learning anywhere, anytime.
Although most intelligent tutoring systems are geared for 1-1 computing, I think there are some components that could be tweaked and then transfered to create intelligent "tutoring" systems for collaborative learning. Students like game-based learning, and what could be more fun than playing AND learning with a partner or group of peers? (I plan to revisit the research in this area in an upcoming post.)
Some thoughts:I envision a system could support learning as well as important skills useful to students in life beyond the school walls, such as positive social interaction, teamwork, and problem-solving skills. The path of least resistance? Most likely applications that support the learning of pairs or small groups of students working at one display. However, in this era of the "21st Century Learner", there is a growing need for applications that can support small groups of students for collaborative groups and project-based learning activities.
There are a few applications developed for collaborative learning activities around a multi-touch table, such as the SMARTTable or the Surface, and more are needed. Also needed are intelligent systems that can support video conferencing and collaborative learning between students who are not physically co-located.
There are some problems that have yet to be solved. For example, the use of multiple sensors for an application designed for young people might be too intrusive. There are serious issues related to privacy/security. Who would have access to data regarding a student's emotional or physiological state? How would this data be utilized? How would this information be protected? Many school districts have security vulnerabilities, so it is possible that this information could be misused, if in the wrong hands.
Below I've highlighted two "intelligent" tutoring systems that incorporate the use of sensors in one form or another to generate information about student learning in a way that simulates what good teachers do every day. The ClassroomAssist application was developed by researchers at Intel, in collaboration with several universities. The Wayang Outpost application was developed by researchers at UMASS, and is aligned with the principles of Universal Design for Learning.
CLASSROOM ASSIST
ClassmateAssist is an application developed by Intel's Everyday Sensing and Perception team. Here is the description of the application from Intel Research:"The advent of 1:1 computing in the classroom opens the door for teachers to set up individualized learning for their students who have a wide spectrum of interests and skills. ClasmateAssist technology uses computer vision and image projection to assist and guide students in a 1:1 learning environment, helping them to independently accomplish tasks at their own pace, while at the same time allowing teachers to be apprised of student progress."
In the following video, Richard Beckwith, a developmental psychologist at Intel, demonstrates a prototype of an application that uses video-sensing to track student's hand movements during a coin sorting lesson. The application provides feedback to the student, and also tracks data about the student's progress that can be transformed into a report for the teacher. The system can also monitor student's facial expression, note attention levels, and provide feedback.
Theocharous, G., Beckwith, R., Butko, N., Philipose, M. Tractable POMDP Planning Algorithms for Optimal Teaching in "SPAIS".International Joint Conferene on Artificial Intelligence (IJCAI) workshop on Plan Activity, and Intent Recognition (PAIR), Pasadena, California, July 2009.
Wayang Outpost: Web-based Interactive Math/Intelligent Tutoring System, with Sensors.
I've followed the work of Beverly P. Woolf and her colleagues for some time. Much of their research has centered around a web-based application, Wayang Outpost, an intelligent electronic tutoring system that incorporates multimedia and animated adventures while providing activities designed to prepare teens for standardized math tests, such as the SAT and state-mandated end-of-course exams.
In recent years, the team has been using non-invasive sensors in their research, including a camera that views facial expressions, a posture-sensing device located in the seat of the student's chair, and a pressure-sensitive mouse, and a wireless skin conductance wristband. Data collected through all of these sensors can provide useful information about student learning. The system can also note when students try to "game" the system.
Related Publications
Woolf, B.P., Arroyo, I., Muldner, K., Burleson, W., Cooper, D., Dolan, R., Christopherson, R.M (2010)The Effect of Motivational Learning Companions on Low Achieving Students and Students with Disabilties (pdf) International Conference on Intelligent Tutoring Systems, Pittsburgh.
Abstract "We report the results of a randomized controlled evaluation of the effectiveness of pedagogical agents as providers of affective feedback. These digital learning companion were embedded in an intelligent tutoring system for mathematics, and were used by approximately one hundred students in two public high schools. Students in the control group did not receive the learning companions. Results indicate that low-achieving students—one third of whom have learning disabilities—had higher affective needs than their higher achieving peers; they initially considered math problem-solving more frustrating, less exciting, and felt more anxious when solving math problems. However, after they interacted with affective pedagogical agents, low-achieving students improved their affective outcomes, e.g., reported reduced frustration and anxiety."
Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R. (2009)
Emotion Sensors Go To School. AIED 2009. Pp. 17-24. IOS Press. Low-tech description of Wayang Outpost, the math application used in the above publication:
Paul Franz, Recoder.Com 5/16/09 Cooper, D.G., Arroyo, I., Woolf, B.P., Muldner, K., Burleson, W., Christoperson, R. Sensors Model Student Self-Concept in the Classroom (pdf) UMass Amherst, June 22, 2009/UMAP 2009
The following video is an "Animate" of a talk by creativity expert Sir Ken Robinson at the RSA (Royal Society for the encouragement of Arts, Manufactures, and Commerce). How do we educate our children to take their place in the economies of the 21st century, given that we can't anticipate what the economy will look like at the end of next week?
FYI: An Animate is a video in which a talented illustrator draws images related to the content of a speaker's presentation. (It is a great way to engage visual thinkers, in my opinion.)
The video explains it all.
RELATED
The following video is the longer original presentation by Sir Ken Robinson, responding to the question about how change can happen in education, and what we might do to make it last:
"For over 250 years the Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA) has been a cradle of enlightenment thinking and a force for social progress. Our approach is multi-disciplinary, politically independent and combines cutting edge research and policy development with practical action.
- Our projects generate new models for tackling the social challenges of today.
- Our work is supported by a 27,000 strong Fellowship - achievers and influencers from every field with a real commitment to progressive social change."
Josh Blake is the Tech Lead of the InfoStrat Advance Technology Group in DC. He has been creating multi-touch applications Microsoft's Surface multi-user table-tops for a while. Recently, his team built a suite of applications designed for use by young children at a museum. Below is a video demonstration of some of this work. It really looks exciting!
Here is artist Steven Lopez, sharing his NextWindow experience:
Here is a video from NextWindow's "Digital Art Day"- watch how four artists use various tools, including brushes, to interact with a NextWindow screen:
“A common method for creating digital art has incorporated pen-based tablet devices, connected to a PC. This approach required artists to look down at the tablet while they drew and look up at the image that was developing on the larger screen. Although artist’s adapted to this tech-heavy approach, it did not come close to the sensory experience of sketching on paper with pencil or using acrylics and watercolors to create a vast array of effects on canvas. Now, there are sophisticated touch screen PCs and applications that are helping artists bridge the "digital divide" between traditional art-making and computer-based art. A group of visual artists visited the NextWindow offices in 2010 for the opportunity to play with a touch PC and see how it affected the creative process.” -NextWindow/Polly Traylor
I'm playing around with my sea life video clips to get ideas for creating interactive educational videos in the future for use on interactive whiteboards and other large touch-screen surfaces.
Here's my YouTube description:
This is a medley of several video clips taken of sea life, mostly living in aquariums. Awesome jellyfish can be viewed from 1:34- 3:45. The music consists of clips from iMovie, and then William Orbit's "Sea Green" and "Surfin". I created this video for students I work with who have more complex disabilities, such as severe autism.
Earlier this week, I shared my wish to attend the Playful Interface Cultures exhibition at the Ars Electronica Festival. Since I can't get to Linz, Austria to see the festival first hand, I'll share a video of the iRiS project that is part of the exhibition and festival - the link to the video was sent to me by Johannes Schoening, one of the iRiS project team members. iRiS stands for Immediate Remote Interaction System, and is the outcome of a joint research project of the University of Saarbrucken and the University of Munich, in Germany.
If you happen to be at the festival, you can see the project in action between 10 and 11 P.M on September 5th and September 6th.
I am always inspired by Bill Buxton's words of wisdom. If you haven't heard of him, know that he's been around for a very long time, coming to the world of computer science and IT through his passion for music. According to his on-line biography, "Bill Buxton is a relentless advocate for innovation, design, and - especially - the appropriate consideration of human values, capacity, and culture in the conception, implementation, and use of new products and technologies. This is reflected in his research, teaching, talks, and writing - including his column on design and innovation for BusinessWeek.com, and his 2007 book, Sketching User Experiences."
A Little Tale about Touch"It's about the physical and the social context where it is happening" (Bill Buxton)
RELATED “The only true voyage of discovery is not to go to new places, but to see the world through different eyes”. - Bill Buxton, quoting Proust (1913, Remembrance of Things Past)
Bill Buxton: After the Show (Interviewed by Caroline Goles, Worldwide Partner Group Business Manager)
"The thesis of this chapter is that we should consider technology in terms of the fidelity with which it reflects human capabilities on three levels:
* physical: how we are built and what motor/sensory skills we possess;
* cognitive: how we think, learn, solve problems and what cognitive skills we possess;
* social: how we relate to our social milieu, including group structure and dynamics, power, politics, and what social skills we possess.
Our metaphor is one of three separate mirrors, each reflecting one of these levels. In order to be judged acceptable, designs must provide an acceptable degree of fidelity in how they reflect each of these three aspects of human makeup and activity. The benefit is in how the model can provide a simple but valuable test that can be used during the design process. We now look at each of these mirrors in turn."
One of the schools I serve as a school psychologist has a number of students with severe autism. Over the past year or so, each classroom was outfitted with a new interactive white board (IWB). We've found that multimedia content displayed on these large screens is much more engaging than what is viewed on computers with much smaller displays.
The exciting news is that some of the students who have minimal interest in computers pay a great deal of attention to activities on the IWBs. Students who have self-stimulatory or repetitive behaviors that interfere with their ability to engage in traditional activities don't seem to exhibit these patterns nearly as much when the focus is on the IWB. In my opinion, IWBs are great tools for reaching and teaching young people who have autism spectrum disorders!
I'm in the process of creating a variety of short video clips that teachers can embed in learning activities that also provide a way for students to reduce their levels of anxiety, agitation, and/or repetitive behaviors.
The following videos are my first experiments, and are not as polished as I'd like. They are best viewed on a large-screen display or IWB, set to high definition. Although various students have viewed these videos a few times this week, they were a great hit. In some cases, we found it useful to loop the video, especially for students who require repetition of content.
The videos were shot using a small hand-held Panasonic HD camcorder, and quickly edited in iMovie. The music was either taken from the iMovie music library or created using riffs in Garage Band, an Apple iLife product.
Enjoy!
Lily Pond and Music
In this video, I used a few subtitles to direct the viewer to points of interest, such as the little grasshopper hiding in the pink lily and a dragonfly, which appears near the end of the video.
Butterflies and Flowers Set to Music
I set this version to piano music for a student who listens to piano music as a coping strategy. I plan to create another version with other genres of music. This particular score was created with piano riffs from Garage Band.
Up Close at the Charleston Aquarium - with relaxing music
Even the most inattentive students paid maximum to this video when it was looped! I think they liked the variety of sea life, especially the turtles. Tip: If you plan to capture video at an aquarium, plan to visit at a time where there are few visitors!
I can't wait to take my video camera to the Atlanta aquarium.
Minnows and Music
The minnows swimming around in the murky green tank are a little boring, but things get slightly more exciting when the bait-shop owner feeds them. The music makes up for what the video lacks. The students didn't mind at all.
Jellyfish at the Discovery Place Aquarium, Charlotte, NC - with music
I'd like to visit Discovery Place at a less-crowded time and re-capture the jellyfish in action from a better vantage point.
UPDATE
I'd like to see if there are similar videos set to music by William Orbit.
I found this video on YouTube of William Orbit's "Sea Green", set to video created by a fan:
Below is "Surfin", great for chilling - needs some video...
Interactive off-the-desktop technologies are taking off in many regions, including the Carolinas. I'd like to share some of this "post-WIMP" goodness with my readers. There are lots of interesting things developing in my own backyard!
About a year ago, I wrote a post about T1 Visions, the company behind the interactive technology at the T1 Tapas restaurant in Huntersville, NC. In May of this year (2010), Mike Feldman, the president of T1 Visions, LLC, and inventor of the T1 Connection Booth, participated in a panel discussion about innovative technologies for the restaurant sector at the National Restaurant Association (NRA) Show in Chicago, Illinois.
The T1 Connection booth is an integrated system that includes a durable interactive multi-user touch screen table, powered by a mac. It also includes a high-quality audio system and high-definition video screen. It can accommodate a variety of mobile devices, such as iPods, iPhones, MP3 players, and laptops. If you have digital photos, you can easily connect to the system and view them on the video screen.
The T1 Connection Booth was designed for use in restaurants, bars, and nightclubs. In the Charlotte area, booths can be found at Harper's Restaurant in South Park, and soon will be ready for action at a new restaurant, Cowfish. According to a press release, the system is more affordable than the Microsoft Surface, and can be
The T1 Connection booth looks like it could be re-purposed for a variety of settings. I've experienced the T1 Connection Booth "hands on" a few times while visiting T1 Tapas, and I'm sure that it has potential for use in museums, libraries, and other public spaces. T1Visions also provides high-definition display systems that coordinate with the T1 Connection booths. T1 Vision's displays and digital signs are in use at the newly-opened NASCAR Hall of Fame museum, and also at the Charlotte Convention Center.
T1 Visions Touchscreen Experience
2009 Five Ventures Conference: T1 Visions, Inc.
The TechnoFiles, CNN Video Feature of T-1 Vision's High Tech Restaurant, T1 Tapas (2009):
Previous IMT post about T1 Tapas, July 16, 2009: T1 Tapas, a restaurant north of Charlotte, N.C., in the Birkdale Village in Huntersville, has majority owners with a technology background. Mike Feldman and Jim Morris started up Digital Optics Corporation, which focused on optics for computers and imaging, and after they sold their company, they teamed up with Denise Feldman to establish their company. T 1 Connection Booth with Multi-User Touch Screen, HD TV, Sound System, Computer, & more: "T1 Connection Booth seating gives you access to music, photos, and videos through built-in touchscreen tabletops, brilliant monitors, speakers, and computers."-Picture and taken from the T 1 Tapas website The restaraunt serves as a test bed for T 1 Visions to try out their software and hardware designed to enhance the digitally connected dining experience.The restaurant was featured in May of this year on CNN.
I recently came across the INAVATE website, and I've found several articles on topics that I'm sure will interest my blog readers. The article that intrigued me today was about the AquaLux 3D project by researchers at Carnegie Mellon University. In the videos, you can see how text, video, and images can be created with light and water droplets. The researchers used this technology to create a multi-dimensional Tetris game!
A Multi-Layered Display with Water Drops (non-technical version)
"Using a projector and rows of water drop emitters, we create a multi-layer display that can be used for text, videos, and interactive games."
How the Water-drop Multi-display Works: SIGGRAPH 2010 Submission:
Is your product, service, or company awesome in the eyes of your beholders? Do you help foster "self-awesomeness" among your users, customers, readers, or audience?
The wisdom of a social media consultant won't sustain core innovation. In this video, Kathy Sierra talks about how to reverse engineer sustainable passion. What superpowers do you give your users? Watch the video to learn about a model that helps people get better.
The video is about an hour, but worth your time:
RELATED
Kathy Sierra is involved with theCreating Passionate Users website, where the bloggers "are all facinated by brains, minds and what science can tell us about the practice of making users passionate about their lives and tools. She is one of the co-creators of the Head First books from O'Reilly. I first learned about Kathy Sierra when I was taking a Java programming class on-line, and needed more resources. She created the JavaRanch website,"a friendly place for Java greenhorns"
By the way, the Head First books are great resources for people who are tech-curious and have an urge to learn to code. I love Head First books and have several on my reference shelves.
Intel's Oasis system uses object recognition that triggers various applications that generate such things as shopping lists and recipes. The system can handle more than one item of food. It includes videos of how to prepare meals, a great feature for people just starting , or those who are learning to prepare healthier meals.
In my opinion, this sort of application would be useful to people with disabilities that affect memory.