Jul 28, 2011

How Algorithms shape our world: Kevin Slavin's thought-provoking TEDTalk video- great visuals!




"Kevin Slavin argues that we're living in a world designed for -- and increasingly controlled by -- algorithms. In this riveting talk from TEDGlobal, he shows how these complex computer programs determine: espionage tactics, stock prices, movie scripts, and architecture. And he warns that we are writing code we can't understand, with implications we can't control." -TED



An interesting fact shared during the presentation- 2000 or so physicists work on Wall Street, engaged in  Algo-trading.   Remember the "flash crash"?

OK Go's Interactive HTML5 Music/Dance/Message Video, Featuring Pilobolus - behind the scenes and more. Delight.

Not long after the earthquake and nuclear disaster in Japan, the creative team at Google Japan collaborated with the band OK Go and Pilobolus, a dance troupe,  to explore the capabilities of HTML5 and multimedia, while at the same time providing people from all over the world a means to send a message to others, including the people of Japan.  


I encourage you to visit the link to the interactive version of All Is Not Lost  For your convenience, here is the traditional version:



Before the video begins, viewers can enter a message in a text box.  At the end of the video, the message will appear, with a few options for sharing the message. At a loss for words, I quickly typed "I love summer", and was delighted to see how the dancers gathered to form my message, as pictured in the screen shot.


Quite a bit of creative thinking went into the making of All Is Not Lost, as described in following article and video clips:
Behind the Work: Trish Sie on OK Go's "All Is Not Lost"
Shareen Pathak, Creativity, 7/27/11
"The trifecta also partnered with Google to create an HTML5-powered interactive experience, which leverages the Chrome browser to showcase the clip through 12 separate windows of dances that shift along with the music. Users can also type in messages in Roman letter or Japanese and watch the band create the letters with their bodies. The result? A visually arresting intersection of dance, technology and direction." -Shareen Pathak

A LOVE LETTER TO JAPAN....


THE MAKING OF ALL IS NOT LOST


RELATED
All Is Not Lost lyrics
OK Go's Human Kaleidoscope, All Is Not Lost, And How It Translates Into Sales
Teressa Iezzi, Fast Company, 7/28/11
O.K. to Go Dance Crazy
Tony Gervino, NY Times Magazine 7/31/11



Jul 27, 2011

Link: Google updating Chrome for Apple's Lion Multi-touch

Here are two quick links related to the Chrome update for Apple's latest operating system, Lion:

Google Updating Chrome for Lion Multi-touch
Stephen Shankland, CNET News, 7/27/11


RELATED
Chrome Dev Channel Release
7/26/11

Apple's iOS 5 facial recognition feature opens up interactive possibilities

I've been thinking about creating my first iPad app, and as I was searching for information, I came across a few articles related to Apple's new iOS 5 that I found interesting.  


Because my target user group includes young people with autism spectrum disorders (ASD),  I was intrigued by the possibility that the facial recognition APIs might provide a means of assessing mood or emotional states.   Most of us understand that our faces function as mirrors to feelings, and we use our facial expressions to communicate our feelings to others.  Unfortunately, this is a concept that is difficult for young people with ASD to understand.  My hunch is that there is an "app for that".   


I'd love to great to create a little iPad app for young people with ASD for education, intervention, and/or communication activities that incorporates the facial recognition feature!


Apple's iOS facial recognition could lead to Kinect-like interaction
Darrell Etherington, GigaOm/Reuters, 7/27/11
Here is a quote from the above article:
"You could create apps that track a user’s eye movement and dynamically change content accordingly, for instance. App developers might even be able to use data gathered from facial recognition APIs to identify so-called “hotspots,” providing insight about where a user is looking most within an app and arranging content accordingly. In time, an iPhone app might even be able to assess the emotional state of the user, based on whether they’re frowning or smiling, and address the user in a manner appropriate to their mood. It might also be able to tell how engaged users are with mobile ads and content, which might be useful for iAd customers, among others."


Stan Schroeder, Mashable, 7/26/11
Below is a video from from the above post which demonstrates an app developed by Polar Rose, a company that was purchased by Apple.
RELATED
Apple plans native panorama functionality in iOS5
Seth Weintraub, 9TO5Mac, 7/8/11
iOS 5's final release may include "Assistant" speech-to-text feature
Chris Rawson, TUAW, 7/23/11

Link to Innovative Interactivity's List: "Thirty multimedia people to follow on Google+"

Tracy Boyer Clark, founding manager and editor of Innovative Interactivity,  recently posted a nicely curated list of interesting people to follow on Google+.  


Thirty multimedia people to follow on Google+


Tracy is a fantastic resource for all things related to multimedia. Be sure to add her to the list. BTW, my Google+ Multimedia circle includes everyone on Tracy's list, and a few more.


Thanks, Tracy, for sharing this list!

Jul 26, 2011

Eric Sailer's Lists of iPad, iPod, and iPhone Apps for (special) education


Update: Here is a link to Eric Sailer's resources on Scribd:
http://www.scribd.com/doc/24470331/iPhone-iPad-and-iPod-touch-Apps-for-Special-Education



Update:  I think Eric Sailer's blog has moved.  For now, here is a link to his website:
http://ericsailers.com/iresources.html

Eric Sailers is a speech/language pathologist who has an extensive amount of experience using "apps" in educational settings.  He's compiled a great list of apps and related resources on his blog:  http://slpsharing.com/app-resources/

Eric spends some of his time giving presentations about the use of Apple's mobile devices in education. He's co-created a few apps, too!

OPEN SOURCE MULTI-TOUCH SOFTWARE FOR YOUNG PEOPLE WITH AUTISM SPECTRUM DISORDERS- Featured in the WSJ! (Video and links)

Well before the iPad entered the market, researchers such as Dr. Juan Hourcade at the University of Iowa, began to develop multi-touch software to  support young people with autism spectrum disorders and other disabilities.  I've been following this research for a while, and was happy to discover that Dr. Hourcade's work was  recently featured in a recent Wall Street Journal article and video.  


If you are a parent or work with a young person with special needs, you might be interested in learning more about Windows-7 based touch-screen software.  The software shown in the video below, as well as related resources, are available for free on the Open Autism Software website.




RELATED

Article:
Shalini Ramachandran, Wall Street Journal, 7/26/11

Previous Post: Interactive Multi-Touch for Children with Autism Spectrum Disorders:  Research and Apps by Juan Pablo Hourcade, Thomas Hanson, and Natasha Bullock-Rest, University of Iowa 4/26/10




Why is this important to me?   
I work with young people with a range of disabilities, including autism, as a school psychologist.  One of my schools is a program that is technology-rich.  Although we don't have iPads, there is a SMARTBoard in every classroom and in other rooms around the school. We have a Wii,  a few multi-touch all-in-one PC's,  and recently received two Dell multi-touch tablet PCs

I noticed that Dr. Hourcade is looking for people who'd like to create similar apps for the iPad.  I'd be interested in knowing anyone who might be working on this.  I'd like to help in some way, but I don't have a lot of free time to do this on my own. 

I have a new iPad2, and after playing around with a ton of apps, I was inspired to downloaded XCode4 and think about what I could create. I'm a Mac person at heart, but I learned to program within a Windows environment (C#).    I need a way to maintain my programming skills and gradually learn a few new ones. This sort of project would be useful to me in my "day job", too!




SOMEWHAT RELATED
3D Interactive Heart, by 22Miles, on a Dell Latitude Xt2


Jul 24, 2011

Video: Kinect SoundWall, links to info and code!





Here is information about the project from the KinetHacks SoundWall site:

"Kinect sound machines become prettier and easier with each development! The Kinect SoundWall is a drum beat music machine controlled by gestures and voice commands. This video by  displays this digital music machine at work and how through various gesture and voice commands, users can create awesome beats to dance to. In the video, the user gestures to to certain blocks in the screen in order to create a beat there or render the beats void. Through various voice commands, the beat can start, increase tempo, stop etc. Through the proper integration of both voice and gesture commands, the Kinect SoundWall sets the standard for a great and efficient sound machine of the Kinect!"
"For more information about the Kinect SoundWall visit the project’s website."
RELATED
Vertigo SoundWall CodePlex Project Site

Jul 22, 2011

Quicklinks: Cute video about need for Google+, Spielburg on 3D, Tactile Pixels, Touch Screen Steering Wheel, and More

Here are a few interesting links  and a couple of videos.  Enjoy exploring!


Comic-Con 2011: Steven Spielberg Gives His Thoughts on 3D
Jason Barr, Collider, 7/22/11



Kwame Opam, Gizmodo, 7/9/11

Albrecht Schmidt, User Interface Engineering Blog, 7/17/11


Potential to improve some user experiences, using HTML 5
HCI 596 Course Blog, Iowa State University, 7/11/11


PBS Launches LearningMedia, a New Digital Repository for Educational Content
Audrey Watters, Hack Education, 6/27/11


Link to Microsoft Surface 2.0 SDK and Resources
Luis Cabrera, Surface Blog, 7/12/11


AI (Artificial Intelligence) Demonstrates Natural Learning, Applies New Skills To Civilization
Devin Coldewey, Tech Crunch, 7/13/11


21 Google+ Privacy Tips: the Ultimate Guide
Craid Kanalley, Huffington Post, 7/21/11


iPad K-12 Sales Outpace Mac Products
Ian Quillen, Education Week, 7/20/11


Wearable lab coat TV packs thousands of LEDs, heads for Burning Man
Zach Honig, Engaget, 7/13/11

(I'd like a job where I can do tech experiments, silly ones, too!)




Jul 18, 2011

Emerging Interactive Multimedia, New Models of HCI for Museum Exhibits (Course offered by Ideum's Jim Spadaccini, plus info about the MT55 multi-touch table)

Jim Spadaccini, the director and founder of Ideumwill be teaching a course on exhibit development through the University of Victoria, "Emerging Exhibits: Exploring New Models of Human Computer Interaction (HCI).  The excerpt from the course description provides a glimpse of how emerging technologies are beginning to change the museum experience:


"Computer-based interactive exhibits are undergoing a major transformation. The lone, single-user kiosk is now being replaced by multitouch tables and walls, motion-capture spaces, networked installations, and RFID-based exhibits. Advances in augmented reality, voice recognition, eye tracking, and other technologies promise even more radical change for exhibits in the near future."


I've been following Jim's journey with Ideum, a multimedia design firm that collaborates with museums and related non-profits, for many years, and I am impressed with the work of this company.  In addition to his work at Ideum,  Jim serves as the Principal Investigator of a National-Science Foundation sponsored open-source exhibit software project, Open Exhibits, which provides a free software development kit that supports the creation of multi-touch and multi-user software applications for museums and educational settings.


I'm happy to put in a plug for Ideum's latest product, the MT55 Platform Multi-Touch Table. It incorporates a range of features that I'm sure will meet the needs of museum visitors.  In my opinion, this table would be a fantastic resource for all types of libraries, including those in K-12 settings.

The MT55 Platform Multi-touch Table, from Ideum

The MT55 Platform Multitouch Table from Ideum on Vimeo.  (Note: This video features music by Moby, the track "Sevastopol" on his current album, Destroyed. The music was used with the artists' permission. Learn more at: moby.com") - Ideum

"The thinnest, largest, most powerful multitouch table available.The MT55 Platform multitouch table houses a powerful computer and a 55-inch interactive LCD display that responds to 32 touch-points, inside a rugged aluminum body."

"The bright 55″ 1920×1080 HD display has a 5,000,000:1 contrast ratio. A wide 178-degree viewing angle accommodates multiple users around the table. The optical multitouch system supports 32 simultaneous touch points for collaborative interaction. The system is multitouch-enabled from start-up, and runs Windows 7 64-bit professional edition."


"The integrated computer is packed with power. It contains an Intel® Hyper-Threaded DualCore i5® which runs at 2.66 GHz, 8GB of RAM, and a 128 GB solid-state drive (upgradeable to an i7®)."


"The table comes complete with WIFI, Bluetooth, and Ethernet connectivity. It also has multiple HDMI outputs that allow you to easily mirror the table's display, extend the desktop, or connect to and display from another computer or HDMI device."

"Convenient, but secure ports: CAT5, HDMI, and USB 2.0 are available on both the side on bottom the table."

"The MT55 Platform includes blue LED under-lights to illuminate the area beneath the interactive surface (custom LED colors are available). Every MT table includes a sophisticated internal cooling cell to maintain operating conditions that exceed the optimum environmental specifications for the internal components."

"The interactive surface of the MT55 Platform protected by a sheet of hardened, crystal clear, low-iron 5mm tempered glass surface...
As an option, we offer Sevasa HapticGlas®, produced exclusively for Ideum. Micro-etched HapticGlas® provides tactile feedback, reduces fingerprints, increases scratch resistance, and directs user focus." -Ideum
 

RELATED
High-res photos of the MT55 Platform

GestureWorks Software
Open Exhibits
Ideum
Open Exhibits Tuio Kinect

Tools for Knowledge Junkies: Document Cloud, CALAIS , Linked Data

For all of the knowledge junkies out there, here are a few things that might fuel your passion for interactive information exploration and consumption. Although the tools below were adopted by people in journalism/news related fields, I think that they have potential for use in education and other fields. I'm interested in learning more about how people from different disciplines currently use these tools, and I'd also like more to this list.

CALAIS: "We want to make all the world's content more accessible, interoperable and valuable. Some call it Web 2.0, Web 3.0, the Semantic Web or the Giant Global Graph - we call our piece of it Calais. Calais is a rapidly growing toolkit of capabilities that allow you to readily incorporate state-of-the-art semantic functionality within your blog, content management system, website or application." CALAIS is published by Thomson Reuters
MORE ABOUT CALAIS
PR Video for CALAIS:

How CALAIS Works:

CALAIS for Publishers
CALAIS for Bloggers
CALAIS for Software Providers
CALAIS for Content Managers
CALAIS for Developers
Open CALAIS Content Maps by Jer Thorp

"Jer Thorp is a software artist, writer, and educator. He is a contributing editor for Wired UK. He is currently Data Artist in Residence at the New York Times."

DOCUMENTCLOUD
"DocumentCloud runs every document you upload through OpenCalais, giving you access to extensive information about the people, places and organizations mentioned in each."

DocumentCloud Blog from Knight Foundation on Vimeo.
MORE ABOUT DOCUMENT CLOUD
DocumentCloud Merging with IRE 
Amanda Hickman, DocumentCloud, 6/9/11 (Note:  IRE = Investigative Reporters and Editors)
Newsrooms using Document Cloud
Document Cloud: Not Just For Journalists?  (Excellent article!) Konrad Lawson, The Chronicle of Higher Education, 7/8/11 (Konrad is a Ph.D student in the history department at Harvard University.)
For Techies:  Document Cloud's VisualSearch.js | Autocomplete Faceted Search Queries
Losterium Posterious, 7/9/11


LINKED DATA
Part of the Linking Open (LOD) Data Project Cloud Diagram, click for full and historical versions...
-linkeddata.org
"Linked Data is about using the Web to connect related data that wasn't previously linked, or using the Web to lower the barriers to linking data currently linked using other methods. More specifically, Wikipedia defines Linked Data as "a term used to describe a recommended best practice for exposing, sharing, and connecting pieces of data, information, and knowledge on the Semantic Web using URIs and RDF.""


Tim Berners-Lee: The Year Open Data Went Worldwide

"....Tim Berners-Lee invented the World Wide Web. For his next project, he's building a web for open, linked data that could do for numbers what the Web did for words, pictures, video: unlock our data and reframe the way we use it together." At TED2009, Tim Berners-Lee called for "raw data now" -- for governments, scientists and institutions to make their data openly available on the web. At TED University in 2010, he shows a few of the interesting results when the data gets linked up."-TED

LDOW2011: Linked Data on the Web workshop


COMMENT
I plan to share more information related to this topic in future posts.  I welcome input and links from my readers!



Jul 14, 2011

Multi-touch Update from Stantum

The people at Stantum have been working hard to improve multi-touch technology, focusing on smaller tablet-sized systems.  Stantum is a company I've been following for several years, from the time it was known as Jazz Mutant.  I have been impressed by Stantum's focus on the needs of people as well as the company's careful attention to important details.


I'm pleased to see that the company has an idea of how its multi-modal technology can support multi-touch in education:   "Ambidexterity and multi-modality are the two pillars of Stantum's core project – making the use of touch-enabled devices more creative and productive. Amongst others, there is one field of application where we truly see a soaring need for ambidexterity and multi-modality – augmented textbooks." -Guillaume Largillier


At the Society for Information Display's Display Week exhibition this past May, Stantum introduced a new palm rejection feature for its Interpolated Voltage Sensitivity technology. This technology provides users with a more natural way to interact with the interface and application content on tablets.   The technology supports Android's multi-touch framework and is also Windows 7 certified.  The palm rejection feature will be a welcome improvement for future multi-touch applications designed for education settings, where it is likely that  more than one hand - or person, might be interacting with content on the screen at the same time.


Below are two videos that provide a glimpse of Stantum's innovations:




Stantum's technology can enable ten simultaneous touches, is highly responsive, and supports high-resolution content. According to a May press release, "Palm rejection is available as an API (application programming interface) to Windows and Android operating systems on x86 and ARM platforms. IVSM touch modules are offered to OEMs through the company’s Qualified Manufacturers Partners, comprising tier-one touch-screen manufacturers with high-volume production capabilities. More information is available at info@stantum.com"


RELATED
Stantum's TouchPoints Newsletter (July 2011)


Stantum Whitepapers:
How to Evaluate Multi-Touch While Standing in a Store (pdf) - a great source of information.
Jim Meador, Pascal Auriel, Gauthier Chastan, Stantum
Specifying and Characterizing Tactile Performances for Multi-touch Panels: Toward a User-Centric Metrology (pdf) - outlines some important points!
Guillaume Largillier, Pascal Joguet, Cyril Recoquillon, Pascal Auriel, Axel Balley, Jim Meador, Julien Olivier, Gauthier Chastan





News from the HCI lab at UNC-Charlotte - Creative Interactions (Videos)

In the video below, Nathan Nifong, discusses and demonstrates his independent study work through the HCI LAB (Human-Computer Interaction) at UNC-Charlotte. He focused his work on creating a motion-responsive musical instrument that integrated movement, sounds, and graphics.


"We're trying to research anything and everything that will help to make computers easier to use." -Nathan Nifong


Thanks to Celine LaTulipe, for the link!


RELATED
To get a better idea about the HCI lab, a small, yet sparkling gem on the UNC-Charlotte campus, take a look at the following video:




"I get to create things, and to me, you just can't beat that!"- Celine Latulipe


Dr. Latulipe is known for her work with the Dance.Draw project, Interactive Surveillance, and tools that support things like bimanual interaction, collaboration, and creativity.

Jul 12, 2011

Summer Break: Music Apps, Multimedia, Kinect, My New iPad2, Tech-reading, Google+, Dancing...

I'm on summer break, which for me, means that I spend an increased amount playing/creating music and doing all of the other fun stuff I don't have much time for during the school year. I'm still exploring what I can do with my new iPad2 - there are so many music apps!  My favorite at this moment is Garage Band. It keeps me engaged for hours, and I can take it with me anywhere I go.  I'm also exploring iPad apps for education and students with special needs, since many of the young people I work with have autism spectrum disorders.  They all really love music.


Today, I came across turntable.fm, a "social-djing" website, from a link shared by Dimitri Diakopoulos.  I think it would be fun to play with.




















Turntable.FM, The Fastest-Growing Music Service You're Not Using
William Fenton, PC Magazine, 6/23/11
Social DJing with Turntable.FM
Andrew Mager, 5/28/11


I'm still plowing through technology journals and zines from previous months - I had to skip over my stack to read the cover article of the most recent Communications of the ACM:
Michael Edwards, University of Edinburgh, 2011


I LOVE the design of this cover. It would make for a nice interactive interface for an iPad music app. Or a larger touch-screen display. Or even a SMARTBoard! (BTW, My first computer-related course was Computer Music Technology, in 2003. My undergraduate honors research (psychology), years ago, focused on constructive cognition and music recognition/memory. This topic is dear to my heart.)

I've spent some quality time with my first grand-baby this summer.  Although his "screen time" is limited, given his age of 7 1/2 months, he enjoys playing with music on my iPad.  He likes the drums found in the iPad GarageBand application.  Here he is playing with NodeBeat, an app created by Seth Sandler and Justin Windle:













Most of his time is spent off-screen:





Over the last few months, there has been a surge of interactive touch-enabled apps for education, including some for young people with special needs.  This will be the topic of a few of my future posts.

Jul 10, 2011

Link: "The Old Internet Neighborhoods" -Message Boards, Forums, Chat Rooms, RIP? +more

In skimming through my RSS feeds today, I noticed that that Google+ has generated quite a bit of reflection about the evolution/revolution of on-line communities over the past week or so. In the article below, Virginia Heffernan reminisces about the days of the digital networking before the spread of broadband connections and Smartphones:


The Old Internet Neighborhoods
Virginia Heffernan, Opinionator, New York Times 7/10/11


Thanks to Bonnie Bracey Sutton for the link!


Here are some visual reflections related to Google+:

-Stephen Downes


-Brooks Bayne

-Abid Shaikh
-Abid Shaikh


SOMEWHAT RELATED

Keith Kleiner, Singularity Hub, 4/6/11
What is Your Plan for Google+ Experimentation and Exploration?
Beth Kantor, Beth's Blog: (How Networked Nonprofits Are Using Social Media to Power Change)
7/7/11
The Evolution of Cyberspace: Virtual Worlds
Craig Harm, Sendsonline.org, 1/22/11
Throwback to the 90's: How Social Networking is Moving Back to Private
Lydia Leavitt, The Next Web, 10/17/10
Google+Facebook Extension Integrates Facebook Viewing and Updating on Google+
David Galloway, Lifehacker, 7/9/11
Note:  Apparently there was some problems with the extension, as mentioned in the update of the above article:"Update: We've learned that this browser extension has a lot of privacy violations and are retracting our recommendation of this extension. Please visit this Reddit post to find out detailed info on the problems and how to uninstall the extension from your system."  
David Galloway linked the following article from his post:
Facebook blocks Google Chrome extension for exporting friends
Emil Protalinski, ZDNet, 7/5/11

Jul 9, 2011

What Do You Love: A Cute Multi-stream Search Engine from Google. Key in what you "love" and see what comes up!
















What Do You Love (WDYL) is a a multi-search engine from Google that in my opinion, aims to facilitate topic exploration by providing people with a visual/multimedia array of "widgets" that provide tempting glimpses of a range of options for digging deeper into a topic.  


Words really can't explain it.  I suggest you try it out.  It is fun to see what comes up. Below is a screen shot of part of what came up when I keyed in "Interactive Multimedia Technology".   (If you are a regular reader of this blog, you'll notice that the picture widget includes pictures I've posted on this blog over the past several years.)






















Key in your name and variations of your name to see what comes up.  Apparently there are a bazillion pictures of me in cyberspace, taken by my husband while I was relaxing on a cruise, one of my favorite pastimes:


RELATED
CHI 2011 WDYL Search Results
Google's What Do You Love (WDYL) Multi-Search
Rob D. Young, Search Engine Watch, 6/29/11
Google Asks "What Do You Love?" With New Service
Brennon Slattery, PCWorld, 6/28/11
What do you love? Google knows.
Regina Hope Sinsky, Venture Beat, 6/28/11


Thanks to Farhad Javidi for the link!

"Rise and Shine": TED video featuring Simon Lewis and his recovery from a serious brain injury, and how cutting-edge technology helped. (Video includes interesting visuals)

Simon Lewis was in an automobile accident that resulted severe injuries to his body, including a very severe head injury that left him in a coma.  As a result of his experience, he wrote a book, "RISE AND SHINE", covering his journey over 15 years of recovery and regeneration.  In the book, Lewis shares what he learned along the way, and how cutting edge technology and some non-traditional thinking helped him move forward.


Simon Lewis had the opportunity to discuss his journey at a TED partner event in India in December, 2010.   The following video of his talk is about 22 minutes long, but worth taking the time to watch. A variety of visuals are used to illustrate his journey and the research he did along the way. Near the end of the video, Simon Lewis demonstrates some of the technologies that he wears that supports his functioning.


Here is the blurb about the book from the Borders website:

"An impassioned tale of survival and recovery, this inspirational story recounts the author’s horrific car accident, his subsequent coma, and the more than 15 years of cutting-edge treatments and therapies endured during convalescence. With specific details of the rigorous rehabilitation process that ensued, including numerous breakthrough and experimental surgeries, the book also provides practical insight into navigating the treacherous world of insurance and how to differentiate between the often conflicting medical opinions offered. In addition to describing the numerous procedures undergone, the author tells not only of his pain, frustration, and despair, but also of his childlike wonder at the beauty and miracle of creation. A first-person account of sudden, unexpected tragedy and life-affirming courage, this remarkable tale of regeneration imparts lessons both medical and spiritual."



The Rise and Shine website includes many of the graphics used in Simon's video, and is worth taking some time to explore. Some of the graphics are interactive. (Since the website relies on Flash, it won't work if you try to access it using an iPad.)


Comment:  As a school psychologist with additional training in neuropsychology - specifically assessment and intervention for children and teens who have experienced traumatic brain injury (TBI), this topic is important to me.  I'm watching the video a second time, and I plan to read his book.

(Cross-posted on the TechPsych blog.)

Best Practices For Designing Websites for Kids - Quick Link





Best Practices For Designing Websites for Kids
David Morrison, Smashing Magazine, 7/6/11


Here is a short quote from the article:


"According to Jacob Nielsen’s research on teenagers, interactive website features (such as forums, mini-games, polls, ranking systems, competitions and 3-D interfaces) are valued by kids if they build a sense of community and foster participation. Bolting such features on will not likely prove effective, because kids will soon see the gaps and re-evaluate the website, despite any initial interest."







Quick link: "14 burdens placed on the user" by Jeff Sauro, Measuring Usability Blog


It is 2011 and usability issues continue to plague us.


I am sure most of encounter many of the usability annoyances outlined in the following article several times a week, if not several times a day!


14 burdens placed on the user
Jeff Sauro, Measuring Usability, 7/5/11


via Smashing Magazine


Jul 8, 2011

Off topic: 134 Launches in 134 Minutes - video clip of space shuttle launches over the years

Today I had the chance to watch the final launch of the space shuttle Atlantis. After the shuttle was safely launched, CNN broadcasted the video "134 Launches in 134 Minutes". While I watched the video, I thought about the thousands of people who have worked for NASA, and all of the technology and scientific advances that have taken place over the years.


Here is a version of the video I found on YouTube:




My hope is that we will continue to make scientific gains in the future, but with continued economic problems, this might take place at a much slower rate.

RELATED

Reflections about the last space shuttle launch by Jeremy Perez-Cruz (Tea and Letter blog)

Jul 7, 2011

I want to travel around the globe and attend all of the cool conferences about innovative interactive technologies. Any sponsors? (Yes, I'm day-dreaming)

Here are a few I missed:


NIME 2011 OSLO: The International Conference on New Interfaces for Musical Expression - NIME is an outgrowth of a workshop held at CHI 2001 (Human Factors in Computing Systems).
"The NIME conference draws a varied group of participants, including researchers (musicology, computer science, interaction design, etc.), artists (musicians, composers, dancers, etc.) and developers (self-employed and industrial). The common denominator is the mutual interest in groundbreaking technology and music, and contributions to the conference cover everything from basic research on human cognition through experimental technological devices to multimedia performances." Just take a look at all of the presentations that were at NIME 2011!  NIME 2011 Program (pdf)


Touch the Web 2011: 2nd International Workshop on Web-Enabled Objects June 20-24, 2011, Paphos, Cyprus (in conjunction with the International Conference on Web Engineering ICWE)
"The vision of the Internet of Things builds upon the use of embedded systems to control devices, tools and appliances. With the addition of novel communications capabilities and identification means such as RFID, systems can now gather information from other sensors, devices and computers on the network, or enable user-oriented customization and operations through short-range communication. When the information gathered by different sensors is shared by means of open Web standards, new services can be defined on top of physical elements. In addition, the new generation of mobile phones enables a true mobile Internet experience. These phones are today’s ubiquitous information access tool, and the physical token of our "Digital Me“. These meshes of things and “Digital Me” will become the basis upon which future smart living, working and production places will be created, delivering services directly where they are needed."


Upcoming Conferences and Workshops


Ubicomp 2011, September 17-21, 2011, Beijing, China
"Ubicomp is the premier outlet for novel research contributions that advance the state of the art in the design, development, deployment, evaluation and understanding of ubiquitous computing systems. Ubicomp is an interdisciplinary field of research and development that utilizes and integrates pervasive, wireless, embedded, wearable and/or mobile technologies to bridge the gaps between the digital and physical worlds. The Ubicomp 2011 program features keynotes, technical paper sessions, specialized workshops, live demonstrations, posters, video presentations, panels, industrial exhibition and a Doctoral Colloquium."

"Given its multi-disciplinary nature, Ubicomp has developed a broad base of audience over the past 12 years. Key audience communities are: Human Computer Interaction, Pervasive Computing, Distributed and Mobile Computing, Real World Modeling, Sensors and Devices, Middleware and Systems research, Programming Models and Tools, and Human Centric Validation and Experience Characterization. More detailed information about the topical focus of Ubicomp can be found in the Call for Papers."



Eurodisplay 2011: XXXI International Display Research Conference, September 19-22, Bordeaux-Arcacho, France
Eurodisplay 2011 in Bordeaux/Arcachon provides researchers, engineers, and technical managers a unique opportunity to present their results and update their knowledge in all display-related research fields...The two keynote addresses on the first day of the Eurodisplay 2011 Symposium on September 20th will be a unique chance to hear a global overview on the future of the display market (Samsung LCD) and on display applications in the automotive industry (Daimler AG)


Eurodisplay ’11 in Bordeaux-Arcachon will be the right spot to learn more about the last results on display research and related fields such as organic electronics. The cognitic science will give a new vision on the impact of display in our day to day life, not only from a perception but from the Information standpoint since SID deals with “Information Display” and not only technologies.The two invited talk of Pr. Bernard Claverie and Dr. Lauren Palmateer will focus on this aspect......A dedicated business oriented session will address two important aspect of our business from display company creation by Thierry Leroux to end user trends analysis for TV market by Dr. Jae Shin. This conference will provide a global vision of our Display World including not only the well-known display leaders but also the very actives BRIC countries......Dr. V. Pellegrini Mammana will give a vivid illustration of this display industry dynamic in Brazil."

UIST Symposium, October 16-19, 2011, Santa Barbara, California
"UIST (ACM Symposium on User Interface Software and Technology) is the premier forum for innovations in the software and technology of human-computer interfaces. Sponsored by ACM's special interest groups on computer-human interaction (SIGCHI) and computer graphics (SIGGRAPH), UIST brings together researchers and practitioners from diverse areas that include traditional graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and CSCW. The intimate size, single track, and comfortable surroundings make this symposium an ideal opportunity to exchange research results and implementation experiences."


VisWeek 2011: Viz, Infovis, VAST October 23-28, Providence, RI
"Computer-based information visualization centers around helping people explore or explain data through interactive software that exploits the capabilities of the human perceptual system. A key challenge in information visualization is designing a cognitively useful spatial mapping of a dataset that is not inherently spatial and accompanying the mapping by interaction techniques that allow people to intuitively explore the dataset. Information visualization draws on the intellectual history of several traditions, including computer graphics, human-computer interaction, cognitive psychology, semiotics, graphic design, statistical graphics, cartography, and art. The synthesis of relevant ideas from these fields with new methodologies and techniques made possible by interactive computation are critical for helping people keep pace with the torrents of data confronting them."

6th Annual ACM Conference on Interactive Tabletops and Surfaces 2011 ITS 2011
November 13-16, 2011 Portopia Hotel, Kobe, Japan
"The Interactive Tabletops and Surfaces 2011 Conference (ITS) is a premiere venue for presenting research in the design and use of new and emerging tabletop and interactive surface technologies. As a new community, we embrace the growth of the discipline in a wide variety of areas, including innovations in ITS hardware, software, design, and projects expanding our understanding of design considerations of ITS technologies and of their applications."


AFFINE: 4th International Workshop on Affective Interaction in Natural Environments (ICMI 2011)  November 17, 2011, Alicante, Spain (CFP deadline is August 19, 2011)
Scope: "Computer gaming has been acknowledged as one of the computing disciplines which proposes new interaction paradigms to be replicated by software engineers and developers in other fields. The abundance of high-performance, yet lightweight and mobile devices and wireless controllers has revolutionized gaming, especially when taking into account the individual affective expressivity of each player and the possibility to exploit social networking infrastructure. As a result, new gaming experiences are now possible, maximizing users’ skill level, while also maintaining their interest to the challenges in the same, resulting in a state which psychologists call flow: “a state of concentration or complete absorption with the activity at hand and the situation”. The result of this amalgamation of gaming, affective and social computing has brought increased interest in the field in terms of interdisciplinary research........Natural interaction plays an important role in this process, since it gives game players the opportunity to leave behind traditional interaction paradigms, based on keyboards and mice, and control games using the same concepts they employ in everyday human-human interaction: hand gestures, facial expressions and head nods, body stance and speech. These means of interaction are now easy to capture, thanks to low-cost visual, audio and physiological signal sensors, while models from psychology, theory of mind and ergonomics can be put to use to map features from those modalities to higher-level concepts, such as desires, intentions and goals. In addition to that, non-verbal cues such as eye gaze and facial expressions can serve as valuable indicators of player satisfaction and help game designers provide the optimal experience for players: games which are not frustratingly hard, but still challenging and not boring.........Another aspect which makes computer gaming an important field for multimodal interaction is the new breed of multimodal data it can generate: besides videos of people playing games in front of computer screens or consoles, which include facial, body and speech expressivity, researchers in the field of affective computing and multimodal interaction may benefit from mapping events in those videos (e.g. facial signs of frustration) to specific events in the game (large number of enemies or obstacles close to the player) and infer additional user states such as engagement and immersion. Individual and prototypical user models can be built based on that information, helping produce affective and immersive experiences which maintain the concept of ‘flow’. This workshop will cover real-time and off-line computational techniques for the recognition and interpretation of multimodal verbal and non-verbal activity and behaviour, modelling and evolution of player and interaction contexts, and synthesis of believable behaviour and task objectives for non-player characters in games and human-robot interaction.The workshop also welcomes studies that provide new insights into the use of gaming to capture multimodal, affective databases, low-cost sensors to capture user expressivity beyond the visual and speech modalities and concepts from collective intelligence and group modelling to support multi-party interaction."

Intelligent User Interfaces (IUI 2012) Lisbon, Portugal, February 14-17 (pdf) (CFP Submission Deadline is October 21, 2011)
"Major Topics of Interest to IUI include: Intelligent interactive interfaces, systems, and devices, Ubiquitous interfaces, Smart environments and tools, Human-centered interfaces, Mobile
interfaces, Multimodal interfaces, Pen-based interfaces, Spoken and natural language interfaces, Conversational interfaces, Affective and social interfaces, Tangible interfaces, Collaborative multiuser interfaces, Adaptive interfaces, Sensor-based interfaces, User modeling and interaction with novel interfaces and devices, Interfaces for personalization and recommender systems, Interfaces for plan-based systems, Interfaces that incorporate knowledge- or agent-based approaches, Help interfaces for complex tasks, Example- and demonstration-based interfaces, Interfaces for intelligent generation and presentation of information, Intelligent authoring systems, Synthesis of multimodal virtual characters and social robots, Interfaces for games and entertainment, learning based interactions, health informatics, Empirical studies and evaluations of IUI interfaces, New approaches to designing Intelligent User Interfaces, and related areas"


IXDA: Interaction 12, Dublin, Ireland, February 1-4, 2012
"Interaction|12 is an ideal venue to showcase the most inspiring and original stories in interaction design. We have a world of creative talent to tap into, so our conference roster will fill up fast. We’re on the lookout for thoughtful, original proposals that will inspire our community of interaction designers from all over the world. Do you have an uncommon or enlightening design story, valuable lessons learned from hands-on experience and want to be a part of the programme at Interaction|12? You do? Great!"


More to come!


BTW, I'd like to go to a few Urban Screens or Media Facades festivals:

Media Facades Festival Europe 2010 from MediaFacades on Vimeo


Of course, I'd like to go to educational technology, school psychology, and special education conferences...