Nov 3, 2010

Oracle's Cloud Interoperability API, Intel's Cloud 2015 vision, Open Data Alliance, "Cloud Builders" (videos and links, just in case you wanted to know)


Oracle submits cloud interoperability API Larry Dignan, ZDNet, 11/3/10
Oracle Announces Oracle Cloud Resource Model Application Programming Interface
Oracle Press Release 11/3/10
Oracle Cloud Computing Strategy (Oracle Site)
Oracle Cloud Computing FAQ's


An IT Cloud Computing Roadmap from Intel IT

Intel, Other Top Companies Unveil New Cloud Mission:  Open Data Centers (Intel Newsroom)
Intel plays ringleader as Open Data Center Alliance launches ZDNet, 10/27/10
Intel's Cloud Computing Solutions

Open Data Center Alliance Members
Open Data Center Usage Roadmap

Lots of things have changed since I took a database class!


RELATED
Microsoft Research: CloudFaster 

The User Experience (UX) of the 21st Century Classroom - Adaptive Path

Adaptive Path, a User Experience company,  submitted two concepts to Slate's recent Classroom Redesign challenge.  As part of this project, Slate asked students and teachers to take pictures pictures about their schools.  The pictures were part of the "Through Your Lens" exhibit. The project highlights the fact that many of our schools simply aren't geared for what is required of learning and teaching in the 21st century, as many were built to meet the demands of a society from a long-ago era. 


I think that Adaptive Path's "User Experience of Education" spin might generate some innovative thinking about what we need to do differently in our nation's classrooms, including the process of teaching and learning.   I especially liked Adaptive Path's Touch Screen Desks of the Future. Here is the concept overview, taken from Kim Cullen's Adaptive Path blog post, "The 21st Century Classroom, from a UX Perspective":


"Our futuristic concept focuses on the students’ desks, and emphasizes the importance of being able to quickly reconfigure the classroom for different activities. Each student has his or her own square desk, the entire surface of which is an edge-to-edge touch screen. Each desk connects to the wireless network, allowing the teacher to distribute digital lesson materials from a handheld device.
The desks operate independently to support individual student work, or can be combined in groups of two or four. In a group setting, the combined desks form one large screen to support collaborative work. To support certain activities, the desks can also interact with physical objects when placed and moved around their surface.
The surface of the desk is hinged, so that a student can store his or her personal items inside of it. Additionally, when a desk is opened to a fully vertical position, it functions as a stand-up workspace for delivering presentations. If multiple desks are opened to this position, they form a large contiguous touch screen wall."

In schools, flexibility is key, and I think that Adaptive Path's concept for connected is right on target!  


Here is a challenge for Adaptive Path and other folks involved in UX:   Design a system  to support teachers and learners as education "information architects".  I am sure the cross-pollination between UX, IA, and Ed Tech would be fruitful.


RELATED
Descriptions of Adaptive Path's submissions to the Slate competition:
Touch Screen Desks of the Future
Concept 2 The Modern Trophy Case
All entries to the 21st Century Classroom challenge
About Adaptive Path

Thanks to Jonathan Brill for the link!

Nov 2, 2010

EyeTube for YouTube! Eye-gaze interaction software, free and downloadable from GazeGroup

Gaze interaction systems provide access to computers and the rich content now available on the web for many people with disabilities.  Unfortunately, commercial gaze tracking systems are very expensive and at times, difficult to calibrate.  There is hope!


Following up on my recent post, "Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta", I thought I'd share the GazeGroup's EyeTube for YouTube interface.  


What is great about EyeTube for YouTube is that it provides two different interfaces. The simplified version looks good for younger children or people with cognitive disorders, and is icon-based.  The second version is appropriate for people who can navigate through more complex visual representations of content. 


EyeTube requires a Windows-based system and .Net 3.5 at this time. It can be downloaded from the GazeGroup website.  If you plan to download the application, you must also make sure you have a YouTube account. To get the application up and running, you'll need to change the settings (EyeTubeSettings.xml) to match your account.   (If you don't know much about changing settings or xml, ask someone you know who works in IT.)


Below is the icon-based version of the eye-gaze interface for YouTube:
EyeTube - Gaze Interaction for YouTube (simplified version)


Feature-rich version of the EyeTube interface for YouTube:
EyeTube - Gaze Interaction for YouTube

From the GazeGroup site:

"The EyeTube prototype offers a feature rich eye controlled interface for the popular YouTube service. Instead of emulating a mouse pointer and interacting with a web browser the EyeTube interface is especially designed to be driven by gaze input. It offers a wide range of features such as keyword searching, popular video feeds, favorites and social aspects such as subscriptions, friends and commenting on videos.The highly optimized interfaces allows for a streamline interaction which is aleviated from the Midas Touch problem. In most previous gaze interfaces selection is made by a dwell time activator, e.g fixat a button for a specific amount of time and it will execute the function. In the EyeTube interface a fixation on a U.I element will highlight it and a second fixation on the activation button is required to execute the function. This removes the stress of having to constantly move the eyes to avoid unintentional activation."
"The EyeTube also exists in another simplified incarnation developed for users whom are distracted by a larger number of options. It supports basic features such as browsing categories, optional keyword searching and favorites."

RELATED
The GazeGroup
(The individuals mentioned below may be currently working elsewhere, but involved in the gaze research in some way.)

GazeGroup Research Areas

COGAIN (Communication by Gaze Interaction)

ACM CHI Conference Articles
San Agustin, J., Skovsgaard, H., Hansen, J. P., and Hansen, D. W. 2009. Low-cost gaze interaction: ready to deliver the promises. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4453-4458. DOI= http://doi.acm.org/10.1145/1520340.1520682
San Agustin, J., Hansen, J. P., Hansen, D. W., and Skovsgaard, H. 2009. Low-cost gaze pointing and EMG clicking. In Proceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 3247-3252. DOI= http://doi.acm.org/10.1145/1520340.1520466 
Tall, M., Alapetite, A., San Agustin, J., Skovsgaard, H. H., Hansen, J. P., Hansen, D. W., and Møllenbach, E. 2009. Gaze-controlled driving. InProceedings of the 27th international Conference Extended Abstracts on Human Factors in Computing Systems (Boston, MA, USA, April 04 - 09, 2009). CHI EA '09. ACM, New York, NY, 4387-4392. DOI= http://doi.acm.org/10.1145/1520340.1520671

UPDATE

Eye-controlled games and leisure applications from the COGAIN wiki: http://www.cogain.org/wiki/Leisure_Applications
  • EyeArt - EyeArt eye-drawing program, developed by Andre Meyer and Markus Dittmar, Technical University of Dresden, Applied Cognitive Research Unit, Germany.
  • GazeTrain - Gaze-controlled action oriented puzzle game, developed by Lasse Farnung Laursen, Technical University of Denmark
  • Puzzle - Simple puzzle game that can be played with eye movements, developed by Vytautas Vysniauskas, Siauliai University, Lithuania
  • Road to Santiago - Gaze-controlled adventure game (full game), developed by Javier Hernandez Sanchiz, Universidad Publica de Navarra, Spain
  • Snap Clutch - An application that uses eye gaze data to generate key and mouse events for playing games such as World of Warcraft and Second Life.
  • ASE: Accessible Surfing Extension for Firefox - Follow this link to access ASE, an Accessible Surfing Extension for Firefox, developed by Emiliano Castellina and Fulvio Corno at Politecnico di Torino. (Note that this is a beta version.)
  • Eye Gaze Music (SAW Selection Sets) - Point and Play – eye gaze (direct pointing) musical activities, developed by DART. Please note that SAW (Special Access to Windows) framework application is needed to play these 15 music selection sets. SAW is available for free athttp://www.oatsoft.org/Software/SpecialAccessToWindows
  • EyeTube - Gaze interaction for YouTube - Follow this link to get more information and download EyeTube at ITU GazeGroup's web pages
  • Eye3D and other head eye mouse software - Eye3D for education, and a collection of links to free software that works with head or eye mouse. Includes links to downloads and original sites.
  • Gaze-controlled Breakout - Follow this link to access a modified version of the LBreakout2 game which can be operated by an SMI eye tracker, developed by Michael Dorr et al. at University of Luebeck
  • Oleg Spakov's Freeware games for MyTobii - Follow this link to access MyTobii compatible games developed by Oleg Spakov, University of Tampere, Finland
  • Free ITU Gaze Tracker and applications - Download a webcam based open-source gaze tracker and several applications that work with it, developed at IT University of Copenhagen
  • GameBase - Check out the Eye-Gaze Games category at the SpecialEffect GameBase!
  • More information about Gaze-Controlled Games - Follow this link to see a list of online information resources on using gaze for the control of games and other leisure applications

Nov 1, 2010

3D Browser-based Science Games from Muzzy Lane: The ClearLab Project

"ClearLab is a project to create innovative 3D science games for middle school students. ClearLab games will be immersive and educational, and can be played in the browser - at school, the library, at home - anywhere with access to the internet. Teachers will be able to assign, manage and assess student game play from the web."


"ClearLab is being developed by Muzzy Lane Software, Inc., in partnership with the Federation of American Scientists, curriculum developers from K12, Inc. and science teachers around the country. The project's primary goal is to develop games that improve student performance on standardized assessment and that foster lifelong passion for science."


The ClearLab Project is funded by DARPA.  It is an open development project.


Thans to Eliane Alhadeff, of Serious Games Market, for the link!


ClearLab Blog



Cross-posted on the TechPsych blog.

Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tall, NUI-Group Member

I came across the first version of the open-source ITU Gaze Tracker on the NUI Group forum in April of 2009 and played around with it a bit.  I was impressed.  I'm happy to say that the new version looks even better, although I haven't had the time to try it out.  Below are two recent videos that will give you a better understanding about gaze tracking.  


For the tech-curious, make sure you take the time to view the second video!  Links to info & code are below.


GT2 High speed remote eye tracking "Pushing the limits"


Technical Demonstration


Info about  the ITU Gaze Tracker 2.0 Beta from the NUI Group Forum, posted by Martin Tall:



Introducing the ITU Gaze Tracker 2.0 Beta
"We’ve made great progress since the initial release, today we open the doors for version 2.0. Internally we’ve rewritten major parts of the platform to gain flexibility and higher performance.  First version was DIY playtime, this version is nothing short of a screamer. High performance, very accuracy tracking. People are telling us we are crazy giving it away but we’re dedicated to the mission: Accessible eye tracking for all, regardless of nationality and means. We’re making it happen."
Important highlights for GT2.0b:
- Supports three modes of operation, head-mounted, remote mono/binocular
- Vastly improved performance, +500fps head mounted, +170fps remote binocular (both eyes)
- Awesome accuracy, avg. 0.3 - 0.7 degrees of visual angle (remote binocular)
- New U.I, looks so.. 2010
- Automatic tuning (optimization of algorithms parameters)
- Relatively low CPU-utilization and memory footprint (12%, 170Mb, core i7 860 win7-64)
- Many enhancements, bug-fixes etc.

Unlocking the Future of Cities through Multi-Touch Interactive Visualization at RENCI (UNC-Charlotte)

Here is a link to an article that was in the SciTech section of my morning paper today!


Unlocking the Future of Cities:  UNCC scientists work across disciplines to predict how urban areas will use open land. Tyler Dukes, Charlotte Observer, 10/31/10


"As part of a three-year, $286,000 grant from the National Science Foundation, the group of scientists from UNC Charlotte is researching the complex relationship between the Queen City and its surrounding forest and pastoral lands. Using a combination of social, natural and computer science, they're working to build an interactive map-based simulation capable of showing the impact of future development and policy on land use....It's a project requiring Meentemeyer's team to peel back multiple layers of cultural and economic values surrounding land in the South. The research will have implications beyond the Charlotte area...By allowing the public to explore those possibilities visually on anything from a laptop to a touch-screen table, the research team is hoping its work will mean more informed decisions about how people use the land around them."  -Charlotte Observer




Image Source: Charlotte Observer


Wouldn't this be a great tool to use to support collaborative learning projects in the schools?


RELATED
RENCI at UNC-Charlotte has a Multi-touch Table in the Visualization Center
RENCI Visualization Center Update
Visualization Resources at RENCI UNC-Charlotte
RENCI at UNC Charlotte
Multi-Touch at RENCI
Research by Touch:  RENCI Multitouch Table Gives Computer Science Research an Intuitive Interface