Sep 26, 2013

ACO VIRTUAL: Interactive virtual chamber orchestra, with 3D musicians

The Australian Chamber Orchestra has unveiled ACO VIRTUAL, an installation that provides music-lovers the opportunity to experience what it is like to be surrounded by thirteen performing musicians, with the opportunity to control what you choose what is heard. The installation was created in collaboration with Mod Productions, an interactive multimedia company.

This promises to become a fantastic music education and performance tool!  To get a better understanding of ACO VIRTUAL, view the following video clips:

ACO VIRTUAL Trailer


"Each instrumentalist was filmed at Fox Studios, ACO artistic director and lead violin Richard Tognetti said, using the same "bullet-time" camera equipment used to shoot The Matrix, in both 2D and 3D. Visitors can enjoy the full stereoscopic experience with provided 3D glasses." -CNET

ACO Virtual Launch of Touring Installation

ACO VIRTUAL launch at MCA from Mod Productions on Vimeo.


CNET Interview: Building an Interactive Virtual Chamber Orchestra



How they did it:



The experience is controlled by a touch-pad app:
Picture of the touch pad app used to control the virtual musicians
Credit: Will Huxley, ACO VIRTUAL



(Information for exhibitors)






Sep 15, 2013

Librarians as Data Scientists: Data Science for All

Librarians as Data Scientists: Data Science for All


"The ability to take data - to be able to understand it, to process it, to extract value from it, to visualize it, to communicate it - that's going to be a hugely important skill in the next decades. " - Hal Varian, McKinsey Quarterly

I came across the above quote from the Data Scientist Training for Librarians website.  It provides the motivation for librarians - and others- to consider learning more about data science - a field that applies to just about every field of study in some way, as well as most professions.


Credit: Dirk Tiede/DST4L



Librarians are in a good position to become early embracers of data science, as they are interested in information, understand a wide range of scholarly (and not-so scholarly) topics, and serve as the human interface bridging people and ideas. Wearing many hats, they provide resources for people from all walks of life, in community libraries, schools, universities, government, industry, and research centers.  With the power of today's computers and software tools, they have the potential to play an important role in the way we discover, curate, understand, and communicate knowledge. 

Chris Erdmann, the head librarian of the John G. Wolbach Library at the Harvard-Smithsonian Center for Astrophysics, in collaboration with like-minded others, organized the first Data Scientist Training for Librarians (DST4L) course. The first round of the course included a number of noted speakers from a variety of disciplines, and covered a number of topics, as summarized  by Jennifer Prentice (Simmons Graduate School of Library and Information Science Tech Lab) on the course website.

The second DST4L course was opened to all librarians, with a solid list of topics outlined on the syllabus.   The course started with a 2-hour introduction to the course last July, captured in the video below: 



I'd like to share that Chris Erdmann is a member of the Seamless Astronomy Group, a multi-disciplinary collaboration of astronomers, computer scientists, librarians, visualization specialists, and information scientists.  The aim of this group is to research and develop systems that "seamlessly integrate scientific data and literature, the semantic interlinking and annotation of scientific resources, the study of the impact of social media and networking sites on scientific dissemination, and the analysis and visualization of astronomical data and research communities".  

As I looked at the info-graphic posted on the Seamless Astronomy website (below), it occurred to me that this conceptualization could apply to a number of disciplines.  I look forward to learning more about the tools, methods, and processes that this group generates!
Credit: Seamless Astronomy Group

One of the projects that interests me is the WorldWide Telescope, a collaboration between Microsoft Research and the Seamless Astronomy group. The WorldWide Telescope is a visualization software that enables people to explore data and high resolution images stored in the "cloud" through a web browser or computer application. According to Chris Endermann, a student project exploring the use of the new Oculus Rift virtual reality headset with the WorldWide Telescope is in the works.   

RELATED AND SOMEWHAT RELATED

Data Scientist Training for Librarians 
Data Scientist Training For Librarians Course Details

Data Science: What's in it for the New Librarian?
jmstanto, Information Space, 7/16/12

Cracking the Code: Librarians Acquiring Essential Coding Skills Matt Enis, The Digital Shift, 3/6/13 (Links to tutorials)

OpenRefine: Data Scientist Trainings for Librarians Presentation (SlideShare) 
Tom Morris, 3/28/13

Librarians Are Becoming Big Data Experts
John Edwards, BigData Republic, 8/9/13

Code {4}lib Issue 21

Prototyping as a Process for Improved User Experience with Library and Archives Websites
Shaun Ellis, Maureen Callahan, code{4}lib Journal, Issue 18, 10/3/12

Crafting Linked Open Data for Cultural Heritage:  Mapping and Curating Tools for the Linked Jazz Project M. Cristina Pattuelli, Matt Miller, Leanora Lange, Sean Fitzell, Carolyn Li-Madeo, code{4}lib Journal, 7/15/13

Collaborative Librarianship News

Confederation of Open Access Repositories

Public Libraries Add Multimedia Learning to Digital Mission
Nora Fleming, Education Week, 9/15/13

Life of an emerging technologies librarian in the health sciences (must-read)
Patricia F. Anderson, UMSI Monthly, 1/2013


Data and Coding for Journalists


Although journalists aren't librarians, the profession has expanded to include data journalism, a growing field fueled by the visual nature of the web and the increase in popularity of mobile devices in the form of smart phones and tablets. Books and newspapers are still with us, but technology has changed the way journalists do their jobs

I'm not sure if the journalists and librarians are collaborating with each other as they venture into coding and data analysis.   If not, they should!  The links below would be of interest to librarians, journalists, and other knowledge junkies.


Video: Journalism in the Age of Data

Journalism in the Age of Data from Geoff McGhee on Vimeo.

Coding for the Future: The Rise of Hacker Journalism
Maryanne Reed, Media Shift, 5/2/13
The above post includes a number of video and audio clips on the topic, plus a number of links, such as the following:
Data scraping tool for non-coding journalists launches     (Import.io)
Sarah Marshall, Journalism.Co.Uk, 9/9/13

Data Journalism Handbook: A Five Minute Field Guide
Data Journalism Handbook Chapter:  Getting Data













ODDS AND ENDS
A little history about the idea of seamlessness in computing:
The Computer for the 21st Century
Mark Weiser, Scientific American, September 1991 (online draft)


My thoughts on the need for data/information visualization not long after the world-wide financial crisis:
Reflections: Need for Interactive Infoviz for the Financial Biz, Business Leaders, Government Officials, Educators and the Rest of US... 
Lynn Marentette, Interactive Multimedia Technology, 2/1/09



Since I'm a school psychologist (in my day job), I thought I'd slip this in: 
Why every (psychology) student should learn to code 
Matt Wall, Computing for Psychologists, 1/13/13
Many of Matt Wall's links would be of interest to anyone thinking about learning more about technology, data analysis, and related coding.


Sep 6, 2013

Narcolepsy Awareness Infographic, FDA's Public Meeting

One of my daughters was diagnosed with narcolepsy during her senior year of high school. It is one of those "hidden" diseases that doesn't get much attention. The good news is that as a result of advocate Julie Flygare and others, the FDA has chosen narcolepsy as the focus of a public meeting on September 24. 2013, to learn more about the disease, straight from the people who are challenged by it every day. 

This initiative is part of the Patient-Focused Drug Development initiative. "For each disease area, the Agency will conduct a public meeting to discuss the disease and its impact on patients' daily lives, the types of treatment benefit that matter most to patients, and patients' perspectives on the adequacy of the available therapies. These meetings will include participation of FDA review divisions, the relevant patient community, and other interested stakeholders.

For additional information see the Unite Narcolepsy website.  If you have narcolepsy or are close to someone who has it, you can complete an on-line survey to provide input to the FDA on the impact of this disease.



Eye Tribe Eye Tracker Dev Kit, $99; Open Source ITU Gaze Tracker Grows Up!

The Eye Tribe Eye Tracker developer kit is available for pre-order for $99.00. The kit comes with an SDK with C++, C#, and Java, full source code included.  

I've been waiting for a while to see this happen! 

The Eye Tribe Eye Tracker is an outgrowth of the work of a group of researchers at the IT University of Copenhagen.  At the time, it was known as the open-source ITU Gaze Tracker. 
I came across it a few years ago in a NUI-Group forum, and later wrote a post about it when the 2.0 version was released. 

Although the Eye Tribe Tracker was originally developed to meet the needs of people with disabilities who could not access computers, it was found to have potential for a number of other uses that were not really possible before the spread of mobile technologies such as touch-screen tablets and smart phones. 

To get a better understanding of eye-gaze/tracking technology, take a look at the following videos and follow the related links.



Below is a demonstration of the gaze UI on an Android smartphone:


Here is another look at this technology running on a Windows 8 Tablet:





RELATED
The Eye Tribe (website)
Eye Tribe starts taking pre-orders for $99 Windows eye tracker
Senseye will let you control your mobile phone with your eyes
Martin Bryant, The Next Web, 12/2/11
Open-source Eye-tracking: The ITU Gaze Tracker 2.0 Beta Via Martin Tull, NUI-Group Member
Lynn Marentette, Interactive Multimedia Technology, 11/1/10
ITU GazeGroup
Gaze Tracker Development
GazeGroup Forum
Martin Tall


RELATED VIDEOS
Eye Tribe was formally known as Senseye. Below is an earlier video that shows how it worked with a web-cam on a mobile device:



Open-Source ITU Gaze Tracker

ITU Gaze Tracker from ITUcph on Vimeo.


Earlier Videos of the ITU Gaze Tracker:
Technical Demonstration 




Seeking Sustainable Innovation

Aug 23, 2013

Allison Druin and the HCIL Team Win Emmy for Nick App: Outstanding Creative Achievement in Interactive Media-User Experience and Visual Design

I'd like to give a shout-out to Allison Druin and the team at the University of Maryland Human-Computer Interaction Lab (HCIL) for winning the Outstanding Creative Achievement in Interactive Media - User Experience and Visual Design award!  This is a new category of award for the Emmys.


The Nick App is free and available for the iPad, iPhone, iPod Touch, Windows 8, and Xbox Life.




Here is the scoop from the Emmys website:

"The Nick App is a branded experience that allows kids to watch and play Nick in unprecedented ways. This free App features a moveable tile layout that can be swiped in any direction, promoting discovery and exploration and offering kids instant and on-demand access to more than 1,000 pieces of Nickelodeon-themed content. It includes short-form videos of original skits, sketch and comedic bits, behind-the-scenes clips and photos from Nick stars and animated characters, full episodes, polls, new games, and surprising random hilarity. The Nick App supports the full Nickelodeon on-air line up as well as specials such as the annual Kids' Choice Awards. The App boasts new content daily and includes fun and funny interactive elements such as the "Do Not Touch" button that triggers an array of disruptive comedy and surprises. Nickelodeon's goal was to go beyond a typical app that offers free video viewing and instead offer more interactive content, games, and video not seen on television — whenever and wherever the user wants it." 

RELATED

Kid Design at the HCIL: Human Computer Interaction Lab, University of Maryland 
At the HCIL, children participate as co-designers, and are members of the Kidsteam. The Nick App was created with their input!

Allison Druin is an iSchool Professor & Chief Futurist for the Division of Research at the University of Maryland. She previously was the director of the HCIL has devoted much of her career to children and technology. 

Release:  Immersive and Interactive Digital Media Programs to Receive Emmys

Emmys Category Descriptions:  Outstanding Creative Achievement in Interactive Media 
(Multiplatform Storytelling; Original Interactive Program; Social TV Experience; User Experience and Visual Design)

Aug 14, 2013

Let's Get Visual: Visual Communication with Smartphones, Getty's Open Content, Art Everywhere and Visual Perception Research

Let's Get Visual....

It difficult to imagine a world without Instagram, Flickr, YouTube, Pinterest or Facebook photo-sharing.  According to a press release from ABI Research in 2012, over one billion cameras were shipped in smartphones and tablets.  Never before has it been this easy to shoot and share pictures and video clips in-the-moment.   Why talk on the phone for twenty minutes with one person when you can share a (visual) moment with your entire extended family and network of friends, colleagues, and acquaintances with just one touch?

We are communicating in ways that were not possible a decade ago.

Qmee, a online search rewards company, put together an infographic to show what transpires on-line in just 60 seconds. 72 hours of video are uploaded to YouTube. 20 million photos are viewed on flickr. 104,000 photos are shared on snapchat, and 3,600 photos are uploaded every second to Instagram.


Infographic of what transpires on the Internet in just 60 seconds.
Credit: Qmee








































According to a 2012 study published by the Pew Internet & American Life project, 46% of adult Internet users took pictures or videos and posted them on line. 41% reported that they found photos or videos online, and then re-posted them to share with others.  If you have a Facebook account, you know that there are people who delight in taking digital content sharing to the next level. Websites such as the Programmable Web's Mashup Dashboard and Know Your Meme are examples of this trend. 


Getty Museum's Open Content Program

The Getty Museum recently launched the Open Content Program with a release of about 4,600 high-resolution images of all public domain artworks in the museum's collection, free for anyone to use, modify, or publish.  The Getty plans to release more images in the future. The images can be accessed via the Getty Search Gateway

Here is the motivation behind the release of Getty's digital content:


"Why open content?  Why now?  The Getty was founded on the conviction that understanding art makes the world a better place, and sharing our digital resources is the natural extension of that belief.  This move is also an educational imperative.  Artists, students, teachers, writers, and countless others rely on artwork images to learn, tell stories, exchange ideas, and feed their own creativity.  In its discussion of open content, the most recent Horizon Report, Museum Edition stated "it is now the mark-and social responsibility-of world class institutions to develop and share fee cultural and educational resources."" 


It will be interesting to see how this trend unfolds.  We are now past the clip-art era.


Art Everywhere

"Its all about flooding our streets with art this summer, and turning the UK into the world's largest art gallery" -Richard Reed

Art Everywhere was launched in the UK on August 12th and ends on August 25th of this year (2013).  The art is on display in public spaces on thousands of billboards and digital screens across the UK,  as well as on buses and cabs.  The art was chosen by the public, and the project was funded by the project's sponsors and donations.  

Photos of the artwork chosen for the Art Everywhere exhibition, along with related information about each selection, can be viewed on the Art Everywhere website. Richard Reed, who spearheaded Art Everywhere, provides an overview in the video below:




All of the Art Everywhere posters can come to life with when viewed through a smartphone. Using the free Blippar app, the viewer can scan the poster with the smartphone's camera to access more information about the artwork.  Viewers are encouraged to enter the Art Everywhere photo competition by taking pictures of the outdoor exhibition to win a camera, daily prizes.

Art Everywhere's partners include the Reed Page Foundation, the ArtFund, Tate, Posterscope, Vizeum, 101/   EasyArt, Blippar, Arts Media People, Clear Channel, CBS Outdoor, JCDecaux, Ocean, and Primesight

Neuroscience Research: Visual Perception Process

"The brain is plastic for life" .-Michael Merzenich*

There have been concerns that the internet and new technologies might have a significant impact, good or bad, on the human brain and child development, as discussed by a panel held at the University of Colorado Annual Conference on World Affairs.  

The good news is that neuroscientists have new technologies to explore how the brain works. We have evidence to support that our brains have more plasticity as we age than previously thought. Since the technology use in brain research is so new, it might be a while before we will know what exactly happens when our brains are "on the internet".

Theories of how the brain functions are changing as new discoveries are made. One example of this is the work of Dr. Randy Bruno, a neuroscientist at Columbia University's Kalvi Institute for Brain Science.  He has discovered that in rats, signals about sensory information are processed simultaneously in two parts of the cortex, as separate circuits.  The upper layers of the cortex might be integrating sensation with context or experience, as described in the following video:



Dr. Bruno's research aims to understand how experience causes structural changes in the brain. His research shows that neuronal connections between the thalamus and the cortex are very plastic and responsive. Although he studies brain functioning in rats, the cortical systems are similar to other mammals, including humans.

"The key is to be actively engaged in a task and that there be meaningful elaboration on the information taken in at higher levels in order to maintain associations between different pieces of information. Such active engagement, as opposed to passive acquisition of what we experience with our senses, may enhance changes in the brain." - Randy Bruno*

From what I can see, it might be good for us to accessing the web via smartphones to engage in social visual communication activities.  If we are excited by a visually-based experience, and share it with others, or if we take in visual information and use it to create a mashup, and send it out for others to enjoy, our brains are actively engaged. If the neuroscientists are right, we aren't really wasting our time online.  

We are growing dendrites!

RELATED

The Fantastic Plastic Brain*
The Kavli Foundation, 7/12

Study Advances New Theory of How the Brain Is Wired
Claudia Wallace, Columbia University Research, 6/27/13

The Internet and the Human Brain Panel (Video-1 hour, 14 minutes)
University of Colorado Annual Conference on World Affairs

Instagram at the Museum: Communicating the Museum Experience through Social Photo Sharing Alexandra Weilenmann, Thomas Hillman, Beata Junselius, CHI 2013, ACM

Open Content, An Idea Whose Time Has Come
James Cuno, The Getty Iris, 8/12/13

Getty Search Gateway

Horizon Report 2012 Museum Edition





Blippar Brings World's Largest Art Exhibition 'Art Everywhere' to Life
Blippar News Release, 8/8/13

Art Everywhere turns UK's streets into world's largest art show
Caroline Davies, The Guardian, 8/8/13

The Rise of Visual Socialization
The Halo Group Effect, 2013

The Unstoppable Trend of Visual Communications
David Amerland, HP InputOutput, 2/11/12

The Rise of Visual Social Media
Ekaterina Walter, Fast Company, 8/28/12

Research Modules: Next Generation Mobile Devices, ABI Research

Mashup Dashboard, Programmable Web

Know Your Meme

Prancercise video memes 

This Brain Discovery May Overturn a Century-Old Theory
Ben Thomas, Scientific American Blog, 8/8/13

Aug 13, 2013

VISUAL LEARNING: EMOTION-EXPRESSION-EXPLANATION CALL FOR PAPERS - Deadline for abstracts Aug. 15th

CALL FOR PAPERS


VISUAL LEARNING: EMOTION — EXPRESSION — EXPLANATION


Conference to take place in Budapest, November 15-16, 2013 
(Note: The deadline for the CFP for the following conference is August 15th, 2013. An abstract of a maximum of 200 words and a short bio of a maximum of 100 words is required by the deadline.)

Conference organized by the Visual Learning Lab, Department of Technical Education,
Budapest University of Technology and Economics. (http://www.vll.mpt.bme.hu)
Speakers will include:
Contributions are invited from educational, communication, and media theorists, philosophers, linguists, psychologists, and other interested scholars on the following and related topics:
  • educational theory and visual learning
  • images, aesthetics, art history
  • images and media theory
  • metaphor and emotion
  • images in psychotherapy
  • visual rhetoric
  • pictorial meaning
  • pictorial communication
  • visual imagery
  • visual intelligence
  • the visual mind
  • the image problem in the history of philosophy
  • resemblance, representation, reality
  • visual argument
  • scientific visualization
  • visualization and higher education
  • information visualization
  • image and language
  • image and creativity
  • images in the network age
A slot of altogether 35 minutes is planned for each presentation. We envisage an ensuing volume of selected papers (vol. 4 in the series Visual Learning, ed. by Andras Benedek and Kristof Nyiri).
Submission of abstracts (max. 200 words) and short biographical statements (max. 100 words) by Aug. 15, 2013.  Please send your submissions simultaneously to Prof. Andras Benedek <benedek.a@eik.bme.hu> (Head, Department of Technical Education) and to Kristof Nyiri <knyiri@t-email.hu> (Professor of Philosophy, Department of Technical Education). Those submitting abstracts will be notified of the decision concerning acceptance by Sept. 1, 2013.
No conference fees will be charged. Participants are encouraged to arrange their own accommodation. The conference venue (Budapest University of Technology and Economics, 1117 Budapest, Magyar Tudosok korutja 2, Bld. Q, Wing A) is located near downtown Budapest.

Series Visual Learning
Volume 1
András Benedek, Kristóf Nyíri (eds.) 2011: Images in Language
Metaphors and Metamorphoses (Frankfurt am Main, Berlin, Bern, Bruxelles, New York, Oxford, Wien, 2011)

Volume 2
András Benedek, Kristóf Nyíri (eds.) 2012: The Iconic Turn in Education. (Frankfurt am Main, Berlin, Bern, Bruxelles, New York, Oxford, Wien, 2011)

Volume 3 of the series will be published in September, 2013: "How To Do Things With Pictures: Skill, Practice, Performance"

Aug 10, 2013

Kinect Interaction to support people with disabilities: DoubleFine's Happy Action Theater/Kinect Party; OAK Air Switch and Face Switch, resources

I've been experimenting with gesture and touch-based applications for many years and I'm excited to see how things have unfolded over the past couple of years, especially in the field of special education.

Last week I downloaded DoubleFine'Kinect-based Happy Action Theater/Kinect Party,  to use during some group activities with students I work with who have significant disabilities (including severe autism).  I wish I had discovered this suite of games sooner!

I had loads of fun with students and colleagues as we explored some of the 36 creative, and sometimes zany, minigames.  I had heard that DoubleFine had launched something special, but didn't realize how awesome it was until I spent some serious playful time with it at home last weekend. I then tried it out at work this past week.  

If you are planning to explore Happy Action Theater/Kinect party, keep in mind that it plays best when there are at least two people and an audience to cheer everything along.  Through the use of blob detection algorithms, the games can handle up to 6 players at a time, which is perfect for small-group special classes.

The following trailer gives just a little hint of what this suite of mini-games is all about!


I noted that many of the games were effective in helping students become more aware of their peers. They began to play and interact with one-another in ways I hadn't previously imagined.  I especially liked the fact that many of the mini-games made it possible for students in wheelchairs to participate.  

I look forward to exploring more of the games over the next few months and will follow up with a future post after I get more input from my colleagues (and students).

I learned about Kinect Party through my contact with people involved with the GestureSEN wiki. The wiki was created as part of a Professional Learning Community (PLC) for people who work with students in specialized schools, similar to the school where I work, and contains a wealth of information about the use of newer and emerging technologies, such as the iPad, Leap Motion, the Kinect, and eye-gaze systems to support young people with significant disabilities  including autism  Some members of the GestureSEN wiki have learned to code or are in the process of doing so, motivated by what they've experienced so far with their students.  (More information and links are listed in the "RELATED" section of this post.)

OAK

OAK was developed by RCAST at the University of Tokyo in collaboration with Microsoft Japan Co., Ltd. It uses the motion-tracking capabilities of Microsoft's Kinect sensor to create non-contact switches for people with limited mobility, enabling them access to computers and other electronic devices and systems.  The video below provides a nice overview of the OAK system.

The OAK Pro bundle includes the following applications:

The Air Switch software uses the distance/depth capabilities of the Kinect sensor to support gestures of the head, hands, or larger body part to turn things off or on. The infrared from the Kinect also supports the use of the Air Switch in the dark.   The color mode function captures movements from smaller parts of the body, such as a fingertip.

The Face Switch software uses facial recognition software that can track the movements of the face, mouth, tongue, and eyes.  It can identify facial parts that have moved significantly, and records motion data 

The Motion History software observes  the movement of a person's body using the video component of the Kinect sensor.   This customizes the system to the individual and ensures accuracy of the switch.   Movements are color coded and provide the person who is setting up the system a means to fit the system to the specific capabilities and needs of the user.  

The OAK system can be enhanced by the sue of peripherals, such as a USB 4 channel relay box, an IR remote control device or outlet, or other on/off switches/outlets.

The Assist-i corporation has made the OAK system and peripherals available on Amazon Japan.  From what I can tell from the company's website, the OAK software can be downloaded free for a 30-day trial.   I'd love to see how it would work with some of the students I work with who have difficulty accessing conventional switches!  It would be wonderful to come up with ways for these students to access a wider range of digital media activities and games.


















RELATED
University adapting videogame technology to help physically disabled computer users
Philip Kendall, Japan Today, 10/10/12
OAK Air Switch (PC Kinect)
OneSwitch.org.uk 4/30/13
OAK Air Switch, Face Switch, Motion History Pro Bundle (pdf)
Assist-i Corporation
Amazon Ai store: Assist-i Corporation (Prices are in Yen.)

Below is a partial list of links to resources related to using or creating engaging interactive applications and games for people with special needs: 

Using Kinect in Special Ed Classrooms: Advice from Loudoun County, Virginia Teachers
Microsoft in Education Team, Microsoft in Education Blog, 6/1/12

KinectSEN-Kinect and Special Educational Needs round-up
Greg Dunan, Microsoft Coding4Fun, 10/11/12

Monkeying Around with Autism Assessments: Kinect-based game by Vectorform andKaiser Permanente therapists offers a barrel of possibilities!
Lynn Marentette, Interactive Multimedia Technology, 7/23/13

Behind the Scenes: Creating Marty the Monkey (The character from Vectorform's autism assessment app) John Einselen, Vectorform Blog, 7/24/13


Kinect Party Review: More Fun from the Fun Kings
Casey Lynch, IGN, 12/20/12


The Power of Kinect in Special Needs Education
Willemijn de Lint,  Hans Smeele, mytylschool De Ruimte

Sign Language Recognition and Translation with Kinect (pdf)
Ming Zhou, et. al.

Cool Kinect move: Reading sign language in real time
Christopher MacManus, CNET, 7/18/13

Anthony Rhys, Trinity Fields ICT

James Winchester, SENClassroom blog

PMLD Eyegaze Project at Trinity Fields

Kinect hacking using Processing

Kinect SEN and Processing Resources
Keith Manville, Oak Grove College OpenSEN

Mat's Classroom Blog

GestureSEN Wiki
KinectSEN Wiki; KinectSEN News
ProcessingSEN wiki
LeapSEN Wiki
EyegazeSEN Wiki

SEN Students and Coding
OpenSEN, 3/5/13

Processing2

Kinect for Windows Blog

Kinect For Windows
DoubleFine

Understanding Engagement, Module 3.2Training materials for teachers of learners with severe, profound and complex learning difficulties, UK Dept. for Education


Subscribe by e-mail

Aug 3, 2013

Tablets at Work: AT&T videoclips, other examples, in case you were wondering.

If you have a tablet or an iPad, it might be difficult to imagine life without one. AT&T knows that there are plenty of people who might be ripe for buying one. To give them a nudge, they've created a series of videos centered around a couple who is looking for a new home, and the people who help them move towards their dream.

The first video shows how real estate agents can use the photo and video features of a tablet to enhance the home search process, saving time for all.


In the second video, the couple is connected with a tech-savvy mortgage broker, who has a short interchange with another broker who is lost in a pile of paperwork.



The third video shows how tablets are used during the home inspection process as well as planning out the interior design. 



The last video shows how the tablets can be used by the movers to complete a payment transaction and plan for their next stop. The couple orders a pizza, and a scene from the pizzeria shows how a tablet can be used to quickly complete the transaction for the order.




RELATED
Latista Named 2013 Constructech Top 50 Products, LATISTA Field Management Software
Hailey Lee, 7/23/13


How the iPad helps scientists do their jobs
Joel Mathis, Macworld, 7/9/12
Black Mana Studios Launches Interior Design for iPad: Bringing Home Designs to Life
PR Web, 12/19/12
Home Inspector Pro for iPad is now available
Scott Patterson, Inspection News, 12/29/12
New iPad App Make Bridge Inspection Management Mobile, Streamlined
Luke Abaffy, ENR.com, 4/1/13
13 New Mobile Agriculture Apps for 2013
Matt Hopkins, CropLife 3/25/13
HTML5 iPad App Eases Charting for Dentists (Uses Sencha Touch)
Brian T. Horowitz, eWeek, 7/4/12
Children's Care Alliance put iPads in hands of school nurses
Sara K. Stullo, The Express-Times, 2/4/13
100 Boston Police Detectives Getting iPads For Their Jobs
CBS Boston, 6/29/13
Electrical Apps for Electricians and Electrical Engineers
ProCerts Software
Apps in Therapy (Physical Therapists)
Margaret Schmidt, RehabCareAdvantage, 1/8/13
10 Best iPad Apps for Hairstylists
TechiCore, 10/31/12
Technology That serves to Enhance, Not Distract  (Use of iPads in museums and an interactive touch screen wall.)
Fred A. Bernstein, New York Times, 3/20/13

Jul 27, 2013

Drawing with Sounds, an Interactive Musical Drawing App Created with Processing

This is my first creation using the Drawing with Sounds application. This cause and effect application was created with Processing 1.5.1 and produces musical sounds and patterns, along with random abstract shapes that are triggered through movements of a mouse. The application has been adapted for switch users, making it accessible to those who have motor impairments.

Processing is an open-source programming and integrated development environment that was build on the Java programming that has been easy to learn by people who have limited technical backgrounds. It is used by people from from a range of fields, such as art, music, journalism, and data visualization.

The fact that Processing is easy to use makes it great for people with just a little bit of coding knowledge to adopt a variety of open-source applications and tweak a few of the variables to according to the need.   Using the Drawing with Sounds app as an example, I wanted to change the size of the interactive canvas.  By typing in a few keystrokes,  I changed the original size of the screen from 800 x 800 to something larger, as shown below:



























I then typed in another variable for the background color, and was pleased with the results:
























The application was developed from Andrew R. Brown's tutorial on "SoundCipher", based on the SoundCipher library for Processing.  According to the description of the application, "The sketch is intended for use....as a sensory stimulus using sound, shape and color to create engagement". 




Below is a videoclip created by Keith Manville, demonstrating what happens when you interact with Drawing with Sounds: 


I look forward to sharing more on this topic in future blog posts.  I'll be sure to include basic "how-to" information for my "low-tech" readers and colleagues who desire to learn a bit more about using and creating basic interactive applications that appeal to young people with special needs.  


RELATED
Note:  SEN is the UK acronym for "Special Educational Needs"

James Winchester, SEN Classroom Blog, 10/31/13

Keith Manville, opensen blog, 10/27/12
"Open-source, Sensory and Interactive Technology in SEN"

(An accessible switch-enabled version of Drawing with Sounds can be downloaded from the above link.)

Free sensory applications built in Processing 1.5.1"Download links for sensory applications built in Processing 1.5.1, designed to assist SEN students in engaging in learning through the the use of ICT, interactive “Smart” boards, touchscreen or tablet technologies.  Our experience is that using applications on these devices or with natural user interfaces such as the Microsoft Kinect can increase the opportunities for engagement and social communication with many students."


I will be posting more information about emerging interactive technologies for special needs in the future. In the meantime, take a look at the following resources:

The SHAPE Project, technologies to enhance learning for young people on the Autism Spectrum
iPadSEN
KinectSEN Wiki
"Using Kinect in Special Schools for Pupils with Severe Learning Difficulties"
iGazeSEN
OPENSEN Videos