Showing posts with label Intel. Show all posts
Showing posts with label Intel. Show all posts

Jun 25, 2012

Ph.D. Student Positions: Intel Collaborative Research Institute on Sustainable Connected Cities

Thanks to Johannes Schöning for sharing information about this opportunity!
This might be of interest to some of my IMT readers:


EngD/PhD Positions within the Intel Collaborative Research Institute on Sustainable Connected Cities (ISCCI) at University College London (UCL) The Department of Computer Science at UCL is inviting applications for up to 6 Research Student Positions (1 EngD of 4 years, and up to 5 PhDs of 3 years), starting September 24th 2012 or January 7th 2013.


With 6.3 billion people expected to dwell in cities by 2050, the aim of the ISCCI is to create and realize a compelling vision of a sustainable future made possible by adaptive technologies that optimize resource efficiency, enable new services and support the quality of life of urban inhabitants. The Institute is located with a rich external ecosystem of companies and researchers both locally and globally investing in this important domain. The ISCCI is led by Prof. Yvonne Rogers at UCL.


We are looking for students willing to pursue a doctoral degree in computer science around the following broad topics:
•       How technology can help recognize, leverage, and support the out-of-sight, hidden or forgotten resources of urban environments, ranging from volunteers to subterranean water systems and other underlying city infrastructures.
•       How communities can encourage sustainable behaviours over time, for example, through meaningful visualizations and feedback about resource usage to individuals and groups.
•       How technology can give us an opportunity to reinvent new ideas of place and identity, considering the diversification & proliferation of new types of communities in cities, with the aim to increase quality of living and lower the barriers for mobility in our future connectedcities.


The applicants should possess a good honours MSc degree (1st Class or 2:1 minimum) in Computer Science, Psychology, Human-Computer-Interaction or related disciplines. Candidates will be expected to work in teams comprising computer scientists, social scientists, and psychologists, so an open attitude towards interdisciplinary research and teamwork is important. Candidates should have interest in at least two of the following research fields (as well as a good command of English) language:
•       Human-Computer-Interaction,
•       Augmented or Mixed Reality,
•       Interactive 3D Computer Graphics,
•       Interaction Design,
•       Perceptual Psychology, and/or
•       Cognitive Sciences.
•       Ethnography•       Data Mining, Machine Learning
•       Crowed Sourced Data
•       Data Visualization, Cartography
•       Geoinformatics
•       Big Data


Fees are fully paid. Salary for the 4-year EngD position is £18,090 tax free p.a., and £15,590 tax free p.a. for the 3-year PhD positions.The closing date for applications is 5pm on 18th July 2012. Interviews will he held on July 26th and 27th. 


The start date is September 24th  2012 (though it can be postponed to January 7th 2013). No part-time option available.Please download the application form http://www.ucl.ac.uk/uclic/phd_studentships/Intel_studentship_application_form/ and email the completed form to Louise Gaynor l.gaynor@ucl.ac.uk as a single PDF document by 5pm on Wed 18th July 2012. Please can you indicate in your application whether you wish to start in Sept 2012 or Jan 2013.

Intel studentship job advert_July 2012.pdfIntel studentship job advert_July 2012.pdf1881K   View   Download   

Dec 12, 2011

Near Field Communication (NFC) for SmartPhones at the Museum of London, plus related video and NFC links

I recently came across the following video demonstrating the use of NFC (Near Field Communication) at the Museum of London.  I wouldn't be surprised if this technology takes root in a variety of applications in the very near future!


Museum of London's use of Near Field Technology with Nokia


London History Museum Adopts Technology of Future
Claire Swedberg, RFID Journal, 8/16/11
"Museum of London has installed NFC RFID tags in its two location, to provide the growing number of NFC-enabled phone users with additional exhibit information, access to social Web sites and vouchers."
Byte at the museum
Trevor, Conversations by Nokia, 9/26/11


RELATED 
Apple, Microsoft Reported to Include NFC in 2012 Smartphones
Christina Bonnington, Wired Gadget Lab, 11/22/11

Gamification and NFC top 2012 Digital Trends
Wenlei Ma, AdNews12/12/11
Microsoft: NFC already supported in Windows Phone
Gareth Beavis, TechRadar.com, 12/9/11

Near Field Communication
Android Developers
Intel joins NFC Forum board
Telcompaper, 12/8/11

Join the NFC (Near Field Communication) Revolution Today
NFC Data Inc, Sacramento Bee, 11/30/11
"Pledge support. Set a new mobile wallet standard for security, convenience, and affordability! Think outside the phone!"

NFC Near Field Communication Tutorial
Radio-Electronics.com
"A summary overview or tutorial of Near Field Communications, detailing what is NFC and its technology"
Vending Machines Now Take Google Wallet Payments

-Wall Street Journal, MarketWatch












Sep 22, 2011

Intel's 'Museum of Me': Moving 3D visualizations of your Facebook life.



(I missed this one when the buzz started. Thanks to Liz Dorland for the link.)


Museum of Me is a website created by Intel that provides a variety of interesting information visualizations of your photos and other information from Facebook.  Below is a video from Intel that gives a preview of what Museum of Me is about.  Better yet, take a look at all of the 'Museum of Me' videos posted on YouTube.

Video Demo: Museum of Me


RELATED
Intel's 'Museum of Me' is Cool, Creepy Facebook Fun
Sarah Jacobsson Purewal, PCWorld, 6/1/11


Intel's Facebook 'Museum of Me' is a Must-Try
Don  Reisinger, CNET News, The Digital Home 6/2/11

Intel on Facebook


May 3, 2011

Futurecasting and Science Fiction at Intel: Video of Brian David Johnson, Futurist.



"The Tomorrow Project" is an international program that explores and creates science fiction based on science fact. The project features science fiction stories, comics and short screen plays based on current research and emerging technologies and examines their affect on our future. -Intel
The Tomorrow Project-Seattle

RELATED
Intel's Context Aware Computing- Justin Rattner, Intel Chief Technology Office
Intel Developers Forum, 2010


Mar 16, 2011

Cute TV Commercial! Intel Gesture Recognition Technology

This was the first television commercial I saw on TV this morning. I had to stop my morning routine to watch it : )



RELATED
From ChannelIntel: Interview of Francis MacDougall, CTO of GestureTek

Nov 11, 2010

Interactive Touch-Screen Technology, Participatory Design, and "Getting It", Revisited

I've been planning on updating one of my popular posts, "Interactive Touch-Screen Technology, Participatory Design, and Getting It" for a while. 


Here is a compromise - since much still rings true two years since I wrote it, the bulk of this post remains the same.   I've updated a few sections with additional video examples of interactive touch-screen applications, good and bad, along with a few links and resources, located at the bottom of the post. 


(The missing piece of information?  An update about apps for the iPad and similar touch-screen tablets.)

Sit back and enjoy!


http://www.ehomeupgrade.com/wordpress/wp-content/uploads/hp_touchsmart_pc.jpghttp://www.wired.com/images/article/full/2008/08/han_interview_630px.jpg
-Images: HP; Wired

There's been some discussion over the reasons why so many people don't understand touch screen, or "surface" computing, even though research in this area has been going on for years.

As the new owner of the HP TouchSmart, I know that I get it.

The research I've conducted in this area suggests that people will "get-it" only if there is a strong commitment to develop touch-screen "surface" applications through a user-centered, participatory design process. In my view, this should incorporate principles of ethnography, and ensure that usability studies are conducted outside of the lab.


This approach was taken with
Intel's Classmate PC. Intel has about 40 ethnographic researchers, and sent many of them to work with students and teachers in classrooms around the world. (A video regarding ethnographic research and the Intel Classmate project can be found near the end of this post.)

http://download.intel.com/pressroom/kits/events/idffall_2008/images/Picture007.jpghttp://www.classmatepc.com/images/advocateImage.jpg
-Images: ClassematePC


Where to start?
K-12 classrooms and media centers. Public libraries. Malls. Hospital lobbies and doctor's offices. Any waiting room. Staff lounges in medical centers, schools, and universities. Community festivities and events. Movie theater lobbies. Museums and other points of interests.


I believe we need to take a "touching is believing" approach.

Here are some thoughts:
When I try to explain my fascination with developing touch-screen interactive multimedia applications, (interactive whiteboards, multi-touch displays and tables, and the like), many of my friends and family members eyes glaze over. This is particularly true for people I know who are forty-ish or over.

Even if you are younger, if you never saw the cool technology demonstrated in the movie Minority Report, or if you have limited experience with video games, or if you haven't came within touching distance of an interactive whiteboard, the concept might be difficult to understand.


The reality?
Even people who have the opportunity to use surface computing technology on large screens do not take full advantage of it. Multi-touch screens are often used as single-touch screens, and interactive whiteboards in classrooms are often serve as expensive projector screens for teacher-controlled PowerPoint presentations.


Most importantly, there are few software developers who understand the surface computing approach, even with the popularity of the iPhone and iPod Touch. Most focus on traditional business-oriented or marketing applications, and have difficulty envisioning scenarios for surface computing.  There is a need for a breath of fresh air!

Another factor is that not all people entrusted to market surface or touch screen computing fully understand it.
http://blogs.msdn.com/blogfiles/healthblog/WindowsLiveWriter/MicrosoftHUGWishyouwerehereDay2_82D3/IMG_0550_thumb.jpg
Despite a cool website showing off the goods, Microsoft's Surface multi-touch table has been slow to take off, limiting hope of bringing down the price tag to a price most families or schools could afford. (The picture above depicts an application for the Surface designed for health care professionals, not K-12 science education.)

Although you can't buy a Surface table for your family room, it is possible to buy a TouchSmart.  
HP's TouchSmart website is engaging and highlights some examples of touch-screen interaction, but most people don't seem to know about it. (Since this post was last written, there are many more touch and multi-touch options available to the public, such as the  Dell, the iPad, etc.)

Unfortunately, you wouldn't have a clue that the HP TouchSmart exists browsing the aisles at Circuit City or Best Buy!

When I was shopping for my new TouchSmart, I noticed that from a distance, the TouchSmart looked just like the other larger flat-screen monitors filling up the aisles. The salespeople at both stores were not well-informed about the system. The only reason I knew bout the new TouchSmart was related to my obsession with interactive multimedia touch-screen applications- designing them, developing them, studying them, reading about them, blogging about them.... ; }

More thoughts:

After studying HCI (Human-Computer Interaction), and relating this knowledge to what I know as a psychologist, my hunch is that the "Window Icon Mouse Pointing-device" (WIMP) and keyboard input mind-set is embedded in our brains, to a certain extent. Like driving a car, it is something automatic and expected. This is true for users AND developers. (Update - See The Post WIMP Explorers' Club: Update of the Updates for a review of a discussion among passionate post WIMP folks)

Think about it.

Suppose one day, you were told that you no longer were allowed to control your car by turning on the ignition, steering the wheel, or using your feet to accelerate, slow down, or stop the car! Instead, you needed to learn a new navigation, integration, and control system that involved waving your hands about and perhaps speaking a few commands.

For new drivers who'd never seen a car before, this new system would be user-friendly and intuitive. Perhaps it would be quite easy for 16-year-old kids to wrap their heads around this concept. For most of us, no. Imagine the disasters we would see on our streets and highways!

When we think about how newer technologies are introduced to people, we should keep this in mind.

In my mind, spreading the word about surface computing is not a "if you build it, they will come" phenomenon, like the iPhone. We can't ignore the broader picture.

From my middle-age woman's vantage point, I believe that it is important that the those involved with studying, developing, or marketing surface computing applications realize that many of us simply have no point of reference other than our experiences with ATMs, airline kiosks, supermarket self-serve lanes, and the like. (The video clip at the very end of this post provides a good example of touch-screen technology gone wrong.) -UPDATE: additional videos were added to this post.

Be aware that there are substantial numbers of people who might benefit from surface computing who prefer to avoid the ATMs, airline kiosks, and self-serve grocery shopping.

Realize that the collective experience with technology, in many cases, has not been too pretty. Many people have had such user-unfriendly experiences with productivity applications, forced upon them by their employers, that any interest or desire to explore emerging technologies has been zapped.

My own exposure to interactive "surface" related technology was somewhat accidental.
A few years ago, a huge box was deposited into the room I worked in a couple of days a week as a school psychologist at a middle school. After a week or so, I became curious, and found out that it was a SmartBoard. Until then (2002!), I did not know that interactive whiteboards existed.

The boxed remained unopened in the room for the entire school year, but no worry. I played with the only other SmartBoard in the school, and found a couple at the high school where I also worked. I hunted for all of the applications and interactive websites that I could find, and tried them out. That is when I was hooked. I could see all kinds of possibilities for interactive, engaging subject area learning activities. I could see the SmartBoards potential for music and art classes. With my own eyes, I saw how the SmartBoard engaged students with special needs in counseling activities. (By the way, if you are working with middle school students, PBS Kid's ItsMyLife website activities work great on an interactive whiteboard.)

A few years have passed, and reflecting on all of my fun experiences with interactive whiteboards, with and without students, I now understand that many teachers still have had limited exposure to this technology.

This school year, many teachers are finding themselves teaching in classrooms recently outfitted with interactive whiteboards, scrambling along with educational technology staff development specialists, to figure out how it works best with various groups of students, and what sort of changes need to be made regarding instructional practice.


For the very first time, interactive whiteboards were installed in two classrooms at one of the schools I work at. One of the teachers I know thanked me for telling her about interactive whiteboards and sharing my resources and links.

If I hadn't let her know about this technology, she wouldn't have volunteered to have one installed in her classroom. It has transformed the way she teaches special needs students.

In the few months that she's used the whiteboard, I can see how much it has transformed the way the students learn. They are attentive, more communicative, and engaged. The students don't spend the whole day with the whiteboard - the interactive learning activities are woven into lessons at various times of the day, representing true technology integration.

Now let's see what happens when all-in-one touch-screen PC's are unleashed in our schools!

UPDATE:  Take a look at a post I wrote for Innovative Interactivity just after SMART Technologies acquired NextWindow - the post describe in detail how interactive whiteboards are transforming learning and teaching in a program for students with special needs.
SMARTTechnologies Acquires NextWindow: A "smart window" to the world


There are some interesting changes going on at the intersection of HCI and educational technology research.  I participated in a workshop at CHI 2010 last April and was impressed by what is going on in this area, around the globe:   Next Generation of HCI and Education

Value of ethnographic research:
Ethnographic Research Informed Intel's Classmate PC
"Intel looked closely at how students collaborate and move around in classroom environments. The new tablet feature was implemented so that the device would be more conducive to what Intel calls “micromobility”. Intel wants students to be able to carry around Classmate PCs in much the same way that they currently carry around paper and pencil." -via Putting People First and Ars Technica

The video below is from Intel's YouTube Channel. Information about Intel's approach to ethnographic research in classrooms during the development of the Classroom PC is highlighted. This approach uses participatory design and allows the set of applications developed for the Classmate PC to reflect the needs of local students and teachers. Schools from many different countries were included in this study.




FYI: TOUCH SCREEN DISPLAYS:  NEED FOR IMPROVEMENT!

Touch Screen Coke Machine at the Mall: 90 Seconds to get a Coke


User-Unfriendly Interactive Display in the Rain (Ballantyne Village Shopping Center)

User-Unfriendly Information Kiosk Interactive Map
I encountered this puzzling and frustrating interactive directory/map at the Cleveland Clinic.  When I went to visit a relative at the hospital a year or so later, the map was no longer there.


BETTER EXAMPLES OF INTERACTIVE SCREENS:
Here are some interesting pictures from lm3labs, which are in my interactive usability hall of fame:

http://catchyoo.typepad.com/photos/uncategorized/2008/06/30/4654.jpghttp://farm3.static.flickr.com/2172/2233673451_6a48db8bff.jpg?v=0


Samsung's new Omnia SDG i900 was re-created in a much larger size, using lm3lab's Ubiq'window touchless technology.For more about lm3labs, including several videoclips, take a look at one of my previous posts: Lm3Labs, Nicolas Leoillot, and Multimedia Interaction

Midwife Toad App on a Microsoft Surface, Discovery Place Science Center


TellTable:  Digital Storytelling on the Surface:  Microsoft Research, UK


DECEMBER 31, 2009 -Interactive Soda Machine for Fun

The interactive screen on the Coke machine attracted the attention of this young child. He loved spinning the image of the bottle. So did the dad! He said, "I'd like something like this for my home!". I told him about the HP TouchSmart - both the dad and the mom did not know that there were affordable all-in-one touch screens available, but they knew about SMARTboards, because their children's classrooms had them.  Note:  No one from this family actually purchased a soft drink.  I was hoping to time how long it would take them to do so!



Some resources:
lm3labs (catchyoo, ubiq'window)
NUITeqNUI Group (See member's links)
Sparkon (See members links and multi-touch projects)

(More information and resources can be found by doing a "multitouch" or related search on this blog or The World Is My Interactive Interface.)

If you have plenty of time, take a look at my Post WIMP Explorers' Club YouTube playlist.
"Natural user interfaces, gesture interaction, multi-touch, natural interaction, post WIMP examples and more..."

FYI: I visited the Ballantyne Village shopping center a couple of months ago to follow up on the interactive displays, including the one I tried to use while it was raining.   The shopping center changed ownership, and the displays were replaced by the old-fashioned kind, pictured below:



Jul 2, 2010

Gesture and object recognition on your kitchen counter: The Oasis Project demo from Intel Labs

Intel's Oasis system uses object recognition that triggers various applications that generate such things as shopping lists and recipes. The system can handle more than one item of food. It includes videos of how to prepare meals, a great feature for people just starting , or those who are learning to prepare healthier meals.

In my opinion, this sort of application would be useful to people with disabilities that affect memory.



(Previously posted on THE WORLD IS MY INTERACTIVE INTERFACE blog.)

Jun 24, 2010

SIGGRAPH and INTEL's Visual Adrenaline: Look what I found in my email today about 3D lighting, UV's, and dynamic volumetric cloud rendering for games!

I thought I'd share a few links sent to me via Intel's Software Dispatch for Visual Adrenaline about the upcoming ACM SIGGRAPH 2010 conference. The links are to well-written "how-to" articles in pdf format.

"To keep your adrenaline pumping while waiting for SIGGRAPH to start, here's a few juicy bits of inspiration:

"Simulating Real-world Film Lighting Techniques in 3D. Light, shadows, and rendering play together to create realism in your 3D models. Learn to bring these art techniques into your 3D scenes to create compelling cinematic views."
"Creating UVs for Characters in Autodesk Maya*. Regardless of whether you're using a model in a real-time environment or as part of a rendered sequence, here are some tips and tricks to making UV mapping work efficiently."
"Dynamic Volumetric Cloud Rendering for Games on Multi-Core Platforms. Learn how the LuckyCloud demo implemented a solution for real-time dynamic simulation and illumination of clouds that didn't impact performance during game play."
 
By the way, the conference will be held from July 25-29 at the Los Angeles Convention Center.

Apr 4, 2010

ASU's SMALLab, a Mixed Reality Learning Environment : Multi-modal, and Collaborative; Links to Publications about Digital Media and Learning for the 21st Century

"Nothing's impossible, we just get smarter and smarter by the day." - Student, commenting about his experiences in the SMALLab environment. 


The research team at Arizona State university, lead by David Birchfield, has worked to create embodied, multimodal, and collaborative mediated learning learning environments using mixed reality that has been in use at Coronado High School with much success. The SMALLab is a learner-centered approach to learning that provides multi-modal, multi-sensory activities that engages learners, and also results in deeper understanding of more complex concepts.


Here is some information from the SMALLab website:

SMALLab @ Arizona State University - 2009 from aisling kelliher on Vimeo.


Video of high school students describing their work in SMALLab (Coronado High School)
"Central to our work is the development of a new interactive mixed reality learning environment, the Situated Multimedia Art Learning Lab [SMALLab]. SMALLab is an environment developed by a collaborative team of media researchers from education, psychology, interactive media, computer science, and the arts.  SMALLab is an extensible platform forsemi-immersivemixed-reality learning. By semi-immersive, we mean that the mediated space of SMALLab is physically open on all sides to the larger environment.  Participants can freely enter and exit the space without the need for wearing specialized display or sensing devices such as head-mounted displays (HMD) or motion capture markers.  Participants seated or standing around SMALLab can see and hear the dynamic media, and they can directly communicate with their peers that are interacting in the space. As such, the semi-immersive framework establishes a porous relationship between SMALLab and the larger physical learning environment.  By mixed-reality, we mean that there is an integration of physical manipulation objects, 3D physical gestures, and digitally mediated components.  Byextensible, we mean that researchers, teachers, and students can create new learning scenarios in SMALLab using a set of custom designed authoring tools and programming interfaces.

SMALLab supports situated and embodied learning by empowering the physical body to function as an expressive interface. Within SMALLab, students use a set of “glowballs” and peripherals to interact in real time with each other and with dynamic visual, textual, physical and sonic media through full body 3D movements and gestures.  For example, working in theSpring Sling scenario, students are immersed in a complex physics simulation that involves multiple sensory inputs to engage student attention.  They can hear the sound of a spring picking up speed, see projected bodies moving across the floor, feel a physical ball in their own hands and integrate how the projected ball moves in accordance with their own body movements to construct a robust conceptual model of the entire system."


SLINK links the various installations of SMALLab, which are found in museums, schools, the Institute of Play in NYC (used with students who attend the digital-games based, technology-rich charter school, Quest to Learn), 
Gaming SMALLab [IOP, ASU]: Lab install at Parsons DT
-Institute of Play's SMALLab

-Arizona State University: A SMALL way to keep up with technology (Sheilah Britton, 5/11/09)

-Arizona State University:  A SMALL way to keep up with technology (Sheilah Britton, 5/11/09)
How SMALLab Works:


SMALLab modules


More about the way it works: SMALLab-Situated Multimedia Arts Learning


About David Birchfield:
David Birchfield is "a media artist, researcher, and educator.  He has created work that spans from interactive music performance to generative software to robotic installationsn to K-12 learning environments. In recent years, this work cuts across three areas of exploration:K-12 learning, media art installations, and live computer music performance."


Some publications:
              Birchfield, D., Megowan-Romanowicz, Johnson-Glenberg, M., Next Gen Interfaces: Embodied Learning Using Motion, Sound, and Visuals – SMALLab. To appear in Proceedings of the American Educational Research Association Annual Conference; SIG Applied Research in Virtual Environments for Learning [ARVEL], San Diego, CA, April 2009.
               Megowan-Romanowicz, M., Uysal, S., Birchfield, D., Growth in Teacher Self-Efficacy Through Participation in a High-Tech Instructional Design Community, to appear in proceedings of the National Association for Research in Science Teaching Annual Conference, Garden Grove, CA, April 2009.





  • Birchfield, D., Thornburg, H., Megowan-Romanowicz, C., Hatton, S., Mechtley, B., Dolgov, I., Burleson, W., Embodiment, Multimodality, and Composition: Convergent Themes Across HCI and Education for Mixed-Reality Learning Environments, Journal of Advances in Human-Computer Interaction, Volume 2008, Article ID 874563.

  • Dolgov, I., Birchfield, D., McBeath, M., Thornburg, H., Todd, C., Amelioration of Axis-Aligned Motion Bias for Active versus Stationary Judgments of Bilaterally Symmetric Moving Shapes’ Final Destinations, Perception and Psychophysics, in press 2008.

  • D. Birchfield, B. Mechtley, S. Hatton, H. Thornburg, Mixed-Reality Learning in the Art Museum Context, Proceedings of ACM SIG Multimedia, Vancouver, BC, October 27, 2008.

  • S. Hatton, D. Birchfield, M.C. Megowan, Learning Metaphor through Mixed-Reality Game Design and Game Play, Proceedings of ACM Sandbox Conference, Los Angeles, CA, August 10, 2008. [pdf]

Institute of Play's SMALLab contact:
          Katie SalenExecutive Director, Institute of Play Associate Professor, Parsons The New School for Design


The Institute of Play, along with the Joan Ganz Cooney Center and others, have a number of publications related to technology and learning:

"The mission of The Joan Ganz Cooney Center is to catalyze and support research, innovation and investment in digital media technologies to advance children's learning. Nurturing foundational and "21st century" literacies:

The inaugural focus of the Center—given the national need—will be on determining how technology can help elementary-aged children develop the fundamental building blocks of literacy. These include the vital reading, writing, speaking and listening capabilities that all children must develop during the primary grades. A special emphasis of the Center will be on struggling readers who risk educational failure if they do not catch up to their peers by grade four...Another important focus of the Center is to leverage the potential of interactive media to promote "21st century" literacies that students will need to compete and cooperate in our connected world—competencies such as critical thinking and problem solving, second language competency, inter-cultural understanding and media literacy."

Here are a few background resources from the Institute of Play's resources page:
The following reports will bring you up-to-date about technology and "21st Century" learning.
Generation M2: Media in the Lives of 8- to 18-Year-Olds
The Kaiser Family Foundation - Report - 2010
Wallis, C. The Impacts of Media Multitasking on Children’s Learning & Development
The Joan Ganz Cooney Center - Meeting Report - 2010 (pdf)
Shuler, C. iLearn: A Content Analysis of the iTunes App Store’s Education Section
The Joan Ganz Cooney Center - Report - 2009 (pdf)
Wellings, J., Levine, M.H.The Digital Promise: Transforming Learning with Innovative Uses of Technology
The Joan Ganz Cooney Center - White Paper - 2009 (pdf)
Thai, A. M., Lowenstein, D., Ching, D., and Rejeski, D. Game Changer: Investing in Digital Play to Advance Children's Learning and Health
The Joan Ganz Cooney Center - Policy Brief - 2009 (pdf)
There are a number of similar articles published by CISCO and/or the Metiri Group:
Lemke, C., Coughlin, E., Reifsneider, D. (2009). Technology in the Schools: What Does the Research Say? (pdf)
Lemke, C. (2009) Multitimodal Learning through Media:  What the Research Says (pdf)
Williams, S.M. The Impact of Collaborative, Scaffolded Learning in K-12 Schools: A Meta-Analysis (pdf)