Showing posts with label ubiquitous computing. Show all posts
Showing posts with label ubiquitous computing. Show all posts

Dec 2, 2012

EpiCollect: A mobile app, useful for photo + data-collection "in the wild".

EpiCollect is an open-source project developed at Imperial College London, funded by the Wellcome Trust.  According to information posted on the project's website, "EpiCollect is a generic data collection tool that allows you to collect and submit geotagged data forms (along with photos) to a central project website (hosted using Google's App Engine) from suitable mobile phones (Android or iPhone). For example, questionnaires, surveys, etc.  All data synchronised (ie a copy sent from the phone) from multiple phones can then be viewed/charted/filtered at the project website using Google Maps/Earth or downloaded. Furthermore, data can be requested and viewed/filtered from the project website directly on your phone using Google Maps." -EpiCollect

EpiCollect Overview  epicollect.net
(Credit:  EpiCollect Website)

EpiCollect makes use of web API's such as Google Maps, Google Charts, Google Talk, and KML Specifiction, and JavaScript Libraries such as JQuery, script.aculo.us, ExtJS, and Mapstraction.  It runs on the Google AppEngine server, and is available for Android and iPhone.

I think that EpiCollect would be a useful interactive tool for use in education, K-12 and above.  It would be ideal for students working on group projects, such as environmental study.  For young children, a simple assignment might include taking pictures and data about  birds, animals, trees, cloud formations, or even litter, as part of a class project.  Since the data includes photographs, the students could create an end product in the form of an interactive multimedia presentation, available for other students - as well as parents- to view on the web, accessed from any web-enabled device.

HCI research teams could use these tools when observing people using various technologies in public spaces, such as malls, airports, special events, as well as in stores, eateries, and entertainment settings.  

I would be interested in learning more about the use of this application in HCI and K-12 education!

RELATED
EpiCollect Website
EpiCollect Instructions
EpiCollect Instructions (pdf)
The Sight of Road Kill Makes a Pretty, Data-Rich Picture (NPR All Tech Considered)
Note: Audio from the above December 2, 2012 episode can be found on the NPR Weekend Edition Sunday website after 12:00 PM ET on 12/2/12
Mobile app sees science go global  (BBC article)
App for Android Puts Laboratories on Your Phone (Tree Hugger article)
Scientific Data Collection Goes Mobile (Discovery News article)

Paper: EpiCollect: Linking Smartphones to Web Applications for Epidemiology, Ecology and Community Data Collection (PLos One 4(9), 2009)

David M. Aanensen, Derek M. Huntley, Edward J. Feil, Fada'a al-Own, Brian G. Spratt
Conclusion from the above paper:
"Data collection frameworks utilising mobile phones with data submission to and from central databases are widely applicable and can give a field worker similar display and analysis tools on their mobile phone that they would have if viewing the data in their laboratory via the web. We demonstrate their utility for epidemiological data collection and display, and briefly discuss their application in ecological and community data collection. Furthermore, such frameworks offer great potential for recruiting ‘citizen scientists’ to contribute data easily to central databases through their mobile phone."

Oct 8, 2012

Smartphone Use Infographic, via Pew Internet and American Life Project

The Pew Internet & American Life Project website is a treasure trove of statistics about the use of the internet and related technologies.  I especially like the following infographic which outlines how smartphone ownership has reached the "tipping point".  My hunch is that this will lead to some bigger changes in our future!

For more information, see Lee Rainie's article: Smartphone Ownership Update:  September 2012



Jun 25, 2012

Ph.D. Student Positions: Intel Collaborative Research Institute on Sustainable Connected Cities

Thanks to Johannes Schöning for sharing information about this opportunity!
This might be of interest to some of my IMT readers:


EngD/PhD Positions within the Intel Collaborative Research Institute on Sustainable Connected Cities (ISCCI) at University College London (UCL) The Department of Computer Science at UCL is inviting applications for up to 6 Research Student Positions (1 EngD of 4 years, and up to 5 PhDs of 3 years), starting September 24th 2012 or January 7th 2013.


With 6.3 billion people expected to dwell in cities by 2050, the aim of the ISCCI is to create and realize a compelling vision of a sustainable future made possible by adaptive technologies that optimize resource efficiency, enable new services and support the quality of life of urban inhabitants. The Institute is located with a rich external ecosystem of companies and researchers both locally and globally investing in this important domain. The ISCCI is led by Prof. Yvonne Rogers at UCL.


We are looking for students willing to pursue a doctoral degree in computer science around the following broad topics:
•       How technology can help recognize, leverage, and support the out-of-sight, hidden or forgotten resources of urban environments, ranging from volunteers to subterranean water systems and other underlying city infrastructures.
•       How communities can encourage sustainable behaviours over time, for example, through meaningful visualizations and feedback about resource usage to individuals and groups.
•       How technology can give us an opportunity to reinvent new ideas of place and identity, considering the diversification & proliferation of new types of communities in cities, with the aim to increase quality of living and lower the barriers for mobility in our future connectedcities.


The applicants should possess a good honours MSc degree (1st Class or 2:1 minimum) in Computer Science, Psychology, Human-Computer-Interaction or related disciplines. Candidates will be expected to work in teams comprising computer scientists, social scientists, and psychologists, so an open attitude towards interdisciplinary research and teamwork is important. Candidates should have interest in at least two of the following research fields (as well as a good command of English) language:
•       Human-Computer-Interaction,
•       Augmented or Mixed Reality,
•       Interactive 3D Computer Graphics,
•       Interaction Design,
•       Perceptual Psychology, and/or
•       Cognitive Sciences.
•       Ethnography•       Data Mining, Machine Learning
•       Crowed Sourced Data
•       Data Visualization, Cartography
•       Geoinformatics
•       Big Data


Fees are fully paid. Salary for the 4-year EngD position is £18,090 tax free p.a., and £15,590 tax free p.a. for the 3-year PhD positions.The closing date for applications is 5pm on 18th July 2012. Interviews will he held on July 26th and 27th. 


The start date is September 24th  2012 (though it can be postponed to January 7th 2013). No part-time option available.Please download the application form http://www.ucl.ac.uk/uclic/phd_studentships/Intel_studentship_application_form/ and email the completed form to Louise Gaynor l.gaynor@ucl.ac.uk as a single PDF document by 5pm on Wed 18th July 2012. Please can you indicate in your application whether you wish to start in Sept 2012 or Jan 2013.

Intel studentship job advert_July 2012.pdfIntel studentship job advert_July 2012.pdf1881K   View   Download   

Nov 26, 2011

Revisiting Good Blogs: User Interface Engineering (Albrecht Schmidt, University of Stuttgart)

Albrecht Schmidt is a professor at the University of Stuttgart. His main interests include novel user interfaces and innovative applications for ubiquitous computing. His blog, Albrecht Schmidt - User Interface Engineering, serves as his note-pad, full of ideas, deep musings, and great links, including links to good scholarly papers. Below are a few of his posts:

Guests in my multimodal interaction class   
I'd love to take Dr. Schmidt's Multimodal Interaction class!  In this post, Albrecht highlights the presentations of  Nigel DaviesMarc Langheirich, and Rui Jose,  all members of the pd-net project.

Call for Papers:  Symposium on Pervasive Display Networks

Closing Keynote at AMI2011:  Beyond Ubicomp - Computing is Changing the Way we Live
(Pdf version - worth taking a look!)


Percom 2011 in Seatle, keynote


RELATED
PD-NET
"The PD-NET project aims to lay the scientific foundations for a new form of communications medium with the same potential impact on society as radio, television and the Internet. The goal is to explore the scientific challenges and to assess the new technologies required to enable the emergence of large scale networks of pervasive public displays and associated sensors. This display network will be designed and implemented to be open to applications and content from many sources and thus provide the foundation for work on a new global communications medium for information access and interaction." 


Note:  One of my interest is public displays in public spaces.  If you are interested, take a look at my Flicker collection, Ubiquitous Sightings of Urban Screens: Interactive Displays, Kiosks, and Digital Signage.

TWINE video and the Supermechanical Blog: Moving Toward the Web of People and Things!

"A wireless square with sensors and a simple web app to set rules, Twine tells you what your things are doing by email, text or Twitter."   I want one!

This project was developed by David Carr and John Kestener, the designer-engineers behind Supermechanical. They are passionate about creating connectable objects.They honed their skills in the interdisciplinary  MIT Media Lab.  


More information about Twine can be found on the KICKSTARTER website.  Here is a bite of info from the site for the tech-curious:

"Twine is a wireless module tightly integrated with a cloud-based service. The module has WiFi, on-board temperature and vibration sensors, and an expansion connector for other sensors. Power is supplied by the on-board mini USB or two AAA batteries (and Twine will email you when you need to change the batteries)."
"The Spool web app makes it simple to set up and monitor your Twines from a browser anywhere. You set rules to trigger messages — no programming needed. The rules are put together with a palette of available conditions and actions, and read like English: WHEN moisture sensor gets wet THEN tweet "The basement is flooding!" We'll get you started with a bunch of rule sets, and you can share rules you create with other Twine owners."
"Because the hardware and software are made for each other, setup is easy. There's nothing to install — just point Twine to your WiFi network. Sensors are immediately recognized by the web app when you plug them in, and it reflects what the sensors see in real time, which makes understanding and testing your rules easy."

RELATED

Jun 17, 2011

An Internet of Old Things as an Augmented Memory System: "Tales of Things" allows people to record multimedia stories about objects, linked via QR or RIFD tags.

The following article about "Tales of Things", an interesting augmented memory app, was recently selected for review by Personal and Ubiquitous Computing:

An Internet of Old Things as an Augmented Memory System
Ralph Barthel, Kerstin Leder, Andrew Hudson-Smith, Angelina Karpovich, Martin de Jode, Chris Speed, University College London /  Loughborough University
After watching the following videos and exploring the Tales of Things website, I am sure that this sort of technology will have many uses in education.  Wouldn't it be fun to have a "tales of things" project for a homework assignment?!

In the following video, Chris Speed discusses the Tales of Things project:


Tales of Things Beta: Connecting anything with any media, anywhere!


Below is a video of how an old Fisher Price teaching clock can become part of a "tale of things":


RELATED
Tales of Things: The Internet of "Old" Things:  Collecting Stories of Objects, Places, and Spaces (pdf)  Ralph Barthel, Andrew Hudson-Smith, Martin de Jode, Benjamin Blundell, CASA Centre For Advanced Spatial Analysis, University College London, London, United Kingdom







Mar 31, 2011

CALL FOR PAPERS: Touch the Web 2011 2nd International Workshop on Web-Enabled Objects


CALL FOR PAPERS: Touch the Web 2011
2nd International Workshop on Web-Enabled Objects
The workshop will be held on June 20-24, 2011 in Paphos, Cyprus, in conjunction with the:
The vision of the Internet of Things builds upon the use of embedded systems to control devices, tools and appliances. With the addition of novel communications capabilities and identification means such as RFID, systems can now gather information from other sensors, devices and computers on the network, or enable user-oriented customization and operations through short-range communication. When the information gathered by different sensors is shared by means of open Web standards, new services can be defined on top of physical elements. In addition, the new generation of mobile phones enables a true mobile Internet experience. These phones are today’s ubiquitous information access tool, and the physical token of our "Digital Me“. These meshes of things and “Digital Me” will become the basis upon which future smart living, working and production places will be created, delivering services directly where they are needed.

The fundamental questions in this workshop are:
How do we apply the lessons learned in Web Engineering to the Internet of Things?
What kind of Web-based interaction patterns for embedded devices (streaming, eventing, etc.) can we identify?
Applications, deployments, prototypes and evaluations of Web of things systems.
Human-things interaction models and paradigms (mobile interfaces, etc.)
User-oriented, context-aware discovery and dynamic search for the real world.
Compositions of Things and Meshes of Things.
What kind of interfaces, business models and scenarios these smart things will create, address and modify?
How can the services in the surroundings be composed and orchestrated?

The research areas involved are many, including Web Engineering, Ubiquitous Computing, Product Engineering, Distributed Systems, Human Computer Interaction, Economics, Artificial Intelligence, and many more. We look forward in this workshop to bring together different areas of expertise to help us shape a vision of creating living and working surrounded by meshes of web-enabled things.

Workshop format
Expected number of participants is no more than 20, for a full-day workshop. A tentative program (to be adjusted following the number of participants) is organized as follows:
A short (15 min) opening session by the workshop organizers
Two full paper presentation sessions (each session with 3 papers, 20 min presentation each)
Lunch break
One short paper presentation session (up to 6 papers, 5 min presentation each)
Discussion session. A specific agenda to discuss and provide feedback upon will be proposed and the organizers will moderate the discussion.
The last session will be devoted to consolidate and present the results of the discussions. The workshop organizers will write a workshop summary and publish it as part of the workshop proceedings.

Oct 24, 2010

Augmented Reality Eyeglasses: I AM geeky enough to make this fashion statement!

THE AR WALKER, from NTT DoCoMo
AR-walker
Jason Sosa tweeted about the latest trend, the AR Walker, augmented reality for your glasses. This technology from NTT DoCoMo allows you to instantly access location-based information about the things around you. This would be very convenient for people walking about in urban areas, or for tourists. The system also provides easy access to other kinds of information. For example, if you look up towards the sky, you'll get weather information...beyond what you'd figure out on your own.

Of course, the fashionable get-up is in the prototype stage, so you won't be able be seen in public with the newest type of shades anytime soon. For more information, see the video below:


AR WALKER


RELATED
Total Immersion's Partner Success Story: Demo Reel of AR&Co in Indonesia

Cross posted on The World Is My Interactive Interface blog.

Sep 21, 2010

Jul 22, 2010

Connected Youth: Theme of the July-Sept issue of IEEE Pervasive Computing

July-Sept. 2010 IEEE Pervasive Computing Cover: Connected Youth

The guest editors of the July-September 2010 issue of IEEE's Pervasive Computing magazine are John Canny, an engineering professor at the University of California, Berkeley, and Jason Hong, an assistant professor at Carnegie Mellon University's Human-Computer Interaction Institute.  Their article provides an introduction to the publication's timely theme, Connected Youth.  The editors find that the study of the current generation of children and teens provides useful information about the future of computing as in integrates more seamlessly into our daily lives, the way we learn, and our relationships and interaction with others across time, place, and generations.

I'm in the middle of reading this issue of Pervasive Computing.  The articles that have caught my eye so far include "Story Time for the 21st Century", which describes the research an implementation of an interactive book-reading system designed to connect children, family members, and grandparents who live in distant locations. The system involves videoconferencing with paper books and interactive content, and enables grandparents to read with their grandchildren via the Internet.



Additional information about Story Play, a system that is still under development, can be found on the Nokia Research Center website: "Family Story Play: Story Time for the 21st Century".  A follow-up project to Story Play is Story Visit:

image

I'll post more about the articles from the Connected Youth issue of Pervasive Computing in the future.

Jun 4, 2010

UPDATED iRiS: iPhone Remote Interaction System: Paint a multi-media facade on your favorite building! via Johannes Schoning (updated with links)



"We used Touch Projector to paint on a multimedia facade (here: Ars Electronica Center, Linz, Austria). Users simply aim their device at the facade causing the viewfinder to show the facade locally. In this local image, users can now draw with the selected color. The building then changes its color accordingly as if users directly painted on it."


NOTE:
Currenty Johannes Schöning works at the Innovative Retail Laboratory of the German Research Centre for Artificial Intelligence (DFKI) in Saarbruecken.  His research interests include new methods and interfaces to intuitively navigate through spatial information, and ways new intelligent interfaces can help people to solve daily tasks more effectively.  Johannes recently completed his Ph.D. at the Saarland University.  He runs the multi-touch Google group. (LINK: http://groups.google.de/group/multi-touch) Johannes is involved in a range of interesting professional activities.  He's currently looking for new opportunities.


SOMEWHAT RELATED
Urban Screens Conference & Exhibition:  Calls for Interactive Media and Video Art (2010) 
More Urban Screens and 3D Media Facades
Book:  Media Facades:  History, Technology, and Content (M. Hank Haeusler)
3D Projections on Buildings: A distinctive way of communicating
Communicating Through Architecture:  Media Facades and the Digital Infrastructure  The Rathous (Contains an assortment of videos and pictures)
Art and Commerce Meet on Building's Interactive Media Facades
Kelsey Keith, Fast Company, 10/2/2009

Jun 1, 2010

Daily Stack: Tangible User Interface Exploration by Sebastian Ronde Theilke and Anders Homose

"Daily Stack is a playful tool that helps you become more aware of your daily work-flow and time management. By creating a physical representation of your tasks, Daily Stack speaks subtly to your conscience and helps you manage your time through unobtrusive ambient feedback"


Via Jeremy Perez-Cruz:

Daily Stack from Anders Højmose on Vimeo.

Jeremy also had a link to memoryhouse -nice music. (Jeremy is an art director at Anson Stoner.  His blog is Tea and Letter)

May 30, 2010

2010 International Computer Music Conference in NY. I wish I could go!

I'm usually too busy during the last month or so of the school year to attend conferences.  One I'd really like to attend is the 2010 International Computer Music Conference in N.Y.  Music is an important component of interactive multimedia content, and new technologies have made things a lot easier for musicians who are technologically inclined.  Conferences like ICMC are a great way to see - and hear - what is going on.

Links:
ICMC Paper Schedule 
ICMC 2010 Poster/Demo Schedule
Thomas Erbe's ICMC Workshop:  Pure Data Object Programming
(see bio and plug below)

Intriguing Topics: 
"Gestural Shaping and Transformation in a Universal Space of Structure and Sound"
"SoundCatcher:  explorations in audio-looping and time-freezing using an open-air gestural controller"
"Sense/Stage - low cost, open source wireless sensor infrastructure for live performance and interactive, real-time environments"
"The Four M's:  Music, Mind, Motion, Machines"
"A Wireless, Real-time Social Music Performance System for Mobile Phones"
"Because we are all falling down: Physics, Gestures, and Relative Realities"
"Argos:  An open-source application for building multi-touch musical interfaces"
"Peacock: a non-haptic 3D performance interface"
"Head Tracking for 3D Audio using the Nintendo WII"
"The Avatar Initiative- An Interdisciplinary Approach to Digital Media Research and Education"
"Computer Controlled Video as a Multi-modal Interface in Live Acousmatic Music"
"The Machine Orchestra"
"Eye. Breathe. Music"
"Combining audiovisual mappings for 3D musical interaction"

ICMC Unconference Categories

PdBarCamp
Sensory Interaction in Composition and Performance
Language, Neurology, and Acoustics
Open Scores and Accessible, Consumer Devices
Issues in Computer Music Performance
Computer Music and Society:  Questions of Dissemination

Realistically,  I'd be happy with a bit more time to play my keyboard! 
(The very first class I took when I decided to return to school to take computer classes was computer music technology.)


My plug for Tom Erbe, from the ICMC website 
Instructor Bios:

"Tom Erbe has had an important role in American experimental and electronic music of the last 20 years. In addition to his pioneering and widely used program SoundHack, he has become one of the most sought after and respected sound engineers for contemporary music. In 2004 he rejoined the faculty of UCSD in the Department of Music and serves as Studio Director. Most recently Tom has released SoundHack Spectral Shapers, the first of a planned set of three plugin bundles to bring extreme spectral processing to the VST, AU and RTAS formats."

If you are thinking about experimenting with sound, Tom Erbe's SoundHack freeware is awesome. His spectral shapers are worth every penny.  (I used the +binaural filter to create a 3D effect of racing car sounds for one of the students I work with who has autism and loves racing cars.)

+binaural


This filter places a sound at a specific position around the listener's head. Use it with a reverb to create a virtual environment. When used with it's LFO, +binaural can place various beats or parts of a loop in specific repeatable positions.







SOMEWHAT RELATED
When I get a moment, I'm reading my latest issue of IEEE Multimedia cover-to-cover:
Special Issue:  Mobile and Ubiquitous Multimedia
April—Jaune 2010 MultiMedia Cover

(The second class I took after I returned to school to take computer classes was Computer and Internet Multimedia.)

Apr 9, 2010

Interesting journal: Aether- The Journal of Media Geography (A convergence of disciplines)

Aether is hosted by the College of Social and Behavioral Sciences at California State University, Northridge. The April issue is editied by Tristan Thielmann, and "explores the spatial turn in media studies and the media turn in geographical studies, providing a sketch of the subject area "geomedia" from a phenomenological perspective and the field of "media geography" from a disciplinary perspective".


Aether the Journal of Media Geography



Tristan Thielmann, from the University of Seigen, introduces this topic in "Locative Media and Mediated Localities: An Introduction to Media Geography" (pdf)


Thanks to Anne Galloway for the link!


By the way,  Anne Galloway has been blogging since 2002.  Her blog chronicles her intellectual path as she worked on her Ph.D., and more recently, her experiences teaching courses such as "Design Anthropology".    


Her dissertation, "A Brief History of the Future of Urban Computing and Locative Media" (pdf) was completed in 2008. If you are interested in ubiquitous computing and interesting theories, its worth reading, especially if your brain needs some deep feeding.  


I plan to re-read it this summer.


Here is an excerpt:

"The types of ubiquitous or pervasive computing of primary interest in my thesis are those that openly seek to create unique forms of inhabitable space and means of habitation—thereby raising issues of spatialisation, temporalisation, embodiment and affect. So-called mixed reality technologies are explicitly concerned with such questions, and mixed reality environments refer to spaces that combine elements of the physical and virtual worlds. According to Milgram et al. (1994:1), “rather than regarding the two concepts simply as antitheses, however, it is more convenient to view them as lying at opposite ends of a continuum, which we refer to as the Reality-Virtuality (RV) continuum.” At one end of the continuum are seen to be “real” objects that can be observed directly or "sampled and then resynthesized via some display device,” while at the other end are “virtual” objects that are “simulated” through “some sort of a description, or model, of the object” (Milgram and Kishino 1994:1).



SOMEWHAT RELATED
I plan to find out more about the editorial board of Aether:    

Paul C. Adams • University of Texas at Austin
Stuart C. Aitken • San Diego State University
David B. Clarke • Swansea University
Christina Dando • University of Nebraska, Omaha
Deborah Dixon • Aberystwyth University
Marcus Doel • Swansea University
Colin R. Gardner • University of California, Santa Barbara
Ken Hillis • The University of North Carolina at Chapel Hill
Sarah F. Ives • Stanford University
Ed Jackiewicz • California State University, Northridge
John Paul Jones III • University of Arizona
Christina Kennedy • Northern Arizona University
Minelle Mahtani • University of Toronto
Susan Mains • University of the West Indies, Mona
Kevin McHugh • Arizona State University
Christopher M. Moreno • San Diego State University
Wolfgang Natter • Virginia Polytechnic Institute
Joseph Palis • University of the Philippines


Brent J. Piepergerdes • University of Kansas
Rob Shields • University of Alberta
Amy Siciliano • University of Wisconsin
Paul F. Starrs • University of Reno, Nevada
Dan Sutko • North Carolina State University
Jonathan Taylor • California State University, Fullerton
Stefan Zimmermann • University of Mainz
Leo Zonn • University of Texas

Mar 19, 2010

Data-Information-Knowledge-Wisdom: IBM's video about the Internet of Things & the Smarter Planet

This video produced by IBM is an overview of the Internet of Things, and depicts how a vast  amount of data is generated every day through sensors and monitors that are embedded in everything around us. The concept of "systems of systems" is depicted in a cute graphical manner. 


In the video, a pyramid similar to the one below is discussed, outlining the data-to-wisdom process:
http://ranjeetwalunj.files.wordpress.com/2008/10/wisdom.png?w=559&h=342

Knowledge discovery to wisdom, based on Giarratano and Riley, 1998, found on Ranjeet Walunj's blog.

Questions: What are mere mortals to make of all of this data flowing around? How should educators respond to the reality of the Internet of Things and the wealth of data that will soon be available?

FYI: Previous posts related to the Internet of Things

The Internet of Things Europe 2009 Conference: Internet Rabbits, Mirrors, Stamps, and More!
The IBM Web, or Internet of Things
The UX of SmartGridCity: Control Your SmartHouse Remotely, Online
"Web Meets World": User Experience of Embedded Systems and Wireless Sensor Networks
Will NoSQL Rescue the World of Data in the Cloud? (Info and Links for the Tech Curious)
Interactive Mobile Multimedia
Tangible User Interfaces, Part I: Siftables

Feb 1, 2010

Apps with Geographic Data Can Make You Healthy: Bill Davenhall's TED Talk

Bill Davenhall is the health and human services marketing head at ESRI, a geographic information systems (GIS) software development company. In this recent TED Talk, he shows how geographic applications and mobile devices can help provide patients and doctors with useful information for health care planning and informed decision-making by harnessing the power of existing data maps.

The presentation includes a variety of interesting map, representing a "place history", something that is central to the field of geographic medicine. The video is about 9 minutes long, but worth viewing, especially if you care about your health!



Thanks to RealVision for the link.

For more information about this topic, see the International Journal of Health Geographics website.

Sep 19, 2009

The World Is My Interface

THE WORLD IS MY INTERFACE  is the new  title of my Technology-Supported Human-World Interaction blog.  The name was just too long!

The first post written under the new title has a variety of links about ubiquitous computing, off-the-desktop interaction design, use of technology in public spaces, and so forth.  There are a few links to scholarly research and related projects, too.