Showing posts with label webcam. Show all posts
Showing posts with label webcam. Show all posts

Feb 26, 2010

Web-based Technology-Supported Interaction: 6Rounds mixes webcam, games, and social media....

I've been so busy writing reports* that this almost passed me by!

I found out about 6rounds because they use Twitter as a promotional platform.  I happened to notice that this company was following me and clicked on the link.

6rounds started out as an outgrowth of a speed dating website, and the application was initially designed for people to use while waiting for speed dating sessions. According to the 6rounds website FAQ's, "6rounds is a live meeting point, offering users a variety of experiences that they enjoy together using a combination of webcams, real-time games, social activities and media engagements."

Since I'm a happily married middle-aged woman, I'm not sure 6rounds is up my alley.  I think  social singles, college students, and others who don't mind flashing their faces through a webcam would like it.

If I had time, I might like to play around with GixOO, the opensource API that underpins 6rounds. GixOO has the potential for developers to develop games and activities.  The application allows the users to track each other as they move their mice, and also enables people to see the same things as their friends as they interact online.

6rounds looks like it might provide possibilities for collaborative projects in education, but I won't be sure until I give it a try.

So what is 6rounds?



FOR THE TECH-CURIOUS

The following information was quoted from the Openomics blog from Sun Microsystem's  ISV Engineering:


"6rounds is the first product built on the GixOO live social platform, initially developped on the LAMP stack. As a member of the Sun Startup Essentials program, GixOO connected with Sun's ISV Engineering team to test the scalability of their platform on SAMP --the Solaris-based AMP stack, available in an integrated and optimized package from Sun, the Sun Glassfish Web Stack f.k.a. CoolStack. At the time, we ran the benchmark on a Sun SPARC Enterprise T5120 server --featuring the 64-way CoolThreads processor UltraSPARC T2-- running Solaris 10 and CoolStack 1.3. GixOO loved the DTrace kernel instrumentation of Solaris 10 --DTrace gives unique insights into how the application performs, live on a production system-- and the Containers technology a.k.a. Zones --this light-weigth virtualization layer of Solaris allows multiple applications to run in isolation from each other on the same physical hardware--, and quickly adopted them for their internal use.
"At GixOO, we use Sun SPARC-based server, powered by Solaris 10 for our R&D environment. The system gives us the required flexibility and components isolation that we need. Thanks to SPARC's great SMP abilities, we achieve high performance for many development environments running on one single 1U server.
Solaris Zones are very comfortable and simple to configure, and allow the full utilization of the great power hidden in this small machine, which makes Solaris 10 an excellent choice for system administrators. We are using Sun MySQL Server which gives our application high speed data storage solution, and in the future we might migrate to the MySQL Cluster solution to get even faster results."
Dmitry Shestak, CTO,
GixOO"
Somewhat Related

2/26/10: Oracle bought Sun in 2009. Here were the latest results when I did a search to get more information:













Not Really Related 


*For those new to this blog, I'm a school psychologist who returned to her day job full time a year and 1/2 ago, when the economy was taking a nosedive. Before that, I was working part-time and taking computer and technology classes, initially to learn how to create interactive multimedia applications and games.  


Since some of the kids and teens I work with have a range of abilities and disabilities, including autism, I developed an interest in accessibility.  How can universal design principles  be applied to games and emerging interactive technologies?   I'm also fascinated by interactive displays and surfaces of all sizes, especially ubiquitous systems that support cognition, collaboration and communication.  

One of my pet projects:

My vision? A collaborative multimedia, multi-modal interactive time-line might help us to understand  complex, interrelated factors and events more effectively. It would provide an opportunity for the inquisitive to view things from a broad perspective, and also explore things in rich detail. Ideally, the time-line would support multi-touch, multi-user interaction on larger displays and interactive whiteboards, and allow for people who are remotely located to participate in the process. 


Now that one of my schools will be getting a multi-touch SMARTTable, I'd like to experiment with time-line concepts and interactions on a table surface. I'd also like to figure out how this can work seamlessly with the existing SMARTBoard that is in the classroom.  Of course, this would have to take place during after work hours!

May 23, 2009

Razorfish's Touch Framework "Razorfashion" - A lot like my idea for an in-home FashionMirrorAdvisor...

Razorfish recently unveiled the Razorfashion application designed to provide shoppers with an engaging retail experience within the "multi-channel shopping ecosystem". I'm not the "shop to you drop" type of gal, but I can see that this concept could be useful in other situations, after a few tweaks.




As soon as I saw this Razorfish Touch "Fashion" demo video, it touched a nerve. I've been playing around with a similar idea, but for my personal use, in the form of an RFID-enabled system. I'd call it something like "FashionMirrorAdvisor".

Instead of showing skinny fashion models like the Razorfashion application, I'd harness the power of built-in web-cam and mirror my own image on the screen. My mirror would dress me up in the morning when I'm way too foggy to think about matching colors and accessories.

My FashionMirrorAdvisor would be my friend. My "smart" friend, since all of my clothes would be RFID-tagged, along with my shoes, jewelry, and other accessories. My make-up, too.

It would be a no-brainer. I really could use this application - just ask my husband!

More often than not, most mornings I find myself staring at the clothes in my closet, frozen in time, unable to formulate a fashion thought. I might set my eyes on a favorite blouse, but blank out when I try to think about the rest of the steps I need to pull my look together.


I know I can't wear my reddish-pink camisole with my dusty-orange/brown slacks, but at 5:15 A.M., who has the time to think about this little detail? My friend, the TouchFashionMirror would prevent me from making this fashion faux-pas.

No problem.

My FashionMirrorAdvisor would show me a few outfits, and dress my real-time moving image on the screen. Since she knows all things, she'd show me ONLY the articles of clothing that were clean, since my RFID system would keep up with all of that. It would be much more functional than a "virtual wardrobe" application.

I could try out different earrings without having to get them out.

If I couldn't find something, the RFID system would take care of this detail. My FashioMirrorAdvisor would know where I misplaced my clothes, accessories, and even my keys, since they would all be tagged. The mirror application would provide me with a nice little map of my house and car, and highlight the location of the item.

My FashionMirrorAdvisor would keep track of my laundry, too. This would be a great feature. So if my dirty laundry was piling up, and I wanted to wear outfit X, Y, or Z over the next few days, I'd receive a gentle reminder that I'd need to do some laundry first!

Another practical feature:

My FashionMirrorAdvisor would also serve as my health consultant, keeping track of my weight and BMI. This data, along with information gained from the webcam, would be combined so that my advisor would NEVER suggest an outfit that would be too...snug.

I could program the system to provide me with gentle reminders if my weight was an issue. My FashionMirrorAdvisor would show me images of myself "before" and "after", outfits included.

Information about the "after" outfits could be fed to the system from the web-catalogs of my favorite fashion retailers, and once I lost those 10 darned pounds, I'd find a nice parcel delivered to my door.

Thanks to my FashionMirrorAdvisor, I know that the outfit would be just right.


UPDATE 5/8/10:  The FashionMirrorAdvisor would be integrated with a mobile app - since I now have a smartphone, this would be quite useful in planning shopping trips centered around the purchase of new clothes, shoes, accessories, and coordinating cosmetics!  I created a little game  that I think would be ideal for this sort of thing, too.

I still want to work on this....someday.

Too many ideas, too little time!


RELATED
From the Razorfish site:
"The Razorfish Emerging Experiences team is a dedicated group of highly experienced professionals focused solely on emerging experiences and technologies. "Effective innovation" is our multifaceted approach to concepting and delivering pioneering solutions for our clients"

"Founded in 2008, Razorfish Emerging Experiences is a cross-functional team composed of strategists, artists, experience designers, and technologists. We’re part of the Razorfish Strategy & Innovation practice led by Shannon Denton. Jonathan Hull is the managing director of the team, Steve Dawson is the technology lead and Luke Hamilton is the creative lead."


Razorfish Emerging Experiences Portfolio

Mar 26, 2009

ElderGadget Blog

Via Technology-Supported Human World Interaction

Today I came across the ElderGadget Blog when I was searching for technology that might be helpful to my father, who recently spent the last few weeks in an intensive care unit.

Here are a few things I found on the blog, including what I found when I followed a link to the bindependent website:


Smart Table: A Broadband Communication System for the Elderly


ID Studiolab, Delft University of Technology, NL

"The smart system is aimed to prevent possible social isolation between elderly people and their social contacts. It consists of the Smart Table and the Smart Messenger, which provide easy and intuitive way for elderly people to benefit from the advantage of communication technology and improve the satisfaction of their social activities."

"During the design research, a working prototype is made and evaluated by a user test. The Smart Table offers people a way to use broadband communication without the need for a computer. The television will be used for video communication (web-cam) and the table as user-interface. On the table contact persons can be selected and images and video can be shared. The elderly person can capture images that will be shared instantly. The contact person can select images from his or her computer, which will be projected on the table of the elderly person". (TU Delft - Faculty Industrial Design Engineering - Master Design for Interaction - Project Interactive Technology Design - Vincent Steurs & Juin-Yi Huang)


http://www.eldergadget.com/wp-content/uploads/2009/03/recipe-reader.jpg
Demy Kitchen Safe Touchscreen Reader


http://www.eldergadget.com/wp-content/uploads/2009/01/photo-watch.jpg
Photo Watch

http://www.bindependent.com/hompg/images/snh-c900t1.jpg
Clarity-C900: Amplified Big Button Cell Phone
911 Guardian Phone
911 Guardian: "Tiny wearable speakerphone"

Big Digit TV remote
Extra Large TV Remote Control


SmartShopper electonic grocery-list maker







SmartShopper: Electronic grocery-list maker



http://www.eldergadget.com/wp-content/uploads/2009/01/hc-colormouse-2t.jpg
Bierley's ColorMouse Video Magnifier

Voice Pod: Digital recording and playback system

RELATED

Live from CES - Dakim's Brain Fitness Keeps Seniors Sharp

Dakim Brain Games


Previous post: Ubiquitous Computing - Grandpa and grandkids use a webcam and Skype across the miles


YouTube Vlogger: Geriatric1927

Geriatric Gripes and Grumbles


Nov 23, 2008

For the Tech Curious: "Get in Touch with Touchless": Multi-touch with just a webcam and the free demo application!

Via the Seattle Tech Report's Microsoft Blog:



You can find the demo code on the Codeplex website. Here is a quote:

"The Touchless SDK enables developers to create multi-touch based applications using a webcam for input. Touch without touching."

"Touchless started as Mike Wasserman’s college project at Columbia University. The main idea: to offer users a new and cheap way of experiencing multi-touch capabilities, without the need of expensive hardware or software. All the user needs is a camera, which will track colored markers defined by the user."


(I posted about the Touchless SDK previously, but I didn't have the video.)