Showing posts with label applications. Show all posts
Showing posts with label applications. Show all posts

Feb 24, 2011

Vision-Based Hand-Gesture Applications: Video from Communications of the ACM



The latest edition of Communications of the ACM, via "snail mail", was the inspiration for this post:



Vision-Based Hand-Gesture Applications
Juan Pablo Wachs, Mathias Kolsch, Helman Stern and Yael Edan

"Body posture and finger pointing are a natural modality for human-machine interaction, but first the system must know what it's seeing."


More to come!

Jan 26, 2010

There is a need for multi-touch/gesture designers/developers!

If you are a talented interactive web designer/developer, game designer/developer, traditional programmer with a creative bent, or someone who who is thinking about working with technology in the future as a programmer or designer,  I urge you to consider thinking about designing/developing multi-touch applications in the near future.

In my opinion, there will be a need for multi-touch web applications as well as for multi-touch education and collaboration applications for the SMART Table, Microsoft's Surface,  multi-touch tablets like the rumored iTablet from Apple, and the multi-touch laptops and all-in-ones (Dell, HP, etc.).

Below are direct links to some of my blog posts related to multi-touch applications and screens. If you are fairly new to multi-touch, I'm sure that looking through some of my blog posts will be helpful.  All of the posts have links to resources, and most have photos and video clips of multi-touch in action.

If you are new to this blog, I have a great deal of information, links, photos, and video clips of various multi-touch screens and applications. The best way to find the stuff is to enter in a keyword in the search box for this blog:  multitouch, touch screen, gesture, multi-touch, etc. on  this blog.

Also do a search on my other blog: The World Is My Interface http://tshwi.blogspot.com

Here are some links:
Do you have an HP TouchSmart, Dell Studio One or NextWindow touch-screen? NUITech's Snowflake Suite upgrade provides a multi-touch plug-in
http://bit.ly/5tdlhc

The following blog post has a video clip that shows someone from Adobe painting with a multi-touch application in development:
More Multi-Touch!: Rumor of the mobile apple iTablet; AdobeXD & Multitouch; 10-finger Mobile Multitouch: http://bit.ly/4S9Upm

Ideum's GestureWorks: http://bit.ly/4C1p7M

Interactive Walls, Interactive Projection Systems, GestureTek's Motion-Based Games: http://bit.ly/6GRGtW

Intuilab's Interfaces: Multi-touch applications/solutions for presentations, collaboration, GIS, and commercehttp://bit.ly/7RK7qN

For software developers:
How to do Multitouch with WPF 4 in Visual Studio 2010: http://bit.ly/7c4YqC

Oct 8, 2009

The Visual Autopsy Table : Interactive Health Science

Interactive Virtual Autopsy Table


The Virtual Autopsy Table from NorrköpingsVisualiseringscenter on Vimeo.


Virtual Autopsies from NorrköpingsVisualiseringscenter on Vimeo.




How it works: Information from a true case, from the Virtual Autopsy Table website.

"A living patient was treated for cerebral hemorrhaging. X-rays sent through the body during computed tomography grow weaker according to the density of the tissue through which they pass. By assigning density values with varying degrees of transparency and identifying colors, a sort of palette can be created by the computer to use in the imaging process. It becomes possible, for example, to remove clothing, skin or blood vessels. In this case the patient has been operated for a ruptured aneurysm in a small brain vessel. A metal clips has been added that can be seen in the image.
"

RELATED

The Norrkopings Visualization Center
"The installation is financed within the framework of the Visualization program and developed by Norrköping Vi sualization Center in cooperation with CMIV (Center for Medical Image Science and Visualization)."

May 23, 2009

Razorfish's Touch Framework "Razorfashion" - A lot like my idea for an in-home FashionMirrorAdvisor...

Razorfish recently unveiled the Razorfashion application designed to provide shoppers with an engaging retail experience within the "multi-channel shopping ecosystem". I'm not the "shop to you drop" type of gal, but I can see that this concept could be useful in other situations, after a few tweaks.




As soon as I saw this Razorfish Touch "Fashion" demo video, it touched a nerve. I've been playing around with a similar idea, but for my personal use, in the form of an RFID-enabled system. I'd call it something like "FashionMirrorAdvisor".

Instead of showing skinny fashion models like the Razorfashion application, I'd harness the power of built-in web-cam and mirror my own image on the screen. My mirror would dress me up in the morning when I'm way too foggy to think about matching colors and accessories.

My FashionMirrorAdvisor would be my friend. My "smart" friend, since all of my clothes would be RFID-tagged, along with my shoes, jewelry, and other accessories. My make-up, too.

It would be a no-brainer. I really could use this application - just ask my husband!

More often than not, most mornings I find myself staring at the clothes in my closet, frozen in time, unable to formulate a fashion thought. I might set my eyes on a favorite blouse, but blank out when I try to think about the rest of the steps I need to pull my look together.


I know I can't wear my reddish-pink camisole with my dusty-orange/brown slacks, but at 5:15 A.M., who has the time to think about this little detail? My friend, the TouchFashionMirror would prevent me from making this fashion faux-pas.

No problem.

My FashionMirrorAdvisor would show me a few outfits, and dress my real-time moving image on the screen. Since she knows all things, she'd show me ONLY the articles of clothing that were clean, since my RFID system would keep up with all of that. It would be much more functional than a "virtual wardrobe" application.

I could try out different earrings without having to get them out.

If I couldn't find something, the RFID system would take care of this detail. My FashioMirrorAdvisor would know where I misplaced my clothes, accessories, and even my keys, since they would all be tagged. The mirror application would provide me with a nice little map of my house and car, and highlight the location of the item.

My FashionMirrorAdvisor would keep track of my laundry, too. This would be a great feature. So if my dirty laundry was piling up, and I wanted to wear outfit X, Y, or Z over the next few days, I'd receive a gentle reminder that I'd need to do some laundry first!

Another practical feature:

My FashionMirrorAdvisor would also serve as my health consultant, keeping track of my weight and BMI. This data, along with information gained from the webcam, would be combined so that my advisor would NEVER suggest an outfit that would be too...snug.

I could program the system to provide me with gentle reminders if my weight was an issue. My FashionMirrorAdvisor would show me images of myself "before" and "after", outfits included.

Information about the "after" outfits could be fed to the system from the web-catalogs of my favorite fashion retailers, and once I lost those 10 darned pounds, I'd find a nice parcel delivered to my door.

Thanks to my FashionMirrorAdvisor, I know that the outfit would be just right.


UPDATE 5/8/10:  The FashionMirrorAdvisor would be integrated with a mobile app - since I now have a smartphone, this would be quite useful in planning shopping trips centered around the purchase of new clothes, shoes, accessories, and coordinating cosmetics!  I created a little game  that I think would be ideal for this sort of thing, too.

I still want to work on this....someday.

Too many ideas, too little time!


RELATED
From the Razorfish site:
"The Razorfish Emerging Experiences team is a dedicated group of highly experienced professionals focused solely on emerging experiences and technologies. "Effective innovation" is our multifaceted approach to concepting and delivering pioneering solutions for our clients"

"Founded in 2008, Razorfish Emerging Experiences is a cross-functional team composed of strategists, artists, experience designers, and technologists. We’re part of the Razorfish Strategy & Innovation practice led by Shannon Denton. Jonathan Hull is the managing director of the team, Steve Dawson is the technology lead and Luke Hamilton is the creative lead."


Razorfish Emerging Experiences Portfolio

Jun 17, 2006

Interactive literacy applications and on-line resources

Here are some literacy links:

STARFALL
This extensive, free resource works well with in situations such as peer-tutoring, small group instruction, and home learning interventions. Good potential for reading intervention research, also for interventions/support for ELL students. Impressive on an interactive whiteboard. From the website: "“Starfall's online reading programs, together with Starfall's printed materials, are available free from our website as a public service. The website is easy to access and does not require special software to operate. Our printed materials are available for download free of charge. Starfall's educational objectives are consistent with the findings of National Research Council in their report on Preventing Reading Difficulties in Young Children."

EDUTOPIA ARTICLE ABOUT AUDIOBOOKS- "Listening to Literature"

EDUTOPIA ARTICLE ABOUT TECHNOLOGY AND READING - "Tech Teaches"


LEXIA LEARNINGSYSTEMS
Lexia provides interactive software applications for reading. A Spanish version is available. The Lexia Cross-Trainer provides interactive games to develop visual-spatial problem -solving skills. For more information, visit the Lexia products link and the support library.


"HeadSprout's secure, web-based performance reports summarize the online instruction into practical information for teachers and administrators. Automated performance reports provide objective, empirical information about each student's individual strengths and weaknesses and progression within the Headsprout reading program."
Edmark and Riverdeep offer a range of educational applications geared towards special learning needs. Some of the validity research regarding these products is dated. (Research that compares the effectiveness of these products with other interactive reading and academic programs is needed.)

LITERACY RESOURCES FOR SPECIAL NEEDS

Resources from UConn to support reading and writing for students with special needs.

NORTH CENTRAL REGIONAL LIBRARY ARTICLE ON TECHNOLOGY AND LITERACY - "
Critical Issue: Using Technology to Enhance Literacy Instruction

READ PRINT
Good resources for reading materials. Includes biographical information about various authors. According to the website: "Our website offers thousands of free books for students, teachers, and the classic enthusiast."

VCOM3D INC

From the website: "This technology provides interactive, 3D animated characters that communicate through body language (including gesture, signs, and facial expression), lip-synched speech, and action. Their embedded behaviors reflect each character‚’s individual personality, and makes them appealing and lifelike."

"A striking increase in attention span, their level of engagement, and results. During an evaluation at the Florida School for the Deaf and Blind, the increase in comprehension when shifting from text-only to text accompanied by sign language using SigningAvatar technology was 17% to 67%."


INTERVENTION CENTRAL

A great on-line resource is Intervention Central, created by Jim Wright, a school psychologist and school administrator. On this site, you will find several on-line tools, downloads, and links to a variety of academic and behavioral intervention strategies. Many of the interventions focus on literacy. There are some technology links on the site.

Update 3/24/10:  READ 180

"READ 180 is a comprehensive reading intervention program proven to meet the needs of struggling readers in transitional Grades 3–12. The program directly addresses individual needs through differentiated instruction, adaptive and instructional software, high-interest literature and non-fiction, and direct instruction in reading, writing, and vocabulary skills."