Showing posts with label +. Show all posts
Showing posts with label +. Show all posts

Jun 6, 2009

Interactive Touch Tables are Multiplying! Ideum's new 100" multi-touch museum display; Ubisoft's Ruse on a Surface....

http://www.ideum.com/blog/wp-content/uploads/100-multitouch-table-cropped-1024x724.jpg
Photo from the Museum Exhibit and Design News/Ideum Blog

The above table is 100", created for Nashville's Adventure Science Center's Sudekum Planetarium, is based on Ideum's mt2 table: "Built for museum environments, the high-resolution mt2 Table supports intuitive gestures, allowing visitors to interact and explore custom exhibits. With multiuser support, mt2 Table serves as digital gathering place, allowing visitors to interact with museum content and each other."

Two computers power this table. Natural User Interface's Snowflake handles the optical support and tracking, and the second computer will be running the EM spectrum application, which is developed in Adobe Flash. It is a work in progress. For more information, visit the Ideum blog.

Here is a video trailer of Ubisoft's R.U.S.E. played on a multi-touch table. Much of the trailer appears to be a simulation. I liked the part where the battleships are traveling over the water. According to a blog post, the table depicted in the trailer is the Intuiface, by Intuilab.

Photos of Ubisoft's R.U.S.E. game, running on a multi-touch table:

http://www.blogcdn.com/www.joystiq.com/media/2009/06/rusesurfacea.jpg

http://static2.cdn.ubi.com/emea/gamesites/ruse/community/uk/RUSEBLOG.005.jpg
Photo from the Ubisoft Blog

Intuiface Information PDF
Intuilabs Website

If you visit the Intuilabs website, be sure you interact with the photos of the team! I couldn't get the pictures to change when I touched the screen of my HP TouchSmart PC, so I think they are activated by a mouse-over. It would be nice if they could touch-enable their website.)

Information about Touch Screens, Multi-Touch, & Gesture Interaction is Spreading

Since the news about Windows 7 multi-touch capabilities has spread around, I haven't had enough time to keep up all of information related to multi-touch interaction. Fortunately there are a few bloggers out there who are doing a great job filling in the gaps.

The Touch User Interface blog has a wealth of information in the form of pictures, video clips, slides, and links that I'd like to share.

The following slideshow/videos were highlighted in the Touch User Interface blog post, "Touch UI: HCI Viewpoint":

Untold Stories of Touch, Gesture, & NUI

Joe Fletcher, Design Manager, Microsoft Surface

Touch and Gesture Computing, What You Haven't Heard
Dan Saffer



Other posts of interest on the Touch User Interface blog:
Touch screens and vision impairment
Link: Designing the Palm Pre: An Interview with Michelle Koh

Touch User Interface Overview

I've updated some additional information about UX,interactive multimedia, multi-touch, and gesture interaction on my Multimedia and Interaction Resources page, which is a work in progress.


Jun 5, 2009

White House Blog: Interactive Debategraph for Open Government Transparency.

Infoviz for Government!

I came across the following graphic of the interactive "Debategraph" from the White House Blog:

Debategraph screenshot

When you visit the Debategraph website, you can click on each item to see things reconfigured according to your selection. You can link to the site by clicking on the above graphic.

According to the Whitehouse Blog post, "Debategraph is a visual policy mapping tool that is being used for running citizen engagement on climate change in Europe. Debategraph translated our
mindmap of the redacted transparency proposals into the interactive Debategraph. In this format, the different proposals are rateable, addressable, and open to collaborative editing. People can also add supporting and opposing arguments to the proposals. "The aim with visual policy maps of this kind is to collaboratively weave together all of the salient proposals and arguments dispersed through the community into a single rich, transparent structure—in which each idea and argument is expressed just once—so that anyone can explore quickly and gain a good sense of the perceived merits of the relevant choices," says David Price, Debategraph's co-founder."

More from Multitouch Barcelona!

Multitouch Barcelona is up to more great work!

Multitouch Space Invaders XL

Multitouch Space Invaders XL from Multitouch Barcelona on Vimeo.



Guten Touch, by Multitouch Barcelona

GUTEN TOUCH from Multitouch Barcelona on Vimeo.



"Designed for the Red Bull Music Academy 08, Guten Touch is an interactive installation that involves people into a natural relationship with technology. A two projected display system plus a 3m x 2m multitouch wall showcase applications designed to engage us into human friendly experiences rather than flashy and jaw-dropping visualizations. Space Invaders hit by foam balls, pixel paintings created with brushes and digital objects held by hands try to blur boundaries between real and digital."

Multi-touch Barcelona (new site)
"Multitouch Barcelona is a recently born interaction design group that explores natural communication between people and technology. They design touch sensitive environments where real world interactions move to a digital context. Interfaces where people touch, play, move, feel...Where senses play their natural role, where everything just 'happens" as it does in the real world. "




RELATED
Offf 2009: International Festival for the Post-Digital Creation Culture

Jun 2, 2009

Updates about NextWindow and Stantum; Upcoming Emerging Displays Technologies Conference

Here is a brief update about two companies that I follow:

NextWindow Granted Key Optical Touch Screen Patents (pdf)
Pleasanton, CA – June 1, 2009 – "NextWindow, the leader in optical touch screens for all-in-one PCs and large-format displays, has been awarded two key technology patents, one in the US and another in China. The newly granted patents which refer to optical touch systems incorporating light emitters, reflectors and detection methods, help cement NextWindow’s leadership positions in the important Chinese manufacturing and US sales markets"

"The US patent, number 7,538,759, issued by the United States Patent & Trademark Office on May 26, 2009, is titled, “Touch Panel Display System with Illumination and Detection Provided from a Single Edge.” NextWindow previously was granted this patent in Australia , and a request for patent is pending in Canada , Europe, Hong Kong and Japan ..."

You can follow NextWindow on Twitter

Stantum Granted Key Patents on Its Multi-Touch Technology

BORDEAUX, France, June 1, 2009 – "Stantum, a pioneer developer of multi-touch solutions and systems since 2002, announced today that both the European Patent Office and the China Patent & Trademark Office have granted patents (EP1719047 and CN100447723C, respectively) to Stantum on its multi-touch technology."

"In 2004, under its former name, JazzMutant, Stantum became the first company ever to develop and bring to market a multi-touch electronic device – the award-winning Lemur remote controller for creative professionals. The recently granted European and Chinese patents extend the original patent filed in France in February 2004."

"The patents describe a method and a system for controlling electronic devices by manipulating graphic objects on a transparent multi-contact touch panel. Beyond the process enabling the detection and tracking of an unlimited number of simultaneous contact points on a touch screen, the patents disclose various multi-touch interaction techniques, such as applying specific behavior to graphic objects according to finger gestures...."

Stantum's Quarterly Newsletter

Here is an industry-related 1-day conference that looks interesting!

2009 Emerging Display Technologies Conference: Innovation for the Next Wave of Growth

"Emerging display technologies offer alternative performance, cost, design, and business models to mainstream display technologies. From touch screens, flexible displays, OLED displays, e-paper displays, and pocket projectors to 3D displays, this 1-day conference will explore how new display technologies can bring innovative form factors, attractive visual performance, power saving, and potentially drive growth in the near future."

Thursday, September 3, 2009 8:00 AM - 5:30 PM

San Jose Marriott
301 S. Market Street
San Jose, California 95113
USA
408-280-1300


Jun 1, 2009

3DV Systems Motion Sensing Camera and Microsoft's Xbox 360

A while ago there was an article in Reuters that discussed how Microsoft was planning to buy 3DV Systems, a company that develops virtual reality imaging technology for digital cameras, called ZCams. The ZCams would be used as an accessory to the XBox 360 system.

It is true, according to a recent article in Business Week:

Microsoft Moves onto Nintendo's Motion Turf: "The software giant's Xbox 360 gaming console will have a camera system that lets users bring body movements and voice commands into play"

Information about the camera from the 3DV website:
"The new"ZCamTM (previously Z-Sense), 3DV's most recently completed prototype camera, is based on DeepCTM and is the company's smallest and most cost-effective 3D camera. At the size of a standard webcam and at affordable cost, it provides very accurate depth information at high speed (60 frames per second) and high depth resolution (1-2 cm). At the same time, it provides synchronized and synthesized quality color (RGB) video (at 1.3 M-Pixel). With these specifications, the new ZCamTM (previously Z-Sense) is ideal for PC-based gaming and for background replacement in web-conferencing. Game developers, web-conferencing service providers and gaming enthusiasts interested in the new ZCamTM (previously Z-Sense) are invited to contact us."


Video of 3DV Systems at CES 2008

Sony Play Station EyeToy: Therapy for Burn Patients

Here is an excerpt from a recent article about the work of Dr. Joseph Haik, a plastic surgeon and burn specialist who has pioneered a therapy that utilizes Sony's Play Station EyeToy:

"The EyeToy is a digital camera, similar to a webcam, whose technology uses computer vision and gesture recognition to process images, enabling players to interact with games using motion and color detection."

"With our method, patients look into the EyeToy and see their images projected on TV," Haik says. "The game recognizes their gestures and shows them to themselves on screen, helping them adjust to what they look like post-burn. That can help combat depression, improve self-image, and encourage patients to move on when other occupational therapies fail," he explains."


"A very important aspect of healing is coming to terms with scars on the face, hands and other exposed body parts. Depression and other symptoms associated with severe burns can make a full recovery more painful and difficult than it may need to be. That's why the EyeToy can be so useful in helping patients to take the first step in accepting a new self-image, Haik says."

SensoramaLab - Interactive Systems for Rehabilitation Applications

The SensoramaLab at Aalborg University Esbjerg is involved in a variety of research activities focusing on interactive systems, including virtual reality, for use in rehabilitation.

Pictures from the SensoramaLab website:

http://sensoramalab.aaue.dk/thumbs/sensorama_screen_silhouet_island_thumb.jpg
Immersion for therapy of acrophobiaInside the Flying CitiesFlying Citieshttp://sensoramalab.aaue.dk/thumbs/sensorama_screen_city_thumb_bis.jpg
Space Race
Magic carpet
Our stand at Scandinavian Rehabilitation Messe 2008, CPH.
http://sensoramalab.aaue.dk/pics/sensorama_bigcow.jpg

International Society for Virtual Rehabilitation

ArtAbilitation

May 31, 2009

RENCI at UNC-Chapel Hill: Serious Gaming and Simulation, Unity 3D Platform

VIDEO
Running the Unity 3D Game Engine in a Global Immersion 4-Projector Dome System at the RENCI@UNC Engagement Center



For more information:
Serious Gaming and Simulation at RENCI@UNC

"
The application of game technology to teaching, training, and research has been a topic of interest at RENCI@UNC. More specifically, the adaptation of game engines and 3D environments to the specialized visualization environments at the engagement center is a promising area for experimentation and collaboration...We very recently obtained copies of Unity, and have been working through some of the basics. One of the first activities was to create specialized code to show Unity-built 3D environments in our Global Immersion dome system. This was successfully accomplished with a good bit of code-hacking to render the proper camera views and viewports for the four channel dome system. There are some known issues with the Windows stand-alone viewer that we were able to work around. We did some imaginative manipulation of the Unity Plug-in using a browser embedded within a WPF application that allows arbitrary screen resolutions, as well as spanning across multiple displays. Look for a post later with some technical details."
FYI

Here are some pictures from RENCI:

http://unc.renci.org/wordpress/wp-content/themes/renci2.0/images/sections/resources/visualization-resources-b.jpg

http://unc.renci.org/wordpress/wp-content/uploads/2009/03/dome-001-300x225.jpgdome-003-a
dome-003-ahttp://farm4.static.flickr.com/3085/3177205942_79d56b3481.jpg

May 30, 2009

Ball's Plight, an iPhone and iPod Touch Game App by PriPri Games (plug for a former classmate) -and more games-related info....

Priyesh Dixit, a game developer has a free game app for the iPhone and iPod Touch. "Ball's Plight", can be downloaded from the iTunes Store. Here is the description:

"You are Ball, an alien with telekinetic powers and no limbs. An evil scientist has captured you and is forcing you to play his little game so he can unlock the secrets of your powers. You must survive in order to escape!"

"At its core, Ball's Plight is a classic game of ball and paddle. Touch the ball to the paddle to increase your score, reach the goal score to advance to the next level while earning points."

http://priyesh.org/pics/BallsPlight2.jpghttp://priyesh.org/pics/BallsPlight1.jpg
Priyesh was one of my classmates in Dr. Michael Youngblood's Artificial Intelligence for Game Design Class at UNC-Charlotte during the fall of 2006. He went on to work with Dr. Youngblood and Hunter Hale, a Ph.D. student, on the Common Games Understanding and Learning Toolkit (CGUL).

The mission of the CGUL project:

"To identify, develop, collect, and integrate a suite of tools and techniques for improving the understanding of interaction and behavior in all FPS/3PS games, incorporating knowledge as game artifacts, and learning how to advance game artificial intelligence through these mechanisms." (CJUL downloads)

About Priyesh:

Priyesh is a programmer at TimeGate Studios in Surgar Land, TX. He has a MS in Computer Science with a certificate in Game Design and Development.

Priyesh twitters as "
madgamer7", if you'd like to follow him.


RELATED:


http://playground.uncc.edu/images/header.jpg
The Playground: UNC-Charlotte's Game Design and Development Program

http://playground.uncc.edu/GameIntelligenceGroup/GIG-L1.png
UNC-Charlotte's Game Intelligence Group, lead by Dr. Michael Youngblood


http://playground.uncc.edu/Game2Learn/images/buttonlogo.png

UNC-Charlotte's Game2Learn Group, lead by Dr. Tiffany Barnes

Publication:
D.Hunter Hale, G. Micheal Youngblood, & Priyesh N. Dixit Automatically-generated Convex Region Decomposition for Real-time Spatial Agent Navigation in Virtual Worlds (pdf) Association for the Advancement of Artificial Intelligence
(Voted Most Infuential Game AI Publication of 2008)

Game:
Incursion, an Xbox game created by UNC-Charlotte Game Lab students,is available for download from the Xbox LIVE Marketplace website.

May 29, 2009

May 28, 2009

Multi-player multi-touch: "NuMTyPysics", based on Tim Edmond's Numpty Physics (similar to Crayon Physics)

I have Numpty Physics on my Nokia n800 internet tablet, and Crayon Physics on my HP TouchSmart PC. Both are designed for single touch, and are fun to play. Since my TouchSmart can handle duo-touch input, I wondered what Crayon Physics might be like if it supported two players at once.

http://www.tuxi.com.ar/wp-content/uploads/numpty-physics-tuxi.jpg

As you can see from the video, Thomas Perl and his colleagues have figured this out- at least with Numpty Physics!



The music is worth the watch. It's by Triplexity.

Numpty Physics and Crayon Physics both use the Box2D engine. Here is some information from the website:

"NuMTyPYsics are our enhancements to Tim Edmond's NumptyPhysics game. We added support for receiving TUIO messages from tbeta via PyTUIO by embedding an Python interpreter into the NumptyPhysics code. Currently, we simply emulate mouse input by pushing hand-crafted input events (SDL_Event) onto the SDL event queue (SDL_PushEvent). In the future, we plan to do bi-directional communication between the game engine (written in C++) and our multi-touch handling code, which will be written in Python."
-Thomas Perl

Note:
I've use Crayon Physics Deluxe with several of the students I work with who have severe autism. It is amazing how well they can figure out solutions for the levels. It would be even better if it could be enabled for duo-touch. It supports joint attention, which is a very important social interaction skill for young people with autism to develop.

May 27, 2009

My Nabaztag:tag and Mir:ror Arrived!

http://regmedia.co.uk/2009/01/13/mirror_1.jpg

I'll update this post after I've played with my new Wi-Fi and RFID toys.

For more information:

violet: Let All Things Be Connected

"The Internet of Things starts here"

May 26, 2009

GestureTek's "Cube": A compact Interactive Gesture-based Display System

GestureTek's "Cube"




From the GestureTek website:

"Introducing The Cube - a compact, turnkey, 'plug and play' interactive display unit that brings the power of gesture control to a variety of display spaces. Project the interactive 80” diagonal display onto almost any floor, wall, table or counter for branding, advertising, entertainment and product promotion. The Cube will engage customers, turn heads and drive business results."

Brochure
(pdf)

May 24, 2009

Thorsten Blum and Johann Korndörfer's Multi-touch 360 Sphere for Art Installations- and Space Invaders!

I came across the Multi-touch 360 project today and I thought it would be the best surface for a prototype I worked on a couple of years ago. Multi-touch 360 would be great for travel planning and teaching geography. I could see one of these interactive spheres in the lobby of every cruise ship!

This


on this!

The Multi-touch 360 project is the work of Thorsten Blum and Johann Korndorfer, based on their previous work on the Space Invader's 360 game, featured later in this post. It is similar to Microsoft's Sphere project.

Multi-Touch 360 First Tests Video:




Picture Collage of Multi-Touch 360 In Process:

Multi-Touch Materials Research

SPACE INVADERS 360 VIDEO


SPACE INVADERS 360


Space Invaders 360: An Interactive Multplayer Gaming Installation That Makes You Move (pdf)
MindTrek 2008 October 6-9, 2008, Tampere, FINLAND ACM

ABSTRACT
"Space Invaders 360 is a tribute to the video game Space Invaders designed by Tomohiro Nishikado in 1978. While the original version is a two-dimensional arcade shooter game, we have created an interactive gaming installation combin-ing the fun and thrill of the classic video game with experimental concepts of display technologies and new ways of user interaction.
Beside the technical aspects we also want to discuss the social aspects of our work. How does the gameplay change
in the 21st century? What new ways of user interaction can be thought of? The success of Nintendo Wii and Nintendo DS [7] shows the interest of professional and casual gamers in new methods of user interaction. While the last decades have been dominated by video games on fixed screens, over- sized gamepads and realistic graphics, gamers are now moving towards video gaming with a focus on physical interaction."



Thorsten maintains another blog, which is worth a look. I like the links he posts:
More about Microsoft's Sphere prototype:

"Our prototype device builds on a commercially available globe projection system (Global Imagination’s Magic Planet). We have added touch-sensing capabilities with an infrared camera that shares the optical path with the projector. This novel configuration permits us to enclose both the projection and the sensing mechanism in the base of the device and allows for easy 360 degrees access to the device and high degree of interactivity without shadowing or occlusion problems. We have also developed a custom graphics projection pipeline that solves the distortion problems and enables projected data to be correctly visualized on the curved surface.

The applications developed for Sphere currently include a picture and video browser, interactive globe visualization, finger painting, globe visualization, and an omni-directional video-conferencing application (360 degree panoramic video from Microsoft RoundTable device). These applications are designed for highly interactive public space environments (e.g., museums, lobbies, information booths) and highlight the appeal of the device as well as its visualization and interaction capabilities." Hrvoje Benko, Andy Wilson, Ravin Balakrishnan, Billy Chen

Note: Microsoft's RoundTable 360 video conferencing application is now known as the Polycom CX5000 Unified Conference Station, optimized for the Microsoft Office Communicator. More information can be found on the Polycom website.

http://research.microsoft.com/en-us/um/people/benko/projects/sphere/IMG_3834_small.jpg
Microsoft's Sphere

May 23, 2009

Razorfish's Touch Framework "Razorfashion" - A lot like my idea for an in-home FashionMirrorAdvisor...

Razorfish recently unveiled the Razorfashion application designed to provide shoppers with an engaging retail experience within the "multi-channel shopping ecosystem". I'm not the "shop to you drop" type of gal, but I can see that this concept could be useful in other situations, after a few tweaks.




As soon as I saw this Razorfish Touch "Fashion" demo video, it touched a nerve. I've been playing around with a similar idea, but for my personal use, in the form of an RFID-enabled system. I'd call it something like "FashionMirrorAdvisor".

Instead of showing skinny fashion models like the Razorfashion application, I'd harness the power of built-in web-cam and mirror my own image on the screen. My mirror would dress me up in the morning when I'm way too foggy to think about matching colors and accessories.

My FashionMirrorAdvisor would be my friend. My "smart" friend, since all of my clothes would be RFID-tagged, along with my shoes, jewelry, and other accessories. My make-up, too.

It would be a no-brainer. I really could use this application - just ask my husband!

More often than not, most mornings I find myself staring at the clothes in my closet, frozen in time, unable to formulate a fashion thought. I might set my eyes on a favorite blouse, but blank out when I try to think about the rest of the steps I need to pull my look together.


I know I can't wear my reddish-pink camisole with my dusty-orange/brown slacks, but at 5:15 A.M., who has the time to think about this little detail? My friend, the TouchFashionMirror would prevent me from making this fashion faux-pas.

No problem.

My FashionMirrorAdvisor would show me a few outfits, and dress my real-time moving image on the screen. Since she knows all things, she'd show me ONLY the articles of clothing that were clean, since my RFID system would keep up with all of that. It would be much more functional than a "virtual wardrobe" application.

I could try out different earrings without having to get them out.

If I couldn't find something, the RFID system would take care of this detail. My FashioMirrorAdvisor would know where I misplaced my clothes, accessories, and even my keys, since they would all be tagged. The mirror application would provide me with a nice little map of my house and car, and highlight the location of the item.

My FashionMirrorAdvisor would keep track of my laundry, too. This would be a great feature. So if my dirty laundry was piling up, and I wanted to wear outfit X, Y, or Z over the next few days, I'd receive a gentle reminder that I'd need to do some laundry first!

Another practical feature:

My FashionMirrorAdvisor would also serve as my health consultant, keeping track of my weight and BMI. This data, along with information gained from the webcam, would be combined so that my advisor would NEVER suggest an outfit that would be too...snug.

I could program the system to provide me with gentle reminders if my weight was an issue. My FashionMirrorAdvisor would show me images of myself "before" and "after", outfits included.

Information about the "after" outfits could be fed to the system from the web-catalogs of my favorite fashion retailers, and once I lost those 10 darned pounds, I'd find a nice parcel delivered to my door.

Thanks to my FashionMirrorAdvisor, I know that the outfit would be just right.


UPDATE 5/8/10:  The FashionMirrorAdvisor would be integrated with a mobile app - since I now have a smartphone, this would be quite useful in planning shopping trips centered around the purchase of new clothes, shoes, accessories, and coordinating cosmetics!  I created a little game  that I think would be ideal for this sort of thing, too.

I still want to work on this....someday.

Too many ideas, too little time!


RELATED
From the Razorfish site:
"The Razorfish Emerging Experiences team is a dedicated group of highly experienced professionals focused solely on emerging experiences and technologies. "Effective innovation" is our multifaceted approach to concepting and delivering pioneering solutions for our clients"

"Founded in 2008, Razorfish Emerging Experiences is a cross-functional team composed of strategists, artists, experience designers, and technologists. We’re part of the Razorfish Strategy & Innovation practice led by Shannon Denton. Jonathan Hull is the managing director of the team, Steve Dawson is the technology lead and Luke Hamilton is the creative lead."


Razorfish Emerging Experiences Portfolio