Showing posts sorted by date for query multitouch multi-touch. Sort by relevance Show all posts
Showing posts sorted by date for query multitouch multi-touch. Sort by relevance Show all posts

Oct 13, 2013

UPDATE: NUITEQ's Snowflake ABC has been updated, improved, and retitled-- Snowflake MultiTeach®

August 24, 2016 Update

This post was originally published on October 13, 2013, and focused on Snowflake ABC, a software suite of entertaining educational apps developed by NUITEQ®.

Fast forward to 2016, and you'll see many great improvements have been made by the NUITEQ® team.  Along with the changes came a new name, Snowflake MultiTeach®


Snowflake MultiTeach® was designed for multi-touch, multi-user displays and tables, but can be used on single-touch displays or traditional computers/laptops with a mouse. It works well for "front-of-class" presentations and has many features that support collaborative group work.



So what is new?

The Lessons app was added to the suite of applications, along with Nodes, a concept-mapping and presentation application that supports project and problem based learning, STEM, and STEAM.

The Lessons app is a powerful addition to Snowflake MultiTeach®.  Within this app,  teachers (and students) can build a variety of interactive, engaging lessons, learning games, and quizzes, with little effort. 

Snowflake MultiTeach® can support several lessons and apps running at the same time, using the Zones feature, which makes it ideal for students working in pairs or small groups.

To compliment the Lessons app, the online MultiTeach® Lessons Community  was developed to provide a way for teachers to share lessons they've created.  The lessons posted on the MultiTeach® Lessons Community can be accessed from within the Lessons home screen.  Lessons can be searched by subject and level and imported directly into the Lessons home screen.  

Lessons created with Snowflake MultiTeach® can be exported directly from the Lessons home screen to the MultiTeach® Lessons Community so they can be shared with others. Snowflake MultiTeach® and sign-up to the MultiTeach® Lessons Community is required for access.

To support teachers as they make the transition from single-touch whiteboards to multi-user displays, the MultiTeach® Pinterest site was created.  There you can find curated and regularly updated information about how to use Snowflake MultiTeach®, 21st century teaching and learning, free digital resources for the classroom, and much more.

For information about how teachers can integrate Snowflake MultiTeach® into their classrooms, take some time to read the following blog posts. Share with your peers!

Infuse Arts into STEM Projects Using Collaborative Touch Screen Technology

Collaborative STEM Education and Careers Exploration

How to MultiTeach®: Introduction to Snowflake MultiTeach® Nodes App

Differentiated instruction through Snowflake MultiTeach®


Like what you see?

If you are interested in replacing your current single-touch interactive whiteboard with a multi-touch display, contact a representative from NUITEQ® by following the email link on the website, and you will be connected to someone in your region who can assist you. 

Here is the old post for reference.  

FYI: Most of the ABC apps shown below can be found within the current version of Snowflake MultiTeach®.

NUITEQ is a multi-touch & gesture tech company based in Sweden focused on collaborative natural user interfaces and interaction. Instead of using a traditional mouse and keyboard, NUITEQ's products support the effective use of fingers and hands to interact and explore digital content. I've been following this company since it was launched by Harry van der Veen in 2007 and have been happy to share NUITEQ's updates on this blog over the years.

NUITEQ's products can be found in public spaces, such as museums, retail centers, and transportation hubs,  at special events, such as trade shows and exhibitions, in collaborative work settings, such as board rooms and design studios, and in educational settings.   

One of the latest innovations from NUITEQ is Snowflake ABC, multi-touch software for children and teens. It runs on a range of multi-touch hardware surfaces, including Lenovo, 3M Touch Systems,  Zytronic, Flatfrog, PQ Labs, Touch International, Lumio, and others.  



Multitouch software Snowflake ABC for kids (K-12) from multitouch software on Vimeo.

In my opinion, Snowflake ABC would be great for use in homes, libraries, and classrooms. Although the video shows children using the software on large displays and tables, it works well on smaller 27-inch touch screens, preferably those that support multi-touch interaction. A free trial version is available.  For parents considering purchasing a home computer, the Lenovo IdeaCenter A520 and other multi-touch "all-in-ones" come loaded with a variety of Snowflake apps.

Below is a screen shot various Snowflake ABC educational apps, from the Snowflake Suite website.


Credit: NUITEQ








































RELATED
Multi-touch software Snowflake ABC 1.3 for kids released (K-13) - Now with 18 languages out of the box!  NUITEQ Blog, 9/26/13
Snowflake ABC Brochure (pdf)
NUITEQ Signs OEM Agreement with Lenovo for its Multitouch Software Platform Snowflake Suite PRWEB, 8/31/12
Review:  Lenova IdeaCentre A520
Jason Palaszewski, 8/22/13, Lenovo Blog

Snowflake Interactive Display at Skellefteå Airport (Snowflake Suite)

Jan 10, 2013

Gesture Markup Language (GML) for Natural User Interaction and Interfaces

Quick post:
"GML is an extensible markup language used to define gestures that describe interactive object behavior and the relationships between objects in an application.  Gesture Markup Language has been designed to enhance the development of multiuser multi-touch and other HCI device driven applications." -Gesture ML Wiki

GestureML was created and maintained by Ideum. 

More information to come!
The Pano













Photo credit: Ideum

RELATED
Ideum Blog

OpenExhibits Free multitouch and multiuser software initiative for museums, education, nonprofits, and students

GestureWorks  Multi-touch authoring for Windows 8 & Windows 7



Nov 17, 2012

Human Computer Interaction + Informal Science Education Conference (NUI News)

I recently learned of the HCI + ISE conference, funded by the National Science Foundation and organized by Ideum and Independent Exhibitions that will provide the groundwork for the future of the development and design of interactive computer-based science exhibits.
Science museums have a long history of interactivity, well suited to groups of "explorers", such as families or students visiting on a field trip.  

What is really exciting is that new interactive applications and technologies have the power to transform the way people learn and understand science in a collaborative and social way.  Innovations in the field of HCI - Human-Computer Interaction- such as multi-touch and gesture interaction, are  well-suited to meet the goals of science education for all, beyond the school doors and wordy textbooks. 

Below is a screen-shot of the conference website, a description about the conference, quoted from the site, and some related resources.



About the HCI+ISE Conference
"HCI technologies, such as motion capture, multitouch, augmented reality, RFID, and voice recognition are beginning to change the way computer-based science exhibits are designed and developed. Human Computer Interaction in Informal Science Education (HCI+ISE) is a first-of-its-kind gathering to explore and disseminate effective practices in developing a new generation of digital exhibits that are more intuitive, interactive, and social than their predecessors."
"The HCI+ISE Conference, to be held in Albuquerque, New Mexico June 11-14 2013, will bring together 60 museum exhibit designers and developers, learning researchers, and technology industry professionals to share effective practices, and to explore both the enormous potential and possible pitfalls that these new technologies present for exhibit development in informal science education settings."
"HCI+ISE will focus on the practical considerations of implementing new HCI technologies in educational settings with an eye on the future. Along with a survey of how HCI is shaping the museum world, participants will be challenged to envision the museum experience a decade into future. The conference results will provide a concrete starting point for exhibit developers and informal science educators who are just beginning to investigate these emerging technologies and design challenges in creating these new types of exhibits."
Why HCI+ISE?
"Since the mid-1980s informal educational venues have increasingly incorporated computer-based exhibits into their science communication offerings in an effort to keep pace with public expectations and make use of the expanding opportunities these technologies provide. The advent and popularity of once novel HCI technologies are becoming commonplace: the Wii and Microsoft Kinect now allow for motion capture video games, tablet PCs have multitouch interaction, and smart phones and other devices come standard with voice recognition. Yet many museums are still developing single-touch and trackball-driven, single-user computer kiosks."
"Science museums have a long history of championing hands-on, physical, and inquiry-based activities and exhibits. This vast experience has only just begun to be applied to interactive computer interfaces. Along with seasoned science exhibit developers, the Conference will draw upon individuals outside of ISE who will provide fresh insight into the technologies, design issues, and audience expectations that these visitor experiences present."
Involvement and Findings
"HCI+ISE will bring together a diverse group of practitioners and other professionals to discuss (and in some cases share and prototype) new design approaches utilizing emerging HCI technology. Please see our Apply page to learn how you can participate. Conference news and findings will be distributed through a variety of ISE and museum websites, including this one."
"We welcome your questions and comments about the HCI+ISE Conference."
CONTACTS
Kathleen McLean of Independent Exhibitions
& Jim Spadaccini of Ideum
HCI+ISE Co-chairs
"Open Exhibits is a multitouch, multi-user tool kit that allows you to create custom interactive exhibits."
CML:  Creative Mark-up Language
GML: Gesture Mark-up Language
GestureWorks
Ideum

Jul 29, 2012

Blast from the 2009 past: News, Videos, and Links about Multi-touch and Screen Technologies

One of the things I like to do is share updates about the world of multimedia, multi-touch, gesture, screen, surface, and interactive technologies, focusing on off-the-desktop applications and systems. When I started this blog, I had to put forth quite a bit of effort just to FIND interesting things to blog about.  


These days, there are so many sources that focus on emerging - and now commonplace- interactive technologies, my main challenge is to filter the noise.  Where do I begin?


My archives are vast.   I randomly picked the year 2009 and came across one of my previous posts, "News, Videos, and Links about Multitouch and Screen Technologies."   The post is long, and contains a number of videos and links that probably will be of value to a future curator of the history of technology.


I welcome comments from readers who might be able to help me update information about various applications and systems I've featured on this blog in the past. 

The pictures are screenshots from the results of an  image search for "interactivemultimediatechnology".  Over the past 6 years, I've posted quite a few!








Jul 21, 2012

Musings about NUI, Perceptive Pixel and Microsoft, Rapid Creative Prototyping (Lots of video and links) Revised

It just might be the right time for everyone to brush up on 21st century tech skills. iPads and touch-phones are ubiquitous. Touch-enabled interactive whiteboards and displays are in schools and boardrooms.  With Microsoft's Windows 8 and the news that the company recently acquired Jeff Han's company, Perspective Pixel, I think that there will be good support - and more opportunities- for designers and developers interested in moving from GUI to NUI.    


In the video below, from CES 2012, Jeff Han provides a good overview of where things are moving in the future.  We are in a post-WIMP world and there is a lot of catching up to do!

CES 2012  Perceptive Pixel and the Future of Multitouch (IEEE Spectrum YouTube Channel)



During the video clip, Jeff explains how far things have come during the past few years:
 "Five and 1/2 years ago I had to explain to everybody what multi-touch was and meant. And then, frankly, we've seen some great products from folks like Apple, and really have executed so brilliantly, that everyone really sees what a good implementation can be, and have come to expect it.  I also think though, that the explosion of NUI is less about just multi-touch, but an awareness that finally people have that you don't have to use a keyboard and mouse, you can demand something else beside that.  People are now willing to say, "Oh, this is something I can try, you know, touch is something I can try as my friendlier interface"."

Who wouldn't want to interact with a friendlier interface?  Steve Ballmer doesn't curb his enthusiasm about Windows 8 and Perceptive Pixel.  Jeff Han is happy how designs created in Windows 8 scales for use on screens large and small. He explains how Windows 8 can support collaboration. The Story Board application (7:58) on the large touchscreen display looks interesting.

I continue to be frustrated by the poor usability of many web-based and desk-top applications.  I like my iPad, but only because so many dedicated souls have given some thought to the user experience when creating their apps.  I often meet with disappointment when I encounter interactive displays when I'm out and about during the day.  It is 2012, and it seems that there are a lot of application designers and developers who have never read Don Norman's The Design of Everyday Things!



I enjoy making working prototypes and demo apps, but my skill set is stuck in 2008, the last year I took a graduate-level computer course.  I was thinking about taking a class next semester, something hands-on, creative, and also practical, to move me forward. I can only do so much when I'm in the DIY mode alone in my "lab" at home.  I need to explore new tools, alongside like-minded others.  


There ARE many more tools available to designers and developers than there were just four years ago.  Some of them are available online, free, or for a modest fee.  I was inspired by a link posted by my former HCI professor, Celine Latulipe, to her updated webpage devoted to Rapid Prototyping tools. The resources on her website look like a good place to start for people who are interested in creating applications for the "NUI" era.  (Celine has worked many interesting projects that explore how technology can support new and creative interaction, such as Dance.Draw.) Below is her description of her updated HCI resources:

"New HCI resource to share: I have created a few pages on my web site devoted to Rapid Prototyping tools, books, and methods. These pages contain reviews of various digital tools, including 7 different desktop prototyping apps, and including 8 different iPad apps for wireframing/prototyping. I hope it's useful to others. Feel free to share... and please send me comments and suggestions if you find anything inaccurate, or if you think there is stuff that I should be adding. I will be continuing to update this resource." -http://www.celinelatulipe.com (click on the rapid prototyping link at the top)



IDEAS
Below are just a few of my ideas that I'd like to implement in some way. I can't claim ownership to these ideas- they are mash-ups of what comes to me in my dreams, usually after reading scholarly publications from ACM or IEEE, or attending tech conferences. 
  • An interactive timeline, (multi-dimensional, multi-modal, multimedia) for off-the-desktop interaction, collaboration, data/info analysis exploration.  It might be useful for medical researchers, historians, genealogists, or people who are into the "history of ideas".  Big Data folks would love it, too. It would handle data from a variety of sources, including sensor networks. It would be beautiful to use.
  • A web-based system of delivering seamless interactive, multi-modal, immersive experiences, across devices, displays, and surfaces. The system would support multi-user, collaborative interaction.  The system would provide an option for tangible interaction.
  • A visual/auditory display interface that presents network activity, including potential intrusions, malfunctions, or anything that needs immediate attention that would be likely to be missed under present monitoring methods. 
  • Interactive video tools for creation, collaboration, storytelling.  (No bad remote controllers needed.)
  • A "wearable" that provides new ways for people to express and communicate creatively, through art, music, dance, with wireless capability. (It can interact with wireless sensor networks.)*
  • An public health application designed to provide information useful in understanding and sepsis prevention efforts. This application would utilize the timeline concept describe at the top of this list. This concept could also be useful in analyzing other medical puzzles, such as autism.
Most of these ideas could translate nicely to educational settings, and the focus on natural user interaction and multi-modal i/o aligns with the principles of Universal Design for Learning, something that is important to consider, given the number of "at-risk" learners and young people who have disabilities.

I welcome comments from readers who are working on similar projects, or who know of similar projects.  I also encourage graduate students and researchers who are interested in natural user interfaces to and move forward with an off-the-desktop NUI project.  I hope that my efforts can play a part in helping people make the move from GUI to NUI!  



Below are a few videos of some interesting projects, along with a list of a few references and links.


SMALLab (Multi-modal embodied immersive learning)


PUPPET PARADE: Interactive Kinect Puppets(CineKid 2011)



MEDIA FACADES: When Buildings Start to Twitter

HUMANAQUARIUM (CHI 2012)

 

NANOSCIENCE NRC Cambridge (Nokia's Morph project)






 
Examples: YouTube Playlists
POST WIMP EXPLORERS' CLUB
POST-WIMP EXPLORER'S CLUB II

Web Resources
Celine Latulipe's Rapid Prototyping Resources 
Creative Applications
NUI Group: Natural User Interface Group
OpenFrameworks and Interactive Multimedia: Funky Forest Installation for CineKid
SMALLab Learning
OpenExhibits: Free multi-touch + multiuser software initiative for museums, education, nonprofits, and students.
OpenSense Wiki 
CINEKID 2012 Website 
Multitouch Systems I Have Known and Loved (Bill Buxton)
Windows 8
Perceptive Pixel
Books
Natural User Interfaces in .NET  WPF 4, Surface2, and Kinect (Josh Blake, Manning Publications)
Chapter 1 pdf (Free)
Brave NUI World: Designing Natural User Interfaces for Touch and Gesture (Daniel Wigdor and Dennis Wixon)
Designing Gestural Interfaces (Dan Saffer)
Posts
Bill Snyder, ReadWrite Web, 7/20/12

I noticed some interesting tools on the Chrome web store - I plan to devote a few more posts to NUI tools in the future.

Dec 15, 2011

Christian Bannister's Interactive Multi-touch and Gesture-based Subcycle Project. Use your hands to shape sound, create, and manipulate music. Wow!!

I came across a link to Christian Bannister's Subcycle Labs website when I was taking a look at the Creative Applications Network website.


If you have an interest in music technology and innovative gesture/multitouch applications, you'll appreciate the details that are shared on the Subcycle website.  In the meantime, take the time to watch a few of Christian's videos.  Enjoy!


Blip Shaper Walkthrough

Blip Shaper Walkthrough from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"
Subcycle Walkthrough

Subcycle Blip Shaper from christian bannister on Vimeo.
"a) creating percussive patterns with monome b) shaping the individual sounds that make up the patterns with multitouch gestures c) recording touchscreen gestures as automation d) storing, duplicating and navigation patterns e) recording the resulting audio to a dynamic buffer f) manipulating the buffer with a multitouch cut-up approach g) visualizing everything with dual screens"


The following information describing the Blip Shaper is from Christian's Subcycle website/blog:
"For the drum sounds I have Drumaxx running for synthesized sounds and Battery running for sampled sounds.  These are running in parallel so for each voice there is a separate patch running in each VST.  The Parameters are modified with the touchscreen independently but in all cases a single touch gesture on the X-Axis will cross fade between the sampled version of the sound and the synthesized version of the sound.  I love this because I have never seen this before and I can never decide which technique I like better.  The synthesized drums are more malleable and have more interesting parameters to play with but the sampled sounds seem more substantial.  I will post a detailed list of parameters and gestures in the future."   


Christian currently uses Max for Live (ableton), and codes with Processing/Java, using Eclipse for organization.

Autopilot - Subcycle
autopilot - subcycle labs from christian bannister on Vimeo.
"sound visualization, multi-touch interface, break beat performance engine, autopilot, spacialized sound, dsp, max/MSP, fm synthesis, sonic navigation, sound storm visualization, time machine, granular, interactive sound sculpture, joystick array, more at subcycle.org"


SOMEWHAT RELATED 
Community Core Vision
NUI Group
BTW,I'm wondering if Christian or other NUI group members would like to do something with some of my ideas that incorporate interactive music. To learn more, see my post, "It must be  Voronoi: Looking for ideas for my music+art+dance+tech dream...."
...


....

Nov 28, 2011

FlatFrog Multitouch Videos: Point Separation, Multi-input, Multi-user input

FlatFrog Multitouch is a company based in Sweden. It was founded by Ola Wassvic and Christer Fåhraeus.  The technologies support 20+ simultaneous touches, and recognize object size, a useful feature. FlatFrog screens can be optimized for a wide range of light conditions  FlatFrog's multi-touch and gesture interaction is featured in the short video clips below.  


FlatFrog is gearing up for commercial release. According to the FAQ's on the website, "all sizes are possible, from 5" to 100" and upward.  Promethean is one of the company's investors.   There is a volume manufacturing agreement with Kortek Corporation, known for industrial and gaming displays.




Thanks Touch User Interface for sharing this information! (Touch User Interface is the blog for Sensible UI, known for the ArduMT, aka the Arduino Multi-touch Development Kit)

Aug 19, 2011

MULTITOUCH NEWS: NUITEQ's interactive display at a business park; Innovation Now's multitouch anatomy application

Multitouch Display for Business Science Park Aurorum, by NUITEQ


"NUITEQ developed a customized multi-touch software solution for Corporate Reception / Lounge Areas for Business Science Park Aurorum in Luleå, based on the award-winning Snowflake Suite framework. In addition to the software, NUITEQ delivered and installed a 32 touch points multitouch dreaMTouch LCD from Germany based Elektrosil."-NUITEQ

Harry van der Veen, Multitouch Blog, 8/19/11


Multitouch Anatomy by Innovation Now


"Virtual anatomy surface computer in the shape of autopsy table that show and enable the user to use hand gestures to do the anatomy process virtually." -Innovation Now

Homam Alghorani, CEO, Innovation Now Sdn Bhd

Jul 18, 2011

Emerging Interactive Multimedia, New Models of HCI for Museum Exhibits (Course offered by Ideum's Jim Spadaccini, plus info about the MT55 multi-touch table)

Jim Spadaccini, the director and founder of Ideumwill be teaching a course on exhibit development through the University of Victoria, "Emerging Exhibits: Exploring New Models of Human Computer Interaction (HCI).  The excerpt from the course description provides a glimpse of how emerging technologies are beginning to change the museum experience:


"Computer-based interactive exhibits are undergoing a major transformation. The lone, single-user kiosk is now being replaced by multitouch tables and walls, motion-capture spaces, networked installations, and RFID-based exhibits. Advances in augmented reality, voice recognition, eye tracking, and other technologies promise even more radical change for exhibits in the near future."


I've been following Jim's journey with Ideum, a multimedia design firm that collaborates with museums and related non-profits, for many years, and I am impressed with the work of this company.  In addition to his work at Ideum,  Jim serves as the Principal Investigator of a National-Science Foundation sponsored open-source exhibit software project, Open Exhibits, which provides a free software development kit that supports the creation of multi-touch and multi-user software applications for museums and educational settings.


I'm happy to put in a plug for Ideum's latest product, the MT55 Platform Multi-Touch Table. It incorporates a range of features that I'm sure will meet the needs of museum visitors.  In my opinion, this table would be a fantastic resource for all types of libraries, including those in K-12 settings.

The MT55 Platform Multi-touch Table, from Ideum

The MT55 Platform Multitouch Table from Ideum on Vimeo.  (Note: This video features music by Moby, the track "Sevastopol" on his current album, Destroyed. The music was used with the artists' permission. Learn more at: moby.com") - Ideum

"The thinnest, largest, most powerful multitouch table available.The MT55 Platform multitouch table houses a powerful computer and a 55-inch interactive LCD display that responds to 32 touch-points, inside a rugged aluminum body."

"The bright 55″ 1920×1080 HD display has a 5,000,000:1 contrast ratio. A wide 178-degree viewing angle accommodates multiple users around the table. The optical multitouch system supports 32 simultaneous touch points for collaborative interaction. The system is multitouch-enabled from start-up, and runs Windows 7 64-bit professional edition."


"The integrated computer is packed with power. It contains an Intel® Hyper-Threaded DualCore i5® which runs at 2.66 GHz, 8GB of RAM, and a 128 GB solid-state drive (upgradeable to an i7®)."


"The table comes complete with WIFI, Bluetooth, and Ethernet connectivity. It also has multiple HDMI outputs that allow you to easily mirror the table's display, extend the desktop, or connect to and display from another computer or HDMI device."

"Convenient, but secure ports: CAT5, HDMI, and USB 2.0 are available on both the side on bottom the table."

"The MT55 Platform includes blue LED under-lights to illuminate the area beneath the interactive surface (custom LED colors are available). Every MT table includes a sophisticated internal cooling cell to maintain operating conditions that exceed the optimum environmental specifications for the internal components."

"The interactive surface of the MT55 Platform protected by a sheet of hardened, crystal clear, low-iron 5mm tempered glass surface...
As an option, we offer Sevasa HapticGlas®, produced exclusively for Ideum. Micro-etched HapticGlas® provides tactile feedback, reduces fingerprints, increases scratch resistance, and directs user focus." -Ideum
 

RELATED
High-res photos of the MT55 Platform

GestureWorks Software
Open Exhibits
Ideum
Open Exhibits Tuio Kinect

May 31, 2011

Top 10 All-Time Posts on the Interactive Multimedia Technology Blog

I'm finishing up the last couple weeks of the school year, so I'll have little time to post this week.  I hope you enjoy exploring the following links!


Revised Post 8/1/06: Interactive multimedia for social skills, understanding feelings, relaxation and coping strategies


Teliris Interact TouchTable and TouchWall: Immersive Collaboration & Telepresence; DVE's Holographic Tele-Immersion Room


Games to lift stress away: Flower, flOw, (and Cloud), from thatgamecompany


Power to the Pixel Cross-Media Forum Streaming Live from London Today #PttP


HACKED KINECT MULTITOUCH using libFreenect and libTISCH (via Florian Echtler)


Link to iTV Doctor Rick Howe's post about 2D to 3D, 3D TV data points, and 3D content distributers


Temple Grandin - A gifted visual thinker, who also has autism, featured in HBO movie starring Claire Danes.  Update: Video of Claire Danes' acceptance of a Golden Globe for her performance


Algodoo physic app. for the SMART Board 800 series, supports multi-user interaction!


Wii Just Dance2 and Kinect Dance Central:  UI and Usability Approaches; Challenges for Developoing Accessible Games


Interactive Touch-Screen Technology, Participatory Design, and "Getting It" -Revisited







Apr 29, 2011

Musical Multitouch/Gestural Interfaces by Osmosis

I've been pleasantly surprised by the increase of interesting multi-touch and gesture-based applications developed for musical interaction on large displays.  This topic is dear to my heart -  I took a computer music technology class back in 2003, and think it would have been great if this sort of thing was available back then.  Who wants to point and click around a music app for hours on end?!


The video below shows what is new from Osmosis, a company based in NY that focuses on the development of multi-touch and gesture based software for a range of uses, including music applications.


Performance Systems for Stage/Studio from Osmosis on Vimeo.

Transparent Stage System Specs
Design
• Floating, transparent HD displays from 32"
• Haptic surface with tempered glass backing
• High gain image with wide viewing angle
• Rugged aircraft-grade aluminum build
• Enclosed projector and computer
• Minimalist style, compact footprint
• Disassembles for easy transportation


Interactivity
• Projected capacitive foil or IR bezel options
• Up to 32 simultaneous touch points
• Precise, responsive touch tracking (3mm)
• Immune to external light conditions
• Use of fingers, gloves or stylus


Technical
Display:
• DLP portable projector
• 1280×800 HD resolution
• 2500 ANSI-lumen, 1800:1 contrast
Computer:
• Mini-ITX, Core 2 Quad, 4GB RAM
• ATI Radeon X1250 graphics card
• Wireless keyboard and mouse
• Windows 7 Pro

Stuart McClean, the founder of Osmosis, shared the following information about his company:

"Osmosis is a consulting firm based in the NYC area with deep experience in interactive technology. Although we cater to a range of markets, we’re especially passionate about music production and performance. Working closely with artists, we build customized interactive systems for stage and studio. HCI technology is integrated into a range of designs including stands, desktop rigs, tables, carts and vertical screens. Interfaces are tailored to specific artist needs and combine controls, generative audio and visuals, instruments, and gestural input. Our unique and flexible systems take full advantage of multi-touch interaction and offer seamless control of Ableton Live, Traktor, or other DAWs via midi and OSC..."


cid:image003.jpg@01CBBEEE.FB8A4D70cid:image016.jpg@01CBBEEE.FB8A4D70cid:image011.jpg@01CBBEEE.FB8A4D70


For more information about the applications developed by the Osmosis team, take a look at their showcase page.  

Apr 26, 2011

Multi-touch and Gesture Interaction News and Updates You Might Have Missed (Part I)

Over the past couple of months, I've come across many interesting links related to multi-touch and gesture interaction, but I haven't had time to devote a thoughtful post to each one.  "Part I",  is a nice collection of experimental, commercial, and non-commercial efforts by a variety of creative technologists, with a smattering of industry news that might be of interest to IMT readers. 


Ideum's MT55 HD Multitouch Table 4/19/11

New MT55 HD Multitouch Table Now Shipping,  Jim Spadaccini, Ideum Blog 4/11/11

Smithsonian American Art Museum to Open Education Center  Sara Beladi, NBC Washington News, 4/4/11 (Rumor has it that the Smithsonian American Art might include touch and multi-touch displays in it's plans for a new education center.  The center was funded by an anonymous $8 million dollar gift.)

Bill Buxton, Microsoft Research, 4/7/11 - Includes lots of pictures, links to videos, and more information of what might be the first touch-screen.  Also see Bill Buxton's companion website, Multi-Touch Systems that I have Known and Loved, updated on 3/21/11.  Bill Buxton knows all (almost!)


"The MTbiggie uses the "Front Diffused Illumination" multitouch technique, with ambient infrared light and a DIY infrared webcam. The MTbiggie is similar to the MTmini, but includes a projected image and infrared webcam (rather than a normal webcam)...The MTbiggie isn’t the most stable and robust setup, but it is the easiest to build. To see other methods of building more stable multitouch displays, view the full multitouch display list." -Seth Sandler

(Also check out NodeBeat, a multi-touch music/audio sequencer/generator app by Seth Sandler and Justin Windle)

Intuilab, 4/13/11
"IntuiLab, a global leader in surface computing software applications, today announced support for the revolutionary Microsoft Kinect device across its full line of IntuiFace products and solutions including IntuiFace Presentation and IntuiFace Commerce...Microsoft Kinect brings distant gesture control to interactive solutions. These gesture controls allow users to interact with displayed digital assets from a distance at their own pace and path – for example, browsing through a large quantity of products in a store catalog or manipulating 3D models (such as a mobile phone) – all without having to actually touch the screen..."  -IntuiLab (Take a look at the IntuiLab team- an interactive page!)




Sparkon:  Videos and links related to multi-touch and gesture-based applications



Official Kinect SDK to be Open SourceJosh Blake, Deconstructing the NUI, 4/18/11  
9 This bit of news excited me, but don't get your hopes up. If anyone knows what will happen with the Kinect SDK, please leave a comment.)
"Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea." -Josh Blake

Kenrick Kin, Tom Miller, Bjoern Bollensdorff, Tony DeRose, Bjoern Hartmann, Manees Agrawala (Pixar Online Library)

Flight Race Game on 3DFeel lm3Labs, 4/18/11


JazzMutant Lemure Version 2 : "The only multi-touch and modular controller for sequencers, synthesizers, virtual instruments, vjing and lights, now even better."


Harry van der Veen's Multitouch Blog (NUITEQ)


Stantum "Unlimited Multi-Touch" Latest News

At Immersive Labs, Ads Watch Who Looks at Them Amy Lee, Huffington Post, 4/26/11 

Immersive Labs

Hard Rock Cafe International Using NextWindow Touch Screens:  "Rock Wall Solo displays enhance music lovers' experience in Seattle, Dallas, Detroit and Berlin" 4/12/11 (Full press release pdf)
Music on Touch Screens (NextWindow)

Razorfish: Thoughts on MIX 11 ,James Ashley, Razorfish Blog, 4/20/11  Also see: Razorfish Lab's Prototypes




"The multitouch microscope brings new dimensions into teaching and research. Researchers at the Institute for Molecular Medicine Finland (FIMM) and Multitouch Ltd have created a hand and finger gesture controlled microscope. The method is a combination between two technologies: web-based virtual microscopy and a giant size multitouch display."
"The result is an entirely new way of performing microscopy: by touching a table- or wall-sized screen the user can navigate and zoom within a microscope sample in the same way as in a conventional microscope. Using the touch control it is possible to move from the natural size of the sample to a 1000-fold magnification, at which cells and even subcellular details can be seen."  -Multitouchfi  Also see the Multitouch website.



Big Size Multitouch Display Turned into a MicroscopeMicroscopy-News, 3/28/11
Mac OX 10.7 Lion: new multi-touch gestures, Dock integration for Expose, Launchpad, Mission Control Appleinsider, 4/14/11


Vectorform App featured in Royal Caribbean's Video Promotion: James Brolin, Dean Cain get hands-on with Vectorform app Alison Weber, Vectorform Blog, 3/3/11


3M Touch Systems's YouTube Channel

Social Mirror 3D Gestural Display, Now Using Kinnect:  SnibbeInteractive