Friday, March 8, 2013

WebWise - Multitouch Collaborative Computing & Other HCI Delights

While most Maryland schools and libraries closed for "snow" on Wednesday, March 6, 2013, I had the chance to attend Day 1 of the WebWise Conference in Baltimore City.  This was my first time at this IMLS sponsored annual conference.  It brings together people representing the perspectives of libraries, archives, and museums.  What is really fantastic is that it allows the participant attendees to draw from the experiences of these related fields and think about new ways for services and technologies to be implemented elsewhere.  I know that I attended workshops/sessions on this Wednesday that I either wouldn't have had time or the natural inclination to have attended at a larger conference like ALA.
 
Here I'm sharing some thoughts about the first workshop I attended, Multitouch Collaborative Computing & Other HCI Delights, then follow-up later with posts about other sessions.  The presenter was Jim Spadaccini from Ideum, a company that makes touch screens for public places.  These have most often been at museums over the past 5 years or so, but I can see great potential for these kinds of interfaces in libraries in the future.
 
Jim first reflected on the opening night of a new exhibit at the New Mexico Museum of Natural History in 2007.  They had installed a 24 inch display that allowed visitors to zoom in and zoom out to view dinosaur fossils.  It quickly became apparent that visitor expectations were changing, based on the recent release of the iPhone.  One person tried to touch screen and it didn't act as she expected, so she left.  The way that people interact with their personal touch screens had directly impacted expectations for digital signage, which they expect to respond in a similar way.
 

The nature of large-scale multitouch installations is very complicated.  There are about 4 or 5 intuitive gestures.  Everything else has to be learned.  Once you have multi-touch, you need to expect that it'll be a multi-user experience as well.  You can't make the exhibit display just a "giant ol' iPhone." People will fight over it if it's designed for one user.  Recognized that people will be coming at it from all different directions. All kinds of things happen in these community spaces, and exhibits should plan to allow for the seven characteristics of family-friendly exhibits, as stated in a PISEC study from 1998.
 

There have been two key technology trends in museums over the past twenty years, and unfortunately both are isolating experiences that focus on the single user.  One is the audio tour.  Many families and groups of friends would prefer not to participate.  The single-user kiosk is the other trend.  These don't naturally cater to the interests of groups.
 

Jim showed a photo of a 7 foot round multitouch wall created for the Monterey Bay Aquarium using HD projections.  These kind of multitouch walls and multitouch tables encourage multiple users to interact simultaneously.  He showed another example of a display that brought together photos from Flickr, Google Maps, and NOAA photos of the Gulf oil spill.  Each user able to touch, move, rotate, expand images at the same time and use an on-screen navigational tool called the orb.  It's very tricky to design distributed media controls.
 

Recently the largest multitouch table in the world was installed at the Museum of Science and Industry in Chicago.  It's a 100" table.  See photo to the right.  I'm definitely going to check this out this summer!
 
There has been 35 years of experience for designing single-user interface experiences, but multitouch and multiuser is totally new with it's own set of challenges.  The ideal is to allow for experiences that can be both open-ended or deeper for groups depending on their situation (the time and patience they have available).  Developing software experiences that encourage collaboration and minimize conflict is the key.  This is a break from 35 years of the traditional Graphic User Interface.
 

Gesture recognition is the newest kind of interface, which eliminates the need for touch at all (and the fingerprint marked screens that come with touch).  See gestureml.org.  You can add inertia and other gestures to the application using an xml markup language. Multitouch and motion recognition in the same space is where the technology is going.  Using Gestureworks CORE, you can build gesture-driven apps.  It allows for about 80 3D gestures.  1 finger, 2 finger, 3 finger, 4 finger, 5 finger, single tap, double tap, etc.  Developers are needing to rethink the pinch gesture to work in a 3D environment.  Perhaps in the near future devices will have both a lower and higher front camera to better capture 3D gestures.  After all, it didn't take long for the iPad to come with both front and back mounted cameras after the first generation was released. Gesture-based software should be intelligent enough to discern a visitor's intent.
 

Where is this all heading?  Augmented reality, motion recognition, RFID - you'll have imbedded devices that will allow people to interact in all kinds of physical spaces.  Motion recognition and other HCI (human computer interaction) technologies present similar design challenges.  The expectation will be that technologies not only work as expected but also that these technologies will work together.  These will lead to many interesting possibilities.
 

Jim described the NSF grant-funded Openexhibits project, which provides free multi-touch multi-user software, and the Creating Museum Media for Everyone initiative, which is working toward making museum exhibits accessible and inclusive for all people.  He also described the experimental project, Heist, that is in development. Heist will enable anyone to share museum digital objects onto their own personal device to take it with them.  It also allows you to use your personal device with the multi-user interface. This project is still at the experimental phase and intended to get people thinking.  They are still working out technical hurdles involving html5.
 
An audience member asked if any libraries were  yet using multi-touch multi-user displays?  Jim isn't aware of any yet, but this is likely due to it being such a new technology and the high cost at this stage.  Costs will reduce in the coming years as these interfaces become more common.
 
What do you think?  How could multitouch multiuser tables and walls be used in your library? ...in your community?
 
-Joe

No comments: