Creating the meSchcases

The meSchcases are a series of augmented, networked display cases that were created for use in Museon. They were created as part of a collaboration between Museon and SHU for deployment during the Ecsite 2014 conference. This post will cover the implementation and deployment of the meSchcases.

Creating the Cases

Details of the initial inspiration and brainstorming process were discussed in a previous blog post. From this brainstorming session we arrived at the idea of augmented showcases that would allow us to crowdsource a museum exhibition. Sensors mounted within each case would detect visitors approaching to look at the object inside the case. By monitoring how close visitors came and how long they stayed we could determine their level of interest in the object. A number of cases could be set up together and then a type of popularity contest could be created: monitor levels of interest in each object over a set period of time and then replace the least popular object at the end of the time period. To do this we created a new case design, based around the standard show case used at Museon. Inside the case we placed three infrared distance sensors, which faced out through a slot in the case. This sensors could monitor proximity levels of visitors to the cases. Data from these sensors was gathered by an Arduino Mini and then sent via a USB connection to a Raspberry Pi. This processed and logged this data to a centralised server.

Model of augmented showcase.

meSchase model depicting the projection from within and the slot for the distance sensors.

The Raspberry Pi also performed other useful functions for the cases. By placing a pico-projector within the case we could project content about the object onto the case itself. This allowed the museum to augment the object with content from their repository, including text, images and animations. The Raspberry Pi was used to read content for the current object from the central server and display this content through the projector onto the case. The content was stored in HTML format, making it easy for curators to add or update content in the system. The projector was embedded in a casing designed to stand in the showcase and ensure the correct alignment of content with the case. To allow curators to easily inform the system when the object in a case was changed and to ensure the correct content was being displayed we interfaced an NFC tag reader with the Raspberry Pi. A small pocket was created on the side of the projector casing into which the curator could place a laminated card depicting the object now in the showcase. This card contained an NFC tag with an object ID which allowed the Raspberry Pi to detect the new object and display the relevant content.

Content Engine

The content engine for the showcases was implemented in PHP, jQuery and HTML. The Raspberry Pi requests content from the central server for the current object. This content is delivered as a dynamic HTML page. The page has three sections: content from the museum’s perspective, content from the object’s perspective and content from the visitors’ perspective. The museum content is static HTML content that presents known information about the object itself. The object’s perspective presents content created by the curators, but presented as though the object is speaking about itself. This included text, images and animations. The visitors’ content was created by gathering Tweets about the object made by visitors: each object had a unique hashtag that was displayed on the projection. This allowed visitors to offer opinions on the object and provided an additional measure of visitor interest.

meschcasesscreenshot

Four different content types related to the object are projected from within the meSchcase. The last item is a livestream from Twitter where interaction with and from the public becomes apparent.

Deployment

We built four showcases for use in Museon during Ecsite. These showcases were set up side-by-side in a section of Mueson’s permanent exhibition before the beginning of the Ecsite pre-conference workshops and were left in place for the duration of the conference.

Interactive display cases

The four interactive display cases as part of the permanent exhibition of Museon in The Hague.

Every three hours the least popular object was replaced with a new object from Museon’s storage area. In total 15 objects were selected, had content created for them and were rotated through the cases. Occasionally the objects were rotated between showcases, in case position had an effect on their popularity. Data gathered during the conference showed that some objects were much more popular than others, irrespective of which case they were placed in. One object (actually a set of objects) maintained their place in the showcases for the entire duration of the event, consistently scoring as first or second most popular.

What next?

In general the meSchcases were a success. Museon’s curators liked them, a number of museum professionals attending the conference expressed an interest in them and we have had a request to build some more for an exhibition to take place next year. The deployment was not without issues though. These cases were developed on a very short timeframe and as such there are some improvements to be made. In particular the showcases need to be more fully integrated into Museon’s existing network that was possible in the available time. Some other issues, regarding data gathering, system startup and shutdown and network reliability were also raised and although it was possible to deal with many of these in situ, some more thought should be put into them for future iterations. Overall though, I think we were pretty happy with how this went, as indicated by this photo of some of the systems creators, smiling alongside the cases in place at Museon:

14042478518_b650242623_b

The meSch team from SHU who created the cases (fltr): Daniela Petrelli, Nick Dulake and Mark Marshall.