Bop! (Building Awareness for Enhanced Workplace Performance) was a DTI funded project carried out by the Aware Building Group. Members were: Arup, BT, Imperial College, Central Saint Martins, Brunel University, ArtificialTourism, MaoWorks, AP Futures and Spy. Bop! used an autonomic Wireless Sensor Network to supply environmental and space usage data, and to give inhabitants the opportunity to register their subjective states in real time. The resulting data streams fed into a relational database, and lightweight AI was used to recognise patterns in the relational data, and search for anomalies. Its immediate purpose was to act as a tool for post occupancy evaluation that is an automated 'movie' rather than a snapshot, and which feeds back immediately to inhabitants, but it also has the potential to act as a platform on which to build high level interaction. One of my tasks in the project is to provide data sonification for feedback; here is an example.
The sound module for the prototype had three sounds running. In the example they’re running
on clocks on a pretty rapid cycle (< 2 min), so you can hear the whole range of each
sound. In the real world they’d change much more slowly and less predictably.
The three sound come in order with gaps before 2 and 3
Sound 1 connects to light level. The sound ‘warms up’, gets richer, as light level
increases.
Sound 2 connects to temperature; the sound works inversely, that is it gets ‘colder’ as
temperature rises – the rationale for this is that it could help people feel cooler...
Sound 3 connects to people’s mood as expressed through the physical interface. The
sound becomes more ‘in tune’ with the other sounds as people are more ‘in harmony’
with the space and environmental conditions.