Sensecam, Collaborative Reflection and Passive Image Capture

This afternoon at COOP2006, I enjoyed a short paper by "Supporting Collaborative Reflection with Passive Image Capture" by Rowanne Flec and Geraldine Fitzpatrick. Her PhD research is about how the a technology such as Microsoft's Sensecam can support reflective thoughts in different situations (teacher's practices, everyday reflections... learning from experience).

The SenseCam is a digital camera that has a light sensor and a temperature sensor (allows to trigger images to be taken)... a passive images capture tool. Then you can get a storyboard of the pictures taken.

She ran an expriment in which students when to an arcade to play games with the SenseCam. They played the game and then went back to their HCI class in which they had to discuss some HCI questions. Some groups had the images, some others not (two experimental conditions). She looked at the "goodness" of answers and the number of issues raised in discussion.

Results: - discussion-led use of images: to ground the conversation (referential communication), as an objective record, to talk about something missed by partner or "just in case" - image-led discussion: trigger memory, confirm/disconfirm memory, reveal something missed at time ("it's quite useful for getting a look at what you're actually because we did not use those buttons in the game".

Why do I blog this? I am actually interested both by the study and the tool. I would be super happy to have this sort of tool for my research project about location-based applications and about video games. It would be a nice way to get some traces of the activity that I'd be able to use to get back to the users and discuss them. Here is how it's described by MS:

SenseCam is a badge-sized wearable camera that captures up to 2000 VGA images per day into 128Mbyte FLASH memory. In addition, sensor data such as movement, light level and temperature is recorded every second.

Sensors trigger a new recording. For example, each time the person walks into a new room, this light change transition is detected and the room image is captured with an ultra wide angle or fish-eye lens. (...) The sensor data (motion, light, temperature, and near infrared images) is recorded for later correlation with other user data, for example in the MyLifeBits system. (...)MyLifeBits will allow the large number of images generated daily to be easily searched and accessed. Future SenseCams will also capture audio and possibly heart rate or other physiological data.