Can architecture react to your feelings?
I really enjoyed the way this project was demonstrated–amazing to see a physical structure react to your presence and in such a way that reflects how ‘it’ feels that you are feeling. This is yet another example of ‘slow’ technology that calls for engagement and learning with its use. Affective computing is a relatively new area of focus in ubiquitous computing that caters to users’ feelings and attitudes while relaying to them that computers have feelings too or at least the capacity to understand theirs. Sensibility Space is a large physical structure that talks to a computer to which assesses facial expression for emotion. Once the computer identifies the facial expression of an emotion in its database, it rearranges the architecture to respond directly to the users’ affect with colour and movement as visual indicators. The piece is still under development but what is interesting to question is whether a truly affective machine would respond to us or simply reproduce or identify our feelings. What would this response have the capacity to tell us about ourselves?