CCD-Favicon-2

The Centre for Conscious Design

Breaking Free

How will architecture fare in the post-truth era?

2016 was the year post-truth bounced into the mainstream. For many, it marked a stark realisation of how echo chambers created and perpetuated by algorithms personalising data flows are leaving us ‘trapped in a filter bubble’1. Hearing only the opinions of those in our own circles, it is becoming all too easy to dismiss the facts and views of others. Filippo Menczer — in a recent feature in Scientific American — has pithily expressed this as an ‘algorithmic bias toward engagement over truth’2.

Architecture has not escaped the wrath — in the Dec16/Jan17 issue of The Architectural Review, Stephen Parnell commented on how architectural criticism is ‘on the verge of becoming post-truth’3, whilst Christine Murray took things into the realm of the post-human, weaving the algorithmic shortcomings of artificial intelligence into words of caution for the architectural industry:

While artificial intelligence may learn to design buildings for a site, it can’t make moral decisions, predict the future or perform jazz; it lacks empathy and complex understanding. While firmness, commodity and delight may be reduced to an algorithm, architecture that serves people and place for 100 years cannot.4

Contemplating the rise of the architecture machine some forty years earlier, Nicholas Negroponte even imagined what we might today conceive to be echo chambers (albeit of an architectural sort). At the mercy of designer’s ‘likes’— somewhat akin to today’s social media conventions — he judiciously aired caution that ‘bad architecture could escalate as easily as good design’5.

When thinking about high technology in the built environment, how might we enable truth to prevail over engagement and complex understandings to develop? Algorithms will become more sophisticated that is a certainty. The requirement for a deeper, more critical understanding of our interactions with machines seems ever-more important to this post-truth turn and looming post-human future. We might conceive such symbiosis as one of ‘augmented intelligence’ rather than a direct surrendering to AI: a part of our response therefore ought to lie in knowing how to work together.

The ‘design of the device’ to borrow Negroponte’s phrasing, is perhaps most crucial for thinking about our use of technology. Disrupting circularity in order to break free of filter bubbles, is one such avenue. Algorithms could quite imaginably, be charged with sourcing these serendipitous information encounters. In the words of Negroponte: ‘Someday machines will go to libraries to read and learn and laugh and will drive cities to experience and to observe the world. Such mechanical partners must badger us to respond to relevant information, as defined by evolution and by context, that would otherwise be overlooked.’6 The question is, how can this be achieved in a way that broadens horizons but is not completely left-field? Further still, if we conceive AI as one of augmentation, whose responsibility is it? Does the duty of care lie in the hands of programmers, the minds behind the algorithm? Or should the responsibility rest on our shoulders — one half of the dialogue — to realise these limits and reach out beyond the bubble edge, as we engage in conversation with machines?

With the boundaries set by a meshing of choice and algorithm, it is arguably then up to us to decide whether or not we wish to breach them. But with technology ever more ubiquitous, it is shaping the built environment in ways that most of us do not even comprehend. The invisibility of algorithms, not least the design choices hidden within them, makes breaking free difficult. We might take inspiration from the work of Umbrellium: VoiceOver, a project developed with the explicit aim of overcoming the domination of the ‘voice from above’ in decision-making algorithms7. Both hyperlocal and hyperpublic, it provides an interactive chain of light and sound that anyone in the local village can see and listen in to. Designed and deployed in collaboration with local residents it was conceived…

Not to make decisions better (whatever that means), but to make them collectively; not to remove inefficiency and complexity, or iron out wrinkles and seams, but to embrace that complexity and build value from the unpredictability, serendipity and creativity that is found in messy situations.8

The work of Umbrellium speaks to the design of participatory platforms to overcome feelings of citizen disenfranchisement, capturing the emergent need to ‘encode participatory, consciously interactive parameters into our algorithms, ones that also value our communality and connectivity’9. How can exposure to these alternative, collective circles of thought lift the lid on echo chambers? Can we creatively make visible, the invisible edges of filter bubbles? In an age of information overload — can we find value and be empowered to keep pushing the boundaries of the bubble and algorithm?

Designing in serendipity and creating algorithms that are more visible and comprehensible might start to breakdown the boundaries of the filter bubble. Understanding AI as a relational dialogue and the importance of our own voices in these conversations, is the key to enabling truth to prevail over engagement, and for more complex understandings to unfold. As we enter this era of post-truth architecture, we must not forget: ‘Buildings may be constructed on the building site, but architecture is constructed in the discourse’10.


References

1  Hosanagar, K. Blame the echo chamber on Facebook. But blame yourself, too. Wired Magazine, (25 November 2016). https://www.wired.com/2016/11/facebook-echo-chamber/
2  Menczer, F. Fake online news spreads through social echo chambers. Scientific American, (28 November 2016). https://www.scientificamerican.com/article/fake-online-news-spreads-through-social-echo-chambers/
3  Parnell, S. Post-truth architecture. The Architectural Review, 240 (1437), 6-12 (2016).
4  Murray, C. Looking forward, we committed to doing things differently – which paradoxically, is what we’ve always done. The Architectural Review, 240 (1437), 169-170 (2016).
5  Negroponte, N. The Architecture Machine. MIT Press, (1970).
6  Negroponte, N. The Architecture Machine. MIT Press, (1970).
7  Haque, U. VoiceOver. Citizen empowerment through cultural infrastructure. In: L Bullivant (ed). 4D Hyperlocal: A Cultural Toolkit for the Open-Source City (pp.86-91). Wiley, (2017).
8  Haque, U. VoiceOver. Citizen empowerment through cultural infrastructure. In: L Bullivant (ed). 4D Hyperlocal: A Cultural Toolkit for the Open-Source City (pp.86-91). Wiley, (2017).
9  Haque, U. VoiceOver. Citizen empowerment through cultural infrastructure. In: L Bullivant (ed). 4D Hyperlocal: A Cultural Toolkit for the Open-Source City (pp.86-91). Wiley, (2017).
10  Parnell, S. Post-truth architecture. The Architectural Review, 240 (1437), 6-12 (2016).