The OS of Architecture

by Mirco Becker

The full version of this article was published in SAC Journal 1. It discusses how postgraduate programmes like SAC have contributed over the last decade to an inclusive environment of architectural production that spans different offices, schools and even disciplines. In its dimension and use we can regard this environment as an operating system on which architectural design is run. It is on the verge of being the third pillar in architecture besides built work and theory. It does not promote any style or agenda. It is here to stay and evolve.

Journal1

“Like air and drinking water, being digital will be noticed only by its absence, not its presence”. Nicholas Negroponte 1998

Undeniably we do not have to promote digital techniques any longer. They have arrived in everyday life and are here to stay. The current discourse on the post-digital emphasises this point. The notion of the post-digital could be summed up as the state of living in a world where the digital is accepted and commonplace. Nicholas Negroponte was right when he claimed the digital revolution already happened before the turn of the millennium.

Of all inventions in the digital realm it is the internet and its ability to network minds and hardware which have had the biggest impact so far. Today we experience a shift where singular hardware devices lose their importance to an ecosystem of synced and networked objects, exhibiting the tendency of the digital to link processes. The very same tendencies are also basis for new forms of collective co-creation.

What technology wants

One could marvel at or be suspicious of digital technologies, but there is simply no going back. Their effects keep unfolding, driven by the accumulation of data, the processing of information and the consolidation of knowledge. It is a laborious and creative effort made by individuals and collectives. The discoveries and inventions made are often already embedded in the very nature of technology. This is Kevin Kelly’s thesis, discussed in What Technology Wants, where he argues that technology develops along an inherent trajectory. Kelly points out that innovations like the internet are not chance discoveries or a stroke of genius but inevitable after the discovery and application of electricity by Bell, Edison and others in the late 19th century.

This goes for every technology. Our close relation to technology is bi-directional: As much as we have an urge to invent things, technology offers itself to be innovated on. The point is that someone who wants to partake in technological innovation has to find means to access one of the inherent trajectories of innovation. Now, since the act of designing can be defined as the state of being open to possibilities inherent to the subject at hand, design and innovation are two sides of the same coin. Only by uncovering and understanding these processes can one unleash their generative potential.

Operating systems

Along with the digital revolution an entirely new layer of technology was introduced: The operating system, a software layer that binds and manages all underlaying hardware as well as providing the interfaces for applications to run atop. These systems are mega technologies in themselves and probably the largest systems created by humans. The UNIX system alone, with all the sub-systems evolving out of it such as Linux, Mac OS X, and Google Chromium, is a vivid proof of this unprecedented scale in technology.

This was only made possible by the digital allowing collaborations in large quantities and providing the means to consolidate knowledge. This phenomenon is not exclusive to the traditional notion of operating systems managing low level hard- and software processes, it includes new forms of collective digital creation. Wikipedia and Python illustrate this communal effort of knowledge consolidation, compression and abstraction. This development also impacted on creative disciplines and art. Firstly on digital audio and secondly on digital imagery, causing fundamental changes to how we create, distribute and consume these media.

The OS of Architecture

Since the early 1990s, a parallel development to that described above, has taken place in architecture, resulting in the discipline’s very own operating system. This operating system consists of methods, concepts, processes and technologies – a framework in which contemporary architectural design happens. It includes CAD systems, script libraries, mathematical and geometrical concepts, bidirectional interfaces to engineering analysis, links to prototyping and fabrication technology. In contrast to the OS of computing devices, the OS of Architecture has not been masterminded or consciously led by a single individual or corporation but rather created through collective effort. In this ongoing development new “features” get prototyped, tested, integrated or rejected. For the first time in the history of architecture there is an entity that accumulates design knowledge outside the array of buildings, wisdom and theory. It does not even propagate a style. I would argue that, beside discourse and built work, the OS of Architecture has become a third pillar of the discipline where meaningful contributions to the larger architectural undertaking can be made.

The OS of Architecture is everything but a set of tools; we cannot simply confuse it with a traditional palette of pen, ruler, compass, French curves and spline weights. The main difference to a collection of tools is that all its features and elements are being hosted in the same medium, the digital. Thereby they can evolve, hybridise and interlink – much like in an ecosystem.

Since the pandoras box of digital design was opened in 1992 with the Paperless Studio at Columbia GSAPP, there have been some very successful academic programmes – be it the AADRL in London5, the ICD at TU Stuttgart 6 or the DFABARCH at the ETH Zurich, that have built upon the work and accomplishments at GSAPP. What these programmes have in common is a lack of method and curriculum regarding design education. Instead they have been built around the notion of design research where the analysis of any found phenomena is not primarily used to argue for a single designed object but to construct a system which has the generative capacity to provide a range of possible solutions. More importantly and concomitant to the multiplicity of solutions, it became widely accepted to work with design iterations by which the designed-candidates’ performance could be tested in specific environments. These feedback loops are the perfect example of a first order cybernetic model. At a larger scale it is exactly the same model which laid the foundations for an architectural OS.

The OS was created in a collective effort, initiated at academic programmes and soon after pursued in architectural practice, software development and new forms of publishing. This is still an ongoing process. Where previous periods in architecture were often defined by a vocabulary and repertoire of style, the current model works on the accumulation and consolidation of processes.

In the mid-1990s animation software was used experimentally in some graduate programmes, like the Paperless Studio at Columbia GSAPP, and by pioneers in practice. The reason this seemed more interesting than general purpose 3d CAD packages was that it allowed to set things in motion by inter-dependencies, thus controlling a relative complex outcome via a chain of cascading dependencies. Several mechanisms catered for this functionality, but at the core was a directed, acyclic graph – a computational concept where each object computes its state from object-specific input parameters received from other objects. It was clear that any further development could not do without bettering that computational concept. And so they came, parametric design applications like Generative Components, Grasshopper™ and Design Script, which utilise and expose the underlaying graph structure as the main interface of design. As these applications evolved, they also accumulated design knowledge. Navigating on a freeform surface and placing architectural elements in a meaningful manner onto it was once a technical challenge only to be mastered by scripting or programming the solution. This expertise got consolidated and now sits on the graphic user interface (GUI) of many design applications. Along with the sophistication of the applications’ core functionality came the ability to link internal processes to external applications, analytical methods and manufacturing devices.

The OS of Architecture is the sum of these developments: the techniques, methods, processes and most importantly the interconnection between all of them.

Style

A broad range of architectural agendas and paradigms run on the OS of Architecture. Its unifying nature includes positions from ‘Parametricism’ (Patrik Schumacher) or ‘digital Morphogenesis’ (Michael Hensel, Achim Menges) to biological paradigms (Alisa Andrasek, Francois Roche) and exuberant formalism (Hernan Diaz Alonso).

The fact that two such different practices as Foster & Partners (F&P) and Zaha Hadid Architects (ZHA) can run on the OS of Architecture proves that something bigger than style is at play. Both of these offices have altered their method of design dramatically over the last 10 years, embracing parametric techniques, programmatic problem solving and form generation as well as performance-driven design.

So, despite their historic differences, they have a lot more in common today than one expects at first glance. This commonality goes beyond simply employing the same technology of production, and it is well demonstrated by the fact that the shared technology comes closely associated with an inclusive discourse on architectural geometry, design scripting culture, digital craft and robotic fabrication. Practices apparently as different to one another as F&P and ZHA might even use the same sediment of architectural ideas such as the articulated single surface, obviously dressed up differently in the resulting buildings. Thus, the commonality notwithstanding, the actual artefacts of both practices remain distinct and in line with their respective agendas.

Patrik Schumacher has argued that we have entered a new era of architectural style, one that is not transitional but here to last. However and despite Schumacher advocating an emergent parametric style, the comparison of F&P and ZHA across the shared technology and expertise shows that style is not at the core of the new era. It rather comprises a new layer of technology and furthermore, following Kevin Kelly’s argument, it does not require a manifesto since it is driven along its own, inherent trajectory of development.

The third pillar

Much like the digital has become ubiquitous, the OS of Architecture is also all-pervasive. Even those who oppose its most extreme and stylistic design results, cannot withdraw from it. Unless one steps completely out in pursuit of a manual arts and crafts approach to design, it is very difficult to offer an alternative and relevant methodological model for contemporary architecture. As much as one sees the OS of Architecture responsible for a collapse of distinctions in practice as illustrated above with F&P and ZHA, this is not true for the discourse on architecture. The traditional architectural discourse and the one on the OS of Architecture are different and, till now, largely separate.

Insofar as the OS of Architecture is a pillar to the discipline, adding to the code base of architecture is an equally valid contribution to the architectural endeavour and the development of the discipline as realising buildings or publishing theory. This allows for new players to participate in very different ways than before. It is true for single handed efforts such as David Rutten’s development of Grasshopper™, corporate ventures like Gehry Technologies’ Digital Project™ or Gramazio & Kohler’s systematic introduction of robots to architecture. Furthermore, a few graduate programmes have emerged as great contributors too by pulling technology (animation software, subdivision surfaces, script libraries) into the design process, by developing new methods and consolidating proven ones (form-finding, agent systems, space syntax) and by creating project evidence of experimental methods.

Leave a Reply