Desktop version

Home arrow Education

  • Increase font
  • Decrease font


<<   CONTENTS   >>

Mortar over brick

Brick and mortar construction embodies two very different construction ideologies. Bricks define the domain of parts, as they are discrete industrially produced elements that define an assembly of a larger whole. Bricks define a world that can be quantized into units, and that through the very geometry of the brick, is constrained by the possible arrangements of this prefabricated module. Mortar defines the domain of flow. Through its viscosity and malleability, mortar is able to adapt and change form and breach small gaps. Different mixtures of mortar allow the

Bricktopia by Mapl3 utilizing Rhino Vault by Philippe Block. The project displays how the discrete brick units conform to the simulated vault configuration

FIGURE 2.3 Bricktopia by Mapl3 utilizing Rhino Vault by Philippe Block. The project displays how the discrete brick units conform to the simulated vault configuration.

changing of its properties, gradually transitioning between viscosities and allowing the material to be adapted to different specific contexts.

Two mathematical paradigms are described by these two material assemblages: the discrete model and the continuous model.The discrete is characterized in mathematics as identified as whole numbers. The continuous model can be identified by real numbers, also understood by numbers with a decimal representation. The discrete model has been a conventional form of production, where the production of identical parts, such as bricks, provide greater efficiency in scale. This strategy was expanded to almost every building material, as the advent of industrialization further reduced costs of mass production. As a result, the bespoke or customized became a form of luxury living, in opposition to the serialized production seen as a “cookie cutter” solution.

As explained by Greg Lynn in Animate Form* the continuous paradigm was the central paradigm for innovation in the 1990s. Space was no longer understood as a neutral container but rather populated with forces, or vectors, that are able to deform and contribute to the defining of a given form. Lynn utilized this to understand animation as a process in which form evolves through the influence of intrinsic and extrinsic forces. Traditional discrete, static objects are exchanged with a process that can yield multiple outcomes. Drawing from studies of natural systems that have embryogenesis, this novel process of defining form required a paradigm shift from a system of elements to one of flows, where intensities could transition into features, defining a continuous whole of gradually interpolated differentiation.

For me, it is calculus that was the subject of the issue and it is the discovery and implementation of calculus by architects that continues to drive the field in terms of formal and constructed complexity. The loss of the module in favor of the infinitesimal component and the displacement of the fragmentary collage by the intensive whole are the legacy of the introduction of calculus.’

For Lynn, the introduction of calculus and infinitesimal mathematics was a way to develop a concept of space that includes time and duration. The adoption of software tools from the animation industry enabled architects to engage with time and force modelling, e.g., opportunities offered by the introduction of calculus.

Philosopher Manuel DeLanda, who strongly contributed to the adoption of a Deleuzian ontology' in architecture, explains that matter can be understood through two different series of properties: extensive and intensive. Extensive properties are divisible attributes that can be notated in a Cartesian notion of space, like size, mass, area or volume. On the other hand, intensive properties are indivisible attributes that operate in a differential space, such as temperature, pressure or curvature. Intensities are considered prior to form, as they constitute attributes that can trigger form in the process of unfolding, which generates features."’Architecture that is defined by intensive properties was pioneered by Bernard Cache, who coined the term “objectile.”This is an object that, framed in Deleuzian terms, is defined prior to its actualization, living in pure virtuality. As Cache explains,

Objectile is a generic object: an open-ended algorithm, and a generative, incomplete notation, which becomes a specific object only when each parameter is assigned a value. In the same way, a parametric function notates a family of curves, but none in particular.11

Design could be conceived as an embryonic process out of which a multiplicity of variations could be obtained, each one of them different from the next. This development also allowed design to engage explicitly with the concept of the virtual. As Lynn explains, the virtual is understood as the abstract space of possible actualization.12 The equations derived from calculus gave birth to parametric form, one in which populations of variables are able to define a solution space or domain of possible variations.

The shift toward intensive over extensive properties reached a culminating exposition in December 2003, when Frederic Migayrou curated the exhibition Architectures Non-Standard at Centre Pompidou, Paris, France. In the exhibition, the influence of the mathematics of calculus originated by Henri Poincare established a direct relation with the spline architectures enabled by the software platforms available at the time.1’ Migayrou’s exhibition gathered the work of Greg Lynn, Kas

Oosterhuis, Bernard Cache, UN Studio, Asymptote, dECOI, DR_D, Servo, R&Sie, Tom Kovac, Kol/MAC Studio and NOX. While all these architects had developed specific design research through the advancement of digital technologies, many of the core principles, such as mass customization through parametric differentiation, performative geometries or file-to-factory protocols have transferred and become standard practice for architectural firms worldwide.

The design research in nonstandard architecture that was developed throughout the 1990s led to a set of practices that constituted a design paradigm of continuity. Developments in digital fabrication linked to this paradigm started to enable the post-rationalization and fabrication of complex geometries that emerged from animation software. Digital fabrication allowed the serialization of identical parts to be considered an issue of the past, as it allows for bespoke form to be as viable economically as its serialized counterpart. As it has been framed by Mario Carpo, nonstandard seriality made economies of scale irrelevant, as digital production could, at least in theory, allow for the fabrication of unique components with no additional cost.14 The rationale behind this statement came from examples such as the CNC milling machine, which could produce bespoke units in the same time that would produce identical ones. According to Carpo:

In a digital production process, standardization is no longer a money-saver.

Likewise, customization is no longer a money-waster.15

The file-to-factory protocol allowed digital models to directly provide instructions for CNC manufacturing, circumventing the need for traditional architectural representation outside of the computer screen. The digital model became an apparatus for gathering architectural data and a protocol for collaboration and coordination. Moreover, the multiplicity embedded in the parametric variation of a digital model no longer describes one building but families of buildings. Software companies, together with a new generation of architects and tool makers, developed the digital infrastructure that has enabled the adoption of these tools by students as well as industry. Over time, the terminology describing this area of work became “parametric design.”

 
<<   CONTENTS   >>

Related topics