For generative design, automation needs a new breed of data-savvy architects to set the rules
Automation is a vast generic descriptor that applies to many processes in both the physical and the virtual realms. A key principle is that it reduces or obviates the need for human participation. Provided it is properly verified and validated, and the return is worth the investment, it can add significant value. This is because it dramatically speeds up processes, improves accuracy and has the potential to bring down costs.
A prominent example of this comes from the Rapid Engineering Model (REM) software, developed for Highways England by Bryden Wood. It automates the design and location of overhead signage gantries for emergency slip ways by drawing down geo-located data from third-party sources. Bryden Woods’ Jaimie Johnston explains the advantages: ‘What traditionally took a team of people six months, our software can do in a couple of days. That’s not just slightly better – it’s a revolution.’
Depending on your perspective, this kind of thing either threatens to put people out of work or frees them up to concentrate on other value-adding refinements that cannot yet be automated. Either way, the correlation between increasing industrialisation and improving human wellbeing documented by organisations such as Our World in Data and the SDG Tracker suggests that, on that front at least, it is a net good.
It is useful to distinguish between automation that allows the creation and elaboration of data (for example configuration, optimisation, generative design, BIM), and that which allows data to be translated from the virtual to the analogue or physical worlds (visualisation goggles, robots, 3D printing) or vice versa (sensors, reality-capture scanning and measurement technologies).
This article is about automation that allows the creation and elaboration of data for design efficiency and effectiveness. Beyond architectural and engineering design, this includes interoperability, project management, communication, reporting, and even regulatory and code compliance.
What links all these use-cases is data, the golden thread that, in a proposed singularity, is a virtuous spiral leading to ever-improving quality, value, and societal outcomes.
The world of Construction Technology (ConTech) demonstrates that processing power, while still a limitation, is large enough to admit some tantalising future prospects. The Boston Consulting Group predicts that by 2028, full-scale digitalisation could help the industry save an estimated 12% to 20%, equal to $1 trillion to $1.7 trillion annually.
Onwards to singularity
The variety of innovation is breath-taking, leading inexorably towards the singularity. In this future, design will have switched from computer- to human-aided. The role of architects in design and compliance will increasingly distil into governance. Rather than designing and refining project-specific designs, they will continuously improve generic rules that underpin designs.
Some businesses – notably Speckle and Hypar – are already providing the enabling platforms. Anthony Hauck of Hypar explains, ‘We’re trying to capture the idea that expertise can be recorded and reapplied in multiple contexts.’ If this looks like nullifying your business model, it shouldn’t. It simply allows you to reach more clients. After all, he says, ‘Probably the vast majority of building world-wide currently goes up without the intervention of any licensed professionals at all.’
The point is to step away from blank slates at project inception, which, says Gavin Pike of Bennetts Associates and the Get It Right Initiative, is a perennial problem. ‘Everyone’s continually reinventing the wheel, which is clearly wasteful,’ he says. Bennetts tries to improve the situation using information management processes but recognises how limiting it is not to cross-reference to others.
With a change of mindset as much as of contracts, the industry can overcome its copyright jealousy, which is seen as a small price to pay for the overwhelming value gained from others doing the same thing. While the professions are notoriously wedded to copyright, there are signs that commonality of purpose can break down those barriers. The Architects Declare movement shows willingness to collaborate for the greater good.
There are encouraging signs elsewhere too. The construction industry-sponsored i3P group already pools effort to enable critical R&D, while Sir Robert McAlpine recently convinced 25 other companies to share data through the Construction Data Trust.
Don’t have to be a mechanic to drive a car
The growing libraries of nuggetised expertise will provide verified bases to start from before the algorithmic design engines take over. In just a few keystrokes, generative design configurators such as Testfit or Spacemaker, or the free PRISM application developed by Bryden Wood, already allow you to rapidly build early-stage models compatible with Revit, churning out hundreds of viable options in seconds for instantaneous co-creation. David Miller of DMA says these apps are ‘fairly clunky’, but are likely to get better. Eventually, architectural, structural and MEP models will mostly be generated automatically, speeding up optioneering and consigning clash-detection to the past.
Although coding skills will have become a normal part of the practice mix during the transition, third-party visual programming user interfaces will catch up – probably a good thing given the breadth of knowledge architects are already expected to have. As Hauck says, ‘You don’t have to be a mechanic to drive a car. I don’t think coding should be the end destination for our profession.’
Information management expertise, on the other hand, will have become a core professional competency. By itself, data is insufficient. To become useful, it must be clearly defined with rules for how it can be used, and have a common generic structure.
In a data-centric world, architects must define as well as solve problems. This was the ‘big penny drop’ for David Miller. ‘Everything has to be set up so the baton can be picked up by others at different points and the information remain accessible and interoperable.’
Too often, the value of data is lost because, despite looking good in the geometric model, it is in fact eccentrically structured, producing what information evangelist Emma Hooper of Bond Bryan Digital calls ‘pretty-bad modelling’. Adhering to the new ISO 19650 series of standards governing BIM and its guidance (which she is involved in writing), will fix a generally chaotic picture.
The rules governing information are generic, leading to a glut of business tools useful for administering projects or offering ‘single source of truth’ platforms, such as Viewpoint, Procore, Kreo and Plannerly to name a random selection. The Centre for Digital Built Britain (part of the Construction Innovation Hub) is even exploring the possibility of encoding regulations, the long game being to enable automatic compliance-checking.
The ultimate expression of the singularity will come when reliable AI and machine learning mature. Fed by big data harvested from the digital twins of buildings and infrastructure in use, they will turbo-boost generative design engines. As well as baking in buildability and closing the performance gap, this feedback will enable optimal sustainability and other critical public goods. An early example of this emergent area is WeWork’s ‘neural network’ which predicts meeting room utilisation – estimated to be 40% more accurate than human designers.
Computers cannot yet mimic the intuitive leaps that humans are capable of. Gavin Pike does not think it will happen any time soon, but concedes that, if it does, there’s a risk that clients might forget the benefits of ‘the architect’s governing eye’. Autodesk’s Kyle Bernhardt puts a much more positive spin on it. He thinks AI is all about enhancing the intense creativity of architects, gracing them with a completely new ‘superpower’.
Get the principles right and there is almost no limit to how far automation can go. Without trivialising the ingenuity of the technological innovation involved, the only things holding it back, in the virtual realm at least, are processing power, permissions and syntax.
Implications for architects
The automated future means rethinking business as usual. If it hasn’t already happened, the designs of the future will almost certainly be crowd-sourced. As David Miller says, ‘Your baby is actually a shared endeavour. That takes a mind shift, which clashes with how architects are trained.’
As generative design matures, architects’ involvement will be in partnership with the software provider, either directly or by taking advantage of their open-source platforms. ‘If there’s a glitch in the project model,’ says Jaimie Johnston, ‘you don’t fix the model; you fix the rule set so it only ever gets better.’
While it will relieve the drudgery of reinventing wheels, generative design will also reduce the number of hours you can charge. This time can be redirected into better early-stage optimisation with the help of configurators. Johnston again: ‘It makes lots of options appraisal easy so you will get more and better architectural variation.’
Emma Hooper identifies those ‘pretty-bad models’ and the BIM information exchange at Stage 4 as the big problem. The space between design and site is ‘a black hole that sucks down information,’ she says. Her advice is to adopt the ISO 19650 series, always think about the end use and work backwards from there to make your output optimally robust and useful.
Gavin Pike recommends focussing on standardising digital approaches right across the design supply chain. ‘Anything less than that makes a mockery of the whole process.’
Architectural drawings will cease to have any value. David Miller sees this as a major challenge for the profession as gratification from the ‘visual endorphin hit’ of a beautiful working drawing has to be delayed until the building is complete.
See more on the digital revolution:
Is The Singularity tech’s new dawn?
Can we close the whole-life data loop?
Mass customisation: MMC beyond boxes