Supercomputing: Risk and Complexity in Planning

Can planning in Germany still keep up with its traditional instruments and tools in the current world of rapid change?

There is great need for action in view of ever faster innovation cycles, global dynamics, conflicts and needs, which planning practice lags behind. Potential dangers are increased by extensive land consumption, resulting sealing, the conflict of settlement pressure versus open space and agriculture, and the requisite development of decentralised areas through infrastructure. 

Due to anthropogenic climate change, we have been experiencing extreme weather events on a regular basis for years: Droughts, forest fires, heavy rainfall events, storms and hail are increasing in the wake of global warming and the consequent rise in sea levels. Munich Re calculated 9,200 fatalities and overall losses of $280 billion worldwide for natural disasters in 2021 (of which approximately $120 billion were insured) and calls for building codes and protective measures to be better adapted to extreme weather events (Munich Re, 2022). Furthermore, environmental disasters also pose a significant risk to settlement structures.

The planning instruments and tools applied today in practice were implemented decades ago and have remained largely untouched and correspondingly inert since that time: Land use plans in Germany, for example, are valid for around ten to 15 years and usually only updated at these intervals. Similarly, development plans are usually only revised at long intervals or on an ad hoc basis. Both instruments are the responsibility of the municipal level while smaller municipalities and cities in particular lack the human, professional and infrastructural resources. This is also the case for timeframes of regional plans (at the regional level) and the state development plans (state). 

Overall, the competences of the planning levels between municipalities, regions, Länder and the federal government partly overlap, and are accordingly prone to conflict. In risk perception and management in Germany, a sectoral perspective has dominated to date. Risk management is perceived as more of a sectoral planning task with the primary goal of averting individual hazards. The role of spatial planning is rather weak (Greiving et al., 2016 and Sapountzaki et al., 2011).

Paradigm change for resilient settlements: Planning, forecasting and risk mitigation using digital tools:

With the application of digital tools based on comprehensive available data and using different analysis and simulation methods with the help of high-performance computing (HPC), risks could be immediately identified and the corresponding countermeasures evaluated. These tools would be applicable at all scales, planning and management levels and allow for more transparent, efficient and rapid planning and risk assessment. 

Digital twins are one solution approach that could be implemented from the object (building) to the municipal level, region and the country or beyond. Comparable projects are already being realised: Estonia is currently establishing the first comprehensive digital twin of an entire state in Europe, Singapore has already implemented it. Obviously, it is easier to implement these projects in countries that are smaller in terms of area and have a different political structure. However, there are also efforts by the European Union within the European Space Agency to develop digital twins as tools with regard to global challenges such as climate change, pandemics and emissions (European Space Agency, 2020). 

HPC can offer great opportunities to understand and assess risks in the complexity of spatial planning. With the advent of new technologies, we are witnessing not only an increase in data volumes, but also an increase in the speed at which data is generated and processed, as well as the requirement to analyse data and simulate effectively and in near real-time, real-time and faster than real-time.

The trend towards more data-intensive applications such as analysis of (large amounts of) urban data, weather and climate simulation and flow simulation leads to massive amounts of data that make manual analysis impossible. These challenges can be addressed by introducing a seamless workflow between computationally intensive simulations (HPC) and data-intensive analytics (Big Data). This would allow enormous amounts of for example urban, climate and traffic data to be analysed directly after their automatic generation.

The majority of today‘s HPC systems are not up to the demands of Big Data analytics, while they are very well suited for demanding simulations with parallel computing. In this context, new technical developments and research show that HPC can be coupled with Big Data Analytics to enable seamless workflows between computationally intensive (urban) simulations and data-intensive analytics (Excellerate Programme, HLRS, 2021). This would also have potential for applications in “risk assessment computing”. 
High-performance computing is of great potential for urban data collection, urban analytics and simulation in near real time, real time and faster than real time and forms the computational basis for addressing the complex challenges of today and tomorrow. 

Cover image: “The Deluge”, by Francis Danby, 1840. Oil on canvas. Tate Galery / presented at Triennale Milano, 2022 (detail)

Computers at High-Performance Computing Center Stuttgart (2023)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: