How technology can help us assess a complex world
I am in an interesting place at the moment. I make a living trying to solve assessment and regulatory puzzles from a privileged and influential position inside the system, hence my anonymity. I work for an agency that’s trying to preserve some beautiful catchments from underground mining which is spreading beneath them, and my main job is to work out how much harm this is causing. I am constantly trying to work out how a currently ineffective monitoring, assessment and regulatory system can be improved so that the true extent of impacts can be understood and predicted, and I’m hoping that might interest others and start conversations. Hence the blog, and my great curiosity about how others think it can be done better too.
Just to be clear, I believe that coal mining or any other development which benefits mankind is OK (acknowledging that my logical ethos breaks down where greenhouse effects make coal irredeemably unsustainable), provided the pros and cons have been properly understood and weighed in accordance with appropriate laws of the land. And that impacts on the environment are honestly portrayed and understood – we’re the only species that gets to vote so we have to be fair to the others. I think it’s inevitable and right (or not too wrong) that we do what we do to help ourselves prosper – sustainably, healthily and ever more comfortably.
So I’m truly not anti-development or a revolutionary, but I devoutly believe that development should proceed only when all the relevant information and best available knowledge and science are employed to know what will happen if that development does go ahead.
One thing I’ve discovered in this headspace is that technology is massively improving our ability to understand and predict development impacts, but the regulators and regulatory systems we use are not yet agile enough to harness the new know-how. We regulators need to catch up to do our jobs as well as our communities and environment deserve.
Let me open up the toolbox and give you some examples of the ways the new open-source, big data, numerical modelling and a raft of other tools are enabling us to understand the complexity of nature and how it will be affected by particular actions and interventions:
Open Source Environmental Data and Assessment – There are so many parts to this, and so many of them have huge potential to radically improve the way resources are shared. If the stars align and we the liberal government technocrats get to set this up on behalf of the communities we serve, all environmental monitoring data gathered by industry (proponents and monitored operations), government (all levels) and citizen scientists with smartphones will ultimately be made available and transparent. It will all go up on government servers and will be accessible to all.
This means for example that when a proposal is made, the full set of supporting information and data will be available for examination by the regulators and, if they are so inclined, the community. Modelled predictions made by the proponents can then be independently checked and, if sound, used to nominate specific criteria at specific monitoring locations which should not, and others which must not, be exceeded if the impacts are to stay within the agreed limits.Many new regulations in the financial sector (a field termed RegTech), addressing taxation avoidance and money-laundering for example, are using this approach to enable regulators to identify and monitor key transactions in real time and through this transparency are keeping banks and players honest(ish). The opportunity to expand an open-source approach to environmental regulation now awaits. True, there have been partially successful attempts to use planning conditions and regulatory decision support systems in this way by requiring development impacts to be limited to predictions; the difference here will be with the quality of the analysis and the transparency of the policing systems if the development is approved. If specific criteria are exceeded specific responses, up to and including stopping and rehabilitating the development, will be enforced.
Agent-based modelling to try out the most effective regulatory strategies – set up a simulated world with agreed knowledge, assumptions and rules and then watch as the most efficient way to preserve resources whilst maximising benefits is revealed by the simulated agents. You can even turn these models into interactive games to encourage community participation and feedback.
Open-source analysis tools – Use the GitHub, MatLab and other techy ecosystem approaches to finding the best analytical code and algorithms by developing and sharing them on the web. These big data, open-source technologies are literally advancing at the speed of thought, and advances in analysis of environmental data are constantly progressing and evolving.
Simple-as-possible modelling – Although it may seem counter-intuitive, the best regulations are built on the simplest appropriate analysis. Technical advances are driving us always towards ever greater complexity, yet the best regulatory systems are simple enough that the proponent, regulators and communities can clearly understand the analysis and science underpinning them. In many cases for example, it may be better to apply simple statistical analysis to discern patterns and trends than to use “black-box” models which give you an easy answer of uncheckable accuracy.
The power of these kinds of data-enabled smart regulatory system tools is immense due to their agility, simplicity and transparency. The real challenge now is for the regulators to catch up to the new technical capabilities and work out how to apply them in the most effective way, so much easier preached than done.