This blog is continued from Thinking Systems #8
At present, working either with a restricted set of values or when we claim to be totally value free (impossible in practice), we plan remediation programmes as “predict-act” schemes and then fail to deliver the goods. It’s very much like the myth of the field of dreams: “build it and they will come…” but often they don’t! [Remember the important roles of chance, necessity and 2nd order interactions.]
The situation has become so bad in the fields of conservation biology and ecological restoration that you can find papers in the literature expressing grief over the lack of outcomes from the scientific managerial enterprise. Why, the biologists ask, does physics get to play with huge particle smashers and find the famous Higgs boson whereas we get equivocal results at best? What has gone wrong? As I have been arguing, it’s all to do with physics envy and rationalist myths applied to “soft” systems. Lasch’s system thinking elites have led us astray. (I should know – I was a part of that group for years.)
For example, river restoration attempts succeed (by our present scientific definition of success) about 10% of the time; large scale agri-environment schemes fail to deliver conservation outcomes despite enormous expenditure of money and effort and even heroic alterations in land use in watersheds only explain about 30% of the variability in water quality in their drainage waters. The success rates of current conservation biology projects are no better.
We are trying to manage, manipulate and restore ecosystems through the rationalist myth of “systems thinking” and an inadequate moral philosophy. The realisation that intrinsic environmental values and meaning are designed around evolved mechanisms that deliver distributed robustness and rapid access to the adjacent possible illustrates why we are merely achieving the expected response (see Andreas Wagner’s books).
Attempts to find evidence and universal laws
Nevertheless, and despite all, we constantly seek rationalist universal explanations. It has frequently been suggested that power law distributions and other statistical properties of complex and emergent networks are evidence of self-organised critical behaviour and much play has been made of this. Many such results have been presented as evidence of a universal organising principle, but it seems that such results commonly arise merely from central limit phenomena generated by random patchiness, advection/dispersion and non-linear interactions. Such distributions are common and are called Tweedie distributions.
Ecological data are often said to be fractal and much has been made of this also. These data are not fractal; they are, more correctly, multi-fractal and this again can arise as a kind of central limit phenomenon. Well known examples include fire regimes on land and biological patchiness in water, both of which exhibit multi-fractal scaling. This is exactly what we would expect to see from an evolved distributed robustness mechanism. We have, once again, been fooled by randomness.
The lack of equilibrium is further shown by the widespread presence of 1/f scaling. (1/f scaling means that the observed variability is an inverse function of the frequency.) James Kirchner and Colin Neal have shown that many chemical measures in water quality monitoring show this kind of scaling arising merely from random patchiness and advection in space/time. Worryingly this 1/f power law scaling has a slope which rules out the existence of stable statistical properties like means and variances. More data and longer time series merely show up more variability! This makes collecting evidence and detecting change or trends before and after management interventions highly problematical. Arendt was correct all those years ago.
Distributed robustness therefore provides a highly evolved “anti-fragile” biological response mechanism but it plays merry hell with our rationalist assumptions of universality and stationarity, of predictability and of achieving evidence of outcomes from management actions.
As I have argued before, distributed robust networks in which structure and function interact through reflexive responses in real time are not computable. Nevertheless much new work on control in genetic and metabolic networks (largely coming from the study of biochemistry and molecular biology in systems biology) shows that there do appear to be some general principles that relate function to network structure; presumably because some organising principles beget greater persistence than others (and, of course, we only see the persistent solutions). Similarly, we know from the work of Robert Ulanowicz and others that ecological networks only occupy a subset of all possible state spaces. We do not know why.
So what with distributed robustness, central limit statistics, 1/f power laws and non-stationarity it is little wonder that we have a problem with evidence (like all “soft” systems). Evidence is hard to define, it’s not always collected – and if it is, it is usually debatable because environmental data impinge on our values and they carry different meanings for different people. This should be the clue to a way forward: don’t argue over rationalist data and models generated by universal, value free science, but instead look to local relationships and information flows – focus on values and meanings; on Gregory Bateson’s “differences that make a difference” – both in the biology and in the human realm. Combine a new understanding of meaning and value in biological dynamics with the social, the economic and the political.
The importance of meaning
We know that local interactions between individuals carry meaning and that there is contextual information. Why should ecological contexts differ from the human? We know that living organisms of all kinds exchange signs and symbols and that meaning will be dependent on context (place, time, season) and also on location. Whatever happens in response to the “difference that makes a difference” will vary in space and time depending on contingent history and context: on what that action “means” to the organisms impacted upon at that particular time and place. This view transforms the science of ecological systems from a 1st order, to a 2nd order cybernetic view.
There are different meanings for different species because each will have their own particular view of the system and its environment depending on size, scale, growth form, longevity and the evolved nature of their anticipatory models. This is the ecological Luhmannite view – there is no system without an observer – but it varies from Luhmannite sociology in one critical aspect: meaning implies values and anticipatory models generate “oughts”. There is value in Nature. Evolved survival strategies determine preferred responses.
New kinds of data arising from computing and communications technologies applied to ecological problems are revealing all kinds of subtleties in the relationships between individuals and species and their habitats that we never saw before.
Conservation success is now seen to be dependent on things like contingent and fleeting interactions between individuals and on the interaction of individual movements and habitat heterogeneity. Such detail has not before been included in the usual averaged and Universalist ecological methodology.
Tracking organisms by telemetry is revolutionising our knowledge of the aquatic and terrestrial worlds. New data are coming from activities as different as tracking Radio-Frequency ID tags attached to individual birds in UK woodlands and tracking GPS collars attached to Tasmanian Devils in Australia. New insights from the movement of individuals in heterogeneous environments have revolutionised our knowledge of the ecology of Gouldian finches and feral cats in the Australian tropics. Similarly, new insights on habitat sharing by cheetahs and lions are coming from a crowd-sourced analysis of a massive camera trap database from Serengeti National Park in Africa. While all this is happening birds and bats are being monitored by smart acoustic listening devices monitoring calls and transmitting information back over mobile phone networks.
These data are revealing hitherto unappreciated details of habitat use, of spatial patterns and of interactions between individuals. Slowly but surely we are beginning to understand what interactions and information carry meaning in ecological systems – transcending an old ecological paradigm based on states and stocks of biodiversity. It is hardly surprising that when we attempt to manage an “ecosystem” we rarely get the response we expect, if we get any response at all. Ecosystems – as we presently “know” them – do not exist.
Acting under epistemic uncertainty
We have two kinds of epistemic uncertainty. First our science, driven by an inappropriate moral philosophy, has been looking in the wrong place. We have been using rationalist and instrumental 1st order cybernetics in a cost-benefit framework which neglects the value in Nature, rather than a 2nd order methodology concerned with a much broader set of values and purposes. Second, we have suffered from the problem of not being able to predict outcomes because of the complex working out of cause and effect across scales driven by anticipatory models and contextual information.
Uncertainty rules then. So what to do? Well, the worst thing to do is to invest in a major Environmental Directive or program of works and measures, using standardised predictive methods, expecting evidence of effectiveness and specific outcomes defined by return on investment. It’s a bit like the “build it and they will come” idea from the Field of Dreams – but it does not always work. Huge sums have been spent on infrastructure investments, agri-environment schemes, river and ecosystem restoration and conservation biology; and, sadly, we have little to show for it.
Recent analysis of the process that established major environmental management programmes like the European Union Water Framework Directive has shown how the entire enterprise was based on a particular kind of directed cost-effectiveness approach. The process was set up in such a way that preference was given, by the symbolic-thinking rationalist elites, to the views of like-minded experts and their bureaucratic business models. As Christopher Lasch would say, the EU WFD is a child of the times. As we might now expect, the programme has not delivered.
A further risky thing to do (and perhaps also expensive) is to assume that “big data” – hyper resolution data – will be the answer to the maiden’s prayer. As Keith Beven and his collaborators have pointed out, more data at finer scale does not solve the problems of epistemic uncertainty when applied to the usual rationalist paradigm. Yes, more data can reveal details of pattern and process at finer scales, but without knowing what are the meaningful signals to a range of organisms, such data often merely raise more questions than answers.
It all comes down to values, mindsets and expectations. When confronted by the reality of Hannah Arendt’s plesionic world the elitist, rationalist view of scientific management has generated a number of myths of restoration. These myths include the carbon copy (the ability to reinstate the original state), the field of dreams (build it and they will come), fast-forwarding (rapidly restoring the original when, in reality, it took millennia), the cookbook (the existence of a menu of restoration methods guaranteed to work), command and control (successful prediction and action by management agencies), and the Sisyphus complex (we just have to keep on doing what we are presently doing over and over until it works).
As Richard Hobbs and others have commented, all this has left the restoration and conservation ecology communities grieving over a field of dreams. Our expectations and values concerning the biotic world are misaligned with what is really going on. Particle smashing physics this is not! All this was evident more than half a century ago but the warnings and insights were ignored – only now, maybe, are we beginning to understand what we have done. Maybe….
First understand the problem; then do something about it. There will be no short cuts and there will be no magic bullets – we will have to rethink how we live, how we conceive of the world around us, how we arrange our affairs and what we can expect. Perhaps most importantly we need to pay some attention to the development of methodologies to fuse the “soft” with the “hard” – as Pascal Perez has commented on one of my previous blogs – to the fusion of Peter Checkland’s Soft Systems Methodology with the General Systems Theory of Ludwig von Bertalanffy. We must begin by including the flows of information and meaning in both the human and the natural world. Values matter.
As Hannah Arendt pointed out, this is a fundamentally uncertain world we live in: one that bears many intrinsic values. (I am sure that some economist is going to tell me that the best thing to do is to monetise these values but, like Michael Sandel, I hold that intrinsic values linked to existential risks are beyond price. There are (or should be) moral limits to markets.) We are not gods, and there are no universal prescriptions, no magic bullets. Choosing the path we take is not going to be easy, essentially the process (as the Pope has pointed out) requires us to rethink our moral philosophy – what we ought to do – nevertheless the consequence of inaction does not bear thinking about.
This discussion will be continued in Thinking Systems #10.
Thank you Graham for this new delivery!
I can’t help but to reinforce the significance of Tweedie Distributions in this debate by quoting the following:
“Whereas conventional models for Taylor’s law have tended to involve ad hoc animal behavioral or population dynamic assumptions, the Tweedie convergence theorem would imply that Taylor’s law results from a general mathematical convergence effect much as how the central limit theorem governs the convergence behavior of certain types of random data. Indeed, any mathematical model, approximation or simulation that is designed to yield Taylor’s law (on the basis of this theorem) is required to converge to the form of the Tweedie models” (Wikipedia – Tweedie Distribution).