Discordianism, an entirely invented religion and <every synonym of spoof>, contains a very profound bit of philosophy : imposing order causes disorder. The reasoning is, in my engineer’s interpretation of their gospel :
- Murphy : if it can go wrong, it will, and at the least convenient time and in the least convenient way.
- The number of ways a machine or system can go wrong is an exponential function of the number of parts.
- Problems can arise from outside of the machine or system, Murphy multiplied.
- Protecting a machine or system against problems makes it more complex.
Order is imposed in machines via a control system, from simple mechanical through computerized.
Order is imposed in systems via rules, normally as automated as possible.
The complexity of a system for which a control system can be constructed is much lower than the level of systems we have all around us. Driverless-cars include a control system for a car on a highway with traffic signals and signs, traffic, weather. Humans routinely do that while thinking about other things, talking on the phone, etc. Driverless cars are not ready for sale yet, so that is about the limit in engineering’s capabilities. Airliners could be entirely automated if air traffic control wasn’t needed, but that is still too complex to automate.
The number of parts in these systems is enormous : not only the number of components of the system being controlled, but the much larger number of elements in the automated control system. Program elements, named procedures, can be individually tested to a high degree of confidence, but the paths through a program is an exponential function of the number of procedures : A() calls B() calls C() or A() calls C() calls B(). Bugs reside in paths through a computer program.
The fraction of error-producing paths in a computer program can be very low because programs are written in computer languages which enable precise meanings, programmers have tools that check their code and every component of a program can be tested very extensively on processors that have been tested very very extensively and always execute instructions in exactly the same way. Nevertheless the number of paths through a program is too large to test exhaustively, thus all large programs have bugs.
Systems where humans interpret the rules written in human languages do not have that advantage of being interpreted and executed identically every time. Rules written in a human language are malleable, and people do mold them to their own advantage.
Legal systems are rules in a specialized language which attempts to enable precise meanings. In writing laws, lawyers do not have tools to check for consistency, not in the use of terms within one bill, nor in the use of references to other sections of the law. Humans have to do that, and even before there were long-term career advantages to building loopholes in codes lawyers were writing, e.g. the recent replacement for Glass-Steagall, it was not possible to eliminate those in even simple laws.
As with computer programs’ procedures, legal systems interpret every law in the context of all of the other laws, similar to the paths through a program. Thus, the overall system is exponentially more complex with every new law. New laws are tested after they are passed when lawyers use them. That is equivalent to first testing flight automation software using a fully-loaded airliner — you would not want to be a passenger on that airliner no matter how careful the programmers were.
For all systems of rules, the number of ways to game a system of rules, to evade those rules in an advantageous way, is some fraction of the number of paths through the system, the possible interpretations of a law, both equivalent to a constraint satisfaction problem in mathematical logic. Loopholes are paths through the rules, also ‘bugs’ in computer programs that allow an attacker to take control of the program or system it executes upon.
Every system of control has the same problem of being gamed by people with greater resources than those who build the system : more, more intelligent and clever, with greater computational resources. The problem has grown faster than the means of handling them. All of the systems within which we live are becoming more fragile, more easily exploited by malevolent intelligence.
As we reform our society, we need to make it less exploitable, more resilient. The only way to do that is to minimize the rules and maximize human intelligence applied to any control. Common sense applied by interested parties in local context does that, the standard Conservative solution to all social situations.