Originally published 1996. Author is psych professor at Bamberg.
Focuses on failure modes in complex systems.
Much of the book is based on two groups of experiments:
In the first group subjects become dictators of fictional towns or countries. Not clear how the results of their actions are decided - seems to be much room for experimenters to influence which behaviors are successful. The analysis is post-hoc and mostly quantitative. Probably best to think of these as exploratory studies.
In the second group subjects are asked to make predictions about or attempt to control simple dynamical systems. These seem better controlled, since the behavior of the system is clear defined. Again, the analysis is post-hoc and mostly quantitative. Would be interesting to see if these have been repeated at scale.
Still, many of the failure modes in the book seem worth considering even without knowing their prevalence.
Systems are prone to human failure when they are:
- Complex - many variables, high degree of interdependence. Is subjective, relative to skill/understanding eg driving a car is complex to a beginner.
- Dynamic - evolving over time. Humans are much stronger with spatial relationships than with temporal relationships. Symptom is focus on current state rather than trajectory. (Analogously, passing laws without considering the incentives produced.)
- Intra-transparent - no direct access to / measurement of variables.
- Misunderstood. Humans make assumptions, have incorrect knowledge and are resistant to discovering that their model of reality is wrong.
Parts of decision-making:
- Formulate goals
- Formulate models and gather information
- Predict and extrapolate
- Plan and execute actions
- Review effects
Ways to fail in goal-setting:
- Unclear goals eg ‘promote peace’ is not sufficiently precise to decide between abstaining from a conflict and entering as a peacekeeping force.
- Missing implicit goals. ‘Stop insects from eating crops’ is an explicit goal. ‘Dont destroy the local ecosystem’ is an implicit goal - local ecosystem is currently fine so may be forgotten when weighing actions. Symptom is solutions that cause new problems.
- Poor prioritization of goals. Tend to focus on what we know how to solve or are comfortable working on. Opportunity costs.
- Goal degeneration. Immunized against admitting failure, so escape by shifting goalposts or goal inversion (“the famine is actually desirable, it will help correct the population imbalance”).
- Not recognizing contradictory goals. May be hard tradeoffs between two or more goals. Combined with narrow focus on one goal at a time, leads to seesawing.
Ways to fail in model building and review:
- Not building a model. In the second group of experiments some subjects don’t seem to actually hypothesize an internal model of the system at all, and instead have a crude behaviorist / correlational approach to mapping inputs to outputs.
- Overly reductive models. Rare for everything to reduce to one central variable eg employee happiness. But tempting to have a simple hammer to use for all decisions.
- Deconditionalizing. Fixing broken models by adding ever more conditions and escape clauses. Cf David Deutsch on bad explanations
Features of complex, dynamical systems: Feedback. Buffering. Critical vs indicator variables. Delayed response.
Ways to fail in planning
- Paralysis by analysis.
- Methodism. Over-reliance on simple algorithms that worked in past experience. Analogous to over-generalizing.
- Failing to generate options. Narrow framing.
Efficiency diversity - if planning all the way to the goal is infeasible, focus on plans which lead to states where there are many actions available (diverse) which have high chances of success (efficient). (Analogous to one-armed bandit approach to playing go.)
I didn’t really like the structuring of the book. Can we categorize these problems more helpfully?
- Setting goals. Recognizing implicit goals, tradeoffs between goals, opportunity costs. Prioritization.
- Ability to build/understand models. Most people have no experience of modeling even simple dynamical systems. Concepts like rates of change, feedback loops, phase spaces etc need to be learned and practiced. Control trajectory, not position.
- Tendency to build/update models. Failure to recognize complexity. Psychological obstacles to recognizing mistakes and reacting to them. Tendency towards bad explanations.