Policy-making is complicated
Things are more complicated than they first seem, especially when it comes to policy-making. Should Germany build nuclear power plants to reduce their carbon emissions? Should there be global standards for content moderation on social media platforms? How should we best mitigate the risk of future global pandemics?
I recently had the pleasure of getting to know more people working in policy and forecasting, and I quickly realised that I hadn’t internalised the complexity of the problems they face1. Here, I’ll list some intuition pumps for the difficulty of policy-related questions. For concreteness, I’ll focus on the first of the three above topics, nuclear power in Germany.
- More than a wing flap: Recall the butterfly effect: a butterfly’s wing flap can cause a tornado elsewhere. How about the choice-of-energy-supply effect?
- Defining justice: Are there conflicting interests? Then we have to define justice first. Wonderful. This would involve solving a big part of political philosophy.
- Multilayer forecasting: Answering this question involves forecasting the effects of two hypothetical scenarios (the “yes” and “no” scenarios). But forecasting is basically a full-time job. Forecasters need to identify relevant parameters, estimate them, steelman opposing views, and so on.
- Lazy student: You could easily write a master thesis on this topic. A master thesis is 30 ETCS. Feel the credits!
- Headcount: There’s probably a good number of people working on exactly on this issue. Sum over people in special commissions, the government, in academia, in think tanks, and so on.
- Hard optimisation: The ideal energy source emits zero carbon, works all year round, is cheap, doesn’t produce toxic nuclear waste, etc. This leads to an infinite-dimensional optimisation problem. Worse, the loss functions need not be smooth and differentiable.
- Beyond EXPSpace: If the travelling salesman problem is NP-hard, then what on earth is this optimisation problem even?
- Non-computable: If the halting problem is uncomputable, is there any hope of answering this question?
- Inference time: Imagine prompting a reasoning language model, say GPT-5, to write a nuanced 100-page report on the question. What would be the inference time?
- Going broke: Alternative formulation, so you really feel it: Imagine prompting a reasoning language model, say GPT-5, to write a nuanced 100-page report on the question. Suppose you were billed on a per-token basis. What would be the price?
Well, policy is complicated. What does this mean in practise?
It seems important to distinguish between “thinking for fun” and “thinking for truth”. I enjoy thinking about hard problems – that’s one of the main reasons I do mathematics. But if I’m thinking for truth in a domain outside my expertise2, I probably shouldn’t spend more than 1% of my time coming up with a prior. It will be very weak, anyway. Things are complicated, remember?
Similarly, I’ll try outsourcing as much as possible to experts, resisting the urge to “attempt every problem”. My job should reduce to aggregating expert opinions. Ideally, I could ask an LLM to fill in a table with expert opinions and their respective confidence levels, so I could just take a weighted average.
In a very backward way, internalising that things are complicated makes my life much simpler.
This post was inspired by conversations with Caroline Falkman Olsson and Nadja Flechner. Thanks for the food for thought.