Here are two statements for you. This flow of refugees must mean that terrorists are flocking into Europe hidden among them. Therefore, we must pull in even more telecoms data in order to find the terrorists. They are linked by an extremely important and subtle assumption. What is it?
Well, both statements concern a process of some sort. The first implies a sort of production process for terrorism – terrorists are recruited, they prepare, they infiltrate into the target country, and then they explode. The second implies a similar linear production process for counter-terrorism. Potential threats are detected. This creates suspects. Suspects are investigated. Some sort of action is taken against them.
Here’s something I learned by being a strategy consultant and programming computers for fun, in the spirit of Everything I Know I Learned At A Very Expensive University, part 1 and 2.
If you’re trying to improve (or disrupt) a process, you need to understand which step in the process is the rate-limiting step. This is the slowest step that sets the maximum flow rate through the process (or in Toyota speak, the takt time). In biology or chemistry it might be the availability of, say, nitrates. Once the supply is being fully utilised, adding more of some other resource won’t increase the reaction rate or population any. In computer science the same idea is expressed by the notion of the von Neumann bottleneck – one of storage, memory, processing, or input/output is always a binding constraint. If your application is I/O bound, a faster chip or a more elegant implementation that reduces the computational work involved won’t help.
In the first example, it only makes sense to worry about the refugees if you believe that the supply of terrorists, and their capability to prepare attacks, is not restricted, and that therefore the limiting step is infiltration. If you think the supply of them is restricted – that is to say, the limiting step is recruitment or preparation – it doesn’t matter much.
Seeing as recruiting them involves persuading them to blow themselves up, you’ve got to think that might be the tricky bit.
In the second, it only makes sense to demand more data if you think the limiting step is getting names into your suspect index, rather than investigating them once suspected, or prosecuting, rehabilitating, or otherwise eliminating them once positively identified. If the investigations team is standing idle, for lack of leads, maybe that would indicate a need for more source data. But nobody believes that or even bothers to claim it. Instead we are frequently told that the spooks need more headcount.
If, however, the limiting step is going from vague suspicion to concrete accusation, or doing anything about the accused, more data will just increase the size of the queue behind it. And the two processes interact. If terrorists are rare, which they are, increasing the volume of data will both reduce the percentage of real leads among the suspects, and increase the queueing time before a given lead is either upgraded to a real case, or cleared. It seems to be almost traditional that explosions occur during this queueing interval.
The fact that Hasna Ait Boulahcen was on a list of selectors for the French intelligence services as a security threat and for the ordinary investigative police as a suspected drug dealer at the same time suggests generating names for the index isn’t the problem.
This is something nobody ever seems to discuss. One of the few people to raise it, while the French were passing their massively permissive surveillance law the other week, was the superb blogger Abou Djaffar (Jacques Raillane), also available on Twitter.