I saw a talk by Ed Catmull on the problems Pixar had in their early days. The talk is very interesting, and I recommend it. What caught me, is that at Pixar they had a “brain trust”, a certain group of people who could express their new ideas to one another without fear of backlash. This is actually very difficult to achieve, since early implementations of ideas are always “childish”, lack maturity, and are therefore not considered good enough to be shared with others. However, if you don’t share the idea with the others early, then you will get feedback too late, when you have already invested a lot of time into it. Criticism at this late stage hurts more and what is worse, it might make you so uncomfortable, that you might abandon the idea altogether.
The trick is to get a set of people to trust each other well enough to share not-yet mature ideas with one another in the early stage, so that criticism can help everyone get their ideas right from the start. Establishing this trust is very difficult. I started to understand what Ed Catmull was talking about while working with the STP team. The earlier I seemed to share my ideas, the better they got. For instance, CryptoMiniSat was originally never meant to be used as a library. However, once they (mostly Vijay Ganesh) helped me debug the code, it was much easier to get things “right”. I also met Laurent Simon last year for half a day, and he gave me the idea of using Grid’5000 to test the performance of CryptoMiniSat. Without these early steering ideas, I would have never been able to get CryptoMiniSat to where it is today.
I just read this paper on cybersecurity by Daniel E. Geer, and I was very impressed. Unfortunately I haven’t heard from the author yet, but I regularly used to read the blog of Bruce Schneier, and this essay basically puts the same ideas into perspective.
I have found the essay to be very interesting, and very thought-provoking. It shows very well that security in the cyber (or cyber-connected) world is very difficult to attain. There are no simple solutions. I personally think that security in the real world (not only the cyber world) is also very difficult to obtain, and unfortunately politicians tend to go the easy way, and simply bend in front of the will of the people by implementing “security measures” that in the end don’t help much (if at all) in terms of security, but reassure the people. An example of this is the ban of liquids on airplanes, while cockpits in Europe are still not reinforced — the former doesn’t achieve much but is very visible (and so is more of a security theater), while the latter would be much less visible, but also much more effective. Also to note, that the former takes a lot of man-power to implement, and inconveniences the users (thus taking their time, too), while the latter would be relatively cheap.
The mentality that leads us to believe that bombing is more of a threat is that most people expect planes to be blown up, while hijacking is most only an afterthought. This serious mistake is probably a psychological effect, as most people tend to remember visually colourful incidents more, and Hollywood has made use of the “blow-up” effect too much, etching it into the brains of most people, even decision-makers. However, it is important to remember, that most serious problems in airports and airplanes were carried out through the use of arms other than bombs: to take a trivial example, no planes used in 9/11 were bombed. As a side-note, reinforced cockpits would have prevented all of 9/11, and European cockpits are still not reinforced, but my toothpaste is always taken away — a serious defect, I say.