Cliff Stoll
Doubleday, 1989
Amazon
There was a time before we defaulted to locking our doors. In the 1980s, the nascent internet was mostly used by research scientists and the military. The community was small, and—as tends to be the case in small communities—the level of trust was high. Administrators didn’t invest much effort in locking down their systems, because the possibility of bad actors hadn’t been seriously considered.
Cliff Stoll was a sysadmin at Lawerence Berkley Lab during this period. After noticing sessions from unknown users, Stoll began logging these users’ activities. By examining these logs, he discovered that foreign agents were using his machines as a jumping-off point for exploring ARPANET (what was then the US military’s network), where they would look for sensitive information. The Cuckoo’s Egg chronicles Stoll’s efforts tracking the hackers in his system, and how he worked with various acronynmized agencies to catch them.
The book works on a number of levels. In the first place, it’s exciting—hackers, particle accelerators, the NSA, cocaine—what’s not to like? Secondly, it’s politically interesting. Stoll describes his politics prior to working with government agencies as “a sort of fuzzy, mixed bag of the new-left.” It was an axiom that the NSA/FBI/CIA/whatever is categorically bad. Blanket statements of this variety have the advantage of being socially safe (particularly if you happen to live in Berkeley) but they have the disadvantage of being useless. There are legit criticisms one might level at each of these agencies, but in order to be effective, they require more context and nuance than is found in the fuzzy, mixed bag of the new-left. As Stoll worked with these orgs, he found common ground, as well as new areas of disagreement. The point here is not that young idealists will eventually see reality for what it is and evolve into steely-eyed conservatives. The lesson is about the importance of actually understanding that to which you’re opposed.
Finally, the book has a lot of lessons for people who make, use, and analyze software. As I read, I highlighted all the gems I came across. I wanted to compile them in a review with some notes. So here they are.
social engineering >>> software engineering
The hacker didn’t succeed through sophistication. Rather he poked at obvious places, trying to enter through unlocked doors. Persistence, not wizardry, let him through.
One of the book’s recurring themes is that taking advantage of obvious routes into a system is a lot more effective than trying to be clever. For example, instead of spending time and effort trying to break a specific user’s password, pick a few common passwords and try them on every user.
Here’s an example. If a user has a strong 6-digit PIN, the odds you will guess it are—exactly—one in a million. But if one in a thousand users is lazy and uses 123123, then your odds of finding a user with this password is one in a thousand. By changing tack you just became a thousand times more productive. The Unix systems that were popular in these days came with a few default accounts, each of which had a default password. Sometimes administrators forgot to change these. Instead of trying to crack a prudent admin’s secure password, simply knock on doors until you find a system where the admin forgot to change it.
security through diversity
If everyone used the same version of the same operating system, a single security hole would let hackers into all the computers. Instead, there’s a multitude of operating systems: Berkeley Unix, AT&T Unix, DEC’s VMS, IMB’s TSO, VM, DOS, even Macintoshes and Ataris. This variety of software meant that no single attack could succeed against all systems. Just like genetic diversity, which prevents an epidemic from wiping out a whole species at once, diversity in software is a good thing.
Software is always designed for a specific system. No matter how devastating a virus is, it can only affect systems it was intended to run on. Historically Windows has been the biggest target for authors of viruses—because it’s the most popular. For this reason, malware is less of a threat to Mac, Linux, and Android users.
security through obscurity
Perhaps because artifical intelligence research is so arcane, [the hacker] didn’t find much. Certainly, the antique operating system didn’t provide much protection—any user could read anyone else’s files. But the hacker didn’t realize this. The sheer impossibility of understanding this system protected their information.
This shouldn’t be your go-to for securing a system, but it’s interesting nonetheless.
focus on the problem
As [Mike Muuss] puts it, good programs aren’t written or built. They’re grown.
Software is not an end in itself (a point so obvious that only a programmer could fail to realize it). Truly good software improves with time, as we realize what the most important use-cases are and streamline them. Getting user feedback quickly and responding to it is a much better strategy than over-engineering something before you know precisely what problem you’re trying to solve.
The question isn’t “which computer is faster,” no, not even “which is better.” Instead ask “Which is more suitable?” Or “which will get your job done?”
Similarly, we often look at high-level, general metrics when we should be looking at low-level, problem-specific ones.
your actions betray your assumptions
“Cliff, the hacker’s not from Berkeley”
“How do you know?”
“You saw that guy typing in the ps -eafg command, right? … The f flag is the AT&T Unix way to list each process’s files. Berkeley Unix does this automatically, and doesn’t need the f flag. Our friend doesn’t know Berkeley Unix.
Another recurring theme is that the hacker would routinely reveal personal information through subtle idiosyncrasies in his programming. I think that this observation has import beyond security and forensics: it’s a powerful tool in user testing. Don’t ask your users what they think. They will probably just tell you what they think you want to hear, which is two steps removed from the truth. Instead, look at their behaviour. Suppose you’re concerned that a piece of text on your app looks too much like a button. Instead of asking “does this look like a button to you,” just monitor whether people are clicking it.
automate
I could sleep under my desk and rely on my terminal to wake me up. But at the cost of domestic tranquility: Martha wasn’t pleased at my office campouts. If only my computer would call me whenever the hacker appeared …Of course. A pocket pager.
As a programmer your job mostly consists of automating stuff, so it’s endlessly fascinating how we can still forget to do this. You shouldn’t automate for the sake of automation (see this and this) but you should be vigilant about noticing when you’re repeatedly performing a task, or waiting for something to happen.
write things down
The astronomer’s rule of thumb: if you don’t write it down, it didn’t happen.
Stoll was trained as an astronomer, and he brought these skills to bear on digital forensics. During the year he tracked the hacker, he made copies of all suspicious behaviour on his servers, and he took meticulous notes of all his observations, thoughts, and hypotheses. These notes were invaluable both to him and to the government agencies he collaborated with. Note-taking is a powerful (and underrated) skill.