Mental Models
I really like lists! I also really like mental models. This post is about both.
I use “mental models” quite loosely here: any kind of model, law, idiom, aphorism, etc. that helps me take something about how the world works and capture, compress, and distill it down to something named. They’re like pointers in an index that help me retrieve and recall the right page (and the content that lives there).
Below is a list of models that I plucked from my notes. There are far too many resources to properly place attribution. While I tried to preserve the original, many descriptions are my own personal interpretation/compression of them, and they could very well be wrong! (See: Cunningham’s law) They’re quite software-centric, likely due to most of them coming from what I read. I’ll try my best to keep these updated, and if you see any errors (or have questions), please let me know!
- 1% rule: Only 1% of users create new content
- 12-factor app: A dozen best practices and standards for building and configuring cloud-native web applications
- Amdahl’s law: Formula quantifying the potential speedup of a task by increasing resources or performance
- Antifragile: Property where systems get stronger with randomness, stress, failures, etc.
- Base rate fallacy: Neglecting the underlying base rate, e.g. a 99% accurate test for a disease that only affects 1% of people will result in a high (~50%) false positive rate
- Bitter lesson, the: General methods like search and learning that leverage computation will vastly outperform hand-engineered, bespoke models in the long run
- Black swan: Event with outsized impact that’s improbable but not impossible, and often eventual
- Box’s law: All models are wrong, but some are useful
- Broken windows theory: Broken windows encourages and leads to further crimes and lack of care and ownership
- Brooks’ law: Adding human resources to a late software development project makes it later
- Byzantine generals problem: A consensus problem posed as Byzantine generals coordinating strategy, but some may fail, are unreliable, or are traitors
- CAP theorem: Given network partitions (P), choose either consistency (C) or availability (A)
- Chesterton’s fence: Do not remove a fence until you know why it was put up in the first place
- Clarke’s third law: Any sufficiently advanced technology is indistinguishable from magic
- Confirmation bias: Tendency to perceive things in a way that reinforces preexisting beliefs
- Convoy effect: Slow jobs block or slow down a whole line of other faster jobs
- Conway’s law: Organizations which design systems are constrained to produce designs which are copies of the communication structures of these organizations
- Cunningham’s law: The best way to get the right answer on the Internet is not to ask a question, it’s to post the wrong answer
- Curse of dimensionality: Weird (or difficult) things happen when the data points are far outnumbered by the dimensions or features
- DAMP: Descriptive and meaningful phrases (vs. DRY)
- DRY: Don’t repeat yourself
- Dead Sea effect: The more talented and effective IT engineers are the ones most likely to leave… [leaving] the least talented and effective IT engineers
- Delphi technique: Converging opinions by collecting and anonymizing opinions from experts (and iterating)
- Dependency inversion principle: High-level modules should not be dependent on low-level implementations
- Diderot effect: Obtaining something new can lead to the purchase of additional, complementary things
- Dilbert principle: Companies tend to systematically promote incompetent employees to management to get them out of the workflow
- Dining philosopher’s problem: A concurrency problem posed as five philosophers, two forks, spaghetti, and no neighbors can eat next to each other
- Dunbar’s number: About 150, humans can comfortably maintain only 150 stable relationships
- Dunning-Kruger effect: If you’re incompetent, you can’t know you’re incompetent…
- Duration neglect: Painful experiences are dominated more by the peak and how quickly it diminishes, not the overall duration
- Eisenhower matrix: A 2x2 matrix of importance and urgency
- Gambler’s fallacy: Believing that a random event is more (or less) likely to occur base on past performance, i.e. “It landed on red 5 times, the next one is more likely to be black!”
- Gall’s law: A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work.
- Goldilocks rule: Humans experience peak motivation when working on tasks that are right at the edge of their current abilities
- Goodhart’s law: When a measure becomes a target, it ceases to be a good measure
- Green lumber fallacy: Mistake of conflating knowledge of irrelevant facts with knowledge to make effective decisions, named after a very successful trader who thought green lumber was painted green
- Hamster wheel of pain: Common security loop of “patch and pray” where vulnerabilities are patched ad nauseum
- Hanlon’s razor: We should not attribute to malice that which is more easily explained by stupidity
- Hemingway bridge: Putting down work when you know the next steps so that you can start (bridge) the next session with momentum
- Hick’s law: Decision time grows logarithmically with the number of options you can choose from
- Hindsight bias: Feeling that prediction was possible after an event had occurred, i.e. “I knew it all along!”
- Hofstadter’s law: It always takes longer than you expect, even when you take into account Hofstadter’s Law
- Hyrum’s law: With a sufficient number of users of an API, it does not matter what you promise in the contract: all observable behaviors of your system will be depended on by somebody.
- Ideomotor effect: Influencing of an action by the idea
- Interface segregation principle: No client should be forced to depend on methods it does not use
- Jevons paradox: Consumption of a resource may increase as a response to greater efficiency in its use
- Johari window: A 2x2 matrix of what is known/unknown to self vs. others
- KISS: Keep it simple (stupid)
- Kerchkhoff’s principle: A cryptographic algorithm can be public because all confidentiality relies on the secrecy of the key
- Kernighan’s law: Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.
- Knuth’s optimization principle: Premature optimization is the root of all evil
- Lampson’s law: Neither abstraction nor simplicity is a substitute for getting it right
- Law of Demeter: Each unit should have only limited knowledge about other units: only units “closely” related to the current unit…
- Law of triviality: People within an organization commonly give disproportionate weight to trivial issues
- Law of leaky abstractions: All non-trivial abstractions, to some degree, are leaky
- Law of sufficient variety: In order to ensure stability, a control system must be able to represent all states of the system and controls
- Lindy effect: Things that have been around longer are likely to stay around longer
- Linus’s law: Given enough eyeballs, all bugs are shallow
- Liskov substitution principle: It should be possible to replace a type with a subtype, without breaking the system
- Ludic fallacy: The misuse of games to model real-life situations
- Map vs. territory: Often conflating the map (a model) with the territory (reality)
- Maslow’s hammer: aka Law of the instrument, if all you have is a hammer, everything looks like a nail
- Metcalfe’s law: In network theory, the value of a system grows as approximately the square of the number of users of the system
- Moore’s law: The number of transistors in an integrated circuit doubles about every two years
- Murphy’s law: Anything that can go wrong will go wrong
- Occam’s razor: The simplest explanation is the most likely explanation
- Open/closed principle: Entities should be open for extension and closed for modification
- Ousterhout’s law: When possible, avoid “voodoo constants”, or values in systems that require “black magic” to set correctly
- P-PC balance: Trade-off between maximizing production vs. production capability
- Pareto principle: aka 80/20 rule, the majority of results come from a minority
- Parkinson’s law: Work expands so as to fill the time available for its completion
- Peter principle: People in a hierarchy tend to rise to their “level of incompetence”
- Postel’s law: Be conservative in what you do, be liberal in what you accept from others
- Premack’s principle: More desirable behaviors can be used to reinforce less desirable behaviors
- Principle of least astonishment: A component of a system should behave in a way that most users will expect it to behave, and therefore not astonish or surprise users
- Prisoner’s dilemma: Game theory experiment where two prisoners have to decide to collude or betray
- Putt’s law: Technology is dominated by two types of people, those who understand what they do not manage and those who manage what they do not understand
- Pyramid of pain: Imposing cost on attackers that cause them to change their tactics, behaviors, and tools are the most effective
- Recency bias: Overemphasizing recent events over past ones
- Red Queen’s race: From Through a Looking Glass, running as fast as you can but staying in the same place
- Reed’s law: The utility of large networks, particularly social networks, scales exponentially with the size of the network
- SOLID: acronym for single responsibility, open/closed, Liskov substitution, interface segregation, and dependency inversion principles
- Scout rule: Always leave the code better than you found it
- Shirky principle: Organizations (sometimes) preserve the problem to which they are the solution
- Shotgun and (or vs.) rifle strategy: Using a general, wide strategy (shotgun) and a more precise, specific strategy (rifle), sometimes as complements
- Simpson’s paradox: Phenomenon where trends appear inside groups of data but disappear or reverses when the groups are combined
- Single responsibility principle: Every module or class should have a single responsibility only
- Streetlight effect: aka Drunkard’s principle, bias where people only look for things where it’s easiest to look
- Sunk cost fallacy: Tendency to continue a losing investment even when abandoning it would be more beneficial
- Survivorship bias: Focusing on things that survived vs. (usually many) things that didn’t
- Tom West’s law: Not everything doing is worth doing well
- Two pizza rule: If you can’t feed a team with two pizzas, it’s too large
- Unix philosophy: Make each program do one thing well…
- Via negativa: Concept where things might improve by subtraction or removal
- Watermelon project: Projects that are green on the outside, but red on the inside
- Wheaton’s law: Don’t be a dick
- Wirth’s law: Faster hardware can trigger the development of less-efficient software
- YAGNI: You ain’t gonna need it
- Zeigarnik effect: Tendency to remember unfinished tasks better than completed ones