Matt Glassman

a little information often tells you a lot

  1. A lot of people demand way too much evidence for things before they will adjust their views. Some of this is related to a desire for certainty, rather than a probabilistic view of the world. But you can get caught up in frequentist statistics and take it way too far. In many cases, conditional probability and/or Bayesian logic is much more useful.

  2. Even conditional on that (lol), most people don’t update their beliefs nearly enough when they get new information. A small amount of information often moves conditional probabilities an extreme amount.

  3. Some conditional probability is easy and obvious. If we walked into a random room of 30 people, the probability any of them could do 500 pushups without stopping is very small, maybe I would lay you 200-1 or more on a bet. But if someone walked up to us randomly and out of nowhere wanted to bet he could do 500 pushups without stopping, we’d be crazy to even take the bet at even money. Conditional on wanting to bet, the probability he can do the push-ups goes from very small to quite high. The single piece of data completely swings the probability.

  4. You see this all the time in card games. Sometimes we say a player has good card sense. Everyone knows what that means, but everyone also has trouble defining it. The only definition That has ever made sense to me is they are very good at conditional probability. I think this is many other fields as well. People with good political instincts. Or good business sense. Or good matchmaking ability. It’s all just conditional probability.

  5. A classic example from card games is sizing people up in tournament poker. Poker tournaments move quickly and you have to start drawing judgments about players from very limited information. People wait way too long. If you are trying to figure out if someone is loose or tight, seeing them open-raise twice in a row from under-the-gun is a massive inference they are loose. If you see them make a single limp-call from out of position, that’s a huge sign they are a recreational player.

  6. Bayes’ theorem quantifies this neatly. If you think 85% of tournament players are tight and 15% are loose, and tight players open-raise 10% of hands UTG and loose players raise 50% of hands, Bayes theorem [P(loose | raise UTG twice in a row) = (P(RaiseTwice | Loose) * P(Loose))/ P(RaiseTwice)] tell us that someone raising twice in a row UTG is 82% likely to be a loose player.

  7. As always, your priors matters. Garbage in, garbage out. But there’s nothing magical about Bayes’ theorem, it’s literally just a proportion that describes the percentage of time you’d see a conditional occur. If your priors are correctly specified, it’s no less valid than any other arithmetic. A loose player raises twice in a row UTG 25% of the time; a tight player just 1% of the time. The observation of such behavior massively suggest a loose player.

  8. The same examples carry over to politics. I don’t give people second-chances on the internet if they come to a discussion in bad-faith. Everyone can have a bad day, but arguing in bad faith even once---and especially the first time I interact with them ---massively raises the conditional probability that they aren’t worth my time as a future interlocutor, and that I have little to gain/learn from talking to them.

  9. You can and should train yourself to act on less information. In card games and life. Of course, if you need to be certain this doesn’t apply. But many decisions don’t require certainty or, worse, have time-limited features that turn the need for certainty into missed opportunity.