Bayesian statistics for beginners : a step-by-step approach

Bayesian Statistics for Beginners is an entry-level book on Bayesian statistics. It is like no other math book you’ve read. It is written for readers who do not have advanced degrees in mathematics and who may struggle with mathematical notation, yet need to understand the basics of Bayesian inferen...

Full description

Saved in:
Bibliographic Details
Main Authors Donovan, Therese M. (Therese Marie), Mickey, Ruth M.
Format eBook Book
LanguageEnglish
Published Oxford Oxford University Press 2019
Oxford University Press, Incorporated
Edition1
Subjects
Online AccessGet full text
ISBN0198841302
9780198841296
0198841299
9780198841302
DOI10.1093/oso/9780198841296.001.0001

Cover

Table of Contents:
  • Cover -- Bayesian Statistics for Beginners: A Step-by-Step Approach -- Copyright -- Dedication -- Preface -- Contents -- SECTION 1. Basics of Probability -- CHAPTER 1. Introduction to Probability -- What is probability? -- Should you play? -- How can we get a good estimate of Pr(four) for this particular die? -- Is one roll good enough? -- What would we expect if the die were fair? -- How would you change the table and probability distribution if the die were loaded in favor of a four? -- What would the probability distribution be for the bet? -- Do Bayesians think of probability as long-run averages? -- What's next? -- CHAPTER 2. Joint, Marginal, and Conditional Probability -- What is an eyeball event? -- Why is it called a Venn diagram? -- What is the probability that a person in universe U is in group A? -- What about people who are not in group A? -- I'm sick of eyeballs. Can we consider another characteristic? -- Can we look at both characteristics simultaneously? -- Is it possible to have Morton's toe AND be a lefty? -- Is it possible NOT to have Morton's toe if you are a lefty? -- What if five lefties also have Morton's toe? -- Of the four events (A, A, B, and B), which are not mutually exclusive? -- Are any events mutually exclusive? -- If you were one of the lucky 100 people included in the universe, where would you fall in this diagram? -- What does this have to do with probability? -- What is the probability that a person selected at random is a righty and has Morton's toe? -- What does the word "marginal" mean? -- Can you fill in the empty cells in Table 2.7? -- Quickly: What is the marginal probability of having Morton's toe with this conjoint table? -- Can you express the marginal probability of having Morton's toe as the sum of joint probabilities? -- Can we look at this problem from the Venn diagram perspective again?
  • How do we move from the probability of a given value of Y to the probability distribution for all possible values of Y? -- Is this an example of a probability distribution? -- Is this also a probability mass function? -- What if we had flipped the coin 10 times? -- Really? -- OK, what does "binomial" mean? -- When do we use binomial probability? -- What does the binomial probability mass function look like? -- What notation should I use to describe a binomial process like coin flipping? -- What is a binomial distribution? -- How about the probability of observing 2.5 heads out of 3 coin flips, given that the coin is fair? -- What is a parameter? -- What are the assumptions of the binomial probability mass function? -- Are there other probability mass functions besides the binomial? -- What do all of these functions have in common? -- All right then . . . what is the Bernoulli distribution? -- Likelihood -- OK, what exactly is likelihood? -- Why is this important? -- Are there any other key points to bear in mind regarding likelihood? -- Can we quickly confirm that the likelihoods do not need to sum to 1.0 here? -- How would this be used in a Bayesian inference problem? -- Can we depict this problem graphically? -- Can we compare this problem with the authorship problem? -- What if we considered all possible hypotheses for p between 0 and 1 instead of just two specific hypotheses? -- Can we summarize the main points of this chapter? -- OK, what's next? -- CHAPTER 9. Probability Density Functions -- What is a function? -- Can you give me an example of a continuous random variable? -- What is the probability that a bacterium lives exactly 5 hours? -- So, what can we do? -- Can we see an example of a probability density function? -- I see . . . and what distribution would result from this pdf? -- Why is the density 0.5 in this example?
  • Can we formally define a uniform pdf?
  • One more time…what is Bayesian inference again? -- What if I collect more data? -- What other sort of questions have been tackled using Bayesian inference approaches? -- CHAPTER 5. The Author Problem: Bayesian Inference with Two Hypotheses -- What is step 1? -- What is step 2? -- What is step 3? -- What is step 4? -- How exactly do we compute the likelihood? -- Which of the two hypotheses more closely matches the observed rate? -- If likelihood is a probability, how do we quantify this "consistency" in terms of probability? -- What is step 5? -- Where are the priors in this equation? -- Where is the posterior probability of the Hamilton hypothesis in this equation? -- Where are the likelihoods of the observed data under each hypothesis in this equation? -- So, what is the posterior probability of the Hamilton hypothesis? -- How do we set the prior probabilities? -- What if we found more papers known to be authored by Hamilton and Madison? -- Do the likelihoods of the data have to add to 1.0? -- Did Mosteller and Wallace really use this approach? -- Can we summarize this problem? -- How does this problem differ from the Breast Cancer Problem in the last chapter? -- CHAPTER 6. The Birthday Problem: Bayesian Inference with Multiple Discrete Hypotheses -- Should Bobbie and Reggie use an informative prior? -- What data do we need then? -- Is the divisor of Bayes' Theorem always a constant? -- What if the non-informative prior were used instead of the "When You Were Born" prior? -- So, the choice of the prior really affects the results? -- Are there other times when the prior drives the results? -- What is so tricky about setting the prior? -- I've heard the terms "objective" and "subjective" with reference to Bayesian analysis. What do these mean? -- What really happened to Bobbie and Mary? -- Isn't that nice?
  • CHAPTER 7. The Portrait Problem: Bayesian Inference with Joint Likelihood -- Who won the bet? -- Why did men wear wigs in the 1700's? -- So, how can we determine the probability that the man in the photo is Thomas Bayes? -- Paint? -- And how can lead white help us with dating the Thomas Bayes' portrait? -- Great! Can we get started? -- Step 1. What are the hypotheses? -- Step 2. What are the prior probabilities that each hypothesis is true? -- Step 3. What are the data? -- OK, then. What are the observed data with respect to wigs? -- And what are the observed data with respect to similarity? -- So what is our final dataset for step 3? -- Step 4. What is the likelihood of the observed data under each hypothesis? -- Should we start with wigs? -- What about similarity between "Thomas Bayes" and Joshua Bayes? -- How do we calculate the likelihood under each hypothesis? -- OK, then, what is the likelihood of observing a similarity score of 55 or greater under each hypothesis? -- So how do we combine both results into one likelihood for each hypothesis? -- Step 5. What is the posterior probability that the portrait is of Thomas Bayes? -- Are the two pieces of information really independent? -- Can we add on more independent pieces of information? -- What if our information is not independent? -- Are there any assumptions in this analysis? -- What is the main take-home point for this chapter? -- Looking back at the portrait, who was Barrett, developer of the columnar method? -- What's next? -- SECTION 3. Probability Functions -- CHAPTER 8. Probability Mass Functions -- What is a function? -- What is a random variable? -- Can you show an example? -- Is a random variable a function? -- Where do we go from here? -- What is the probability of observing y ¼ 3 heads?
  • If you have Morton's toe, does that influence your probability of being a lefty? -- What is conditional probability? -- How exactly do you calculate the probability that a person is a lefty, given the person has Morton's toe? -- So if you have Morton's toe, does that influence your probability of being a lefty? -- Does Pr(A | B) ¼ Pr(B | A)? -- Can you calculate the conditional probability of being a lefty, given you have Morton's toe, from our conjoint table instead of the raw numbers? -- Can you calculate conditional probability of having Morton's toe, given you are a lefty, from our conjoint table? -- If we know the conditional and marginal probabilities, can we calculate the joint probabilities? -- Are Pr(A|B) and Pr(B|A) related in some way? -- SECTION 2. Bayes' Theorem and Bayesian Inference -- CHAPTER 3. Bayes' Theorem -- First, who is Bayes? -- Is that really a picture of Thomas Bayes in Figure 3.1? -- Ok, what exactly is Bayes' Theorem? -- What does this have to do with Bayes' Theorem? -- What is so remarkable about this? -- If you have a member of B, what is the probability that he/she is also a member of A? -- So, when would we need to use Bayes' Theorem? -- Is that all there is to it? -- CHAPTER 4. Bayesian Inference -- What exactly is science? -- How do we go about actually conducting science? -- How on earth did Thomas Bayes make a connection between probability and scientific inference? -- What is Bayesian inference? -- How does Bayesian inference work? -- How can we turn this into a Bayesian inference problem? -- Is there a pattern in the denominator of this new version? -- Does anything else about this equation strike you as notable? -- So, why all the fuss? -- How does this relate to science? -- Ok, what exactly is the difference between the two interpretations of Bayes' Theorem? -- What if there are more than two hypotheses?