Exercises

  1. This exercise is about regularization priors. In the code that generates the data, change order=2 to another value, such as order=5. Then, fit model_p and plot the resulting curve. Repeat this, but now using a prior for beta with sd=100 instead of sd=1 and plot the resulting curve. How are both curves different? Try this out with sd=np.array([10, 0.1, 0.1, 0.1, 0.1]), too.
  2. Repeat the previous exercise but increase the amount of data to 500 data points.
  3. Fit a cubic model (order 3), compute WAIC and LOO, plot the results, and compare them with the linear and quadratic models.
  4. Use pm.sample_posterior_predictive() to rerun the PPC example, but this time, plot the values of y instead of the values of the mean.
  5. Read and run the posterior predictive example from PyMC3's documentation at https://pymc-devs.github.io/pymc3/notebooks/posterior_predictive.html. Pay special attention to the use of the Theano shared variables.
  6. Go back to the code that generated Figure 5.5 and Figure 5.6, and modify it to get new sets of six data points. Visually evaluate how the different polynomials fit these new datasets. Relate the results to the discussions in this book.
  7. Read and run the model averaging example from PyMC3's documentation at https://docs.pymc.io/notebooks/model_averaging.html.
  8. Compute the Bayes factor for the coin problem using a uniform prior beta (1, 1) and priors such as beta (0.5, 0.5). Set 15 heads and 30 coins. Compare this result with the inference we got in the first chapter of this book.
  9. Repeat the last example where we compare Bayes factors and Information Criteria, but now reduce the sample size.
  10. For the entropy example, change the q distribution. Try this with distributions like stats.binom(10, 0.5)  and stats.binom(10, 0.25).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.151.164