forked from hejing3283/bayesian_computation_R
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathBDA-R_chpt03_single_parameterModels.rmd
169 lines (153 loc) · 5.73 KB
/
BDA-R_chpt03_single_parameterModels.rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
Chapter 03, single parameter models
## prepare
install.packages("rmarkdown")
- Normal distribution with known Mean, unknow variance
==> exmaple: American football scores: difference d btw outcome & published point spread. d1,...., dn. with N(0, sigma)
L(var) <- ( var ) ^ (-n/2) exp ( sum( di ^ 2 / (2var))),
prior , p(var) <- 1/var
```
require(LearnBayes)
data(footballscores)
attach( footballscores )
d = favorite - underdog - spread
n = length(d)
v = sum( d^2 )
```
step 1 simulation values of precison parameter from scaled-chi-squre(n) distribtution
step 2: transformation to get values from the posterior distribution of sd
step 3: histogram of simulated samples
```
P = rchisq( 1000, n) / v
s = sqrt( 1 / P)
hist(s, main = "")
quantile( s, probs = c( 0.025, 0.5, 0.975))
```
==> example 2: heart transplant mortality rate
Possion( e * lambda )
using maximum likelihood, but not working, therefore, use a small group of hospitals hwich are believed to have similar outcome as the hospitals we are going to model
gamma(a, b); using yobs = poisson( ex * lam ) to estimate the posterior density for lam, as well as prior preditive denstiy of y, from those samller hospital sample, we can calculate and get alpha, and beta
```
alpha = 16; beta = 15174
yobs = 1; ex = 66 ## new hospital
y = 0:10
lam = alpha/beta ## prior mean
py = dpois(y, lam * ex) * dgamma( lam, shape = alpha, rate = beta) / dgamma( lam, shape = alpha + y, rate = beta + ex)
cbind(y, round(py, 3))
## posterior
lambdaA = rgamma(1000, shape = alpha + yobs, rate = beta + ex)
ex = 1767; yobs= 4
y = 0:10
py = dpois( y, lam * ex) * dgamma( lam, shape = alpha, rate = beta) / dgamma( lam, shape = alpha + y, rate = beta + ex)
cbind( y, round(py, 3))
lambdaB= rgamma( 1000, shape = alpha + yobs, rate = beta+ex )
### check prior's influence on posterior
par( mfrow = c(2, 1) )
plot( density( lambdaA), main = 'HOSPITAL A', xlab = 'lambdaA', lwd = 3)
curve( dgamma( x, shape = alpha, rate = beta), add = TURE)
legend( "topright", legend = c("prior", "posterior"), lwd = c(1,3) )
plot(density( lambdaB), main = "HOSPITAL B", xlab = 'lambdaB', lwd = 3)
curve( dgamma( x, shape = alpha, rate = beta), add = T)
legend( "topright", legend = c("piror", "posterior"), lwd = c(1,3))
```
==> robustness of BDA
estimata IQ of a persion
test scores are yi ~ normal( theta, gamma = known)
==>with normal prior, we get normal posterior
```
quantile1 = list( p = .5, x = 100);
quantile2 = list( p = .95, x = 120)
prior.param = normal.select( quantile1, quantile2 )
mu = prior.param$mu
tau = prior.param$sigma
sigma = 15 ## known
n = 4
se = sigma / sqrt( 4 )
ybar = c( 110, 125, 140)
tau1 = 1 / sqrt( 1/se^2 + 1/tau^2 )
mu1 = (ybar / se^2 + mu/tau^2) * tau1^2
summ1 = cbind( ybar, mu1, tau1)
summ1
```
==> another prior, a symmetric density, t density with location mu, scale tau, and 2 degrees of freedom.
The posterior does not have a convient form, therefore, a construct a grid of theta values that covers the posterior density, compute the product of likelihood and prior and then do a normalization.
Then this discrete distribution on those grid of theta are used to compute the posterior mean and std.
```
t.df = 2
tscale = (quantile2[[2]] - quantile1[[2]]) / qt(0.95, t.df)
tscale
## compare t and normal prior
par( mfrow = c(1,1))
curve( 1/ tscale * dt((x-mu) / tscale, 2), from = 60, to = 140, xlab = 'theta', ylab = 'Prior Density')
curve( dnorm( x, mean = mu, sd = tau), add = T, lwd =3)
legend ( "topright", legend = c("t density", "normal density"), lwd = c(1,3))
norm.t.compute = function( ybar ){
theta = seq(60, 180, length = 500) ## the grid
like = dnorm( theta, mean = ybar, sd = sigma / sqrt(n))
prior = dt( (theta - mu)/tscale, 2)
post = prior * like
post = post / sum(post)
m = sum( theta * post)
s = sqrt( sum( theta ^ 2 * post) - m^2 )
c(ybar, m, s)
}
summ2 = t( sapply( ybar, norm.t.compute))
dimnames(summ2)[[2]] = c("ybar", "mu1 t", "tau1 t")
summ2
cbind( summ1, summ2)
### compare posterior
theta = seq(60, 180, length = 500)
normpost = dnorm( theta, mu1[3], tau1)
normpost = normpost / sum( normpost )
plot(theta, normpost, type = 'l', lwd = 3, ylab = 'Posterior Density')
like = dnorm(theta, mean = 140, sd = sigma/sqrt(n))
prior = dt( (theta - mu)/tscale, 2)
tpost = prior * like / sum(prior * like)
lines( theta, tpost)
legend( "topright", legend = c("t prior", 'normal piror'), lwd = c(1,3) )
```
==> 3.5 mixture of conjugate priors
1. mixture of 2 beta distribution as the prior
```
probs = c(.5, .5)
beta.par1 = c(6, 14)
beta.par2 = c(14, 6)
betapar = rbind( beta.par1, beta.par2)
data = c(7, 3)
post = binomial.beta.mix(probs, betapar,data)
post
curve( post$probs[1] * dbeta(x, 13, 17) + post$probs[2] *dbeta(x,21,9), from = 0, to =1, lwd = 3, xlab = 'P', ylab = "DENSITY")
curve( .5 * dbeta(x, 6, 12) + .5 *dbeta(x, 12,6), 0, 1, add = T)
legend( "topleft", legend = c("Prior", "Posterior"), lwd = c(1,3))
```
==>[ hard to understand ] Bayesian test of the Fairness of a coin
```
2 * min( pbinom( 5, 20, 0.5), (1-pbinom(5 ,20, .5)) )
n = 20 ; y = 5 ### observed data
a = 10 ; ## assigned paramter for beta, prior for p
p = 0.5 ## fairness p
m1 = dbinom( y, n, p) * dbeta(p, a, a) / dbeta( p, a+y, a+n-y)
lambda = dbinom( y, n, p) / (dbinom(y, n, p) + m1)
lambda
require(LearnBayes)
pbetat(p, .5, c(a,a), c(y, n-y) )
### sensitivity of the
prob.fair = function( log.a )
{
a = exp( log.a )
m2 = dbinom(y, n, p) * dbeta( p, a, a) / dbeta(p, a+y, a+n-y)
dbinom(y, n, p) / (dbinom( y, n, p) + m2)
}
n = 20; y = 5; p = 0.5
curve(prob.fair(x), from = -4, to = 5, xlab = 'log a', ylab = "Prob(coin is fair)", lwd = 2)
### bayesian of y <=5
n = 20; y = 5;
a =10
p = .5
m2 = 0
for (k in 0:y)
{
m2 = m2 + dbinom( k, n, p) * dbeta(p, a, a )/ dbeta(p, a+k, a+n-k)
}
lambda = pbinom(y, n, p) / (pbinom(y, n, p) + m2)
lambda
### END of chapter----------