有两种可能性造成 slow convergence或者不收敛:
1。模型根本不合适
2。算法不合适
我猜你的模型可能是这样的:
<br />
Model 1:<br />
y[i] | s[i] ~ Gamma(df/2, s[i]/2)<br />
s[i] | b ~ \sum_{j=1}^K b[j] Inv-Gamma(somegivenparameter_j,somegivenparameter_j)<br />
如果我必须要去评估这个模型,我会 augment data space by introducing "mixture indicator" 就是说把"参数"再增加24123
<br />
Let x=(x_1,...x_{number of observations})<br />
<br />
Model 2:<br />
y[i] | s[i] ~ Gamma(df/2, s[i]/2)<br />
s[i] | x[i] = j ~ Inv-Gamma(para_j,para_j), x[i] denotes the mixture indicator, j = 1,...,K. Given x[i], s[i] has an inverse gamma distribution.<br />
Prob(x[i]=j)=b[j], j = 1,...,K <br />
<br />
Sampling s[i]:<br />
the consitional posterior of s[i] is p(s[i]|...) \propto p(y[i]|s[i],...) prior(s[i]|x[i]=j,...).<br />
<br />
Since s[i]s are not correlated, this algoritm should not affect the convergence of MCMC algorithm. However, Gibbs sampler can not apply to sampling s[i] as Inv-Gamma prior may not conjugate to Gamma likelihood, not 100% sure. You dont have this concen if you are using WINBUGs.<br />
<br />
Sampling x[i]:<br />
The conditional posterior of x[i] given observations, y, and mixture components, b, has a multinominal-distribution(1, b), so we can easily sample x. <br />
<br />
Prob(x[i]=j | y ,...) \propto b[j] * p(y|...)<br />
<br />
Sampling b[j]?<br />
The next question turns to how to sampling b, if it is necessary? generally, we use b as a prior. Now, we assume b~Dirichlet(a_1,...,a_K) and let a=(a_1,...,a_k).<br />
we need to specify hyperparameters for a. However, to be honest, I have no idea about how to sampling b :( Maye we should discuss it latter.<br />
哦, 你确实是使用b作为prior
b ~ dirichlet (1)<br />
试一下data-augmentation, 这个是非常基本而且非常好用的办法在mixture distribution.
你的问题应该可以解决。
如果有问题我们可以再讨论。
:)