-
Notifications
You must be signed in to change notification settings - Fork 36
Description
What is the statistical model that this should implement?
@model function f_indices()
b ~ Bernoulli()
x = Vector{Float64}(undef, b ? 1 : 2)
if b
x[1] ~ Normal(1.0)
else
x[1] ~ Normal(1.0)
x[2] ~ Normal(2.0)
end
end
Some options:
-
Nothing. That should be invalid.
-
A model for which the domain of x[2] is Union(Reals, {N/A}) and the joint is
p(b, x1, x2) = p(b) p(x1) p(x2 | b)
where
p(b) = Bernoulli
p(x1) = Normal(1.0)
and the distribution p(x2 | b) is such that its density function is
pdf(p(x2 | b == true))(y) = y == N/A ? 1.0 : 0.0
pdf(p(x2 | b == false))(y) = y == N/A ? 0.0 : pdf(Normal(2.0))(y)
- A model like the above but where the domain of x[2] is Reals and
pdf(p(x2 | b))(y) = b ? 1.0 : pdf(Normal(2.0))(y)
This is what we currently do. I'm not convinced that this is a valid statistical model though.
Other similar models
@model function f_product()
b ~ Bernoulli()
if b
dists = [Normal(1.0)]
else
dists = [Normal(1.0), Normal(2.0)]
end
x ~ product_distribution(dists)
end
This seems to currently implement option 2. from above.
@model function f_addinf()
b ~ Bernoulli()
x = Vector{Float64}(undef, b ? 1 : 2)
if b
x[1] ~ Normal(1.0)
else
if length(__varinfo__[@varname(x)]) == 1
@addlogprob! -Inf
end
x[1] ~ Normal(1.0)
x[2] ~ Normal(2.0)
end
end
This, too, seems to implement something like option 2. from above, in a different manner. f_product
errors if you evaluate it with a VarInfo
where b
is false
and x
only has one element, whereas f_addinf
executes but rejects all such samples.
Sampling
Note that we don't know how to sample from f_product
or from f_addinf
, or more generally anything that implements option 2. from above. The only samplers that I know of that can handle both discrete and continuous parameters are MH and Gibbs. Gibbs won't ever work for this, because if x
is in the acceptable range for some value of b
then you can't get a valid sample from changing just b
or changing just the range of x
; You'd have to change both at once, and that is exactly what Gibbs doesn't do. MH could in principle work, but we would need to use a proposal distribution that allows going from x being length 1 to x being length 2. Such a jump would then have to be done at the same time as changing the value of b
.