-
Just a quick question before I start digging deeper in what looks like a wonderful library :) I have an AR(p) model and would like to transform the partial autocorrelations The transformation The transformation is invertible (one-to-one) and its Jacobian is known. My question is: can RxInfer.jl support these transformations?
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @mvsoom! Thanks for giving RxInfer.jl a try. I'll move this discussion into our discussions for more in-depth exploration. RxInfer.jl indeed supports nonlinear transformations of random variables, offering flexibility in specifying inverse functions for approximation methods. Here's a breakdown:
I don't know how you want encode this transformation within RxInfer DSL, but here's a basic example to get you started. While the model depicted may not make complete sense, it demonstrates how to integrate transformations: # Here's a rough sketch:
using RxInfer
# let's encapsulate transformation in a multivariate output, assuming p=3
transformation(r) = [r[3] + r[2]*r[3] + r[3]^2, r[2] + r[1]*r[2] + r[2]^2, r[1] + r[1]^2]
@model function ar_model(n)
x_prev = datavar(Vector{Float64}, n)
x = datavar(Float64, n)
# r should be coming from somewhere, let's assume it's a multivariate normal
r ~ MvNormal(μ=zeros(3), Λ=diageye(3))
# assuming y are coefficient of ar process
y ~ transformation(r)
for i in 1:n
x[i] ~ Normal(μ=dot(y, x_prev[i]), τ =1.0)
end
end
@meta function ar_model_meta()
# transformation() -> DeltaMeta(method=Linearization())
transformation() -> DeltaMeta(method=Unscented())
# transformation() -> DeltaMeta(method=Linearization(), inverse=my_inverse) # specify known inverse
end
n_samples = 10
# dumb data
data_x_prev = [randn(3) for _ in 1:n_samples]
data_x = [rand() for _ in 1:n_samples]
result = infer(model = ar_model(n_samples), data = (x_prev = data_x_prev, x = data_x), meta=ar_model_meta(), free_energy=true)
# check posteriors
mean_r = mean(result.posteriors[:r])
cov_r = cov(result.posteriors[:r])
mean_y = mean(result.posteriors[:y])
cov_y = cov(result.posteriors[:y])
# negative log evidence:
result.free_energy Feel free to adapt this example to suit your specific transformation requirements. |
Beta Was this translation helpful? Give feedback.
Hi @mvsoom! Thanks for giving RxInfer.jl a try. I'll move this discussion into our discussions for more in-depth exploration.
RxInfer.jl indeed supports nonlinear transformations of random variables, offering flexibility in specifying inverse functions for approximation methods. Here's a breakdown:
Gaussian Random Variables: If you're dealing with Gaussian random variables, we recommend using either Linearization or Unscented transform. These methods allow you to provide inverse functions for accurate approximations.
Non-Gaussian Random Variables: For transformations involving non-Gaussian random variables, we currently employ CVI (Conjugate Variational Inference). However, please kee…