You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The minimal example is a 2-mixture of linear regression, with known coefficients and a single real-covariate
func(x):
U ~ bernoulli(.3)
if U == 0:
X ~ normal(2 * x + 3, .1)
else:
X ~ normal(-7* x + 2, 2)
Families of models with such "regression structure" include generalized linear models (GLMs). One can, for example, first learn a mixture-of-experts GLM using an approximate inference system and then render the learned model as an SPPL program. With this representation we can leverage SPPL's exact inference engine, which can solve problems that the majority of PPLs with approximate inference cannot solve.
The text was updated successfully, but these errors were encountered:
The minimal example is a 2-mixture of linear regression, with known coefficients and a single real-covariate
Families of models with such "regression structure" include generalized linear models (GLMs). One can, for example, first learn a mixture-of-experts GLM using an approximate inference system and then render the learned model as an SPPL program. With this representation we can leverage SPPL's exact inference engine, which can solve problems that the majority of PPLs with approximate inference cannot solve.
The text was updated successfully, but these errors were encountered: