Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Classification example? #89

Open
j-adamczyk opened this issue Jun 23, 2023 · 6 comments
Open

Classification example? #89

j-adamczyk opened this issue Jun 23, 2023 · 6 comments

Comments

@j-adamczyk
Copy link

As far as I understand from the paper, the only thing needed to perform classification with BART, i.e. have binary response, is adding a probit link.

However, this question uses sigmoid (logistic link), but without pm.Deterministic:

with pm.Model() as model:
    x = pm.BART('x', X_shared.get_value(), Y_train)
    y = pm.Bernoulli('y', p=pm.math.sigmoid(x), observed=Y_train)
    trace = pm.sample()

Bayesian Computation Book in exercises 7M10 and 7M11 suggests modifying this code, which uses sigmoid, but with pm.Deterministic, like:

theta = pm.Deterministic("theta", pm.math.sigmoid(mu))

Another discussion uses inverse probit distribution, also with pm.Deterministic:

with pm.Model() as model_bart:
	mu = pmb.BART("mu", df_train_features, df_train_labels, m=200)
	theta = pm.Deterministic("theta", pm.math.invprobit(mu))
	y = pm.Bernoulli("y", p=theta, observed=df_train_labels)
	idata = pm.sample(random_seed=0, tune=200)

Which option should be used for BART binary classification? Also, adding BART classification example (even a very small code snippet) to the documentation would be really useful.

@aloctavodia
Copy link
Member

Hi, we should definitely add more examples to the docs, including a binary response. In the meantime, you can check this https://github.com/Grupo-de-modelado-probabilista/BART/blob/master/experiments/space_influenza.ipynb

When you say "As far as I understand from the paper", which paper is that?

A pm.Deterministic is the way to tell PyMC to store a variable, other than that a model with and without a deterministic is equivalent. Let's see an example.

with pm.Model() as model:
    x = pm.BART('x', X, Y)
    y = pm.Bernoulli('y', p=pm.math.sigmoid(x), observed=Y)

you will get, after sampling, the values of x, but not the values of pm.math.sigmoid(x) if instead, you write

with pm.Model() as model:
    x = pm.BART('x', X, Y)
    p = pm.Deterministic('p', pm.math.sigmoid(x))
    y = pm.Bernoulli('y', p=p, observed=Y)

Then you will get the values of x and the value of p.

Whether you should choose a logistic, probit or even other inverse link function like cloglog is a modeling choice. Personally, I have not evaluated the impact of this choice on BART models. I assume that with PyMC-BART that choice should have essentially the same pros and cons as with a typical generalized linear model.

@j-adamczyk
Copy link
Author

@aloctavodia thanks for fast response. I will check out the notebook.

I was referring to the paper BART: Bayesian additive regression trees, which is the original paper for BART, as far as I understand.

Thank you for the explanation for pm.Deterministic, I'm just starting with PyMC, and BART seemed like the most obvious place (apart from simple Bayesian regression), coming from regular tree-based models.

@aloctavodia
Copy link
Member

The implementation proposed in the original BART paper that you mention is different from the one in PyMC-BART, the original is not suitable for a probabilistic programming language like PyMC. For instance, PyMC-BART does not use conjugate-prior as described in the original BART paper.

Welcome to PyMC (and PyMC-BART)!

@j-adamczyk
Copy link
Author

@aloctavodia I tried running the code with your suggestion:

import pymc as pm
import pymc_bart as pmb


# model definition
with pm.Model() as model:
    # data containers
    X = pm.MutableData("X", X_train)
    y = pm.MutableData("y", y_train)
    
    # model definiton
    mu = pmb.BART("mu", X, y_train, m=50)
    
    # link function
    # p = pm.Deterministic("p", pm.math.sigmoid(mu))
    p = pm.Deterministic("p", pm.math.invlogit(mu))
    
    # likelihood
    y = pm.Bernoulli("y_pred", p=p, observed=y_train)

# training
with model:
    # actual training via MCMC
    idata = pm.sample(random_seed=0)

# testing
with model:
    pm.set_data({"X": X_test, "y": y_test})
    idata.extend(pm.sample_posterior_predictive(idata))

Training went well. In test, I'm getting the error:


---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/compile/function/types.py:970, in Function.__call__(self, *args, **kwargs)
    968 try:
    969     outputs = (
--> 970         self.vm()
    971         if output_subset is None
    972         else self.vm(output_subset=output_subset)
    973     )
    974 except Exception:

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/graph/op.py:543, in Op.make_py_thunk.<locals>.rval(p, i, o, n, params)
    539 @is_thunk_type
    540 def rval(
    541     p=p, i=node_input_storage, o=node_output_storage, n=node, params=None
    542 ):
--> 543     r = p(n, [x[0] for x in i], o)
    544     for o in node.outputs:

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/tensor/random/op.py:378, in RandomVariable.perform(self, node, inputs, outputs)
    376 rng_var_out[0] = rng
--> 378 smpl_val = self.rng_fn(rng, *(args + [size]))
    380 if (
    381     not isinstance(smpl_val, np.ndarray)
    382     or str(smpl_val.dtype) != out_var.type.dtype
    383 ):

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/tensor/random/basic.py:55, in ScipyRandomVariable.rng_fn(cls, *args, **kwargs)
     54 size = args[-1]
---> 55 res = cls.rng_fn_scipy(*args, **kwargs)
     57 if np.ndim(res) == 0:
     58     # The sample is an `np.number`, and is not writeable, or non-NumPy
     59     # type, so we need to clone/create a usable NumPy result

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/tensor/random/basic.py:1473, in BernoulliRV.rng_fn_scipy(cls, rng, p, size)
   1471 @classmethod
   1472 def rng_fn_scipy(cls, rng, p, size):
-> 1473     return stats.bernoulli.rvs(p, size=size, random_state=rng)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/scipy/stats/_distn_infrastructure.py:3357, in rv_discrete.rvs(self, *args, **kwargs)
   3356 kwargs['discrete'] = True
-> 3357 return super().rvs(*args, **kwargs)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/scipy/stats/_distn_infrastructure.py:1028, in rv_generic.rvs(self, *args, **kwds)
   1027 rndm = kwds.pop('random_state', None)
-> 1028 args, loc, scale, size = self._parse_args_rvs(*args, **kwds)
   1029 cond = logical_and(self._argcheck(*args), (scale >= 0))

File <string>:6, in _parse_args_rvs(self, p, loc, size)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/scipy/stats/_distn_infrastructure.py:909, in rv_generic._argcheck_rvs(self, *args, **kwargs)
    908 if not ok:
--> 909     raise ValueError("size does not match the broadcast shape of "
    910                      "the parameters. %s, %s, %s" % (size, size_,
    911                                                      bcast_shape))
    913 param_bcast = all_bcast[:-2]

ValueError: size does not match the broadcast shape of the parameters. (7877,), (7877,), (2626,)

During handling of the above exception, another exception occurred:

ValueError                                Traceback (most recent call last)
Cell In[62], line 3
      1 with model:
      2     pm.set_data({"X": X_test, "y": y_test})
----> 3     idata.extend(pm.sample_posterior_predictive(idata))

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pymc/sampling/forward.py:644, in sample_posterior_predictive(trace, model, var_names, sample_dims, random_seed, progressbar, return_inferencedata, extend_inferencedata, predictions, idata_kwargs, compile_kwargs)
    639 # there's only a single chain, but the index might hit it multiple times if
    640 # the number of indices is greater than the length of the trace.
    641 else:
    642     param = _trace[idx % len_trace]
--> 644 values = sampler_fn(**param)
    646 for k, v in zip(vars_, values):
    647     ppc_trace_t.insert(k.name, v, idx)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pymc/util.py:389, in point_wrapper.<locals>.wrapped(**kwargs)
    387 def wrapped(**kwargs):
    388     input_point = {k: v for k, v in kwargs.items() if k in ins}
--> 389     return core_function(**input_point)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/compile/function/types.py:983, in Function.__call__(self, *args, **kwargs)
    981     if hasattr(self.vm, "thunks"):
    982         thunk = self.vm.thunks[self.vm.position_of_error]
--> 983     raise_with_op(
    984         self.maker.fgraph,
    985         node=self.vm.nodes[self.vm.position_of_error],
    986         thunk=thunk,
    987         storage_map=getattr(self.vm, "storage_map", None),
    988     )
    989 else:
    990     # old-style linkers raise their own exceptions
    991     raise

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/link/utils.py:535, in raise_with_op(fgraph, node, thunk, exc_info, storage_map)
    530     warnings.warn(
    531         f"{exc_type} error does not allow us to add an extra error message"
    532     )
    533     # Some exception need extra parameter in inputs. So forget the
    534     # extra long error message in that case.
--> 535 raise exc_value.with_traceback(exc_trace)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/compile/function/types.py:970, in Function.__call__(self, *args, **kwargs)
    967 t0_fn = time.perf_counter()
    968 try:
    969     outputs = (
--> 970         self.vm()
    971         if output_subset is None
    972         else self.vm(output_subset=output_subset)
    973     )
    974 except Exception:
    975     restore_defaults()

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/graph/op.py:543, in Op.make_py_thunk.<locals>.rval(p, i, o, n, params)
    539 @is_thunk_type
    540 def rval(
    541     p=p, i=node_input_storage, o=node_output_storage, n=node, params=None
    542 ):
--> 543     r = p(n, [x[0] for x in i], o)
    544     for o in node.outputs:
    545         compute_map[o][0] = True

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/tensor/random/op.py:378, in RandomVariable.perform(self, node, inputs, outputs)
    374     rng = copy(rng)
    376 rng_var_out[0] = rng
--> 378 smpl_val = self.rng_fn(rng, *(args + [size]))
    380 if (
    381     not isinstance(smpl_val, np.ndarray)
    382     or str(smpl_val.dtype) != out_var.type.dtype
    383 ):
    384     smpl_val = _asarray(smpl_val, dtype=out_var.type.dtype)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/tensor/random/basic.py:55, in ScipyRandomVariable.rng_fn(cls, *args, **kwargs)
     52 @classmethod
     53 def rng_fn(cls, *args, **kwargs):
     54     size = args[-1]
---> 55     res = cls.rng_fn_scipy(*args, **kwargs)
     57     if np.ndim(res) == 0:
     58         # The sample is an `np.number`, and is not writeable, or non-NumPy
     59         # type, so we need to clone/create a usable NumPy result
     60         res = np.asarray(res)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/pytensor/tensor/random/basic.py:1473, in BernoulliRV.rng_fn_scipy(cls, rng, p, size)
   1471 @classmethod
   1472 def rng_fn_scipy(cls, rng, p, size):
-> 1473     return stats.bernoulli.rvs(p, size=size, random_state=rng)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/scipy/stats/_distn_infrastructure.py:3357, in rv_discrete.rvs(self, *args, **kwargs)
   3328 """Random variates of given type.
   3329 
   3330 Parameters
   (...)
   3354 
   3355 """
   3356 kwargs['discrete'] = True
-> 3357 return super().rvs(*args, **kwargs)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/scipy/stats/_distn_infrastructure.py:1028, in rv_generic.rvs(self, *args, **kwds)
   1026 discrete = kwds.pop('discrete', None)
   1027 rndm = kwds.pop('random_state', None)
-> 1028 args, loc, scale, size = self._parse_args_rvs(*args, **kwds)
   1029 cond = logical_and(self._argcheck(*args), (scale >= 0))
   1030 if not np.all(cond):

File <string>:6, in _parse_args_rvs(self, p, loc, size)

File ~/PycharmProjects/monte_carlo_BART/venv/lib/python3.11/site-packages/scipy/stats/_distn_infrastructure.py:909, in rv_generic._argcheck_rvs(self, *args, **kwargs)
    906 ok = all([bcdim == 1 or bcdim == szdim
    907           for (bcdim, szdim) in zip(bcast_shape, size_)])
    908 if not ok:
--> 909     raise ValueError("size does not match the broadcast shape of "
    910                      "the parameters. %s, %s, %s" % (size, size_,
    911                                                      bcast_shape))
    913 param_bcast = all_bcast[:-2]
    914 loc_bcast = all_bcast[-2]

ValueError: size does not match the broadcast shape of the parameters. (7877,), (7877,), (2626,)
Apply node that caused the error: bernoulli_rv{0, (0,), int64, True}(RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7F2F51F8C580>), [7877], 4, p)
Toposort index: 2
Inputs types: [RandomGeneratorType, TensorType(int64, shape=(1,)), TensorType(int64, shape=()), TensorType(float64, shape=(None,))]
Inputs shapes: ['No shapes', (1,), (), (2626,)]
Inputs strides: ['No strides', (8,), (), (8,)]
Inputs values: [Generator(PCG64) at 0x7F2F51F8C580, array([7877]), array(4), 'not shown']
Outputs clients: [['output'], ['output']]

HINT: Re-running with most PyTensor optimizations disabled could provide a back-trace showing when this node was created. This can be done by setting the PyTensor flag 'optimizer=fast_compile'. If that does not work, PyTensor optimizations can be disabled with 'optimizer=None'.
HINT: Use the PyTensor flag `exception_verbosity=high` for a debug print-out and storage map footprint of this Apply node.

7877 is length of my training data, 2626 is length of my test data.

I'm not sure if this is the right place to discuss this, or should I make a separate issue for this?

@aloctavodia
Copy link
Member

You need to specify that the shape of the likelihood is the shape of the mutable variable. This is something related to PyMC and not directly with PyMC-BART.

Something like this:

with pm.Model() as model:
    ...
    
    X_s = pm.MutableData("X_s", X)
    ....
        _ = pm.Normal("y_obs", mu=μ, sigma=ϵ, observed=Y, shape=X_s.shape)

with  model:
    pm.set_data({"X_s": new_values})
    pm.sample_posterior_predictive(idata, extend_inferencedata=True)

@j-adamczyk
Copy link
Author

j-adamczyk commented Jun 25, 2023

@aloctavodia thank you, my model is finally working on binary classification! However, I still have trouble with a multiclass one. Is it even possible with the default BART?

I tried defining my model as:

with pm.Model() as self.model:
    # data containers
    X_var = pm.MutableData("X", X)

    # model definition
    mu = pmb.BART("mu", X_var, y)

    # link function
    p = pm.Deterministic("p", pm.math.invlogit(mu))

    # likelihood
    pm.Categorical("y_pred", p=p, observed=y, shape=p.shape)

I get an error:

pymc.exceptions.SamplingError: Initial evaluation of model at starting point failed!
Starting values:
{'mu': array([2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       ... # I truncated this bit
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5])}

Logp initial evaluation results:
{'mu': 0.0, 'y_pred': -inf}
You can call `model.debug()` for more details.

Using model.debug():

point={'mu': array([2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       ... # I truncated this bit
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5, 2.5,
       2.5, 2.5, 2.5, 2.5, 2.5])}

The variable y_pred has the following parameters:
0: Sigmoid [id A] <Vector(float64, shape=(?,))> 'p'
 └─ mu [id B] <Vector(float64, shape=(?,))>
The parameters evaluate to:
0: [0.92414182 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182
 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182
 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182
 ... # I truncated this bit
 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182
 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182 0.92414182]
This does not respect one of the following constraints: 0 <= p <=1, sum(p) = 1

I guess that makes sense, probability does not sum to 1. I have seen your response here, and I:

  • one-hot encoded y vector
  • added shape, but with order reversed than what you wrote - shape=(X_len, n_classes) instead of shape=(n_classes, X_len), since with the original one I was getting a shape error, and indeed the shape was transposed (compared to what I expected)

So my code is:

with pm.Model() as self.model:
    # data containers
    X_var = pm.MutableData("X", X)
    X_len = pm.MutableData("X_len", len(X)

    # model definition
    mu = pmb.BART("mu", X_var, y, shape=(X_len, n_classes))

    # link function
    p = pm.Deterministic("p", pm.math.invlogit(mu))

    # likelihood
    pm.Categorical("y_pred", p=p, observed=y, shape=p.shape)

And I get an error:

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jakub/PycharmProjects/monte_carlo_BART/main.py", line 20, in <module>
    acc_bart, auroc_bart, time_bart = train_and_test(X_train, X_test, y_train, y_test, clf_bart)
  File "/home/jakub/PycharmProjects/monte_carlo_BART/train_and_test.py", line 8, in train_and_test
    clf.fit(X_train, y_train)
  File "/home/jakub/PycharmProjects/monte_carlo_BART/BART.py", line 80, in fit
    mu = pmb.BART(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc_bart/bart.py", line 124, in __new__
    return super().__new__(cls, name, *params, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc/distributions/distribution.py", line 305, in __new__
    rv_out = cls.dist(*args, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc_bart/bart.py", line 128, in dist
    return super().dist(params, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc/distributions/distribution.py", line 384, in dist
    rv_out = cls.rv_op(*dist_params, size=create_size, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/random/op.py", line 289, in __call__
    res = super().__call__(rng, size, dtype, *args, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/op.py", line 295, in __call__
    node = self.make_node(*inputs, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/random/op.py", line 320, in make_node
    size = normalize_size_param(size)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/random/utils.py", line 136, in normalize_size_param
    size = cast(as_tensor_variable(size, ndim=1, dtype="int64"), "int64")
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/__init__.py", line 49, in as_tensor_variable
    return _as_tensor_variable(x, name, ndim, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/functools.py", line 889, in wrapper
    return dispatch(args[0].__class__)(*args, **kw)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/basic.py", line 157, in _as_tensor_Sequence
    return MakeVector(dtype)(*x)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/op.py", line 295, in __call__
    node = self.make_node(*inputs, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/basic.py", line 1665, in make_node
    raise TypeError(
TypeError: Expected inputs to be upcastable to int64; got ['float64']

Which is quite weird, since y is an int. I also tried converting y to float, but got the same result.

Lastly, I tried your code here:

with pm.Model() as self.model:
    # data containers
    X_var = pm.MutableData("X", X)
    X_len = pm.MutableData("X_len", len(X)

    # model definition
    mu = pmb.BART("mu", X_var, y, shape=(n_classes, X_len))

    # link function
    p = pm.Deterministic("p", pm.math.softmax(mu, axis=0))

    # likelihood
    pm.Categorical("y_pred", p=p.T, observed=y, shape=p.T.shape)

But this way, I get yet another error:

Traceback (most recent call last):
  File "/home/jakub/PycharmProjects/monte_carlo_BART/main.py", line 20, in <module>
    acc_bart, auroc_bart, time_bart = train_and_test(X_train, X_test, y_train, y_test, clf_bart)
  File "/home/jakub/PycharmProjects/monte_carlo_BART/train_and_test.py", line 8, in train_and_test
    clf.fit(X_train, y_train)
  File "/home/jakub/PycharmProjects/monte_carlo_BART/BART.py", line 95, in fit
    self.training_trace_ = pm.sample(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc/sampling/mcmc.py", line 653, in sample
    step = assign_step_methods(model, step, methods=pm.STEP_METHODS, step_kwargs=kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc/sampling/mcmc.py", line 233, in assign_step_methods
    return instantiate_steppers(model, steps, selected_steps, step_kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc/sampling/mcmc.py", line 134, in instantiate_steppers
    step = step_class(vars=vars, model=model, **args)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc_bart/pgbart.py", line 131, in __init__
    self.likelihood_logp = logp(initial_values, [model.datalogp], vars, shared)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pymc_bart/pgbart.py", line 607, in logp
    function = pytensor_function([inarray0], out_list[0])
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/compile/function/__init__.py", line 315, in function
    fn = pfunc(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/compile/function/pfunc.py", line 367, in pfunc
    return orig_function(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/compile/function/types.py", line 1744, in orig_function
    m = Maker(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/compile/function/types.py", line 1518, in __init__
    self.prepare_fgraph(inputs, outputs, found_updates, fgraph, mode, profile)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/compile/function/types.py", line 1411, in prepare_fgraph
    rewriter_profile = rewriter(fgraph)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 125, in __call__
    return self.rewrite(fgraph)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 121, in rewrite
    return self.apply(fgraph, *args, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 292, in apply
    sub_prof = rewriter.apply(fgraph)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 2420, in apply
    node_rewriter_change = self.process_node(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1917, in process_node
    self.failure_callback(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1770, in warn_inplace
    return cls.warn(exc, nav, repl_pairs, node_rewriter, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1758, in warn
    raise exc
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1914, in process_node
    replacements = node_rewriter.transform(fgraph, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1074, in transform
    return self.fn(fgraph, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/rewriting/basic.py", line 1029, in local_useless_switch
    out = alloc(out, *out_shape)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/op.py", line 295, in __call__
    node = self.make_node(*inputs, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/basic.py", line 1425, in make_node
    sh, static_shape = infer_static_shape(shape)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/basic.py", line 1391, in infer_static_shape
    folded_shape = rewrite_graph(shape_fg, custom_rewrite=topo_constant_folding).outputs
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/utils.py", line 61, in rewrite_graph
    _ = query_rewrites.rewrite(fgraph)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 121, in rewrite
    return self.apply(fgraph, *args, **kwargs)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 292, in apply
    sub_prof = rewriter.apply(fgraph)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 2450, in apply
    sub_prof = grewrite.apply(fgraph)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 2032, in apply
    nb += self.process_node(fgraph, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1917, in process_node
    self.failure_callback(
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1770, in warn_inplace
    return cls.warn(exc, nav, repl_pairs, node_rewriter, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1758, in warn
    raise exc
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1914, in process_node
    replacements = node_rewriter.transform(fgraph, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/graph/rewriting/basic.py", line 1074, in transform
    return self.fn(fgraph, node)
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/tensor/rewriting/basic.py", line 1139, in constant_folding
    required = thunk()
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/link/c/op.py", line 103, in rval
    thunk()
  File "/home/jakub/anaconda3/envs/monte_carlo_BART/lib/python3.10/site-packages/pytensor/link/c/basic.py", line 1783, in __call__
    raise exc_value.with_traceback(exc_trace)
AssertionError: Could not broadcast dimensions
log_thunk_trace: There was a problem executing an Op.

Process finished with exit code 1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants