ParetoNBDRV#
- class pymc_marketing.clv.distributions.ParetoNBDRV(name=None, ndim_supp=None, ndims_params=None, dtype=None, inplace=None, signature=None)[source]#
Methods
ParetoNBDRV.L_op(inputs, outputs, output_grads)Construct a graph for the L-operator.
ParetoNBDRV.R_op(inputs, eval_points)Construct a graph for the R-operator.
ParetoNBDRV.__init__([name, ndim_supp, ...])Create a random variable
Op.ParetoNBDRV.add_tag_trace([user_line])Add tag.trace to a node or variable.
ParetoNBDRV.batch_ndim(node)ParetoNBDRV.dist_params(node)Return the node inpust corresponding to dist params
ParetoNBDRV.do_constant_folding(fgraph, node)Determine whether or not constant folding should be performed for the given node.
ParetoNBDRV.grad(inputs, outputs)Construct a graph for the gradient with respect to each input variable.
ParetoNBDRV.infer_shape(fgraph, node, ...)Try to return a version of self that tries to inplace in as many as
allowed_inplace_inputs.ParetoNBDRV.make_node(rng, size, *dist_params)Create a random variable node.
ParetoNBDRV.make_py_thunk(node, storage_map, ...)Make a Python thunk.
ParetoNBDRV.make_thunk(node, storage_map, ...)Create a thunk.
ParetoNBDRV.perform(node, inputs, outputs)Calculate the function on the inputs and put the variables in the output storage.
ParetoNBDRV.prepare_node(node, storage_map, ...)Make any special modifications that the
Opneeds before doingOp.make_thunk().ParetoNBDRV.rng_fn(rng, r, alpha, s, beta, ...)Sample a numeric random variate.
ParetoNBDRV.rng_param(node)Return the node input corresponding to the rng
ParetoNBDRV.size_param(node)Return the node input corresponding to the size
Attributes
default_outputAn
intthat specifies which outputOp.__call__()should return.destroy_mapA
dictthat maps output indices to the input indices upon which they operate in-place.dtypeitypesnameotypessignatureview_mapA
dictthat maps output indices to the input indices of which they are a view.