snowdrop.src.preprocessor package

Submodules

snowdrop.src.preprocessor.codegen module

Extension to ast that allow ast -> python code generation.

copyright:

Copyright 2008 by Armin Ronacher.

license:

BSD.

class snowdrop.src.preprocessor.codegen.SourceGenerator(indent_with, add_line_information=False)[source]

Bases: NodeVisitor

This visitor is able to transform a well formed syntax tree into python source code.

For more details have a look at the docstring of the node_to_source function.

body(statements)[source]
body_or_else(node)[source]
decorators(node)[source]
newline(node=None, extra=0)[source]
signature(node)[source]
visit_Assert(node)[source]
visit_Assign(node)[source]
visit_Attribute(node)[source]
visit_AugAssign(node)[source]
visit_BinOp(node)[source]
visit_BoolOp(node)[source]
visit_Break(node)[source]
visit_Bytes(node)[source]
visit_Call(node)[source]
visit_ClassDef(node)[source]
visit_Compare(node)[source]
visit_Continue(node)[source]
visit_Delete(node)[source]
visit_Dict(node)[source]
visit_DictComp(node)[source]
visit_Ellipsis(node)[source]
visit_Expr(node)[source]
visit_ExtSlice(node)[source]
visit_For(node)[source]
visit_FunctionDef(node)[source]
visit_GeneratorExp(node)
visit_Global(node)[source]
visit_If(node)[source]
visit_IfExp(node)[source]
visit_Import(node)[source]
visit_ImportFrom(node)[source]
visit_Lambda(node)[source]
visit_List(node)
visit_ListComp(node)
visit_Name(node)[source]
visit_Nonlocal(node)[source]
visit_Num(node)[source]
visit_Pass(node)[source]
visit_Print(node)[source]
visit_Raise(node)[source]
visit_Repr(node)[source]
visit_Return(node)[source]
visit_Set(node)
visit_SetComp(node)
visit_Slice(node)[source]
visit_Starred(node)[source]
visit_Str(node)[source]
visit_Subscript(node)[source]
visit_TryExcept(node)[source]
visit_TryFinally(node)[source]
visit_Tuple(node)[source]
visit_UnaryOp(node)[source]
visit_While(node)[source]
visit_With(node)[source]
visit_Yield(node)[source]
visit_alias(node)[source]
visit_arg(node)[source]
visit_arguments(node)[source]
visit_comprehension(node)[source]
visit_excepthandler(node)[source]
write(x)[source]
snowdrop.src.preprocessor.codegen.test_generation()[source]
snowdrop.src.preprocessor.codegen.to_source(node, indent_with='    ', add_line_information=False)[source]

Convert a node tree back into python source code.

This function is useful for debugging purposes, especially if you’re dealing with custom asts not generated by python itself.

It could be that the source code is evaluable when the AST itself is not compilable / evaluable. The reason for this is that the AST contains some more data than regular sourcecode does, which is dropped during conversion.

Each level of indentation is replaced with indent_with. Per default this parameter is equal to four spaces as suggested by PEP 8, but it might be adjusted to match the application’s styleguide.

If add_line_information is set to True comments for the line numbers of the nodes are added to the output. This can be used to spot wrong line number information of statement nodes.

snowdrop.src.preprocessor.condition module

Created on Fri Mar 12 08:47:07 2021

@author: alexei

snowdrop.src.preprocessor.condition.Derivative(f, x)[source]

Returns derivative of a constant, i.e. zero

Parameters:
ffunction

Function

xfloat

variable.

Returns:

None.

snowdrop.src.preprocessor.condition.IfThen(condition, x)[source]

Checks condition and returns x or 0.

Parameters:
conditionbool

Condition.

xfloat

Variable value.

Returns:

Variable x value if condition is satisfied and 0 if not.

snowdrop.src.preprocessor.condition.IfThenElse(condition, a, b)[source]

Checks condition and returns a or b.

Parameters:
conditionbool

Condition.

afloat

Variable value.

bfloat

Variable value.

Returns:

Variable ‘a’ value if condition is satisfied and variable ‘b’ value if not.

snowdrop.src.preprocessor.condition.Negative(x)[source]

Returns True if argument is non-negative

Parameters:
xfloat

variable.

Returns:

None.

snowdrop.src.preprocessor.condition.Positive(x)[source]

Returns True if argument is non-negative

Parameters:
xfloat

variable.

Returns:

None.

snowdrop.src.preprocessor.condition.Subs(x, *args)[source]

Nothing to substitute.

snowdrop.src.preprocessor.condition.log(x)[source]

Returns negative number if argment is negative.

snowdrop.src.preprocessor.condition.myzif(x)[source]

Return user defined function value.

Parameters:
xfloat

Variable value.

snowdrop.src.preprocessor.eval_formula module

snowdrop.src.preprocessor.eval_formula.eval_formula(expr: str, dataframe=None, context=None)[source]

Evaluate expression.

Args:
expr: string

Symbolic expression to evaluate. Example: k(1)-delta*k(0)-i

table: (optional) pandas dataframe

Each column is a time series, which can be indexeds.

context: dict or CalibrationDict

Context

snowdrop.src.preprocessor.eval_solver module

class snowdrop.src.preprocessor.eval_solver.FindNames[source]

Bases: NodeVisitor

visit_Name(node)[source]
snowdrop.src.preprocessor.eval_solver.eval_solver(incidence)[source]
snowdrop.src.preprocessor.eval_solver.evaluate(system, values=None, context=None)[source]
snowdrop.src.preprocessor.eval_solver.get_atoms(string)[source]
snowdrop.src.preprocessor.eval_solver.get_incidence(sdict)[source]

snowdrop.src.preprocessor.f_dynamic module

snowdrop.src.preprocessor.f_dynamic.f_dynamic(x, p, exog=[0], order=1, ind=None)[source]

snowdrop.src.preprocessor.f_jacob module

snowdrop.src.preprocessor.f_jacob.f_jacob(x, p, exog=[0])[source]

snowdrop.src.preprocessor.f_measurement module

snowdrop.src.preprocessor.f_measurement.f_measurement(x, p, exog=[0], order=1, ind=None)[source]

snowdrop.src.preprocessor.f_rhs module

snowdrop.src.preprocessor.f_rhs.f_rhs(x, p, exog=[0], order=0, ind=None)[source]

snowdrop.src.preprocessor.f_sparse module

snowdrop.src.preprocessor.f_sparse.f_sparse(x, p, exog=[0], order=1, ind=None)[source]

snowdrop.src.preprocessor.f_steady module

snowdrop.src.preprocessor.f_steady.f_steady(x, p, exog=[0])[source]

snowdrop.src.preprocessor.function module

snowdrop.src.preprocessor.function.correct(n, max_lead_shock, min_lag_shock, indexEndogVariables, derivative)[source]

Set elements of not swapped array to zero.

Parameters

nint

number of endogenous variables.

max_lead_shockint

Minimum lag shock number.

min_lag_shockint

Mimum lead shock number.

indexEndogVariablesnumpy ndarray

Index Of EndogVariables.

derivativenumpy ndarray

partial derivatives array.

Returns

Corrected derivatives array.

snowdrop.src.preprocessor.function.flip(n, order, indEndog, indExog, max_lead_shock, min_lag_shock, derivatives)[source]

Swap indices of endogenous and exogenous variables.

Parameters:
param order:

Order of partial derivatives of the system of equations.

type order:

int

param endogVariables:

list

type endogVariables:

Endogenous variables to be flipped

param exogVariables:

list

type exogVariables:

Exogenous variables to be flipped

returns:

Function, Jacobian, Hessian, and third order derivatives matrices.

snowdrop.src.preprocessor.function.get_function(model, y, func=None, params=None, shock=None, exog=None, t=0, debug=False)[source]

Returns a function array.

Parameters:
param model:

Model object.

type model:

Model.

param y:

Array of values of endogenous variables.

type y:

numpy.array

order func:

Model equations function.

order func:

Function

param params:

Values of parameters.

type params:

numpy.array

param shock:

The values of shocks.

type shock:

numpy.array

param exog:

The exogenous process.

type exog:

numpy ndarray

param t:

Time index.

type t:

int

param debug:

If this flag is raised then compiles function from files; otherwise use cache.

type debug:

bool

returns:

Function values.

snowdrop.src.preprocessor.function.get_function_and_jacobian(model, params=None, y=None, shock=None, t=0, order=1, bSparse=False, exog=None, debug=False)[source]

Returns function and jacobian.

Parameters:
param model:

Model object.

type model:

Model.

param params:

array of parameters.

type params:

numpy.array.

param y:

Array of values of endogenous variables.

type y:

numpy.array

param shock:

The values of shocks.

type shock:

numpy.array

param t:

Time index.

type t:

int

param order:

Order of partial derivatives of the system of equations.

type order:

int

param bSparse:

If this flag is raised then call f_sparse function; otherwise - f_dynamic

type bSparse:

bool

param exog:

Exogenous process.

type exog:

numpy ndarray

param debug:

If this flag is raised then compiles function from files; otherwise use cache.

type debug:

bool

returns:

Function, Jacobian, Hessian, and third order derivatives matrices.

snowdrop.src.preprocessor.function_compiler module

class snowdrop.src.preprocessor.function_compiler.CountNames(known_variables, known_functions, known_constants)[source]

Bases: NodeVisitor

visit_Call(call)[source]
visit_Name(cname)[source]
snowdrop.src.preprocessor.function_compiler.compile_function_ast(equations, symbols, arg_names, output_names=None, function_name='anonymous', rhs_only=False, return_ast=False, print_code=False, definitions=None, vectorize=True, use_file=False)[source]
snowdrop.src.preprocessor.function_compiler.eval_ast(mod)[source]
snowdrop.src.preprocessor.function_compiler.eval_with_diff(f, args, add_args, epsilon=1e-08)[source]

f is a guvectorized function: f(x1, x2, ,xn, y1,..yp) args is a list of vectors [x1,…,xn] add_args is a list of vectors [y1,…,yn] the function returns a list [r, dx1, …, dxn] where: r is the vector value value of f at (x1, xn, y1, yp) dxi is jacobian w.r.t. xi

TODO: generalize when x1, …, xn have non-core dimensions

snowdrop.src.preprocessor.function_compiler.get_deps(incidence, var, visited=None)[source]
snowdrop.src.preprocessor.function_compiler.make_method(equations, arguments, constants, targets=None, rhs_only=False, definitions={}, function_name='anonymous')[source]
snowdrop.src.preprocessor.function_compiler.parse(s)[source]
class snowdrop.src.preprocessor.function_compiler.standard_function(fun, n_output)[source]

Bases: object

epsilon = 1e-08
snowdrop.src.preprocessor.function_compiler.tshift(t, n)[source]
snowdrop.src.preprocessor.function_compiler.unique(seq)[source]

snowdrop.src.preprocessor.function_compiler_sympy module

snowdrop.src.preprocessor.function_compiler_sympy.ast_to_sympy(expr)[source]

Convert an AST to a sympy expression.

Parameters:
param expr:

AST expression.

type expr:

ast.

returns:

Sympy expression.

snowdrop.src.preprocessor.function_compiler_sympy.compile_function(equations, syms, params, syms_exog=[], eq_vars=[], function_name='func', out='f_func', b=False, model_name='', log_variables=[])[source]

From a list of equations and variables, define a multivariate functions with higher order derivatives.

Parameters:
param equations:

List of equations.

type equations:

list.

param syms:

Symbols.

type syms:

list.

param params:

Parameters.

type params:

list.

param syms_exog:

Exogenous variables symbols.

type syms_exog:

list

param eq_vars:

List of variables in each equation.

type eq_vars:

list.

param function_name:

Function name.

type function_name:

str.

param out:

Name of file that contains this function source code.

type out:

str.

param b:

If True use JAX package, otherwise - numpy.

type b:

bool.

param model_name:

Model name.

type model_name:

str.

returns:

Compiled function.

snowdrop.src.preprocessor.function_compiler_sympy.compile_higher_order_function(equations, syms, params, syms_exog=[], eq_vars=[], order=1, function_name='anonymous', out='f_dynamic', b=False, bSparse=False, model_name='', log_variables=[])[source]

From a list of equations and variables, define a multivariate functions with higher order derivatives.

Parameters:
param equations:

List of equations.

type equations:

list.

param syms:

Symbols.

type syms:

list.

param params:

Parameters.

type params:

list.

param syms_exog:

Exogenous variables symbols.

type syms_exog:

list

param eq_vars:

List of variables in each equation.

type eq_vars:

list.

param order:

Order of partial derivatives of Jacobian.

type order:

int.

param function_name:

Function name.

type function_name:

str.

param out:

Name of file that contains this function source code.

type out:

str.

param b:

If True use cupy package, otherwise - numpy.

type b:

bool.

param bSparse:

True if sparse algebra is used.

type bSparse:

bool.

param model_name:

Model name.

:type model_name; str. :returns: Function and matrices of partial derivatives.

snowdrop.src.preprocessor.function_compiler_sympy.compile_jacobian(equations, syms, params, syms_exog=[], eq_vars=[], function_name='jacob', out='f_jacob', bSparse=False, b=False, model_name='', log_variables=[])[source]

From a list of equations and variables, define a multivariate functions with higher order derivatives.

Parameters:
param equations:

List of equations.

type equations:

list.

param syms:

Symbols.

type syms:

list.

param params:

Parameters.

type params:

list.

param syms_exog:

Exogenous variables symbols.

type syms_exog:

list

param eq_vars:

List of variables in each equation.

type eq_vars:

list.

param order:

Order of partial derivatives of Jacobian.

type order:

int.

param function_name:

Function name.

type function_name:

str.

param out:

Name of file that contains this function source code.

type out:

str.

param b:

If True use cupy package, otherwise - numpy.

type b:

bool.

param bSparse:

True if sparse algebra is used.

type bSparse:

bool.

param model_name:

Model name.

:type model_name; str. :returns: Compiled jacobian.

snowdrop.src.preprocessor.function_compiler_sympy.get_indices(equations, syms, eq_vars=[])[source]

From a list of equations and variables, define a multivariate functions with higher order derivatives.

Parameters:
param equations:

List of equations.

type equations:

list.

param syms:

Symbols.

type syms:

list.

returns:

row and column indices.

snowdrop.src.preprocessor.function_compiler_sympy.higher_order_diff(eqs, symbols, eq_vars=[], order=1)[source]

Take higher order derivatives of a list of equations w.r.t a list of symbols.

Parameters:
param eqs:

List of equations.

type eqs:

list.

param symbols:

List of symbols.

type symbols:

list.

param eq_vars:

List of variables in each equation.

type eq_vars:

list.

param order:

Order of partial derivatives of Jacobian.

type order:

int.

returns:

Matrix of partial derivatives.

snowdrop.src.preprocessor.function_compiler_sympy.log_normalize_equations(equations, variables, log_variables=[], bAST=True, debug=False)[source]

Modify equations by replacing variables with their logs with time shift.

For example, replaces variable:
  1. var(-1) -> log(var__m1_)

  2. var(1) -> log(var__p1_)

  3. var -> log(var__)

Parameters:
param equations:

List of equations.

type equations:

list.

param variables:

List of variables in each equation.

type variables:

list.

param bAST:

If True use AST tree to parse equations, stringify variables and convert to sumpy expressions. Otherwise parse equations with the aid of regular expressions of ReGex module.

type bAST:

bool.

snowdrop.src.preprocessor.function_compiler_sympy.non_decreasing_series(n, size)[source]

List all combinations of 0,…,n-1 in increasing order.

Parameters:
param n:

Defines maximum number of integers in all combinations.

type n:

int.

param size:

Size of all combinations.

type size:

int.

returns:

List of combination of 0,…,n-1 of size size.

snowdrop.src.preprocessor.function_compiler_sympy.normalize_equations(equations, variables, bAST=True, debug=True)[source]

Modify equations by replacing lead/lag variables with the dated variables.

For example, replaces variable:
  1. var(-1) -> var__m1_

  2. var(1) -> var__p1_

  3. var -> var__

Parameters:
param equations:

List of equations.

type equations:

list.

param variables:

List of variables in each equation.

type variables:

list.

param bAST:

If True use AST tree to parse equations, stringify variables and convert to sumpy expressions. Otherwise parse equations with the aid of regular expressions of ReGex module.

type bAST:

bool.

snowdrop.src.preprocessor.function_compiler_sympy.test_deriv()[source]

Test derivatives of equations.

snowdrop.src.preprocessor.function_compiler_sympy.test_deriv2()[source]

Test derivatives of equations.

snowdrop.src.preprocessor.functions module

Created on Tue Apr 20 17:47:40 2021

@author: A.Goumilevski

snowdrop.src.preprocessor.functions.Abs(x)[source]
snowdrop.src.preprocessor.functions.DiracDelta(x)[source]
snowdrop.src.preprocessor.functions.Heaviside(x)[source]
snowdrop.src.preprocessor.functions.Max(*args)[source]
snowdrop.src.preprocessor.functions.Min(*args)[source]
snowdrop.src.preprocessor.functions.PNORM(x, mean=0.0, std=1.0)[source]

Troll ‘PNORM’ normal distribution function.

snowdrop.src.preprocessor.functions.log(x)[source]

snowdrop.src.preprocessor.language module

Defines structure of LanguageElement class.

class snowdrop.src.preprocessor.language.Beta[source]

Bases: LanguageElement

Beta distribution class.

baseclass

alias of Beta

class snowdrop.src.preprocessor.language.Binomial[source]

Bases: LanguageElement

Binomial distribution class.

baseclass

alias of Binomial

class snowdrop.src.preprocessor.language.Cartesian[source]

Bases: LanguageElement

Cartesian grid class.

baseclass

alias of Cartesian

class snowdrop.src.preprocessor.language.Domain[source]

Bases: LanguageElement

Domain class.

baseclass

alias of Domain

class snowdrop.src.preprocessor.language.Gamma[source]

Bases: LanguageElement

Gamma distribution class.

baseclass

alias of Gamma

class snowdrop.src.preprocessor.language.LanguageElement[source]

Bases: dict

Class LanguageElement.

check()[source]

Inspect call signature of a callable object and its return annotation.

classmethod constructor(loader, node)[source]

Constructor.

eval(d={})[source]

Evaluate Numeric node.

class snowdrop.src.preprocessor.language.LogNormal[source]

Bases: LanguageElement

Lognormal distribution class.

baseclass

alias of LogNormal

class snowdrop.src.preprocessor.language.Logistic[source]

Bases: LanguageElement

Logistic distribution class.

baseclass

alias of Logistic

class snowdrop.src.preprocessor.language.MvNormal[source]

Bases: LanguageElement

Multivariate normal distribution class.

baseclass

alias of MvNormal

class snowdrop.src.preprocessor.language.Normal[source]

Bases: LanguageElement

Normal distribution class.

baseclass

alias of Normal

class snowdrop.src.preprocessor.language.Uniform[source]

Bases: LanguageElement

Uniform distribution class.

baseclass

alias of Uniform

snowdrop.src.preprocessor.language.c

alias of Cartesian

snowdrop.src.preprocessor.misc module

Miscellaneous code.

class snowdrop.src.preprocessor.misc.CalibrationDict(symbols=None, calib=None)[source]

Bases: OrderedDict

Dictionary that holds model calibration names and values.

Parameters:
OrderedDict:

Ordered dictionary

Usage examples:

cb = CalibrationDict(symbols, calib)

snowdrop.src.preprocessor.misc.allocating_function(inplace_function, size_output)[source]
snowdrop.src.preprocessor.misc.calibration_to_dict(symbols, calib)[source]

Build OrderedDict from dict.

Parameters:
symbolslist

Symbols.

calibration_dictdict

Mapping of names and values.

Returns:
calibrationOrderedDict

Ordered dictionary.

snowdrop.src.preprocessor.misc.calibration_to_vector(symbols, calibration_dict)[source]

Build list of dictionary values.

Parameters:
symbolslist

Symbols.

calibration_dictdict

Mapping of names and values.

Returns:
calibrationlist

Values list.

snowdrop.src.preprocessor.misc.numdiff(fun, args)[source]

Vectorized numerical differentiation.

snowdrop.src.preprocessor.objects module

Wrapper for probability distribution objects.

class snowdrop.src.preprocessor.objects.Domain(**kwargs)[source]

Bases: dict

Domain class.

property max
property min

snowdrop.src.preprocessor.pattern module

class snowdrop.src.preprocessor.pattern.Compare[source]

Bases: object

compare(A, B)[source]
class snowdrop.src.preprocessor.pattern.ReplaceExpectation[source]

Bases: NodeTransformer

ast = <module 'ast' from '/home/alexei/anaconda3/lib/python3.12/ast.py'>
visit_Subscript(node)[source]
snowdrop.src.preprocessor.pattern.compare_strings(a, b)[source]
snowdrop.src.preprocessor.pattern.fix_equation(eq)[source]
snowdrop.src.preprocessor.pattern.match(m, s)[source]
snowdrop.src.preprocessor.pattern.test_strings()[source]

snowdrop.src.preprocessor.processes module

class snowdrop.src.preprocessor.processes.Beta(a, b)[source]

Bases: object

Univariate Beta distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes.Binomial(n, p)[source]

Bases: object

Univariate Binomial distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes.Cartesian(x, y, sig_z=0.001)[source]

Bases: object

Cartesian grid class.

class snowdrop.src.preprocessor.processes.Gamma(shape, scale=1.0)[source]

Bases: object

Univariate Gamma distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes.LogNormal(Mu=0.0, Sigma=1.0)[source]

Bases: object

Univariate LogNormal distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes.Logistic(loc=0.0, scale=1.0)[source]

Bases: object

Univariate Logistic distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes.MvNormal(mean=[0], cov=[[1.0]])[source]

Bases: object

Multivariate normal distribution class.

simulate(N, T)[source]
class snowdrop.src.preprocessor.processes.Normal(loc=0, scale=1.0)[source]

Bases: object

Univariate Normal distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes.Uniform(low=0, high=1)[source]

Bases: object

Univariate Uniform distribution class.

simulate(T)[source]
snowdrop.src.preprocessor.processes.beta(a, b, size=None)

Draw samples from a Beta distribution.

The Beta distribution is a special case of the Dirichlet distribution, and is related to the Gamma distribution. It has the probability distribution function

\[f(x; a,b) = \frac{1}{B(\alpha, \beta)} x^{\alpha - 1} (1 - x)^{\beta - 1},\]

where the normalization, B, is the beta function,

\[B(\alpha, \beta) = \int_0^1 t^{\alpha - 1} (1 - t)^{\beta - 1} dt.\]

It is often seen in Bayesian inference and order statistics.

Note

New code should use the ~numpy.random.Generator.beta method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

afloat or array_like of floats

Alpha, positive (>0).

bfloat or array_like of floats

Beta, positive (>0).

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if a and b are both scalars. Otherwise, np.broadcast(a, b).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized beta distribution.

See Also

random.Generator.beta: which should be used for new code.

snowdrop.src.preprocessor.processes.binomial(n, p, size=None)

Draw samples from a binomial distribution.

Samples are drawn from a binomial distribution with specified parameters, n trials and p probability of success where n an integer >= 0 and p is in the interval [0,1]. (n may be input as a float, but it is truncated to an integer in use)

Note

New code should use the ~numpy.random.Generator.binomial method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

nint or array_like of ints

Parameter of the distribution, >= 0. Floats are also accepted, but they will be truncated to integers.

pfloat or array_like of floats

Parameter of the distribution, >= 0 and <=1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if n and p are both scalars. Otherwise, np.broadcast(n, p).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized binomial distribution, where each sample is equal to the number of successes over the n trials.

See Also

scipy.stats.binomprobability density function, distribution or

cumulative density function, etc.

random.Generator.binomial: which should be used for new code.

Notes

The probability density for the binomial distribution is

\[P(N) = \binom{n}{N}p^N(1-p)^{n-N},\]

where \(n\) is the number of trials, \(p\) is the probability of success, and \(N\) is the number of successes.

When estimating the standard error of a proportion in a population by using a random sample, the normal distribution works well unless the product p*n <=5, where p = population proportion estimate, and n = number of samples, in which case the binomial distribution is used instead. For example, a sample of 15 people shows 4 who are left handed, and 11 who are right handed. Then p = 4/15 = 27%. 0.27*15 = 4, so the binomial distribution should be used in this case.

References

Examples

Draw samples from the distribution:

>>> n, p = 10, .5  # number of trials, probability of each trial
>>> s = np.random.binomial(n, p, 1000)
# result of flipping a coin 10 times, tested 1000 times.

A real world example. A company drills 9 wild-cat oil exploration wells, each with an estimated probability of success of 0.1. All nine wells fail. What is the probability of that happening?

Let’s do 20,000 trials of the model, and count the number that generate zero positive results.

>>> sum(np.random.binomial(9, 0.1, 20000) == 0)/20000.
# answer = 0.38885, or 38%.
snowdrop.src.preprocessor.processes.gamma(shape, scale=1.0, size=None)

Draw samples from a Gamma distribution.

Samples are drawn from a Gamma distribution with specified parameters, shape (sometimes designated “k”) and scale (sometimes designated “theta”), where both parameters are > 0.

Note

New code should use the ~numpy.random.Generator.gamma method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

shapefloat or array_like of floats

The shape of the gamma distribution. Must be non-negative.

scalefloat or array_like of floats, optional

The scale of the gamma distribution. Must be non-negative. Default is equal to 1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if shape and scale are both scalars. Otherwise, np.broadcast(shape, scale).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized gamma distribution.

See Also

scipy.stats.gammaprobability density function, distribution or

cumulative density function, etc.

random.Generator.gamma: which should be used for new code.

Notes

The probability density for the Gamma distribution is

\[p(x) = x^{k-1}\frac{e^{-x/\theta}}{\theta^k\Gamma(k)},\]

where \(k\) is the shape and \(\theta\) the scale, and \(\Gamma\) is the Gamma function.

The Gamma distribution is often used to model the times to failure of electronic components, and arises naturally in processes for which the waiting times between Poisson distributed events are relevant.

References

Examples

Draw samples from the distribution:

>>> shape, scale = 2., 2.  # mean=4, std=2*sqrt(2)
>>> s = np.random.gamma(shape, scale, 1000)

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> import scipy.special as sps  
>>> count, bins, ignored = plt.hist(s, 50, density=True)
>>> y = bins**(shape-1)*(np.exp(-bins/scale) /  
...                      (sps.gamma(shape)*scale**shape))
>>> plt.plot(bins, y, linewidth=2, color='r')  
>>> plt.show()
snowdrop.src.preprocessor.processes.logistic(loc=0.0, scale=1.0, size=None)

Draw samples from a logistic distribution.

Samples are drawn from a logistic distribution with specified parameters, loc (location or mean, also median), and scale (>0).

Note

New code should use the ~numpy.random.Generator.logistic method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

locfloat or array_like of floats, optional

Parameter of the distribution. Default is 0.

scalefloat or array_like of floats, optional

Parameter of the distribution. Must be non-negative. Default is 1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if loc and scale are both scalars. Otherwise, np.broadcast(loc, scale).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized logistic distribution.

See Also

scipy.stats.logisticprobability density function, distribution or

cumulative density function, etc.

random.Generator.logistic: which should be used for new code.

Notes

The probability density for the Logistic distribution is

\[P(x) = P(x) = \frac{e^{-(x-\mu)/s}}{s(1+e^{-(x-\mu)/s})^2},\]

where \(\mu\) = location and \(s\) = scale.

The Logistic distribution is used in Extreme Value problems where it can act as a mixture of Gumbel distributions, in Epidemiology, and by the World Chess Federation (FIDE) where it is used in the Elo ranking system, assuming the performance of each player is a logistically distributed random variable.

References

Examples

Draw samples from the distribution:

>>> loc, scale = 10, 1
>>> s = np.random.logistic(loc, scale, 10000)
>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, bins=50)

# plot against distribution

>>> def logist(x, loc, scale):
...     return np.exp((loc-x)/scale)/(scale*(1+np.exp((loc-x)/scale))**2)
>>> lgst_val = logist(bins, loc, scale)
>>> plt.plot(bins, lgst_val * count.max() / lgst_val.max())
>>> plt.show()
snowdrop.src.preprocessor.processes.lognormal(mean=0.0, sigma=1.0, size=None)

Draw samples from a log-normal distribution.

Draw samples from a log-normal distribution with specified mean, standard deviation, and array shape. Note that the mean and standard deviation are not the values for the distribution itself, but of the underlying normal distribution it is derived from.

Note

New code should use the ~numpy.random.Generator.lognormal method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

meanfloat or array_like of floats, optional

Mean value of the underlying normal distribution. Default is 0.

sigmafloat or array_like of floats, optional

Standard deviation of the underlying normal distribution. Must be non-negative. Default is 1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if mean and sigma are both scalars. Otherwise, np.broadcast(mean, sigma).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized log-normal distribution.

See Also

scipy.stats.lognormprobability density function, distribution,

cumulative density function, etc.

random.Generator.lognormal: which should be used for new code.

Notes

A variable x has a log-normal distribution if log(x) is normally distributed. The probability density function for the log-normal distribution is:

\[p(x) = \frac{1}{\sigma x \sqrt{2\pi}} e^{(-\frac{(ln(x)-\mu)^2}{2\sigma^2})}\]

where \(\mu\) is the mean and \(\sigma\) is the standard deviation of the normally distributed logarithm of the variable. A log-normal distribution results if a random variable is the product of a large number of independent, identically-distributed variables in the same way that a normal distribution results if the variable is the sum of a large number of independent, identically-distributed variables.

References

Examples

Draw samples from the distribution:

>>> mu, sigma = 3., 1. # mean and standard deviation
>>> s = np.random.lognormal(mu, sigma, 1000)

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, 100, density=True, align='mid')
>>> x = np.linspace(min(bins), max(bins), 10000)
>>> pdf = (np.exp(-(np.log(x) - mu)**2 / (2 * sigma**2))
...        / (x * sigma * np.sqrt(2 * np.pi)))
>>> plt.plot(x, pdf, linewidth=2, color='r')
>>> plt.axis('tight')
>>> plt.show()

Demonstrate that taking the products of random samples from a uniform distribution can be fit well by a log-normal probability density function.

>>> # Generate a thousand samples: each is the product of 100 random
>>> # values, drawn from a normal distribution.
>>> b = []
>>> for i in range(1000):
...    a = 10. + np.random.standard_normal(100)
...    b.append(np.prod(a))
>>> b = np.array(b) / np.min(b) # scale values to be positive
>>> count, bins, ignored = plt.hist(b, 100, density=True, align='mid')
>>> sigma = np.std(np.log(b))
>>> mu = np.mean(np.log(b))
>>> x = np.linspace(min(bins), max(bins), 10000)
>>> pdf = (np.exp(-(np.log(x) - mu)**2 / (2 * sigma**2))
...        / (x * sigma * np.sqrt(2 * np.pi)))
>>> plt.plot(x, pdf, color='r', linewidth=2)
>>> plt.show()
snowdrop.src.preprocessor.processes.multivariate_normal(mean, cov, size=None, check_valid='warn', tol=1e-8)

Draw random samples from a multivariate normal distribution.

The multivariate normal, multinormal or Gaussian distribution is a generalization of the one-dimensional normal distribution to higher dimensions. Such a distribution is specified by its mean and covariance matrix. These parameters are analogous to the mean (average or “center”) and variance (standard deviation, or “width,” squared) of the one-dimensional normal distribution.

Note

New code should use the ~numpy.random.Generator.multivariate_normal method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

mean1-D array_like, of length N

Mean of the N-dimensional distribution.

cov2-D array_like, of shape (N, N)

Covariance matrix of the distribution. It must be symmetric and positive-semidefinite for proper sampling.

sizeint or tuple of ints, optional

Given a shape of, for example, (m,n,k), m*n*k samples are generated, and packed in an m-by-n-by-k arrangement. Because each sample is N-dimensional, the output shape is (m,n,k,N). If no shape is specified, a single (N-D) sample is returned.

check_valid{ ‘warn’, ‘raise’, ‘ignore’ }, optional

Behavior when the covariance matrix is not positive semidefinite.

tolfloat, optional

Tolerance when checking the singular values in covariance matrix. cov is cast to double before the check.

Returns

outndarray

The drawn samples, of shape size, if that was provided. If not, the shape is (N,).

In other words, each entry out[i,j,...,:] is an N-dimensional value drawn from the distribution.

See Also

random.Generator.multivariate_normal: which should be used for new code.

Notes

The mean is a coordinate in N-dimensional space, which represents the location where samples are most likely to be generated. This is analogous to the peak of the bell curve for the one-dimensional or univariate normal distribution.

Covariance indicates the level to which two variables vary together. From the multivariate normal distribution, we draw N-dimensional samples, \(X = [x_1, x_2, ... x_N]\). The covariance matrix element \(C_{ij}\) is the covariance of \(x_i\) and \(x_j\). The element \(C_{ii}\) is the variance of \(x_i\) (i.e. its “spread”).

Instead of specifying the full covariance matrix, popular approximations include:

  • Spherical covariance (cov is a multiple of the identity matrix)

  • Diagonal covariance (cov has non-negative elements, and only on the diagonal)

This geometrical property can be seen in two dimensions by plotting generated data-points:

>>> mean = [0, 0]
>>> cov = [[1, 0], [0, 100]]  # diagonal covariance

Diagonal covariance means that points are oriented along x or y-axis:

>>> import matplotlib.pyplot as plt
>>> x, y = np.random.multivariate_normal(mean, cov, 5000).T
>>> plt.plot(x, y, 'x')
>>> plt.axis('equal')
>>> plt.show()

Note that the covariance matrix must be positive semidefinite (a.k.a. nonnegative-definite). Otherwise, the behavior of this method is undefined and backwards compatibility is not guaranteed.

References

Examples

>>> mean = (1, 2)
>>> cov = [[1, 0], [0, 1]]
>>> x = np.random.multivariate_normal(mean, cov, (3, 3))
>>> x.shape
(3, 3, 2)

Here we generate 800 samples from the bivariate normal distribution with mean [0, 0] and covariance matrix [[6, -3], [-3, 3.5]]. The expected variances of the first and second components of the sample are 6 and 3.5, respectively, and the expected correlation coefficient is -3/sqrt(6*3.5) ≈ -0.65465.

>>> cov = np.array([[6, -3], [-3, 3.5]])
>>> pts = np.random.multivariate_normal([0, 0], cov, size=800)

Check that the mean, covariance, and correlation coefficient of the sample are close to the expected values:

>>> pts.mean(axis=0)
array([ 0.0326911 , -0.01280782])  # may vary
>>> np.cov(pts.T)
array([[ 5.96202397, -2.85602287],
       [-2.85602287,  3.47613949]])  # may vary
>>> np.corrcoef(pts.T)[0, 1]
-0.6273591314603949  # may vary

We can visualize this data with a scatter plot. The orientation of the point cloud illustrates the negative correlation of the components of this sample.

>>> import matplotlib.pyplot as plt
>>> plt.plot(pts[:, 0], pts[:, 1], '.', alpha=0.5)
>>> plt.axis('equal')
>>> plt.grid()
>>> plt.show()
snowdrop.src.preprocessor.processes.normal(loc=0.0, scale=1.0, size=None)

Draw random samples from a normal (Gaussian) distribution.

The probability density function of the normal distribution, first derived by De Moivre and 200 years later by both Gauss and Laplace independently [2]_, is often called the bell curve because of its characteristic shape (see the example below).

The normal distributions occurs often in nature. For example, it describes the commonly occurring distribution of samples influenced by a large number of tiny, random disturbances, each with its own unique distribution [2]_.

Note

New code should use the ~numpy.random.Generator.normal method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

locfloat or array_like of floats

Mean (“centre”) of the distribution.

scalefloat or array_like of floats

Standard deviation (spread or “width”) of the distribution. Must be non-negative.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if loc and scale are both scalars. Otherwise, np.broadcast(loc, scale).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized normal distribution.

See Also

scipy.stats.normprobability density function, distribution or

cumulative density function, etc.

random.Generator.normal: which should be used for new code.

Notes

The probability density for the Gaussian distribution is

\[p(x) = \frac{1}{\sqrt{ 2 \pi \sigma^2 }} e^{ - \frac{ (x - \mu)^2 } {2 \sigma^2} },\]

where \(\mu\) is the mean and \(\sigma\) the standard deviation. The square of the standard deviation, \(\sigma^2\), is called the variance.

The function has its peak at the mean, and its “spread” increases with the standard deviation (the function reaches 0.607 times its maximum at \(x + \sigma\) and \(x - \sigma\) [2]_). This implies that normal is more likely to return samples lying close to the mean, rather than those far away.

References

Examples

Draw samples from the distribution:

>>> mu, sigma = 0, 0.1 # mean and standard deviation
>>> s = np.random.normal(mu, sigma, 1000)

Verify the mean and the variance:

>>> abs(mu - np.mean(s))
0.0  # may vary
>>> abs(sigma - np.std(s, ddof=1))
0.1  # may vary

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, 30, density=True)
>>> plt.plot(bins, 1/(sigma * np.sqrt(2 * np.pi)) *
...                np.exp( - (bins - mu)**2 / (2 * sigma**2) ),
...          linewidth=2, color='r')
>>> plt.show()

Two-by-four array of samples from the normal distribution with mean 3 and standard deviation 2.5:

>>> np.random.normal(3, 2.5, size=(2, 4))
array([[-4.49401501,  4.00950034, -1.81814867,  7.29718677],   # random
       [ 0.39924804,  4.68456316,  4.99394529,  4.84057254]])  # random
snowdrop.src.preprocessor.processes.uniform(low=0.0, high=1.0, size=None)

Draw samples from a uniform distribution.

Samples are uniformly distributed over the half-open interval [low, high) (includes low, but excludes high). In other words, any value within the given interval is equally likely to be drawn by uniform.

Note

New code should use the ~numpy.random.Generator.uniform method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

lowfloat or array_like of floats, optional

Lower boundary of the output interval. All values generated will be greater than or equal to low. The default value is 0.

highfloat or array_like of floats

Upper boundary of the output interval. All values generated will be less than or equal to high. The high limit may be included in the returned array of floats due to floating-point rounding in the equation low + (high-low) * random_sample(). The default value is 1.0.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if low and high are both scalars. Otherwise, np.broadcast(low, high).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized uniform distribution.

See Also

randint : Discrete uniform distribution, yielding integers. random_integers : Discrete uniform distribution over the closed

interval [low, high].

random_sample : Floats uniformly distributed over [0, 1). random : Alias for random_sample. rand : Convenience function that accepts dimensions as input, e.g.,

rand(2,2) would generate a 2-by-2 array of floats, uniformly distributed over [0, 1).

random.Generator.uniform: which should be used for new code.

Notes

The probability density function of the uniform distribution is

\[p(x) = \frac{1}{b - a}\]

anywhere within the interval [a, b), and zero elsewhere.

When high == low, values of low will be returned. If high < low, the results are officially undefined and may eventually raise an error, i.e. do not rely on this function to behave when passed arguments satisfying that inequality condition. The high limit may be included in the returned array of floats due to floating-point rounding in the equation low + (high-low) * random_sample(). For example:

>>> x = np.float32(5*0.99999999)
>>> x
5.0

Examples

Draw samples from the distribution:

>>> s = np.random.uniform(-1,0,1000)

All values are within the given interval:

>>> np.all(s >= -1)
True
>>> np.all(s < 0)
True

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, 15, density=True)
>>> plt.plot(bins, np.ones_like(bins), linewidth=2, color='r')
>>> plt.show()

snowdrop.src.preprocessor.processes_new module

class snowdrop.src.preprocessor.processes_new.Beta(a, b)[source]

Bases: object

Univariate Beta distribution class.

simulate(T)[source]

Return a sample draw from beta distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized beta distribution.

class snowdrop.src.preprocessor.processes_new.Binomial(n, p)[source]

Bases: object

Binomial distribution class.

simulate(T)[source]

Return a sample draw from binomial distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized binomial distribution.

class snowdrop.src.preprocessor.processes_new.Gamma(shape, scale=1.0)[source]

Bases: object

Univariate Gamma distribution class.

simulate(T)[source]

Return a sample draw from gamma distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized gamma distribution.

class snowdrop.src.preprocessor.processes_new.LogNormal(Mu=0.0, Sigma=1.0)[source]

Bases: object

Univariate LogNormal distribution class.

simulate(T)[source]

Return a sample draw from log-normal distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized log-normal distribution..

class snowdrop.src.preprocessor.processes_new.Logistic(loc=0.0, scale=1.0)[source]

Bases: object

Univariate Logistic distribution class.

simulate(T)[source]

Return a sample draw from logistic distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized logistics distribution.

class snowdrop.src.preprocessor.processes_new.MvNormal(mean=[0], cov=[[1.0]])[source]

Bases: object

Multivariate normal distribution class.

simulate(N, T)[source]

Return a sample draw from multivariate normal distribution.

Parameters:
Nint

Sample size.

Tint

Time.

Returns:
ndarray or scalar

Drawn samples from the parameterized multivariate normal distribution.

class snowdrop.src.preprocessor.processes_new.Normal(loc=0, scale=1.0)[source]

Bases: object

Normal distribution class.

simulate(T)[source]

Return a sample draw from normal distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized normal distribution.

class snowdrop.src.preprocessor.processes_new.Uniform(low=0, high=1)[source]

Bases: object

Univariate Uniform distribution class.

simulate(T)[source]

Return a sample draw from uniform distribution.

Parameters:
Tint

Time.

Returns:
resultndarray or scalar

Drawn samples from the parameterized uniform distribution.

snowdrop.src.preprocessor.processes_old module

class snowdrop.src.preprocessor.processes_old.Beta(a, b)[source]

Bases: object

Univariate Beta distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes_old.Binomial(n, p)[source]

Bases: object

Univariate Binomial distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes_old.Gamma(shape, scale=1.0)[source]

Bases: object

Univariate Gamma distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes_old.LogNormal(Mu=0.0, Sigma=1.0)[source]

Bases: object

Univariate LogNormal distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes_old.Logistic(loc=0.0, scale=1.0)[source]

Bases: object

Univariate Logistic distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes_old.MvNormal(mean=[0], cov=[[1.0]])[source]

Bases: object

Multivariate normal distribution class.

simulate(N, T)[source]
class snowdrop.src.preprocessor.processes_old.Normal(loc=0, scale=1.0)[source]

Bases: object

Univariate Normal distribution class.

simulate(T)[source]
class snowdrop.src.preprocessor.processes_old.Uniform(low=0, high=1)[source]

Bases: object

Univariate Uniform distribution class.

simulate(T)[source]
snowdrop.src.preprocessor.processes_old.beta(a, b, size=None)

Draw samples from a Beta distribution.

The Beta distribution is a special case of the Dirichlet distribution, and is related to the Gamma distribution. It has the probability distribution function

\[f(x; a,b) = \frac{1}{B(\alpha, \beta)} x^{\alpha - 1} (1 - x)^{\beta - 1},\]

where the normalization, B, is the beta function,

\[B(\alpha, \beta) = \int_0^1 t^{\alpha - 1} (1 - t)^{\beta - 1} dt.\]

It is often seen in Bayesian inference and order statistics.

Note

New code should use the ~numpy.random.Generator.beta method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

afloat or array_like of floats

Alpha, positive (>0).

bfloat or array_like of floats

Beta, positive (>0).

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if a and b are both scalars. Otherwise, np.broadcast(a, b).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized beta distribution.

See Also

random.Generator.beta: which should be used for new code.

snowdrop.src.preprocessor.processes_old.binomial(n, p, size=None)

Draw samples from a binomial distribution.

Samples are drawn from a binomial distribution with specified parameters, n trials and p probability of success where n an integer >= 0 and p is in the interval [0,1]. (n may be input as a float, but it is truncated to an integer in use)

Note

New code should use the ~numpy.random.Generator.binomial method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

nint or array_like of ints

Parameter of the distribution, >= 0. Floats are also accepted, but they will be truncated to integers.

pfloat or array_like of floats

Parameter of the distribution, >= 0 and <=1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if n and p are both scalars. Otherwise, np.broadcast(n, p).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized binomial distribution, where each sample is equal to the number of successes over the n trials.

See Also

scipy.stats.binomprobability density function, distribution or

cumulative density function, etc.

random.Generator.binomial: which should be used for new code.

Notes

The probability density for the binomial distribution is

\[P(N) = \binom{n}{N}p^N(1-p)^{n-N},\]

where \(n\) is the number of trials, \(p\) is the probability of success, and \(N\) is the number of successes.

When estimating the standard error of a proportion in a population by using a random sample, the normal distribution works well unless the product p*n <=5, where p = population proportion estimate, and n = number of samples, in which case the binomial distribution is used instead. For example, a sample of 15 people shows 4 who are left handed, and 11 who are right handed. Then p = 4/15 = 27%. 0.27*15 = 4, so the binomial distribution should be used in this case.

References

Examples

Draw samples from the distribution:

>>> n, p = 10, .5  # number of trials, probability of each trial
>>> s = np.random.binomial(n, p, 1000)
# result of flipping a coin 10 times, tested 1000 times.

A real world example. A company drills 9 wild-cat oil exploration wells, each with an estimated probability of success of 0.1. All nine wells fail. What is the probability of that happening?

Let’s do 20,000 trials of the model, and count the number that generate zero positive results.

>>> sum(np.random.binomial(9, 0.1, 20000) == 0)/20000.
# answer = 0.38885, or 38%.
snowdrop.src.preprocessor.processes_old.gamma(shape, scale=1.0, size=None)

Draw samples from a Gamma distribution.

Samples are drawn from a Gamma distribution with specified parameters, shape (sometimes designated “k”) and scale (sometimes designated “theta”), where both parameters are > 0.

Note

New code should use the ~numpy.random.Generator.gamma method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

shapefloat or array_like of floats

The shape of the gamma distribution. Must be non-negative.

scalefloat or array_like of floats, optional

The scale of the gamma distribution. Must be non-negative. Default is equal to 1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if shape and scale are both scalars. Otherwise, np.broadcast(shape, scale).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized gamma distribution.

See Also

scipy.stats.gammaprobability density function, distribution or

cumulative density function, etc.

random.Generator.gamma: which should be used for new code.

Notes

The probability density for the Gamma distribution is

\[p(x) = x^{k-1}\frac{e^{-x/\theta}}{\theta^k\Gamma(k)},\]

where \(k\) is the shape and \(\theta\) the scale, and \(\Gamma\) is the Gamma function.

The Gamma distribution is often used to model the times to failure of electronic components, and arises naturally in processes for which the waiting times between Poisson distributed events are relevant.

References

Examples

Draw samples from the distribution:

>>> shape, scale = 2., 2.  # mean=4, std=2*sqrt(2)
>>> s = np.random.gamma(shape, scale, 1000)

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> import scipy.special as sps  
>>> count, bins, ignored = plt.hist(s, 50, density=True)
>>> y = bins**(shape-1)*(np.exp(-bins/scale) /  
...                      (sps.gamma(shape)*scale**shape))
>>> plt.plot(bins, y, linewidth=2, color='r')  
>>> plt.show()
snowdrop.src.preprocessor.processes_old.logistic(loc=0.0, scale=1.0, size=None)

Draw samples from a logistic distribution.

Samples are drawn from a logistic distribution with specified parameters, loc (location or mean, also median), and scale (>0).

Note

New code should use the ~numpy.random.Generator.logistic method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

locfloat or array_like of floats, optional

Parameter of the distribution. Default is 0.

scalefloat or array_like of floats, optional

Parameter of the distribution. Must be non-negative. Default is 1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if loc and scale are both scalars. Otherwise, np.broadcast(loc, scale).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized logistic distribution.

See Also

scipy.stats.logisticprobability density function, distribution or

cumulative density function, etc.

random.Generator.logistic: which should be used for new code.

Notes

The probability density for the Logistic distribution is

\[P(x) = P(x) = \frac{e^{-(x-\mu)/s}}{s(1+e^{-(x-\mu)/s})^2},\]

where \(\mu\) = location and \(s\) = scale.

The Logistic distribution is used in Extreme Value problems where it can act as a mixture of Gumbel distributions, in Epidemiology, and by the World Chess Federation (FIDE) where it is used in the Elo ranking system, assuming the performance of each player is a logistically distributed random variable.

References

Examples

Draw samples from the distribution:

>>> loc, scale = 10, 1
>>> s = np.random.logistic(loc, scale, 10000)
>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, bins=50)

# plot against distribution

>>> def logist(x, loc, scale):
...     return np.exp((loc-x)/scale)/(scale*(1+np.exp((loc-x)/scale))**2)
>>> lgst_val = logist(bins, loc, scale)
>>> plt.plot(bins, lgst_val * count.max() / lgst_val.max())
>>> plt.show()
snowdrop.src.preprocessor.processes_old.lognormal(mean=0.0, sigma=1.0, size=None)

Draw samples from a log-normal distribution.

Draw samples from a log-normal distribution with specified mean, standard deviation, and array shape. Note that the mean and standard deviation are not the values for the distribution itself, but of the underlying normal distribution it is derived from.

Note

New code should use the ~numpy.random.Generator.lognormal method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

meanfloat or array_like of floats, optional

Mean value of the underlying normal distribution. Default is 0.

sigmafloat or array_like of floats, optional

Standard deviation of the underlying normal distribution. Must be non-negative. Default is 1.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if mean and sigma are both scalars. Otherwise, np.broadcast(mean, sigma).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized log-normal distribution.

See Also

scipy.stats.lognormprobability density function, distribution,

cumulative density function, etc.

random.Generator.lognormal: which should be used for new code.

Notes

A variable x has a log-normal distribution if log(x) is normally distributed. The probability density function for the log-normal distribution is:

\[p(x) = \frac{1}{\sigma x \sqrt{2\pi}} e^{(-\frac{(ln(x)-\mu)^2}{2\sigma^2})}\]

where \(\mu\) is the mean and \(\sigma\) is the standard deviation of the normally distributed logarithm of the variable. A log-normal distribution results if a random variable is the product of a large number of independent, identically-distributed variables in the same way that a normal distribution results if the variable is the sum of a large number of independent, identically-distributed variables.

References

Examples

Draw samples from the distribution:

>>> mu, sigma = 3., 1. # mean and standard deviation
>>> s = np.random.lognormal(mu, sigma, 1000)

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, 100, density=True, align='mid')
>>> x = np.linspace(min(bins), max(bins), 10000)
>>> pdf = (np.exp(-(np.log(x) - mu)**2 / (2 * sigma**2))
...        / (x * sigma * np.sqrt(2 * np.pi)))
>>> plt.plot(x, pdf, linewidth=2, color='r')
>>> plt.axis('tight')
>>> plt.show()

Demonstrate that taking the products of random samples from a uniform distribution can be fit well by a log-normal probability density function.

>>> # Generate a thousand samples: each is the product of 100 random
>>> # values, drawn from a normal distribution.
>>> b = []
>>> for i in range(1000):
...    a = 10. + np.random.standard_normal(100)
...    b.append(np.prod(a))
>>> b = np.array(b) / np.min(b) # scale values to be positive
>>> count, bins, ignored = plt.hist(b, 100, density=True, align='mid')
>>> sigma = np.std(np.log(b))
>>> mu = np.mean(np.log(b))
>>> x = np.linspace(min(bins), max(bins), 10000)
>>> pdf = (np.exp(-(np.log(x) - mu)**2 / (2 * sigma**2))
...        / (x * sigma * np.sqrt(2 * np.pi)))
>>> plt.plot(x, pdf, color='r', linewidth=2)
>>> plt.show()
snowdrop.src.preprocessor.processes_old.multivariate_normal(mean, cov, size=None, check_valid='warn', tol=1e-8)

Draw random samples from a multivariate normal distribution.

The multivariate normal, multinormal or Gaussian distribution is a generalization of the one-dimensional normal distribution to higher dimensions. Such a distribution is specified by its mean and covariance matrix. These parameters are analogous to the mean (average or “center”) and variance (standard deviation, or “width,” squared) of the one-dimensional normal distribution.

Note

New code should use the ~numpy.random.Generator.multivariate_normal method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

mean1-D array_like, of length N

Mean of the N-dimensional distribution.

cov2-D array_like, of shape (N, N)

Covariance matrix of the distribution. It must be symmetric and positive-semidefinite for proper sampling.

sizeint or tuple of ints, optional

Given a shape of, for example, (m,n,k), m*n*k samples are generated, and packed in an m-by-n-by-k arrangement. Because each sample is N-dimensional, the output shape is (m,n,k,N). If no shape is specified, a single (N-D) sample is returned.

check_valid{ ‘warn’, ‘raise’, ‘ignore’ }, optional

Behavior when the covariance matrix is not positive semidefinite.

tolfloat, optional

Tolerance when checking the singular values in covariance matrix. cov is cast to double before the check.

Returns

outndarray

The drawn samples, of shape size, if that was provided. If not, the shape is (N,).

In other words, each entry out[i,j,...,:] is an N-dimensional value drawn from the distribution.

See Also

random.Generator.multivariate_normal: which should be used for new code.

Notes

The mean is a coordinate in N-dimensional space, which represents the location where samples are most likely to be generated. This is analogous to the peak of the bell curve for the one-dimensional or univariate normal distribution.

Covariance indicates the level to which two variables vary together. From the multivariate normal distribution, we draw N-dimensional samples, \(X = [x_1, x_2, ... x_N]\). The covariance matrix element \(C_{ij}\) is the covariance of \(x_i\) and \(x_j\). The element \(C_{ii}\) is the variance of \(x_i\) (i.e. its “spread”).

Instead of specifying the full covariance matrix, popular approximations include:

  • Spherical covariance (cov is a multiple of the identity matrix)

  • Diagonal covariance (cov has non-negative elements, and only on the diagonal)

This geometrical property can be seen in two dimensions by plotting generated data-points:

>>> mean = [0, 0]
>>> cov = [[1, 0], [0, 100]]  # diagonal covariance

Diagonal covariance means that points are oriented along x or y-axis:

>>> import matplotlib.pyplot as plt
>>> x, y = np.random.multivariate_normal(mean, cov, 5000).T
>>> plt.plot(x, y, 'x')
>>> plt.axis('equal')
>>> plt.show()

Note that the covariance matrix must be positive semidefinite (a.k.a. nonnegative-definite). Otherwise, the behavior of this method is undefined and backwards compatibility is not guaranteed.

References

Examples

>>> mean = (1, 2)
>>> cov = [[1, 0], [0, 1]]
>>> x = np.random.multivariate_normal(mean, cov, (3, 3))
>>> x.shape
(3, 3, 2)

Here we generate 800 samples from the bivariate normal distribution with mean [0, 0] and covariance matrix [[6, -3], [-3, 3.5]]. The expected variances of the first and second components of the sample are 6 and 3.5, respectively, and the expected correlation coefficient is -3/sqrt(6*3.5) ≈ -0.65465.

>>> cov = np.array([[6, -3], [-3, 3.5]])
>>> pts = np.random.multivariate_normal([0, 0], cov, size=800)

Check that the mean, covariance, and correlation coefficient of the sample are close to the expected values:

>>> pts.mean(axis=0)
array([ 0.0326911 , -0.01280782])  # may vary
>>> np.cov(pts.T)
array([[ 5.96202397, -2.85602287],
       [-2.85602287,  3.47613949]])  # may vary
>>> np.corrcoef(pts.T)[0, 1]
-0.6273591314603949  # may vary

We can visualize this data with a scatter plot. The orientation of the point cloud illustrates the negative correlation of the components of this sample.

>>> import matplotlib.pyplot as plt
>>> plt.plot(pts[:, 0], pts[:, 1], '.', alpha=0.5)
>>> plt.axis('equal')
>>> plt.grid()
>>> plt.show()
snowdrop.src.preprocessor.processes_old.normal(loc=0.0, scale=1.0, size=None)

Draw random samples from a normal (Gaussian) distribution.

The probability density function of the normal distribution, first derived by De Moivre and 200 years later by both Gauss and Laplace independently [2]_, is often called the bell curve because of its characteristic shape (see the example below).

The normal distributions occurs often in nature. For example, it describes the commonly occurring distribution of samples influenced by a large number of tiny, random disturbances, each with its own unique distribution [2]_.

Note

New code should use the ~numpy.random.Generator.normal method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

locfloat or array_like of floats

Mean (“centre”) of the distribution.

scalefloat or array_like of floats

Standard deviation (spread or “width”) of the distribution. Must be non-negative.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if loc and scale are both scalars. Otherwise, np.broadcast(loc, scale).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized normal distribution.

See Also

scipy.stats.normprobability density function, distribution or

cumulative density function, etc.

random.Generator.normal: which should be used for new code.

Notes

The probability density for the Gaussian distribution is

\[p(x) = \frac{1}{\sqrt{ 2 \pi \sigma^2 }} e^{ - \frac{ (x - \mu)^2 } {2 \sigma^2} },\]

where \(\mu\) is the mean and \(\sigma\) the standard deviation. The square of the standard deviation, \(\sigma^2\), is called the variance.

The function has its peak at the mean, and its “spread” increases with the standard deviation (the function reaches 0.607 times its maximum at \(x + \sigma\) and \(x - \sigma\) [2]_). This implies that normal is more likely to return samples lying close to the mean, rather than those far away.

References

Examples

Draw samples from the distribution:

>>> mu, sigma = 0, 0.1 # mean and standard deviation
>>> s = np.random.normal(mu, sigma, 1000)

Verify the mean and the variance:

>>> abs(mu - np.mean(s))
0.0  # may vary
>>> abs(sigma - np.std(s, ddof=1))
0.1  # may vary

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, 30, density=True)
>>> plt.plot(bins, 1/(sigma * np.sqrt(2 * np.pi)) *
...                np.exp( - (bins - mu)**2 / (2 * sigma**2) ),
...          linewidth=2, color='r')
>>> plt.show()

Two-by-four array of samples from the normal distribution with mean 3 and standard deviation 2.5:

>>> np.random.normal(3, 2.5, size=(2, 4))
array([[-4.49401501,  4.00950034, -1.81814867,  7.29718677],   # random
       [ 0.39924804,  4.68456316,  4.99394529,  4.84057254]])  # random
snowdrop.src.preprocessor.processes_old.uniform(low=0.0, high=1.0, size=None)

Draw samples from a uniform distribution.

Samples are uniformly distributed over the half-open interval [low, high) (includes low, but excludes high). In other words, any value within the given interval is equally likely to be drawn by uniform.

Note

New code should use the ~numpy.random.Generator.uniform method of a ~numpy.random.Generator instance instead; please see the random-quick-start.

Parameters

lowfloat or array_like of floats, optional

Lower boundary of the output interval. All values generated will be greater than or equal to low. The default value is 0.

highfloat or array_like of floats

Upper boundary of the output interval. All values generated will be less than or equal to high. The high limit may be included in the returned array of floats due to floating-point rounding in the equation low + (high-low) * random_sample(). The default value is 1.0.

sizeint or tuple of ints, optional

Output shape. If the given shape is, e.g., (m, n, k), then m * n * k samples are drawn. If size is None (default), a single value is returned if low and high are both scalars. Otherwise, np.broadcast(low, high).size samples are drawn.

Returns

outndarray or scalar

Drawn samples from the parameterized uniform distribution.

See Also

randint : Discrete uniform distribution, yielding integers. random_integers : Discrete uniform distribution over the closed

interval [low, high].

random_sample : Floats uniformly distributed over [0, 1). random : Alias for random_sample. rand : Convenience function that accepts dimensions as input, e.g.,

rand(2,2) would generate a 2-by-2 array of floats, uniformly distributed over [0, 1).

random.Generator.uniform: which should be used for new code.

Notes

The probability density function of the uniform distribution is

\[p(x) = \frac{1}{b - a}\]

anywhere within the interval [a, b), and zero elsewhere.

When high == low, values of low will be returned. If high < low, the results are officially undefined and may eventually raise an error, i.e. do not rely on this function to behave when passed arguments satisfying that inequality condition. The high limit may be included in the returned array of floats due to floating-point rounding in the equation low + (high-low) * random_sample(). For example:

>>> x = np.float32(5*0.99999999)
>>> x
5.0

Examples

Draw samples from the distribution:

>>> s = np.random.uniform(-1,0,1000)

All values are within the given interval:

>>> np.all(s >= -1)
True
>>> np.all(s < 0)
True

Display the histogram of the samples, along with the probability density function:

>>> import matplotlib.pyplot as plt
>>> count, bins, ignored = plt.hist(s, 15, density=True)
>>> plt.plot(bins, np.ones_like(bins), linewidth=2, color='r')
>>> plt.show()

snowdrop.src.preprocessor.recipes module

Ideas : * recursive blocks [by default] * (order left hand side ?) [by default] * dependency across blocks * dummy blocks that are basically substituted everywhere else

snowdrop.src.preprocessor.steady module

Created on Thu Feb 11 09:21:50 2021

@author: alexei

snowdrop.src.preprocessor.steady.Derivative(f, x)[source]

By definition, derivative of a steady state function is zero.

Parameters:
fstr.

Function name, f = f(x).

xstr.

Variable name.

Returns:

Zero value.

snowdrop.src.preprocessor.steady.STEADY_STATE(x, dict_steady_states=None)[source]

Wrapper for a “steady_state” function of a “steady_states” class.

Parameters:
xstr.

Variable name.

dict_steady_statesdict

Map with variables names and steady state values.

Returns:

Steady state value.

class snowdrop.src.preprocessor.steady.SteadyStates(dict_steady_states=None)[source]

Bases: object

ss = None
steady_state(x)[source]

Returns steady state value of a variable.

Parameters:
xstr.

Variable name.

Returns:

Steady state value.

snowdrop.src.preprocessor.steady_state module

snowdrop.src.preprocessor.steady_state.residuals(model)[source]

Returns residuals of model equations for a static solution.

snowdrop.src.preprocessor.symbolic module

All symbolic functions take ast expression trees (not expressions) as input. This one can be constructed as : ast.parse(s).body[0].value

class snowdrop.src.preprocessor.symbolic.Compare[source]

Bases: object

Compares two ast tree instances.

compare(A, B)[source]

Compare two nodes.

class snowdrop.src.preprocessor.symbolic.ExpressionChecker(spec_variables, known_functions, known_constants)[source]

Bases: NodeVisitor

Checks AST expressions.

visit_Call(call)[source]
visit_Name(name)[source]
class snowdrop.src.preprocessor.symbolic.ExpressionLogNormalizer(variables=[], log_variables=[], functions=None)[source]

Bases: NodeTransformer

Replaces calls to variables by log function of their time subscripts.

visit_Call(node)[source]
visit_Name(node)[source]
class snowdrop.src.preprocessor.symbolic.ExpressionNormalizer(variables=None, functions=None)[source]

Bases: NodeTransformer

Replaces calls to variables by time subscripts.

visit_Call(node)[source]
visit_Name(node)[source]
class snowdrop.src.preprocessor.symbolic.ListNames[source]

Bases: NodeVisitor

visit_Name(name)[source]
class snowdrop.src.preprocessor.symbolic.ListSymbols(known_functions=[], known_variables=[])[source]

Bases: NodeVisitor

Creates a lists of symbols by visiting each Call object in ast expression tree.

visit_Call(call)[source]
visit_Name(name)[source]
class snowdrop.src.preprocessor.symbolic.StandardizeDatesSimple(variables)[source]

Bases: NodeTransformer

Replaces calls to variables by time subscripts.

visit_Call(node)[source]

Visitor for Call node.

visit_Name(node)[source]

Visitor for Name node.

class snowdrop.src.preprocessor.symbolic.TimeShiftTransformer(variables, shift=0)[source]

Bases: NodeTransformer

visit_Call(node)[source]

Visitor for Call node.

visit_Name(node)[source]

Visitor for Name node.

snowdrop.src.preprocessor.symbolic.check_expression(expr, spec_variables, known_functions=[])[source]
snowdrop.src.preprocessor.symbolic.compare(a, b)[source]

Compare two nodes.

snowdrop.src.preprocessor.symbolic.destringify(s: str, variables: List[str] = []) Tuple[int, int][source]

Find leads and lags of a variable from its name.

snowdrop.src.preprocessor.symbolic.eval_scalar(tree)[source]
snowdrop.src.preprocessor.symbolic.get_names(expr)[source]
snowdrop.src.preprocessor.symbolic.list_variables(expr: Expr, funs: List[str] = None, vars: List[str] = None) List[Tuple[str, int]][source]
snowdrop.src.preprocessor.symbolic.log_normalize(expr: Expr, variables: List[str] = [], log_variables: List[str] = []) Expr[source]

Replace calls to variables by their time subscripts.

snowdrop.src.preprocessor.symbolic.log_stringify_variable(arg: Tuple[str, int]) str[source]

Return variable with a log function of time shifted variable.

This method encodes varaible name with its lead or lag.

snowdrop.src.preprocessor.symbolic.match(m, s)[source]
snowdrop.src.preprocessor.symbolic.normalize(expr: Expr, variables: List[str] = []) Expr[source]

Replace calls to variables by their time subscripts.

snowdrop.src.preprocessor.symbolic.parse_string(text, start=None)[source]
snowdrop.src.preprocessor.symbolic.std_tsymbol(tsymbol)[source]

Return string encoded with leads/lags.

snowdrop.src.preprocessor.symbolic.stringify(arg) str[source]

Stingify a variable or a parameter.

snowdrop.src.preprocessor.symbolic.stringify_parameter(p: str) str[source]

Stringify a parameter.

snowdrop.src.preprocessor.symbolic.stringify_symbol(arg) str[source]

Stingify symbol.

snowdrop.src.preprocessor.symbolic.stringify_variable(arg: Tuple[str, int]) str[source]

Stringify a variable.

This method encodes varaible name with its lead or lag.

snowdrop.src.preprocessor.symbolic.time_shift(expr: Expr, n, vars: List[str] = []) Expr[source]

Shifts timing in equations variables.

Example: time_shift(:(a+b(1)+c),1,[:b,:c]) == :(a+b(2)+c(1))

snowdrop.src.preprocessor.symbolic.timeshift(expr, variables, shift)[source]

snowdrop.src.preprocessor.symbolic_eval module

class snowdrop.src.preprocessor.symbolic_eval.NumericEval(d, minilang=[<class 'snowdrop.src.preprocessor.language.MvNormal'>, <class 'snowdrop.src.preprocessor.language.Normal'>, <class 'snowdrop.src.preprocessor.language.LogNormal'>, <class 'snowdrop.src.preprocessor.language.Beta'>, <class 'snowdrop.src.preprocessor.language.Binomial'>, <class 'snowdrop.src.preprocessor.language.Gamma'>, <class 'snowdrop.src.preprocessor.language.Logistic'>, <class 'snowdrop.src.preprocessor.language.Uniform'>, <class 'snowdrop.src.preprocessor.language.Domain'>, <class 'snowdrop.src.preprocessor.language.Cartesian'>])[source]

Bases: object

This class defines several evaluation methods.

eval(struct)[source]

Evaluate structure.

eval_commentedmap(d)[source]

Evaluate commented map.

eval_commentedseq(s)[source]

Evaluate comments.

eval_dict(d)[source]

Evaluate dictionary.

eval_float(s)[source]

Evaluate float.

eval_float64(s)[source]

Evaluate float.

eval_int(s)[source]

Evaluate integer.

eval_list(l)[source]

Evaluate list.

eval_ndarray(array_in)[source]

Evaluate ndarray.

eval_nonetype(none)[source]
eval_ordereddict(s)[source]

Evaluate ordered dictionary.

eval_scalarfloat(s)[source]

Evaluate scalar.

eval_str(s)[source]

Evaluate string.

snowdrop.src.preprocessor.util module

Classes define user functions.

Created on Thu Mar 11 22:08:13 2021

@author: A.Goumilevski

class snowdrop.src.preprocessor.util.IfThen(condition, x)[source]

Bases: Function

Evaluate codition and return value based on this condition.

default_assumptions = {}
classmethod eval(condition, x)[source]

Return value based on condition evaluation.

Parameters:
clsFunction object

Instance of Function class.

conditionfloat

Expression value.

xfloat

Expression value.

Returns:
ffloat

Returns x if condition is non-negative and zero otherwise.

classmethod fdiff(argindex)[source]

Compute drivative of x expression.

Parameters:
clsFunction object

Instance of Function class.

argindexint

Indexes of the args, starting at 1

Returns:
diffExpression

Derivative w.r.t x if condition is non-negative and zero otherwise.

class snowdrop.src.preprocessor.util.IfThenElse(condition, a, b)[source]

Bases: Function

Evaluate codition and return value based on this condition.

default_assumptions = {}
classmethod eval(condition, a, b)[source]

Return value based on condition evaluation.

Parameters:
clsFunction object

Instance of Function class.

conditionfloat

Expression value.

afloat

Expression value.

bfloat

Expression value.

Returns:
ffloat

Returns a if condition is non-negative and b otherwise.

classmethod fdiff(argindex)[source]

Compute drivative of a or b expression.

Parameters:
clsFunction object

Instance of Function class.

argindexint

Indexes of the args, starting at 1

Returns:
diffExpression

Derivative w.r.t a if condition is non-negative and derivative w.r.t b otherwise.

class snowdrop.src.preprocessor.util.Max1(a, b)[source]

Bases: Function

default_assumptions = {}
classmethod eval(a, b)[source]

Return minimum of two values.

Parameters:
clsFunction object

Instance of Function class.

afloat

Expression value.

bfloat

Expression value.

class snowdrop.src.preprocessor.util.Min1(a, b)[source]

Bases: Function

default_assumptions = {}
classmethod eval(a, b)[source]

Return minimum of two values.

Parameters:
clsFunction object

Instance of Function class.

afloat

Expression value.

bfloat

Expression value.

class snowdrop.src.preprocessor.util.Negative(x)[source]

Bases: Function

Evaluate if expression is negative.

default_assumptions = {}
classmethod eval(x)[source]

Evaluate expression x.

Parameters:
clsFunction object

Instance of Function class.

xfloat

Expression value.

Returns:
ffloat

Returns one if x is negative and zero otherwise.

classmethod fdiff(argindex)[source]

Compute drivative of a or b expression.

Parameters:
clsFunction object

Instance of Function class.

argindexint

Indexes of the args, starting at 1

Returns:
diffExpression

Returns zero.

class snowdrop.src.preprocessor.util.PNORM(x)[source]

Bases: Function

Evaluate Troll’s normal function.

default_assumptions = {}
classmethod eval(x)[source]

Return value based on condition evaluation.

Parameters:
clsFunction object

Instance of Function class.

conditionfloat

Expression value.

xfloat

Expression value.

Returns:
ffloat

Returns x if condition is non-negative and zero otherwise.

classmethod fdiff(z)[source]

Compute drivative of z expression.

Parameters:
clsFunction object

Instance of Function class.

zsymbol

Take derivative w.r.t. symbol z

Returns:
diffExpression

Derivative w.r.t z.

class snowdrop.src.preprocessor.util.Positive(x)[source]

Bases: Function

Evaluate if expression is positive.

default_assumptions = {}
classmethod eval(x)[source]

Evaluate expression x.

Parameters:
clsFunction object

Instance of Function class.

xfloat

Expression value.

Returns:
ffloat

Returns one if x is non-negative and zero otherwise.

classmethod fdiff(argindex)[source]

Compute drivative of a or b expression.

Parameters:
clsFunction object

Instance of Function class.

argindexint

Indexes of the args, starting at 1

Returns:
diffExpression

Returns zero.

class snowdrop.src.preprocessor.util.myzif(x)[source]

Bases: Function

default_assumptions = {}
classmethod eval(x)[source]

Return value based on user’s function evaluation.

Parameters:
clsFunction object

Instance of Function class.

xfloat

Expression value.

snowdrop.src.preprocessor.util.updateFiles(model, path)[source]

Update python generated files.

Parameters:
modelModel

Model object.

pathstr

Path to folder.

Module contents