Univariate helper functions#
The module xyz.univariate contains lightweight function-based helpers.
These are useful for quick exploratory work, teaching, and sanity checks before
moving to the estimator classes.
Available functions#
entropy_linear(A)entropy_kernel(Y, r, metric="chebyshev")entropy_binning(Y, c, quantize, log_base="nat")
What they do#
entropy_linear#
Computes Gaussian entropy from the sample covariance matrix:
Use it when a fast Gaussian baseline is enough.
entropy_kernel#
Approximates entropy from fixed-radius counts. It is useful when you want a
local geometric notion of uncertainty and are prepared to inspect sensitivity
with respect to r.
entropy_binning#
Approximates entropy by counting discrete states after optional quantization:
This is especially useful for symbolic or deliberately discretized processes.
How to use them#
import numpy as np
from xyz.univariate import entropy_binning, entropy_kernel, entropy_linear
y = np.random.randn(2000, 1)
print("Gaussian entropy:", entropy_linear(y))
print("Kernel entropy:", entropy_kernel(y, r=0.3))
print("Binned entropy:", entropy_binning(y.copy(), c=8, quantize=False))
When to use helper functions instead of estimators#
when you want a one-line numerical check,
when model selection or fitted attributes are unnecessary,
or when teaching the differences between Gaussian, kernel, and binned views of entropy.
Interactive example#
The figure below compares three univariate entropy estimates as the variance of the underlying Gaussian signal changes.
plotly-exec failed while running code: LinAlgError(‘0-dimensional array given. Array must be at least two-dimensional’)