Skip to content

Tutorials

Normal distribution

It is a frequent task to either sample from or evaluate the normal distribution. There are many ways to do this in existing libraries, like np.random.randn() or scipy.stats.norm. However, sometimes you deal with multivariate data of arbitrary shapes. What if you wanted to sample from N distinct multivariate Gaussians, each with their own covariances and means? Or perhaps you want to use the same covariance for all N distributions and then evaluate the pdf at M completely different locations?

The code snippet here shows how we generalize normal sampling and pdfs using a vectorized implementation.

Vectorized sample/pdf
import numpy as np
import uqtils as uq

ndim = 3
shape = (5, ndim)

means = np.random.randint(0, 10, size=shape).astype(np.float64)
cov = np.eye(ndim) * 0.1

samples = uq.normal_sample(means, cov, size=1000)     # (1000, 5, 3)
pdfs = uq.normal_pdf(samples, means, cov)             # (1000, 5)

fig, ax = uq.ndscatter(samples[:, 0, :])

Gradients

It is another frequent task to obtain first and second derivatives of objective functions, say for maximum likelihood estimation. The generalizations of 1st and 2nd derivatives for multivariate, vector-valued functions (i.e. multiple inputs and multiple outputs) are the Jacobian and Hessian matrices. It is convenient to have functions that evaluate the Jacobian/Hessian for arbitrary numbers of inputs or outputs, including the limiting case of a single input and single output (which gives the normal 1st and 2nd derivatives that we all know and love).

The code snippet here shows generalized finite-difference approximations of the Jacobian/Hessian for arbitrary input/output dimensions. The implementation is also vectorized over any number of extra axes, allowing you to evaluate multiple Jacobians/Hessians in one go.

Vectorized Jacobians/Hessians
import numpy as np
import uqtils as uq

# 1d example
def f(x):
    return np.sin(x)

x0 = 1.0
df_dx = uq.approx_jac(f, x0)
d2f_dx2 = uq.approx_hess(f, x0)

# Multivariate example
n_in, n_out = 3, 2
def f(x):
    x0, x1, x2 = [x[..., i] for i in range(n_in)]
    f0 = x0 * x1 + x2
    f1 = np.sin(x0)**2 + x2**3
    return np.concatenate((f0[..., np.newaxis], f1[..., np.newaxis]), axis=-1)

shape = (100, 5, n_in)
x0 = np.random.rand(*shape)
jac  = uq.approx_jac(f, x0)      # (100, 5, n_out, n_in)
hess = uq.approx_hess(f, x0)     # (100, 5, n_out, n_in, n_in)

Markov-Chain Monte Carlo

Here is an example of the delayed rejection adaptive Metropolis-Hastings (DRAM) algorithm for MCMC sampling. The implementation is vectorized over nwalkers, allowing multiple sampling chains to proceed in parallel.

MCMC sampling
import numpy as np
import uqtils as uq

def fun(x):
    mu = [1, 1]
    cov = [[0.5, -0.1], [-0.1, 0.5]]
    return uq.normal_pdf(x, mu, cov, logpdf=True)

nsamples, nwalkers, ndim = 1000, 4, 2
x0 = np.random.randn(nwalkers, ndim)
cov0 = np.eye(ndim)

samples, log_pdf, accepted = uq.dram(fun, x0, nsamples, cov0=cov0)

burn_in = int(0.1 * nsamples)
samples = samples[burn_in:, ...].reshape((-1, ndim))
fig, ax = uq.ndscatter(samples, plot2d='hist')

Sobol' sensitivity analysis

Here is an example of getting the Sobol' indices for the Ishigami test function.

Sobol' sensitivity analysis
import numpy as np
import uqtils as uq

model = lambda x: uq.ishigami(x)['y']
sampler = lambda shape: np.random.rand(*shape, 3) * (2 * np.pi) - np.pi
n_samples = 1000

S1, ST = uq.sobol_sa(model, sampler, n_samples)