API Reference

Subpackages

Module contents

EPSIE is a toolkit for doing MCMCs using embarrasingly parallel Markov chains.

epsie.array2dict(array)[source]

Converts a structured array into a dictionary.

epsie.create_bit_generator(seed=None, stream=0)[source]

Creates an instance of a BIT_GENERATOR.

Parameters
  • seed (int, optional) – The seed to use. If seed is None (the default), will create a seed using create_seed().

  • stream (int, optional) – The stream to create the bit generator for. This allows multiple generators to exist with the same seed, but that produce different sets of random numbers. Default is 0.

Returns

The bit generator initialized with the given seed and stream.

Return type

BIT_GENERATOR

epsie.create_bit_generators(ngenerators, seed=None)[source]

Creates a collection of random bit generators.

The bit generators are different streams with the same seed. They are all statistically independent of each other, while still being reproducable.

Parameters
  • ngenerators (int) – The number of generators to create. Must be \(\geq\) 1.

  • seed (int, optional) – The seed to use. If none provided, will generate one using the system entropy.

Returns

List of BIT_GENERATOR.

Return type

list

epsie.create_seed(seed=None)[source]

Creates a seed for a numpy.random.SeedSequence.

Parameters

seed (int, optional) – If a seed is given, will just return it. Default is None, in which case a seed is created.

Returns

A seed to use.

Return type

int

epsie.dump_pickle_to_hdf(memfp, fp, path=None, dsetname='sampler_state')[source]

Dumps pickled data to an hdf5 file object.

Parameters
  • memfp (file object) – Bytes stream of pickled data.

  • fp (h5py.File) – An open hdf5 file handler. Must have write capability enabled.

  • path (str, optional) – The path (group name) to store the state dataset to. Default (None) will result in the array being stored to the top level.

  • dsetname (str, optional) – The name of the dataset to store the binary array to. Default is sampler_state.

epsie.dump_state(state, fp, path=None, dsetname='sampler_state', protocol=None)[source]

Dumps the given state to an hdf5 file handler.

The state is stored as a raw binary array to {path}/{dsetname} in the given hdf5 file handler. If a dataset with the same name and path is already in the file, the dataset will be resized and overwritten with the new state data.

Parameters
  • state (any picklable object) – The sampler state to dump to file. Can be the object returned by any of the samplers’ .state attribute (a dictionary of dictionaries), or any picklable object.

  • fp (h5py.File) – An open hdf5 file handler. Must have write capability enabled.

  • path (str, optional) – The path (group name) to store the state dataset to. Default (None) will result in the array being stored to the top level.

  • dsetname (str, optional) – The name of the dataset to store the binary array to. Default is sampler_state.

  • protocol (int, optional) – The protocol version to use for pickling. See the pickle module for more details.

epsie.load_state(fp, path=None, dsetname='sampler_state')[source]

Loads a sampler state from the given hdf5 file object.

The sampler state is expected to be stored as a raw bytes array which can be loaded by pickle.

Parameters
  • fp (h5py.File) – An open hdf5 file handler.

  • path (str, optional) – The path (group name) that the state data is stored to. Default (None) is to read from the top level.

  • dsetname (str, optional) – The name of the dataset that the state data is stored to. Default is sampler_state.

epsie.make_betas_ladder(ntemps, maxtemp)[source]

Makes a log spaced ladder of betas.