Crate potpourri

source ·
Expand description

Package for models with discrete, unobservable latent variables that can be learned with the Expectation Maximization algorithm. The package aims at highest modularity to allow for easy experimentation for research such as adding parallelization on clusters and exploring new models

Conventions:

  • Traits: Capital letters and CamelCase, adjectives used as nouns that indicate a cabability.
  • Structs: Capital letters and CamelCase, nouns describing things and concepts
  • methods/functions: snake_case and imperatives or short, discriptive imperative clauses

Re-exports

Modules

Structs

  • Average log-likelihood. Used to meature convergence

Traits

  • Probabilistic mixables should implement this trait A mixture model has a discrete and unobservable variable (i.e., latent) variable associated with each data point. It can be interpreted as a pointer to the component of a mixture generated the sample. This component computes weights the components in the mixture, that is, the probability for each component that the next sample will be drawn from it. In case of non-probabilistic models (k-mm and SOM) this is irrelevant.