An Infinite Hidden Markov Model With SimilarityBiased Transitions
Abstract
We describe a generalization of the Hierarchical Dirichlet Process Hidden Markov Model (HDPHMM) which is able to encode prior information that state transitions are more likely between "nearby" states. This is accomplished by defining a similarity function on the state space and scaling transition probabilities by pairwise similarities, thereby inducing correlations among the transition distributions. We present an augmented data representation of the model as a Markov Jump Process in which: (1) some jump attempts fail, and (2) the probability of success is proportional to the similarity between the source and destination states. This augmentation restores conditional conjugacy and admits a simple Gibbs sampler. We evaluate the model and inference method on a speaker diarization task and a "harmonic parsing" task using fourpart chorale data, as well as on several synthetic datasets, achieving favorable comparisons to existing models.
 Publication:

arXiv eprints
 Pub Date:
 July 2017
 arXiv:
 arXiv:1707.06756
 Bibcode:
 2017arXiv170706756R
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Artificial Intelligence;
 Computer Science  Machine Learning;
 Statistics  Methodology;
 G.3;
 I.2.6
 EPrint:
 16 pages, 4 figures, accepted to ICML 2017, includes supplemental appendix