Here I have taken a random LogHMM with 2 states and 2 observables, and trained it on 3 random observation sequences repeatedly. It is trained on the first sequence for 20 iterations, then that model is trained on the second sequence, that on the third, and then back to the first.

The idea was to show that by repeatedly training it on the same sequences, it would eventually converge to a mutual local maxima, but here we see that no such thing takes place. It tends to become periodic after the first one or two repetitions. I will have to find a better way to train a model on multiple observation sequences.

Training Length 10

Example 1

Example 2

Example 3

Example 4

Example 5

Example 6

Example 7

Example 8

Example 9

Example 10

Training Length 20

Example 1

Example 2

Example 3

Example 4

Example 5

Example 6

Example 7

Example 8

Example 9

Example 10