Num weight bits = 18 learning rate = 10 initial_t = 1 power_t = 0.5 decay_learning_rate = 1 creating cache_file = train-sets/seq_small.cache Reading from train-sets/seq_small num sources = 1 average since sequence example current label current predicted current cur cur predic. examples loss last counter weight sequence prefix sequence prefix features pass pol made gener. 0.666667 0.666667 1 6.000000 [ 1 3 2 1 4 3 ] [ 1 1 1 1 1 1 ] 12 0 0 6 0 0.333333 0.000000 2 12.000000 [ 1 3 2 1 4 3 ] [ 1 3 2 1 4 3 ] 12 1 0 12 6 0.222222 0.000000 3 18.000000 [ 1 3 2 1 4 3 ] [ 1 3 2 1 4 3 ] 12 2 0 18 12 0.166667 0.000000 4 24.000000 [ 1 3 2 1 4 3 ] [ 1 3 2 1 4 3 ] 12 3 0 24 18 0.083333 0.000000 8 48.000000 [ 1 3 2 1 4 3 ] [ 1 3 2 1 4 3 ] 12 7 1 51 42 finished run number of examples = 12 weighted example sum = 72 weighted label sum = 0 average loss = 0.05556 best constant = -0.01408 total feature number = 144