Num weight bits = 18 learning rate = 0.5 initial_t = 0 power_t = 0.5 using no cache Reading datafile = train-sets/0001.dat num sources = 1 average since example example current current current loss last counter weight label predict features 0.626139 0.626139 1 1.0 1.0000 0.2087 51 0.593221 0.560302 2 2.0 0.0000 0.7485 104 0.415928 0.238635 4 4.0 0.0000 0.4452 135 0.315393 0.214858 8 8.0 0.0000 0.3680 146 0.277266 0.239140 16 16.0 1.0000 0.3655 24 0.257687 0.238107 32 32.0 0.0000 0.3811 32 0.247343 0.237000 64 64.0 0.0000 0.3351 61 0.246230 0.245117 128 128.0 1.0000 0.5023 106 finished run number of examples per pass = 200 passes used = 1 weighted example sum = 200 weighted label sum = 91 average loss = 0.238693 best constant = 0.455 best constant's loss = 0.247975 total feature number = 15482