-
Notifications
You must be signed in to change notification settings - Fork 29
/
5.txt
23684 lines (22669 loc) · 845 KB
/
5.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Using Theano backend.
WARNING (theano.sandbox.cuda): The cuda backend is deprecated and will be removed in the next release (v0.10). Please switch to the gpuarray backend. You can get more information about how to switch at this URL:
https://github.com/Theano/Theano/wiki/Converting-to-the-new-gpu-back-end%28gpuarray%29
Using gpu device 3: GeForce GTX 1080 Ti (CNMeM is disabled, cuDNN Mixed dnn version. The header is from one version, but we link with a different version (5005, 7102))
52
10
31
3107
3016
(3016, 300)
(1284, 20, 300)
(1284, 20, 5)
(1284, 20, 20)
(229, 20, 300)
(229, 20, 5)
(229, 20, 20)
(686, 20, 300)
(686, 20, 5)
(686, 20, 20)
[{'input_dims': [300, 5, 20], 'batchsize': 128, 'memsize': 400, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [128, 80, 16], 'momentum': 0.8}, {'shapes': 256, 'drop': 0.7}, {'shapes': 256, 'drop': 0.2}, {'shapes': 32, 'drop': 0.7}, {'shapes': 256, 'drop': 0.7}, {'shapes': 64, 'drop': 0.2}]
0 1.25001141429 1.1815636158 saving model
1 0.981080555916 1.08055782318 saving model
2 0.875533723831 1.00793361664 saving model
3 0.797413694859 1.01143610477
4 0.663908427954 0.927207350731 saving model
5 0.590053504705 0.915789186954 saving model
6 0.556165334582 0.986885547638
7 0.52845762074 0.908334314823 saving model
8 0.484459596872 0.882100462914 saving model
9 0.490174385905 0.924098849297
10 0.529712080956 1.03430652618
11 0.560218316317 0.973799407482
12 0.555620360374 0.956020057201
13 0.49897788167 0.940641701221
14 0.470193067193 0.948909580708
15 0.482360112667 0.960877239704
16 0.491836881638 0.949177026749
17 0.421313926578 0.897920608521
18 0.358544147015 0.922874093056
19 0.37776247561 0.908108890057
20 0.365319958329 0.984310209751
21 0.392173588276 0.904241681099
22 0.349086335301 0.939124584198
23 0.34341455102 0.925287902355
24 0.33810532093 0.921541392803
25 0.334470441937 0.926017820835
26 0.308081197739 0.90407204628
27 0.276602081954 0.90489923954
28 0.277040100098 0.917842149734
29 0.283753812313 0.915242135525
30 0.269514860213 0.916318595409
31 0.289224103093 0.907495081425
32 0.290980258584 0.921942830086
33 0.267672276497 0.951681613922
34 0.262762266397 0.909435451031
35 0.257708507776 0.88266992569
36 0.257957613468 0.872039914131 saving model
37 0.260146340728 0.928619623184
38 0.248771050572 0.925473570824
39 0.250855331123 0.888370096684
40 0.224962422252 0.909427046776
41 0.238081285357 0.906105339527
42 0.22538061291 0.920670032501
43 0.208777153492 0.917890131474
44 0.216868250072 0.907926738262
45 0.203382831812 0.914423942566
46 0.213256537914 0.898414194584
47 0.207984633744 0.909126758575
48 0.216211912036 0.897545933723
49 0.234456312656 0.895347833633
model number is: 24997
mae: 1.0452287185
corr: 0.590473583704
mult_acc: 0.31633
mult f_score: 0.32414
Confusion Matrix :
[[284 95]
[ 86 221]]
Classification Report :
precision recall f1-score support
False 0.76757 0.74934 0.75834 379
True 0.69937 0.71987 0.70947 307
micro avg 0.73615 0.73615 0.73615 686
macro avg 0.73347 0.73461 0.73391 686
weighted avg 0.73705 0.73615 0.73647 686
Accuracy 0.736151603499
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 300, 'windowsize': 2, 'lr': 0.002, 'num_epochs': 50, 'h_dims': [64, 64, 32], 'momentum': 0.6}, {'shapes': 32, 'drop': 0.7}, {'shapes': 128, 'drop': 0.7}, {'shapes': 32, 'drop': 0.7}, {'shapes': 128, 'drop': 0.2}, {'shapes': 32, 'drop': 0.0}]
0 1.30768988729 1.32813930511 saving model
1 1.0487702772 1.1079647541 saving model
2 0.876673679054 1.04134261608 saving model
3 0.779019124061 0.987764298916 saving model
4 0.694241388887 0.987543344498 saving model
5 0.646560006589 0.97952067852 saving model
6 0.605894200504 0.95003592968 saving model
7 0.550390403718 0.957932770252
8 0.524436338991 0.920567035675 saving model
9 0.479024907202 0.939472019672
10 0.482111854106 0.997963428497
11 0.48315558508 0.95294123888
12 0.42375465408 0.924069881439
13 0.396397808939 0.939365148544
14 0.373697366193 0.993176281452
15 0.360032499209 0.967750370502
16 0.346289873868 0.984195172787
17 0.394651720673 1.01731276512
18 0.404284193367 0.979457676411
19 0.341228505597 0.955283880234
20 0.300776009634 0.942800998688
21 0.261419230327 0.956486225128
22 0.242553240061 0.938595712185
23 0.277220678329 0.952788233757
24 0.293043853715 1.0273706913
25 0.259680123627 1.00008940697
26 0.233356601 0.983517706394
27 0.203129076958 0.971563756466
28 0.204739720002 0.988723218441
29 0.234063583612 0.981164455414
30 0.263397706673 0.99816775322
31 0.227550782263 0.998910427094
32 0.199973164871 0.998895823956
33 0.212420818582 0.960309147835
34 0.186524748057 0.930965721607
35 0.210408709012 0.997936666012
36 0.199857991561 1.01797056198
37 0.192520235851 0.967353403568
38 0.181295597926 0.942654967308
39 0.165701475739 0.956309497356
40 0.171040118858 0.983714818954
41 0.193121903203 1.00490427017
42 0.169259593636 0.972331762314
43 0.171502716094 0.950822412968
44 0.212423925474 0.924476563931
45 0.227063301206 0.973221898079
46 0.231609024107 0.996343016624
47 0.217283548787 0.961578845978
48 0.221233126894 0.959631085396
49 0.211520653218 0.97636204958
model number is: 64944
mae: 0.990673117869
corr: 0.632151341174
mult_acc: 0.35423
mult f_score: 0.36473
Confusion Matrix :
[[301 78]
[ 84 223]]
Classification Report :
precision recall f1-score support
False 0.78182 0.79420 0.78796 379
True 0.74086 0.72638 0.73355 307
micro avg 0.76385 0.76385 0.76385 686
macro avg 0.76134 0.76029 0.76076 686
weighted avg 0.76349 0.76385 0.76361 686
Accuracy 0.763848396501
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 64, 'windowsize': 2, 'lr': 0.001, 'num_epochs': 50, 'h_dims': [128, 64, 64], 'momentum': 0.3}, {'shapes': 256, 'drop': 0.0}, {'shapes': 32, 'drop': 0.2}, {'shapes': 128, 'drop': 0.7}, {'shapes': 256, 'drop': 0.0}, {'shapes': 256, 'drop': 0.0}]
0 1.3195476532 1.3925011158 saving model
1 1.2715814352 1.33452868462 saving model
2 1.17065896988 1.19367897511 saving model
3 1.02047168016 1.14483261108 saving model
4 0.956348192692 1.11061894894 saving model
5 0.893869304657 1.09229803085 saving model
6 0.854640042782 1.06240689754 saving model
7 0.818321883678 1.04256618023 saving model
8 0.785671782494 1.02864956856 saving model
9 0.759244406223 1.02177548409 saving model
10 0.73049955368 1.02512359619
11 0.71533613205 1.02867043018
12 0.706657958031 1.04472470284
13 0.721572971344 1.02626621723
14 0.707422423363 1.04961979389
15 0.655809414387 1.02911758423
16 0.615452742577 1.03293931484
17 0.611201930046 1.02479219437
18 0.598788022995 1.02319180965
19 0.569539964199 1.03576052189
20 0.550089585781 1.04573071003
21 0.575121974945 1.03114879131
22 0.567225778103 1.04224276543
23 0.541265541315 1.0308008194
24 0.479878616333 1.04169845581
25 0.460327762365 1.03768324852
26 0.46397010684 1.03406488895
27 0.462498760223 1.04369592667
28 0.485468417406 1.03924238682
29 0.449419116974 1.05195987225
30 0.422486060858 1.0157289505 saving model
31 0.373106783628 1.03485429287
32 0.372446489334 1.04075300694
33 0.34168548584 1.01286125183 saving model
34 0.333442902565 1.04081583023
35 0.332548850775 1.0569409132
36 0.359360742569 1.01875424385
37 0.353250437975 1.05025029182
38 0.364730387926 1.0247066021
39 0.309413996339 1.02570021152
40 0.302817583084 1.01270163059 saving model
41 0.290793222189 1.02002871037
42 0.27721080482 1.02151310444
43 0.289848440886 1.00968587399 saving model
44 0.271795618534 1.03012859821
45 0.259143650532 1.01189517975
46 0.253567910194 1.01117777824
47 0.236824166775 1.01338684559
48 0.241079139709 1.01738762856
49 0.280778700113 1.01757144928
model number is: 51113
mae: 1.04069653387
corr: 0.608310307482
mult_acc: 0.31924
mult f_score: 0.32005
Confusion Matrix :
[[282 97]
[ 94 213]]
Classification Report :
precision recall f1-score support
False 0.75000 0.74406 0.74702 379
True 0.68710 0.69381 0.69044 307
micro avg 0.72157 0.72157 0.72157 686
macro avg 0.71855 0.71894 0.71873 686
weighted avg 0.72185 0.72157 0.72170 686
Accuracy 0.721574344023
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 64, 'windowsize': 2, 'lr': 0.008, 'num_epochs': 50, 'h_dims': [256, 8, 16], 'momentum': 0.3}, {'shapes': 64, 'drop': 0.2}, {'shapes': 32, 'drop': 0.2}, {'shapes': 128, 'drop': 0.5}, {'shapes': 32, 'drop': 0.2}, {'shapes': 64, 'drop': 0.2}]
0 1.18765443116 1.11168038845 saving model
1 0.937712582946 1.04245507717 saving model
2 0.814570222795 1.02373623848 saving model
3 0.724710731208 0.958516657352 saving model
4 0.639929158241 0.937026381493 saving model
5 0.564054016024 0.963490962982
6 0.539381270111 0.941008031368
7 0.500705775619 1.01057136059
8 0.507912301272 0.944897532463
9 0.516626821458 1.01605987549
10 0.517554729432 0.98688441515
11 0.521182475984 1.0022790432
12 0.476589672267 0.966814160347
13 0.444892945141 0.982326567173
14 0.451589201391 0.949109315872
15 0.411509362236 0.985093295574
16 0.393419319391 1.12048923969
17 0.466404984891 1.08424127102
18 0.495797618479 0.951840877533
19 0.412613075972 0.944432914257
20 0.404733662307 0.964715063572
21 0.386356840283 0.972290694714
22 0.360622625053 0.986365497112
23 0.385389594734 0.972783386707
24 0.352777932957 1.03610885143
25 0.359138645232 0.949586808681
26 0.334421185404 0.923638761044 saving model
27 0.302122130617 0.944239377975
28 0.302422622219 0.940558969975
29 0.311755022034 0.970401406288
30 0.319569024071 0.977670490742
31 0.31491898261 0.97163015604
32 0.327279949188 0.979028820992
33 0.369176822156 0.935754954815
34 0.343433396146 0.99231249094
35 0.339053057879 0.993698358536
36 0.35696621649 0.977135539055
37 0.31401508823 1.01989126205
38 0.330183626339 0.983710169792
39 0.342473169789 0.976063132286
40 0.380719681084 0.984969735146
41 0.339053703845 0.995283484459
42 0.321015901864 0.983294010162
43 0.342414674163 0.989170372486
44 0.334835469723 0.967579603195
45 0.309264838323 0.977527618408
46 0.291782534868 0.962030887604
47 0.304518140852 0.97584092617
48 0.282928675413 0.951184332371
49 0.325260804594 0.968567371368
model number is: 62599
mae: 1.05443589712
corr: 0.588100328431
mult_acc: 0.31341
mult f_score: 0.32178
Confusion Matrix :
[[269 110]
[ 94 213]]
Classification Report :
precision recall f1-score support
False 0.74105 0.70976 0.72507 379
True 0.65944 0.69381 0.67619 307
micro avg 0.70262 0.70262 0.70262 686
macro avg 0.70024 0.70179 0.70063 686
weighted avg 0.70453 0.70262 0.70319 686
Accuracy 0.702623906706
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 128, 'windowsize': 2, 'lr': 0.005, 'num_epochs': 50, 'h_dims': [128, 8, 80], 'momentum': 0.8}, {'shapes': 32, 'drop': 0.2}, {'shapes': 64, 'drop': 0.2}, {'shapes': 64, 'drop': 0.5}, {'shapes': 128, 'drop': 0.5}, {'shapes': 32, 'drop': 0.7}]
0 1.27100487947 1.15393793583 saving model
1 1.10234043598 1.04641866684 saving model
2 0.989410825074 1.09150779247
3 0.913098697364 0.996479392052 saving model
4 0.836624903977 0.965931773186 saving model
5 0.796350947022 0.962861061096 saving model
6 0.760359039903 0.926107883453 saving model
7 0.714901064336 0.94787055254
8 0.70454595983 0.939433276653
9 0.65503225252 0.952751219273
10 0.632762221247 0.982768774033
11 0.60211089626 0.984152853489
12 0.607657513022 0.975792944431
13 0.589355698228 0.983610212803
14 0.627283420414 0.973725378513
15 0.618117712438 0.976402163506
16 0.595266877115 0.987099707127
17 0.575790932029 0.970465362072
18 0.552141501755 0.983153343201
19 0.55514844656 0.97984546423
20 0.534663172811 0.974554955959
21 0.536453671008 0.966592311859
22 0.546506907046 0.93357694149
23 0.539057855308 0.950061619282
24 0.522280078381 0.964344143867
25 0.508704908937 0.95944994688
26 0.509511800855 0.955329716206
27 0.484789340943 0.95965641737
28 0.493760217726 0.922565698624 saving model
29 0.488151393086 0.937715888023
30 0.487053194642 0.936301827431
31 0.482223503292 0.935941457748
32 0.485018160939 0.945272505283
33 0.48801747635 0.94589060545
34 0.495609501004 0.963962197304
35 0.502864551544 0.917982876301 saving model
36 0.489291853458 0.9437520504
37 0.477250619233 0.94070482254
38 0.505660180002 0.928785860538
39 0.49191326499 0.935328722
40 0.470447666943 0.924922168255
41 0.48493681848 0.951813697815
42 0.476141947508 0.965186715126
43 0.470131225884 0.95864123106
44 0.471278175712 0.940729022026
45 0.459723939747 0.981064379215
46 0.480832191557 0.995678246021
47 0.468370340019 0.956069767475
48 0.465189407766 0.951274633408
49 0.472032907605 0.96987760067
model number is: 24824
mae: 1.0102341833
corr: 0.613178891033
mult_acc: 0.33236
mult f_score: 0.34306
Confusion Matrix :
[[285 94]
[ 94 213]]
Classification Report :
precision recall f1-score support
False 0.75198 0.75198 0.75198 379
True 0.69381 0.69381 0.69381 307
micro avg 0.72595 0.72595 0.72595 686
macro avg 0.72289 0.72289 0.72289 686
weighted avg 0.72595 0.72595 0.72595 686
Accuracy 0.725947521866
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 256, 'windowsize': 2, 'lr': 0.008, 'num_epochs': 50, 'h_dims': [156, 64, 16], 'momentum': 0.5}, {'shapes': 64, 'drop': 0.5}, {'shapes': 256, 'drop': 0.0}, {'shapes': 128, 'drop': 0.7}, {'shapes': 32, 'drop': 0.2}, {'shapes': 32, 'drop': 0.7}]
0 1.29051225185 1.27234899998 saving model
1 1.15296103954 1.2180634737 saving model
2 1.10743079185 1.23333621025
3 1.04705102444 1.14573121071 saving model
4 1.01548085213 1.16850304604
5 0.957000637054 1.01022076607 saving model
6 0.857898712158 1.00611555576 saving model
7 0.800144588947 0.978390753269 saving model
8 0.77759141922 1.02594041824
9 0.72902173996 1.00667309761
10 0.688861095905 0.971097826958 saving model
11 0.693989264965 1.00629353523
12 0.66445505619 1.00895726681
13 0.659655869007 1.02480399609
14 0.615817809105 0.975501179695
15 0.602621626854 0.990851581097
16 0.599755525589 1.00310099125
17 0.562838935852 0.989993035793
18 0.566470098495 0.971386134624
19 0.525996029377 0.979843497276
20 0.517606806755 0.958706915379 saving model
21 0.513923674822 0.956650435925 saving model
22 0.490429711342 0.951092422009 saving model
23 0.514896452427 0.959121704102
24 0.520912957191 0.985336184502
25 0.502485835552 0.951930701733
26 0.494182085991 0.939700603485 saving model
27 0.500946342945 0.970577836037
28 0.501612818241 0.961349844933
29 0.50111746788 0.959498524666
30 0.485204750299 0.950196266174
31 0.489804804325 0.958089530468
32 0.473830097914 0.959234774113
33 0.493422061205 0.95361739397
34 0.482315284014 0.942515552044
35 0.470590484142 0.949960768223
36 0.470788472891 0.968174159527
37 0.484881544113 0.98075991869
38 0.453681784868 0.942503750324
39 0.477064758539 0.959133088589
40 0.463630884886 0.940487980843
41 0.452823185921 0.975121021271
42 0.462593096495 0.960475921631
43 0.448688358068 1.00509595871
44 0.463132023811 0.977611303329
45 0.450976234674 0.971876740456
46 0.460336059332 0.971968472004
47 0.434373503923 0.971446335316
48 0.439561134577 0.987038850784
49 0.456748342514 0.956203877926
model number is: 10906
mae: 1.05016816071
corr: 0.581317181771
mult_acc: 0.31195
mult f_score: 0.33098
Confusion Matrix :
[[279 100]
[ 97 210]]
Classification Report :
precision recall f1-score support
False 0.74202 0.73615 0.73907 379
True 0.67742 0.68404 0.68071 307
micro avg 0.71283 0.71283 0.71283 686
macro avg 0.70972 0.71009 0.70989 686
weighted avg 0.71311 0.71283 0.71296 686
Accuracy 0.712827988338
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 400, 'windowsize': 2, 'lr': 0.002, 'num_epochs': 50, 'h_dims': [128, 8, 64], 'momentum': 0.3}, {'shapes': 128, 'drop': 0.0}, {'shapes': 256, 'drop': 0.2}, {'shapes': 32, 'drop': 0.5}, {'shapes': 128, 'drop': 0.5}, {'shapes': 256, 'drop': 0.5}]
0 1.31172456741 1.38277518749 saving model
1 1.24073476791 1.23637318611 saving model
2 1.07002658844 1.14768242836 saving model
3 1.06043044329 1.26612794399
4 1.03633333445 1.17790007591
5 0.987748527527 1.12792658806 saving model
6 0.903712248802 1.1061013937 saving model
7 0.871011602879 1.05971384048 saving model
8 0.82012193203 1.03458333015 saving model
9 0.806233727932 1.03390097618 saving model
10 0.761703109741 0.997638165951 saving model
11 0.72004121542 0.999933302402
12 0.704626369476 1.01205420494
13 0.682437241077 1.00220727921
14 0.629996705055 0.986156880856 saving model
15 0.598939561844 1.006965518
16 0.564076113701 1.00968754292
17 0.543253910542 1.01728355885
18 0.524460780621 1.01902520657
19 0.583154177666 1.02182102203
20 0.597071182728 0.995537161827
21 0.478062075377 0.98925024271
22 0.444548910856 0.993008673191
23 0.422773140669 1.0143686533
24 0.419237548113 1.02005887032
25 0.393191409111 1.0445420742
26 0.398138773441 1.01119589806
27 0.375626945496 1.01747322083
28 0.363803607225 1.02741372585
29 0.334088110924 1.04792273045
30 0.325079119205 1.02252197266
31 0.318542546034 1.01600670815
32 0.300577247143 1.02566349506
33 0.285706716776 1.01246523857
34 0.265138056874 1.02091550827
35 0.267516672611 1.01990091801
36 0.262726736069 1.01696419716
37 0.30310767889 1.01954734325
38 0.306868797541 1.01600003242
39 0.299991112947 1.04124593735
40 0.278588074446 1.02621436119
41 0.263121688366 1.02089333534
42 0.237858536839 1.01840138435
43 0.229881677032 1.02463841438
44 0.21601639986 1.0114620924
45 0.220336014032 1.01508378983
46 0.203920409083 1.01666009426
47 0.200571003556 1.01345682144
48 0.210556516051 1.02737545967
49 0.209190708399 1.01601362228
model number is: 69808
mae: 1.03791463539
corr: 0.615697893129
mult_acc: 0.30758
mult f_score: 0.32362
Confusion Matrix :
[[286 93]
[ 89 218]]
Classification Report :
precision recall f1-score support
False 0.76267 0.75462 0.75862 379
True 0.70096 0.71010 0.70550 307
micro avg 0.73469 0.73469 0.73469 686
macro avg 0.73182 0.73236 0.73206 686
weighted avg 0.73505 0.73469 0.73485 686
Accuracy 0.734693877551
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 128, 'windowsize': 2, 'lr': 0.002, 'num_epochs': 50, 'h_dims': [88, 16, 32], 'momentum': 0.9}, {'shapes': 64, 'drop': 0.5}, {'shapes': 64, 'drop': 0.7}, {'shapes': 128, 'drop': 0.5}, {'shapes': 64, 'drop': 0.5}, {'shapes': 128, 'drop': 0.0}]
0 1.17966791093 1.17416656017 saving model
1 0.933487276733 1.14785003662 saving model
2 0.838008153439 1.15136182308
3 0.788555711508 1.01736176014 saving model
4 0.712746103853 1.00132870674 saving model
5 0.683567675203 1.0290607214
6 0.602910218388 1.03513503075
7 0.563948085159 1.08858668804
8 0.545454163849 1.02882099152
9 0.483529401571 0.980244278908 saving model
10 0.47657821551 1.02845776081
11 0.442559383065 1.02496767044
12 0.399736988544 1.00677895546
13 0.343974143267 1.03708970547
14 0.364397021756 1.02418029308
15 0.33065280728 1.01443600655
16 0.331630922109 1.01713633537
17 0.339470651001 1.03943681717
18 0.401771552116 1.06361472607
19 0.360640660301 1.10150814056
20 0.3621665176 1.04393088818
21 0.353552385047 0.983015298843
22 0.320911093429 0.963678181171 saving model
23 0.249942932278 0.981160283089
24 0.2471968472 0.971205592155
25 0.244349526614 1.02359485626
26 0.295575486496 1.03334569931
27 0.288353201747 0.965964257717
28 0.253846465796 0.994215786457
29 0.213635395467 0.982635259628
30 0.197504989803 0.986733138561
31 0.241127854958 0.996337652206
32 0.251100753993 0.998084068298
33 0.24867301248 0.99860394001
34 0.206846784428 0.997927606106
35 0.188551032543 0.983975887299
36 0.197385953553 0.993113279343
37 0.223804350384 0.974225997925
38 0.205905834585 1.0304197073
39 0.18038609121 1.01315498352
40 0.168661520444 1.00907289982
41 0.226251668483 1.00469613075
42 0.214669035375 1.02641391754
43 0.181748437136 1.04331088066
44 0.166459431313 1.03526377678
45 0.162917255424 1.02969944477
46 0.190145429038 1.02581214905
47 0.272082640231 1.0324857235
48 0.263036204129 1.06365132332
49 0.277160989121 1.07068645954
model number is: 63939
mae: 1.03149374351
corr: 0.583756131819
mult_acc: 0.32653
mult f_score: 0.33326
Confusion Matrix :
[[299 80]
[117 190]]
Classification Report :
precision recall f1-score support
False 0.71875 0.78892 0.75220 379
True 0.70370 0.61889 0.65858 307
micro avg 0.71283 0.71283 0.71283 686
macro avg 0.71123 0.70391 0.70539 686
weighted avg 0.71202 0.71283 0.71030 686
Accuracy 0.712827988338
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 128, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [256, 32, 80], 'momentum': 0.8}, {'shapes': 64, 'drop': 0.0}, {'shapes': 64, 'drop': 0.7}, {'shapes': 64, 'drop': 0.5}, {'shapes': 32, 'drop': 0.0}, {'shapes': 128, 'drop': 0.7}]
0 1.28125556707 1.14139473438 saving model
1 1.0680153504 1.0703356266 saving model
2 0.933975279331 1.08236384392
3 0.869293229282 0.978283107281 saving model
4 0.844571565092 0.994686067104
5 0.766746370494 1.00604045391
6 0.716481995583 0.981161057949
7 0.672271065414 0.931509315968 saving model
8 0.658995944262 0.950249791145
9 0.629280348122 0.97506582737
10 0.600637063384 0.933869719505
11 0.615580929816 0.996474504471
12 0.601017653197 0.979766964912
13 0.56215140298 1.00299143791
14 0.57901731357 0.967438697815
15 0.555424139649 0.941493809223
16 0.5459441863 0.996901988983
17 0.558590207994 1.03474330902
18 0.565889149159 0.996370434761
19 0.548791985959 1.03586220741
20 0.633462116122 0.997202038765
21 0.644559521228 1.02635467052
22 0.608092411608 1.00846147537
23 0.530993077904 0.998211324215
24 0.507049576938 0.98446136713
25 0.512037947774 0.973496735096
26 0.497200126201 0.984866082668
27 0.497567818314 0.984368383884
28 0.487782128155 0.962101578712
29 0.481925690919 0.996374964714
30 0.472101493925 0.985032916069
31 0.496768132597 1.01189553738
32 0.462441653758 1.00182402134
33 0.470106741041 1.00676560402
34 0.458491519094 0.986630380154
35 0.465104772151 0.974037110806
36 0.469884984195 0.98366856575
37 0.49659069553 0.995353102684
38 0.440979039669 0.986201107502
39 0.449142076075 1.00567412376
40 0.437861122936 1.01284980774
41 0.462953584641 1.02703642845
42 0.453002097458 1.01058411598
43 0.427660766989 1.02096939087
44 0.429580371827 1.00369691849
45 0.452681862563 1.00930893421
46 0.422265515476 0.974228918552
47 0.468727052957 1.00208854675
48 0.446139442176 1.02675783634
49 0.447491047531 1.03136873245
model number is: 32352
mae: 1.01272497879
corr: 0.61748726242
mult_acc: 0.32216
mult f_score: 0.34233
Confusion Matrix :
[[300 79]
[106 201]]
Classification Report :
precision recall f1-score support
False 0.73892 0.79156 0.76433 379
True 0.71786 0.65472 0.68484 307
micro avg 0.73032 0.73032 0.73032 686
macro avg 0.72839 0.72314 0.72458 686
weighted avg 0.72949 0.73032 0.72876 686
Accuracy 0.730320699708
[{'input_dims': [300, 5, 20], 'batchsize': 128, 'memsize': 400, 'windowsize': 2, 'lr': 0.001, 'num_epochs': 50, 'h_dims': [256, 80, 16], 'momentum': 0.8}, {'shapes': 32, 'drop': 0.5}, {'shapes': 256, 'drop': 0.5}, {'shapes': 32, 'drop': 0.2}, {'shapes': 128, 'drop': 0.7}, {'shapes': 64, 'drop': 0.0}]
0 1.30883706808 1.37106323242 saving model
1 1.17780979872 1.20094263554 saving model
2 1.01957657337 1.14543700218 saving model
3 0.935217857361 1.12451815605 saving model
4 0.906687569618 1.13299179077
5 0.921946400404 1.0860850811 saving model
6 0.905892318487 1.10647928715
7 0.827986103296 1.06457138062 saving model
8 0.766497182846 1.01734530926 saving model
9 0.709401887655 0.994723916054 saving model
10 0.66894775629 0.984024643898 saving model
11 0.630987411737 1.00475299358
12 0.585687738657 1.00700640678
13 0.55386813879 1.02887320518
14 0.525533342361 1.04629445076
15 0.514896532893 1.00091338158
16 0.505560806394 1.05984592438
17 0.491264143586 1.02310025692
18 0.469701349735 1.0427005291
19 0.485442665219 1.02055907249
20 0.434486997128 1.04377257824
21 0.411867707968 1.03652346134
22 0.358114972711 1.04260110855
23 0.345463687181 1.05737543106
24 0.314491128922 1.0499266386
25 0.367529594898 1.07919490337
26 0.432735395432 1.05294919014
27 0.479768207669 1.02958381176
28 0.456685176492 1.01746916771
29 0.493335571885 1.004098773
30 0.356167349219 0.970006942749 saving model
31 0.291618257761 0.97216385603
32 0.255771929026 0.983643472195
33 0.224186696112 0.977397680283
34 0.229306605458 0.982817590237
35 0.239581412077 1.03494012356
36 0.268956975639 1.05178916454
37 0.300939550996 0.99077385664
38 0.274238663912 1.04345357418
39 0.274277837574 1.0309535265
40 0.29495446682 1.05368375778
41 0.270642490685 1.01193261147
42 0.22166313827 1.0302836895
43 0.192358824611 0.992030262947
44 0.20426672101 1.0139169693
45 0.272589582205 1.00648772717
46 0.269734352827 1.0301502943
47 0.204548275471 1.01356983185
48 0.178853304684 0.988499581814
49 0.180544362962 0.996508896351
model number is: 35237
mae: 1.04099788525
corr: 0.600639993979
mult_acc: 0.30029
mult f_score: 0.30883
Confusion Matrix :
[[274 105]
[ 86 221]]
Classification Report :
precision recall f1-score support
False 0.76111 0.72296 0.74154 379
True 0.67791 0.71987 0.69826 307
micro avg 0.72157 0.72157 0.72157 686
macro avg 0.71951 0.72141 0.71990 686
weighted avg 0.72388 0.72157 0.72217 686
Accuracy 0.721574344023
/usr/local/lib/python2.7/dist-packages/sklearn/metrics/classification.py:1143: UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no predicted samples.
'precision', 'predicted', average, warn_for)
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 300, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [256, 48, 32], 'momentum': 0.8}, {'shapes': 128, 'drop': 0.5}, {'shapes': 128, 'drop': 0.5}, {'shapes': 32, 'drop': 0.5}, {'shapes': 64, 'drop': 0.5}, {'shapes': 256, 'drop': 0.2}]
0 1.22249298245 1.20755910873 saving model
1 0.974904733896 1.13770127296 saving model
2 0.876865439117 1.02915430069 saving model
3 0.765383684635 1.05748093128
4 0.707404544204 1.00807058811 saving model
5 0.670175192505 1.01284205914
6 0.63158473298 1.00379526615 saving model
7 0.583025544882 0.962561309338 saving model
8 0.55714135617 0.917912781239 saving model
9 0.528695438057 0.957450389862
10 0.509799337387 0.991874456406
11 0.466183237731 1.02882432938
12 0.445253012329 0.991804122925
13 0.434794456512 0.964269816875
14 0.40999976024 1.006752491
15 0.371687108278 0.961398601532
16 0.42401778996 0.925559639931
17 0.396000831574 1.02426934242
18 0.426597514749 0.989769697189
19 0.513327948749 0.960933923721
20 0.419659718126 0.9506483078
21 0.388202723116 0.90856975317 saving model
22 0.381997342408 0.916200339794
23 0.384352151304 0.923399269581
24 0.387912187725 0.922598898411
25 0.361469059438 0.978991925716
26 0.33368325159 0.93335211277
27 0.339858773723 0.901228189468 saving model
28 0.324146094173 0.89837872982 saving model
29 0.313460652903 0.8843075037 saving model
30 0.296346765384 0.905044555664
31 0.313547138497 0.888565480709
32 0.302243196219 0.883297741413 saving model
33 0.351811213791 0.939002871513
34 0.335071337968 0.943829715252
35 0.378758510575 0.982328057289
36 0.342880720645 0.917846679688
37 0.413258440793 0.991606116295
38 0.37035904713 1.04256892204
39 0.356683964655 0.95444047451
40 0.299027251825 0.926144480705
41 0.303166126087 0.932577729225
42 0.296430433542 0.911957383156
43 0.322312304005 0.912788510323
44 0.338184631988 0.909014105797
45 0.307948505133 0.913823962212
46 0.291026894003 0.918176233768
47 0.304255343974 0.939275085926
48 0.306311724335 0.92064166069
49 0.281904216111 0.902369141579
model number is: 3906
mae: 1.0664300608
corr: 0.577065069356
mult_acc: 0.3105
mult f_score: 0.31406
Confusion Matrix :
[[278 101]
[ 92 215]]
Classification Report :
precision recall f1-score support
False 0.75135 0.73351 0.74232 379
True 0.68038 0.70033 0.69021 307
micro avg 0.71866 0.71866 0.71866 686
macro avg 0.71587 0.71692 0.71627 686
weighted avg 0.71959 0.71866 0.71900 686
Accuracy 0.718658892128
/usr/local/lib/python2.7/dist-packages/sklearn/metrics/classification.py:1145: UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no true samples.
'recall', 'true', average, warn_for)
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 128, 'windowsize': 2, 'lr': 0.001, 'num_epochs': 50, 'h_dims': [256, 32, 16], 'momentum': 0.5}, {'shapes': 256, 'drop': 0.2}, {'shapes': 64, 'drop': 0.7}, {'shapes': 32, 'drop': 0.0}, {'shapes': 64, 'drop': 0.7}, {'shapes': 256, 'drop': 0.2}]
0 1.24993257821 1.15963625908 saving model
1 0.985471318662 1.0578994751 saving model
2 0.85176371187 0.991401076317 saving model
3 0.766741212457 0.996259570122
4 0.713701346517 1.03075301647
5 0.668201853335 0.968281626701 saving model
6 0.614632793516 0.982253491879
7 0.594344989955 1.00899636745
8 0.581522742659 0.976875007153
9 0.536564036459 1.00038421154
10 0.542058861256 1.01174771786
11 0.494192077965 1.07104790211
12 0.511037055403 1.10341310501
13 0.54930883795 1.05969810486
14 0.460196167231 1.07445597649
15 0.401893181354 1.03754448891
16 0.407674774528 0.993642151356
17 0.375916469842 1.02695250511
18 0.351186555624 0.996803462505
19 0.342326721549 0.978021204472
20 0.335166727751 1.01974916458
21 0.315777127817 1.02013790607
22 0.313430521637 1.01606965065
23 0.324396162108 1.01272845268
24 0.314007882774 1.00756812096
25 0.267983844131 1.00787389278
26 0.248547996581 1.03044080734
27 0.268066467717 1.0031015873
28 0.269471711665 0.997968494892
29 0.25869715102 0.990842461586
30 0.256171777844 0.996243417263
31 0.245162516087 1.05332052708
32 0.256051248312 1.02660357952
33 0.258334624767 0.996321380138
34 0.241208542511 0.992003321648
35 0.253656174988 1.04190099239
36 0.256468141451 1.03855609894
37 0.236688097194 1.02675008774
38 0.215820450708 0.989781498909
39 0.230335506797 1.00485825539
40 0.234263657406 0.995730996132
41 0.300580814108 1.02484536171
42 0.273987896368 1.01968586445
43 0.278462914005 1.031701684
44 0.296699841693 1.01268863678
45 0.251339229196 1.04671931267
46 0.268883948028 1.06960642338
47 0.250582752377 1.05386924744
48 0.250231994689 1.03440284729
49 0.237463410571 1.03085911274
model number is: 64562
mae: 1.07456645405
corr: 0.619354510968
mult_acc: 0.31633
mult f_score: 0.33522
Confusion Matrix :
[[270 109]
[ 75 232]]
Classification Report :
precision recall f1-score support
False 0.78261 0.71240 0.74586 379
True 0.68035 0.75570 0.71605 307
micro avg 0.73178 0.73178 0.73178 686
macro avg 0.73148 0.73405 0.73095 686
weighted avg 0.73685 0.73178 0.73252 686
Accuracy 0.731778425656
[{'input_dims': [300, 5, 20], 'batchsize': 64, 'memsize': 64, 'windowsize': 2, 'lr': 0.008, 'num_epochs': 50, 'h_dims': [88, 8, 8], 'momentum': 0.3}, {'shapes': 256, 'drop': 0.7}, {'shapes': 32, 'drop': 0.0}, {'shapes': 128, 'drop': 0.0}, {'shapes': 64, 'drop': 0.2}, {'shapes': 64, 'drop': 0.5}]
0 1.22668139338 1.21552288532 saving model
1 1.03471276462 1.09380960464 saving model
2 0.903931540251 1.04720258713 saving model
3 0.798592934012 1.06329607964
4 0.758952867985 0.974589645863 saving model
5 0.656365200877 0.917575955391 saving model
6 0.641765059531 0.937527060509
7 0.579449723661 0.982351362705
8 0.561158929765 0.917849481106
9 0.56864476651 0.903386414051 saving model
10 0.508814145625 0.901554465294 saving model
11 0.484098084271 0.934728085995
12 0.482385236025 0.955872356892
13 0.554918801785 1.0001128912
14 0.55779902488 0.974173486233
15 0.551387122273 0.948176681995
16 0.482891497016 0.943856060505
17 0.450077337027 0.974274933338
18 0.420922905207 0.973399460316
19 0.39124044776 0.941447019577
20 0.379640096426 0.946981966496
21 0.397897656262 0.930577993393
22 0.387937183678 0.93506705761
23 0.401897747815 0.93339651823
24 0.38028575927 0.925377309322
25 0.413825069368 0.967388033867
26 0.390968093276 0.978320538998
27 0.403095762432 0.989640176296
28 0.389323683083 0.96126139164
29 0.364619767666 0.934763491154
30 0.376892632246 0.9386318326
31 0.376217490435 0.9581848979
32 0.41461443305 0.982243180275
33 0.41798401475 1.00710320473
34 0.429571411014 1.03250944614
35 0.416264638305 1.06372606754
36 0.454884788394 1.05662560463
37 0.505362994969 1.00369787216
38 0.470461255312 0.960874199867
39 0.431647180021 0.974041998386
40 0.418183071911 0.97690320015
41 0.391702069342 1.00900173187
42 0.386250965297 0.980181336403
43 0.36298199594 0.950149595737
44 0.369142763317 0.953932583332
45 0.367165002227 0.937518239021
46 0.361475767195 0.937397778034
47 0.340669183433 0.975588202477
48 0.370266003907 1.01691782475
49 0.385131123662 1.0264172554
model number is: 36526
mae: 1.08133578967
corr: 0.579848764819
mult_acc: 0.32362
mult f_score: 0.33693
Confusion Matrix :
[[265 114]
[ 82 225]]
Classification Report :
precision recall f1-score support
False 0.76369 0.69921 0.73003 379
True 0.66372 0.73290 0.69659 307
micro avg 0.71429 0.71429 0.71429 686
macro avg 0.71370 0.71605 0.71331 686
weighted avg 0.71895 0.71429 0.71507 686
Accuracy 0.714285714286
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 400, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [88, 48, 8], 'momentum': 0.6}, {'shapes': 32, 'drop': 0.5}, {'shapes': 256, 'drop': 0.0}, {'shapes': 256, 'drop': 0.7}, {'shapes': 128, 'drop': 0.7}, {'shapes': 32, 'drop': 0.0}]
0 1.32146382332 1.3353126049 saving model
1 1.22771332264 1.23421859741 saving model
2 1.04402723312 1.13664615154 saving model
3 0.886824846268 1.09551167488 saving model
4 0.804821491241 1.02665567398 saving model
5 0.764425086975 1.07556259632
6 0.801451134682 0.986496865749 saving model
7 0.632470107079 0.983466148376 saving model
8 0.5749314785 0.950269520283 saving model
9 0.516041982174 0.958903610706
10 0.492898362875 0.923401474953 saving model
11 0.476447826624 0.950195550919
12 0.469448530674 0.932460546494
13 0.408927363157 0.946363449097
14 0.370329660177 0.945915997028
15 0.337876719236 0.930290460587
16 0.317324590683 0.939445137978
17 0.29162247777 0.937457680702
18 0.262438055873 0.910726845264 saving model
19 0.2611102283 0.909459769726 saving model
20 0.22495906651 0.921080231667
21 0.232289132476 0.904887378216 saving model
22 0.235516998172 0.933023631573
23 0.22295845747 0.921361505985
24 0.187702006102 0.909528553486
25 0.184517025948 0.912598609924
26 0.17900723815 0.937612831593
27 0.191566359997 0.922341406345
28 0.16923879981 0.905984699726
29 0.153091832995 0.92389780283
30 0.149756005406 0.924592673779
31 0.148594510555 0.91895544529
32 0.155157232285 0.921135962009
33 0.168172442913 0.916193962097
34 0.143680460751 0.930437922478
35 0.123268419504 0.916506707668
36 0.120629449189 0.919531524181
37 0.124405971169 0.927171349525
38 0.142850703001 0.944027483463
39 0.134862528741 0.923605620861
40 0.124659980834 0.912611663342
41 0.115564714372 0.924898922443
42 0.109423291683 0.925328671932
43 0.108148214221 0.924560427666
44 0.100713944435 0.92524933815
45 0.125985087454 0.928552210331
46 0.123266498744 0.932896494865
47 0.10929069221 0.913948953152
48 0.0938871979713 0.917682886124
49 0.100973063707 0.929018199444
model number is: 55897
mae: 1.06103604225
corr: 0.573113763292
mult_acc: 0.31195
mult f_score: 0.3217
Confusion Matrix :
[[292 87]
[ 99 208]]
Classification Report :
precision recall f1-score support
False 0.74680 0.77045 0.75844 379
True 0.70508 0.67752 0.69103 307
micro avg 0.72886 0.72886 0.72886 686