-
Notifications
You must be signed in to change notification settings - Fork 29
/
3.txt
23684 lines (22669 loc) · 846 KB
/
3.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Using Theano backend.
WARNING (theano.sandbox.cuda): The cuda backend is deprecated and will be removed in the next release (v0.10). Please switch to the gpuarray backend. You can get more information about how to switch at this URL:
https://github.com/Theano/Theano/wiki/Converting-to-the-new-gpu-back-end%28gpuarray%29
Using gpu device 1: GeForce GTX 1080 Ti (CNMeM is disabled, cuDNN Mixed dnn version. The header is from one version, but we link with a different version (5005, 7102))
52
10
31
3107
3016
(3016, 300)
(1284, 20, 300)
(1284, 20, 5)
(1284, 20, 20)
(229, 20, 300)
(229, 20, 5)
(229, 20, 20)
(686, 20, 300)
(686, 20, 5)
(686, 20, 20)
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 128, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [128, 64, 80], 'momentum': 0.1}, {'shapes': 128, 'drop': 0.7}, {'shapes': 128, 'drop': 0.0}, {'shapes': 32, 'drop': 0.7}, {'shapes': 32, 'drop': 0.0}, {'shapes': 32, 'drop': 0.7}]
0 1.293531847 1.2159756422 saving model
1 1.08064309359 1.10467457771 saving model
2 1.02292493582 1.11291801929
3 0.939090752602 1.0505926609 saving model
4 0.935581862926 1.09703993797
5 0.875301980972 0.95872348547 saving model
6 0.795811331272 0.987953662872
7 0.762345898151 0.96333694458
8 0.720205819607 0.997708916664
9 0.706245470047 0.983618438244
10 0.663969027996 0.950282931328 saving model
11 0.670211410522 0.961319684982
12 0.64778803587 0.94674551487 saving model
13 0.594671738148 0.926246881485 saving model
14 0.594743108749 0.933847367764
15 0.560138404369 0.953705906868
16 0.550632345676 0.942867875099
17 0.52513871789 0.926109969616 saving model
18 0.52678129077 0.934467673302
19 0.527994537354 0.932425439358
20 0.537560760975 0.937466800213
21 0.506332731247 0.927522599697
22 0.521729576588 0.963396787643
23 0.525565373898 0.898387014866 saving model
24 0.509677857161 0.940208077431
25 0.502279835939 0.932919979095
26 0.510614132881 0.924518406391
27 0.507844281197 0.939171016216
28 0.462592422962 0.947459816933
29 0.461731773615 0.967113494873
30 0.476988065243 0.9200989604
31 0.472156071663 0.974092841148
32 0.472020220757 0.938948333263
33 0.492165112495 1.00572609901
34 0.482648491859 0.929272532463
35 0.48086066246 0.955540239811
36 0.467816472054 0.917089760303
37 0.4578363657 0.934033632278
38 0.450193160772 0.94237279892
39 0.470565223694 0.94458425045
40 0.462973719835 0.935756087303
41 0.462576216459 0.953740000725
42 0.47501924634 0.93727260828
43 0.447610121965 0.960941374302
44 0.4455732584 0.951284885406
45 0.436916780472 0.943760752678
46 0.436753863096 0.926632165909
47 0.425842779875 0.964660763741
48 0.46153883934 0.941421985626
49 0.44361743331 0.969520807266
model number is: 79666
mae: 1.05661809758
corr: 0.604032602081
mult_acc: 0.30321
mult f_score: 0.33431
Confusion Matrix :
[[274 105]
[ 79 228]]
Classification Report :
precision recall f1-score support
False 0.77620 0.72296 0.74863 379
True 0.68468 0.74267 0.71250 307
micro avg 0.73178 0.73178 0.73178 686
macro avg 0.73044 0.73281 0.73057 686
weighted avg 0.73525 0.73178 0.73246 686
Accuracy 0.731778425656
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 128, 'windowsize': 2, 'lr': 0.001, 'num_epochs': 50, 'h_dims': [128, 64, 64], 'momentum': 0.8}, {'shapes': 64, 'drop': 0.2}, {'shapes': 64, 'drop': 0.2}, {'shapes': 32, 'drop': 0.7}, {'shapes': 256, 'drop': 0.7}, {'shapes': 32, 'drop': 0.5}]
0 1.30875994563 1.34452152252 saving model
1 1.09701587409 1.13336586952 saving model
2 0.96336362958 1.10789835453 saving model
3 0.907641409338 1.09080708027 saving model
4 0.857720398903 1.01121091843 saving model
5 0.794773347676 1.00907683372 saving model
6 0.750316031277 0.975692212582 saving model
7 0.698030729592 0.985982298851
8 0.673501311243 1.00744378567
9 0.645394934714 0.995186626911
10 0.65199778825 1.02460479736
11 0.61898688525 1.01418876648
12 0.609898671508 0.987272679806
13 0.562342794985 0.992069423199
14 0.55199065432 0.993048012257
15 0.509773125499 1.00211763382
16 0.483158572763 0.974431157112 saving model
17 0.485117812455 0.968001723289 saving model
18 0.47068348825 0.987053394318
19 0.462631452084 1.01207315922
20 0.482963553071 1.01150131226
21 0.452910387516 1.02699458599
22 0.46100307256 0.986517131329
23 0.415341224521 1.00690150261
24 0.406629517674 0.986917674541
25 0.394323404133 0.982565462589
26 0.383660572022 0.974270284176
27 0.383090592176 0.98778706789
28 0.377207827196 1.0091599226
29 0.386788061261 0.973020553589
30 0.376180484146 0.998263955116
31 0.376787886769 0.995400667191
32 0.370308246464 0.986927568913
33 0.363257677853 0.978755712509
34 0.356378483772 1.00817036629
35 0.3559192691 0.95413172245 saving model
36 0.35576434806 0.988498926163
37 0.352201276273 1.01298451424
38 0.366817545891 0.959967494011
39 0.367604134232 0.977706372738
40 0.356821011752 0.999153077602
41 0.350813508779 1.01428318024
42 0.338011109456 0.983201026917
43 0.344609028846 0.972855746746
44 0.347155642509 0.961441099644
45 0.337128537521 0.989633202553
46 0.330222949386 0.95996594429
47 0.334361349046 0.963209688663
48 0.325151142105 0.968442440033
49 0.318335439265 0.971174240112
model number is: 16469
mae: 1.05063528761
corr: 0.59469800481
mult_acc: 0.30175
mult f_score: 0.31455
Confusion Matrix :
[[280 99]
[ 90 217]]
Classification Report :
precision recall f1-score support
False 0.75676 0.73879 0.74766 379
True 0.68671 0.70684 0.69663 307
micro avg 0.72449 0.72449 0.72449 686
macro avg 0.72173 0.72281 0.72215 686
weighted avg 0.72541 0.72449 0.72482 686
Accuracy 0.724489795918
[{'input_dims': [300, 5, 20], 'batchsize': 128, 'memsize': 128, 'windowsize': 2, 'lr': 0.008, 'num_epochs': 50, 'h_dims': [64, 32, 48], 'momentum': 0.6}, {'shapes': 32, 'drop': 0.2}, {'shapes': 256, 'drop': 0.5}, {'shapes': 256, 'drop': 0.5}, {'shapes': 256, 'drop': 0.0}, {'shapes': 256, 'drop': 0.7}]
0 1.29937428236 1.22945606709 saving model
1 1.04318573475 1.15382516384 saving model
2 0.940403687954 1.02754175663 saving model
3 0.820582360029 0.962307274342 saving model
4 0.77513781786 1.0028141737
5 0.713963115215 0.999525845051
6 0.656205302477 0.96804690361
7 0.602342927456 0.988277316093
8 0.553634822369 0.964510440826
9 0.533712154627 0.999946296215
10 0.509157305956 0.963586747646
11 0.506782925129 1.03757190704
12 0.46960708797 0.962627410889
13 0.467574128509 0.992213904858
14 0.444238859415 0.970498323441
15 0.469853425026 0.977526962757
16 0.536692655087 0.990583658218
17 0.644097194076 1.25329113007
18 0.699694234133 1.02505338192
19 0.490863996744 1.02151417732
20 0.447174084187 1.00042903423
21 0.436766561866 0.994074702263
22 0.436068865657 0.978174030781
23 0.435726907849 0.995161354542
24 0.412993058562 0.96821308136
25 0.417812022567 0.976856172085
26 0.373945832253 0.939818263054 saving model
27 0.368053814769 0.994064807892
28 0.382614925504 1.01150155067
29 0.376770988107 0.9655123353
30 0.398614346981 0.963021874428
31 0.423473897576 1.00057637691
32 0.469765600562 0.94497948885
33 0.404391658306 1.00423252583
34 0.415084281564 0.947913706303
35 0.391131451726 0.961998045444
36 0.354521653056 0.963798344135
37 0.331836980581 0.96606862545
38 0.331542205811 0.97974139452
39 0.343195071816 0.940284073353
40 0.317180019617 0.945425629616
41 0.317069283128 0.958027422428
42 0.332536014915 0.948087573051
43 0.314099526405 0.95529282093
44 0.325275138021 0.972223579884
45 0.341331771016 0.948814094067
46 0.324643528461 0.954046964645
47 0.332363969088 0.950395464897
48 0.331674411893 0.936078846455 saving model
49 0.305243775249 0.972489953041
model number is: 87636
mae: 1.02311394298
corr: 0.591001202443
mult_acc: 0.33819
mult f_score: 0.34151
Confusion Matrix :
[[283 96]
[101 206]]
Classification Report :
precision recall f1-score support
False 0.73698 0.74670 0.74181 379
True 0.68212 0.67101 0.67652 307
micro avg 0.71283 0.71283 0.71283 686
macro avg 0.70955 0.70886 0.70916 686
weighted avg 0.71243 0.71283 0.71259 686
Accuracy 0.712827988338
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 256, 'windowsize': 2, 'lr': 0.002, 'num_epochs': 50, 'h_dims': [88, 48, 32], 'momentum': 0.8}, {'shapes': 256, 'drop': 0.5}, {'shapes': 32, 'drop': 0.5}, {'shapes': 32, 'drop': 0.0}, {'shapes': 64, 'drop': 0.0}, {'shapes': 64, 'drop': 0.5}]
0 1.28196635991 1.19841384888 saving model
1 1.03144022524 1.11048650742 saving model
2 0.897053228319 1.0518655777 saving model
3 0.796924553812 1.00336778164 saving model
4 0.751535464823 1.02207672596
5 0.703404641151 1.03127348423
6 0.648121550679 0.953482806683 saving model
7 0.65139914155 0.981941342354
8 0.633319564909 0.981835782528
9 0.586965334415 1.00166869164
10 0.669585734606 1.0348739624
11 0.626705625653 1.03269565105
12 0.567504294962 0.997885048389
13 0.493714514375 0.978436470032
14 0.436050711572 0.951424062252 saving model
15 0.421789184958 0.970957875252
16 0.43392271921 1.00195002556
17 0.40212123245 0.961722016335
18 0.385148139298 0.970955550671
19 0.359170077741 0.970508337021
20 0.366513786465 0.958610594273
21 0.378461608291 0.942933619022 saving model
22 0.353247963637 0.965143322945
23 0.359460446239 0.963479280472
24 0.359756688774 0.961337864399
25 0.336364842206 0.959712386131
26 0.352283871174 0.965493023396
27 0.340914520621 0.979080379009
28 0.333195589855 0.963599443436
29 0.350460026786 0.970707774162
30 0.311457204074 0.99368417263
31 0.312110357732 1.00405550003
32 0.320687206835 0.932520449162 saving model
33 0.318285892159 0.930450856686 saving model
34 0.302740643919 0.940436899662
35 0.298554717004 0.947636187077
36 0.288524575531 0.916186094284 saving model
37 0.298734341189 0.949454605579
38 0.298846380785 0.954531311989
39 0.303624003008 0.964928090572
40 0.323699083179 0.999274194241
41 0.289503368363 0.99708032608
42 0.306162371859 0.940391898155
43 0.322111165896 0.946433663368
44 0.288364803791 0.93995475769
45 0.291292775795 0.936091780663
46 0.274404438585 0.964311242104
47 0.280489456281 0.948107123375
48 0.294734369218 0.96572726965
49 0.275446489081 0.984754383564
model number is: 30702
mae: 1.03235842634
corr: 0.597680070775
mult_acc: 0.31195
mult f_score: 0.31556
Confusion Matrix :
[[291 88]
[ 94 213]]
Classification Report :
precision recall f1-score support
False 0.75584 0.76781 0.76178 379
True 0.70764 0.69381 0.70066 307
micro avg 0.73469 0.73469 0.73469 686
macro avg 0.73174 0.73081 0.73122 686
weighted avg 0.73427 0.73469 0.73443 686
Accuracy 0.734693877551
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 300, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [156, 16, 80], 'momentum': 0.6}, {'shapes': 64, 'drop': 0.2}, {'shapes': 256, 'drop': 0.0}, {'shapes': 32, 'drop': 0.7}, {'shapes': 256, 'drop': 0.5}, {'shapes': 32, 'drop': 0.5}]
0 1.30834608078 1.29546165466 saving model
1 1.25942893028 1.26989901066 saving model
2 1.12992472649 1.22879171371 saving model
3 1.08099794388 1.16053521633 saving model
4 0.970155215263 1.07281327248 saving model
5 0.905527806282 1.05679857731 saving model
6 0.84272172451 1.02432978153 saving model
7 0.799630784988 0.999671518803 saving model
8 0.74456923008 0.985985457897 saving model
9 0.700516998768 0.98275411129 saving model
10 0.646260404587 0.940052866936 saving model
11 0.592183768749 0.955489218235
12 0.579792809486 0.970515549183
13 0.544608592987 0.946506679058
14 0.531240987778 0.968661367893
15 0.52410889864 0.990860402584
16 0.542827361822 0.963394701481
17 0.522728049755 0.965950608253
18 0.488202512264 0.928620934486 saving model
19 0.443873023987 0.936328589916
20 0.423342376947 0.920839905739 saving model
21 0.41352071166 0.961275875568
22 0.424519789219 0.927833735943
23 0.43658195138 0.964092850685
24 0.408295756578 0.938682079315
25 0.396126931906 0.954010784626
26 0.387146538496 0.94213616848
27 0.39694994688 0.941154062748
28 0.390150523186 0.925131976604
29 0.372905731201 0.929291844368
30 0.370107209682 0.973874926567
31 0.358444952965 0.916546523571 saving model
32 0.360579156876 0.943872690201
33 0.346356117725 0.932070016861
34 0.359852290154 0.932663679123
35 0.34906129241 0.939832150936
36 0.339012426138 0.937743484974
37 0.33701364398 0.922438204288
38 0.325554901361 0.913492679596 saving model
39 0.344227778912 0.963684797287
40 0.326415616274 0.937889933586
41 0.338098329306 0.94634437561
42 0.321825611591 0.931649982929
43 0.326291388273 0.977101325989
44 0.334284883738 0.945838212967
45 0.314556831121 0.950385034084
46 0.331794637442 0.929444491863
47 0.314252066612 0.955842375755
48 0.312550067902 0.950932264328
49 0.312040251493 0.944995343685
model number is: 25984
mae: 1.04897363651
corr: 0.604441926463
mult_acc: 0.32507
mult f_score: 0.3425
Confusion Matrix :
[[271 108]
[ 82 225]]
Classification Report :
precision recall f1-score support
False 0.76771 0.71504 0.74044 379
True 0.67568 0.73290 0.70312 307
micro avg 0.72303 0.72303 0.72303 686
macro avg 0.72169 0.72397 0.72178 686
weighted avg 0.72652 0.72303 0.72374 686
Accuracy 0.723032069971
/usr/local/lib/python2.7/dist-packages/sklearn/metrics/classification.py:1145: UndefinedMetricWarning: F-score is ill-defined and being set to 0.0 in labels with no true samples.
'recall', 'true', average, warn_for)
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 400, 'windowsize': 2, 'lr': 0.005, 'num_epochs': 50, 'h_dims': [156, 32, 64], 'momentum': 0.3}, {'shapes': 32, 'drop': 0.0}, {'shapes': 64, 'drop': 0.5}, {'shapes': 64, 'drop': 0.5}, {'shapes': 128, 'drop': 0.7}, {'shapes': 64, 'drop': 0.7}]
0 1.3093234539 1.35067868233 saving model
1 1.16414895058 1.21208202839 saving model
2 1.05404495001 1.18024480343 saving model
3 1.01261539459 1.09287643433 saving model
4 0.952659153938 1.06990170479 saving model
5 0.901660251617 1.07552468777
6 0.873966515064 1.01709747314 saving model
7 0.847261762619 1.01298999786 saving model
8 0.796055805683 0.981549501419 saving model
9 0.784921586514 0.993173062801
10 0.723877370358 0.974009275436 saving model
11 0.709298694134 1.00067937374
12 0.700318646431 0.992850601673
13 0.696157062054 1.03566420078
14 0.666175091267 0.994560658932
15 0.621507608891 0.995929896832
16 0.577560710907 1.00016570091
17 0.549828195572 1.00642597675
18 0.540499663353 0.971049368382 saving model
19 0.524137866497 0.959271848202 saving model
20 0.509931194782 0.977424323559
21 0.505261719227 0.963867068291
22 0.483962005377 1.00621318817
23 0.4754152596 0.967064499855
24 0.469319564104 1.00683641434
25 0.468938851357 0.973881363869
26 0.440354925394 0.989182889462
27 0.446318203211 0.986796498299
28 0.430079519749 0.989633858204
29 0.41447006464 0.973306596279
30 0.420886605978 0.972964346409
31 0.404169178009 0.968035817146
32 0.398546284437 0.934862434864 saving model
33 0.382495462894 0.971442997456
34 0.367666053772 0.958334684372
35 0.363355058432 0.958805978298
36 0.362034994364 0.982151269913
37 0.376992857456 0.969713389874
38 0.354149091244 0.979403018951
39 0.361888772249 0.98475331068
40 0.359692811966 0.961834907532
41 0.373777157068 0.959996521473
42 0.398939394951 0.967283308506
43 0.361303877831 0.97603982687
44 0.37870387435 0.96108943224
45 0.371181708574 0.986518800259
46 0.353161811829 0.960599899292
47 0.359993082285 0.982558012009
48 0.358262741566 0.960492551327
49 0.350036656857 0.947540521622
model number is: 49779
mae: 1.07763674964
corr: 0.567925215839
mult_acc: 0.27697
mult f_score: 0.29963
Confusion Matrix :
[[288 91]
[ 97 210]]
Classification Report :
precision recall f1-score support
False 0.74805 0.75989 0.75393 379
True 0.69767 0.68404 0.69079 307
micro avg 0.72595 0.72595 0.72595 686
macro avg 0.72286 0.72197 0.72236 686
weighted avg 0.72551 0.72595 0.72567 686
Accuracy 0.725947521866
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 256, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [256, 16, 48], 'momentum': 0.3}, {'shapes': 32, 'drop': 0.5}, {'shapes': 32, 'drop': 0.0}, {'shapes': 256, 'drop': 0.5}, {'shapes': 128, 'drop': 0.5}, {'shapes': 32, 'drop': 0.5}]
0 1.29131424725 1.20541870594 saving model
1 1.10096101612 1.1246407032 saving model
2 0.926051403582 1.03841114044 saving model
3 0.82706720531 1.04633319378
4 0.780466014147 1.02088391781 saving model
5 0.718743462861 0.99675565958 saving model
6 0.707809343934 0.993093311787 saving model
7 0.660998111963 1.0016579628
8 0.678273547441 0.987783849239 saving model
9 0.642053765804 1.01141774654
10 0.576701910794 0.966816306114 saving model
11 0.558769582957 0.944684267044 saving model
12 0.582311582565 1.0172175169
13 0.566005631536 0.987033367157
14 0.547849030793 0.996692419052
15 0.520696244389 0.937914967537 saving model
16 0.505523592234 0.957861244678
17 0.501184841245 0.999245584011
18 0.490283019096 0.966197431087
19 0.487531197071 0.97860199213
20 0.454530242085 0.983277440071
21 0.473645292222 0.996549785137
22 0.483296753466 0.969478785992
23 0.46750633046 0.963282167912
24 0.46604341343 1.00593209267
25 0.47636590302 1.00568437576
26 0.462956907228 0.983153879642
27 0.473045677692 0.967484712601
28 0.489353780448 0.988903820515
29 0.480688558519 1.01070141792
30 0.459996665269 0.964982330799
31 0.462305928767 0.947107970715
32 0.444660595059 0.972941160202
33 0.429367634654 0.986534416676
34 0.476033788174 1.04243957996
35 0.443640314788 0.982167601585
36 0.468122769892 1.01486444473
37 0.452446354181 0.93486136198 saving model
38 0.471068006754 0.936853408813
39 0.427042324096 0.954900145531
40 0.457493836433 0.988539099693
41 0.448677925766 1.01821219921
42 0.425310911238 0.990345478058
43 0.432943240553 0.951631188393
44 0.41072980687 0.987186908722
45 0.4182012178 0.962373673916
46 0.433309978992 0.983765244484
47 0.429056625068 0.993976056576
48 0.42586819008 0.994675278664
49 0.426206588745 0.984202027321
model number is: 80794
mae: 1.01706846503
corr: 0.606261865718
mult_acc: 0.32216
mult f_score: 0.32271
Confusion Matrix :
[[274 105]
[ 82 225]]
Classification Report :
precision recall f1-score support
False 0.76966 0.72296 0.74558 379
True 0.68182 0.73290 0.70644 307
micro avg 0.72741 0.72741 0.72741 686
macro avg 0.72574 0.72793 0.72601 686
weighted avg 0.73035 0.72741 0.72806 686
Accuracy 0.727405247813
[{'input_dims': [300, 5, 20], 'batchsize': 64, 'memsize': 128, 'windowsize': 2, 'lr': 0.001, 'num_epochs': 50, 'h_dims': [128, 32, 64], 'momentum': 0.3}, {'shapes': 128, 'drop': 0.0}, {'shapes': 128, 'drop': 0.5}, {'shapes': 32, 'drop': 0.5}, {'shapes': 256, 'drop': 0.5}, {'shapes': 64, 'drop': 0.2}]
0 1.29273434281 1.32354974747 saving model
1 1.05936661363 1.10020315647 saving model
2 0.938393616676 1.11372673512
3 0.898332646489 1.02673518658 saving model
4 0.808515101671 1.02486407757 saving model
5 0.762548229098 1.04524374008
6 0.719731265306 1.01750421524 saving model
7 0.675085151196 1.01561141014 saving model
8 0.636593359709 1.00171339512 saving model
9 0.636086991429 1.11339759827
10 0.637828421593 1.03922379017
11 0.666973406076 0.991088747978 saving model
12 0.571428398788 0.994287908077
13 0.523541788757 1.00749456882
14 0.48741825372 1.05248188972
15 0.48955232054 0.998635530472
16 0.472752191126 0.991861402988
17 0.436336815357 1.01284444332
18 0.436120201647 1.06181061268
19 0.401338270307 1.03395521641
20 0.390854012966 1.02072262764
21 0.37470715493 1.02108836174
22 0.386496014893 1.04610395432
23 0.434107735753 1.01927268505
24 0.375405961275 1.00434577465
25 0.39791559726 1.11010575294
26 0.384768280387 1.03718543053
27 0.399035950005 1.07459664345
28 0.389160320163 1.06857764721
29 0.398018717766 1.01539969444
30 0.34796276316 0.994714915752
31 0.320711369812 1.08520996571
32 0.323206323385 1.08750283718
33 0.338194349408 1.17439877987
34 0.404455137253 1.2478634119
35 0.484873966873 1.08470702171
36 0.453982207179 1.0662882328
37 0.407595032454 1.02142834663
38 0.317190609872 1.02173233032
39 0.278296338022 1.01697945595
40 0.257629480213 1.02081847191
41 0.243052633107 1.02872109413
42 0.226755533367 1.01368439198
43 0.222951186448 1.02485251427
44 0.225849194825 1.02925038338
45 0.208435282111 1.02127432823
46 0.211782423407 1.02003693581
47 0.215518333763 1.03627133369
48 0.233309916407 1.0187009573
49 0.228736686707 1.05586445332
model number is: 79470
mae: 0.997579342486
corr: 0.628229663983
mult_acc: 0.32507
mult f_score: 0.34371
Confusion Matrix :
[[315 64]
[106 201]]
Classification Report :
precision recall f1-score support
False 0.74822 0.83113 0.78750 379
True 0.75849 0.65472 0.70280 307
micro avg 0.75219 0.75219 0.75219 686
macro avg 0.75335 0.74293 0.74515 686
weighted avg 0.75282 0.75219 0.74959 686
Accuracy 0.752186588921
[{'input_dims': [300, 5, 20], 'batchsize': 256, 'memsize': 300, 'windowsize': 2, 'lr': 0.002, 'num_epochs': 50, 'h_dims': [32, 16, 32], 'momentum': 0.1}, {'shapes': 256, 'drop': 0.2}, {'shapes': 128, 'drop': 0.7}, {'shapes': 64, 'drop': 0.7}, {'shapes': 64, 'drop': 0.5}, {'shapes': 64, 'drop': 0.2}]
0 1.32216057777 1.40694534779 saving model
1 1.30464336872 1.38996648788 saving model
2 1.27333538532 1.34530627728 saving model
3 1.19409191608 1.23371577263 saving model
4 1.05237586498 1.12470018864 saving model
5 0.968768751621 1.08782529831 saving model
6 0.919361257553 1.09715247154
7 0.884920966625 1.07496345043 saving model
8 0.834439754486 1.04238128662 saving model
9 0.800069105625 1.02167129517 saving model
10 0.782655060291 1.01407444477 saving model
11 0.77158318758 1.04235315323
12 0.761876702309 1.02443349361
13 0.707494866848 1.01904964447
14 0.687209510803 1.00482475758 saving model
15 0.660813236237 1.02858293056
16 0.671609508991 1.01362347603
17 0.634866476059 1.01342332363
18 0.609552764893 1.01025605202
19 0.576118433475 1.01470541954
20 0.550227630138 1.02174782753
21 0.546050226688 1.02020800114
22 0.520787465572 1.02703404427
23 0.504217493534 1.03225588799
24 0.487834048271 1.03887104988
25 0.470937114954 1.05836367607
26 0.498819869757 1.06778156757
27 0.556183838844 1.02377295494
28 0.577080893517 1.04274129868
29 0.523206073046 1.03443825245
30 0.450943082571 1.02929186821
31 0.470637202263 1.03000092506
32 0.491406875849 1.0256165266
33 0.429110127687 1.03959584236
34 0.402076560259 1.03896343708
35 0.395725494623 1.03487181664
36 0.397781705856 1.0313334465
37 0.40452876687 1.03381729126
38 0.39697778821 1.0292942524
39 0.375476872921 1.04116153717
40 0.385163003206 1.04251623154
41 0.383533275127 1.04321932793
42 0.402869480848 1.04204893112
43 0.369142347574 1.05129742622
44 0.341933882236 1.04290914536
45 0.337292045355 1.03249371052
46 0.331912446022 1.03648900986
47 0.335145872831 1.04280352592
48 0.315351438522 1.03997898102
49 0.313902437687 1.02440285683
model number is: 57591
mae: 1.05073683206
corr: 0.60717062148
mult_acc: 0.30904
mult f_score: 0.34945
Confusion Matrix :
[[285 94]
[ 88 219]]
Classification Report :
precision recall f1-score support
False 0.76408 0.75198 0.75798 379
True 0.69968 0.71336 0.70645 307
micro avg 0.73469 0.73469 0.73469 686
macro avg 0.73188 0.73267 0.73222 686
weighted avg 0.73526 0.73469 0.73492 686
Accuracy 0.734693877551
[{'input_dims': [300, 5, 20], 'batchsize': 64, 'memsize': 300, 'windowsize': 2, 'lr': 0.005, 'num_epochs': 50, 'h_dims': [32, 64, 48], 'momentum': 0.9}, {'shapes': 256, 'drop': 0.7}, {'shapes': 256, 'drop': 0.2}, {'shapes': 256, 'drop': 0.0}, {'shapes': 256, 'drop': 0.7}, {'shapes': 256, 'drop': 0.0}]
0 1.22460617423 1.14124166965 saving model
1 0.927369201183 1.08820033073 saving model
2 0.846695548296 1.025578022 saving model
3 0.757558467984 1.10963952541
4 0.749009221792 0.983224391937 saving model
5 0.62870683521 0.933750450611 saving model
6 0.599974042177 0.95463603735
7 0.564697852731 0.981899619102
8 0.498288218677 0.992578208447
9 0.473900164664 0.973732769489
10 0.448495040834 0.943853497505
11 0.425385816395 0.98043435812
12 0.39534497261 1.02084267139
13 0.424123355746 0.966306328773
14 0.457363142073 1.02159333229
15 0.44433657676 1.02533626556
16 0.449885848165 1.02305006981
17 0.528835172951 1.07214260101
18 0.478556407988 1.03251922131
19 0.420472821593 1.03684675694
20 0.372563917935 0.963238954544
21 0.358176656812 0.955497443676
22 0.363036559522 0.985028326511
23 0.445583252609 0.962744832039
24 0.455928310752 1.08753359318
25 0.416172382236 1.05081915855
26 0.37832800895 0.998453676701
27 0.354268653691 0.986963152885
28 0.306042724103 0.968118786812
29 0.305567560345 1.05970728397
30 0.337134245038 1.03215992451
31 0.338599683344 0.979814827442
32 0.270839555562 0.992795586586
33 0.253199348599 0.983998060226
34 0.278818560392 1.01853787899
35 0.338468214869 1.07594180107
36 0.316424334794 1.00948381424
37 0.232749527693 1.00259590149
38 0.202972974628 0.973881542683
39 0.174210679531 0.994996964931
40 0.200982249528 1.032802701
41 0.261048240215 1.03151905537
42 0.286907170713 0.959962427616
43 0.225340475887 0.961612462997
44 0.175311348587 0.972891807556
45 0.152608346194 0.986933410168
46 0.165972696245 1.00512623787
47 0.2057392627 1.04801428318
48 0.232566271722 1.02196371555
49 0.213893442601 0.967395603657
model number is: 25598
mae: 1.01328047855
corr: 0.614304381589
mult_acc: 0.33382
mult f_score: 0.34971
Confusion Matrix :
[[299 80]
[ 91 216]]
Classification Report :
precision recall f1-score support
False 0.76667 0.78892 0.77763 379
True 0.72973 0.70358 0.71642 307
micro avg 0.75073 0.75073 0.75073 686
macro avg 0.74820 0.74625 0.74703 686
weighted avg 0.75014 0.75073 0.75024 686
Accuracy 0.750728862974
[{'input_dims': [300, 5, 20], 'batchsize': 128, 'memsize': 64, 'windowsize': 2, 'lr': 0.005, 'num_epochs': 50, 'h_dims': [88, 80, 80], 'momentum': 0.6}, {'shapes': 32, 'drop': 0.7}, {'shapes': 64, 'drop': 0.7}, {'shapes': 64, 'drop': 0.2}, {'shapes': 64, 'drop': 0.7}, {'shapes': 128, 'drop': 0.7}]
0 1.2807267189 1.19333899021 saving model
1 1.05475064516 1.1196590662 saving model
2 0.936995917559 1.05825543404 saving model
3 0.843654733896 1.02754056454 saving model
4 0.803505522013 1.005012393 saving model
5 0.757863432169 0.969483792782 saving model
6 0.715275335312 0.972550928593
7 0.659426128864 1.01358437538
8 0.623308444023 1.02988040447
9 0.607740849257 0.996052324772
10 0.567657992244 1.06466472149
11 0.565843316913 1.02892625332
12 0.53807605207 1.03966224194
13 0.577088287473 1.09575819969
14 0.635368514061 1.02690637112
15 0.652341908216 1.08557772636
16 0.648724138737 0.965734899044 saving model
17 0.519931548834 0.983175933361
18 0.424372127652 0.992496669292
19 0.42076074183 0.978351831436
20 0.430689549446 0.96852427721
21 0.411444163322 1.02219259739
22 0.449587979913 1.01152515411
23 0.416285517812 0.993945658207
24 0.415795642138 0.987369179726
25 0.405812245607 0.994135141373
26 0.382726383209 0.98836183548
27 0.395287391543 1.00552558899
28 0.409267291427 0.984129548073
29 0.359757933021 1.00896918774
30 0.349185684323 0.98704880476
31 0.319165283442 0.974693715572
32 0.329526284337 0.99141895771
33 0.338640958071 0.961127996445 saving model
34 0.350139009953 0.962502539158
35 0.329174599051 0.982896447182
36 0.342587286234 0.972533643246
37 0.382254648209 0.969799995422
38 0.347594669461 0.968197464943
39 0.35402084589 0.965107500553
40 0.335114195943 0.981969952583
41 0.357780462503 0.977514624596
42 0.316843593121 0.964986383915
43 0.321141451597 0.958160877228 saving model
44 0.328346425295 0.970755875111
45 0.321553090215 0.956878900528 saving model
46 0.314669969678 0.968571305275
47 0.312653547525 0.974496543407
48 0.289487403631 0.959570765495
49 0.285619521141 0.971359908581
model number is: 77455
mae: 1.03822410529
corr: 0.585502507503
mult_acc: 0.28863
mult f_score: 0.29476
Confusion Matrix :
[[292 87]
[105 202]]
Classification Report :
precision recall f1-score support
False 0.73552 0.77045 0.75258 379
True 0.69896 0.65798 0.67785 307
micro avg 0.72012 0.72012 0.72012 686
macro avg 0.71724 0.71421 0.71521 686
weighted avg 0.71916 0.72012 0.71914 686
Accuracy 0.720116618076
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 128, 'windowsize': 2, 'lr': 0.002, 'num_epochs': 50, 'h_dims': [64, 64, 64], 'momentum': 0.6}, {'shapes': 64, 'drop': 0.7}, {'shapes': 256, 'drop': 0.0}, {'shapes': 32, 'drop': 0.5}, {'shapes': 256, 'drop': 0.7}, {'shapes': 32, 'drop': 0.5}]
0 1.30528526306 1.35661435127 saving model
1 1.09736158848 1.18212866783 saving model
2 0.964912231266 1.0641272068 saving model
3 0.867004577816 1.0433460474 saving model
4 0.81272174865 0.993264257908 saving model
5 0.751267623901 0.975081503391 saving model
6 0.70446324423 1.024284482
7 0.670565244555 0.990805029869
8 0.638771039993 0.951385259628 saving model
9 0.620575831831 0.97442060709
10 0.590128141642 1.00444328785
11 0.618568243086 1.05250906944
12 0.607334679365 0.946198999882 saving model
13 0.54489890039 0.932319760323 saving model
14 0.514079128951 0.934067249298
15 0.489362117648 0.93948841095
16 0.456774245203 0.966464221478
17 0.44500599429 0.96560305357
18 0.467995204777 0.954566121101
19 0.43077718243 0.996150195599
20 0.413585100323 0.941171824932
21 0.417983846366 0.962643802166
22 0.405093925446 0.948395013809
23 0.40788994655 0.95298486948
24 0.383956772089 0.9615675807
25 0.377090305835 0.9682970047
26 0.387014556676 0.935282766819
27 0.374846290797 0.96232432127
28 0.359996184707 0.965380012989
29 0.368627528846 1.00176167488
30 0.363520473242 0.983727216721
31 0.36901376918 0.967957317829
32 0.372943415493 0.956427931786
33 0.363371903822 0.95380705595
34 0.363739638031 0.965019822121
35 0.363605153188 0.956120014191
36 0.372498907149 0.962764978409
37 0.351981629431 0.952625572681
38 0.385077441484 0.954496979713
39 0.361978683621 0.926951169968 saving model
40 0.364228960127 0.929142951965
41 0.371439372748 0.954643070698
42 0.372905727103 0.957263529301
43 0.368656228483 0.954585254192
44 0.346184672043 0.898487210274 saving model
45 0.346906692535 0.916549026966
46 0.339888100326 0.939862251282
47 0.329582549632 0.912624180317
48 0.33152673915 0.927028119564
49 0.330986251682 0.917435050011
model number is: 12533
mae: 1.02368158294
corr: 0.583607882596
mult_acc: 0.33236
mult f_score: 0.34148
Confusion Matrix :
[[304 75]
[106 201]]
Classification Report :
precision recall f1-score support
False 0.74146 0.80211 0.77060 379
True 0.72826 0.65472 0.68954 307
micro avg 0.73615 0.73615 0.73615 686
macro avg 0.73486 0.72842 0.73007 686
weighted avg 0.73555 0.73615 0.73432 686
Accuracy 0.736151603499
[{'input_dims': [300, 5, 20], 'batchsize': 32, 'memsize': 256, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [88, 48, 8], 'momentum': 0.3}, {'shapes': 64, 'drop': 0.5}, {'shapes': 128, 'drop': 0.0}, {'shapes': 32, 'drop': 0.5}, {'shapes': 64, 'drop': 0.2}, {'shapes': 256, 'drop': 0.0}]
0 1.15372007489 1.1093262434 saving model
1 0.878500127792 1.04449188709 saving model
2 0.734449157119 0.953779876232 saving model
3 0.690536361933 0.978826105595
4 0.609177104384 0.964240074158
5 0.567343952507 1.00213909149
6 0.592020208389 1.05257368088
7 0.556402145326 1.02995574474
8 0.486210630834 0.931168973446 saving model
9 0.476640831679 0.952557682991
10 0.422777741402 0.917669892311 saving model
11 0.427552551031 0.970193505287
12 0.370733515173 0.934316039085
13 0.347167679667 0.911393463612 saving model
14 0.33708104454 0.916469871998
15 0.318966430426 0.93387144804
16 0.335811078176 0.912075519562
17 0.337686568126 0.922506868839
18 0.301330160722 0.957137584686
19 0.34839890711 1.00669538975
20 0.40964416191 0.943315804005
21 0.402816987783 0.987601578236
22 0.346135897934 0.922018766403
23 0.265606807545 0.896471560001 saving model
24 0.251869171113 0.927938401699
25 0.255485020205 0.92230784893
26 0.289422014728 0.956148505211
27 0.23572957553 0.929834365845
28 0.198201682046 0.94172167778
29 0.211320913956 1.0041372776
30 0.215385807492 0.926240205765
31 0.252678003162 0.93217343092
32 0.252995826304 0.921528816223
33 0.336492915079 0.965805530548
34 0.255896919221 0.948213934898
35 0.246557623148 0.950480163097
36 0.246839248016 0.927297353745
37 0.230733636767 0.899312257767
38 0.265279337391 0.937791705132
39 0.265070659295 0.965108573437
40 0.228686385602 0.957369208336
41 0.233420318365 0.938922822475
42 0.243757121265 0.960972189903
43 0.266108799353 0.958786964417
44 0.24214406535 0.975419282913
45 0.217152747512 1.00181388855
46 0.23024459295 0.983658432961
47 0.254374688677 0.996871471405
48 0.247983846813 0.958216786385
49 0.257991049811 1.01108264923
model number is: 48557
mae: 1.05948548016
corr: 0.586536233846
mult_acc: 0.31341
mult f_score: 0.32189
Confusion Matrix :
[[262 117]
[ 79 228]]
Classification Report :
precision recall f1-score support
False 0.76833 0.69129 0.72778 379
True 0.66087 0.74267 0.69939 307
micro avg 0.71429 0.71429 0.71429 686
macro avg 0.71460 0.71698 0.71358 686
weighted avg 0.72024 0.71429 0.71507 686
Accuracy 0.714285714286
[{'input_dims': [300, 5, 20], 'batchsize': 128, 'memsize': 256, 'windowsize': 2, 'lr': 0.01, 'num_epochs': 50, 'h_dims': [256, 16, 48], 'momentum': 0.3}, {'shapes': 64, 'drop': 0.7}, {'shapes': 128, 'drop': 0.5}, {'shapes': 64, 'drop': 0.7}, {'shapes': 128, 'drop': 0.5}, {'shapes': 32, 'drop': 0.5}]
0 1.33397223949 1.33398854733 saving model
1 1.14889857769 1.14614975452 saving model
2 1.04641467333 1.19441831112
3 0.961173981428 1.15943312645
4 0.938265585899 1.04266619682 saving model
5 0.85112683773 1.04396522045
6 0.775751954317 1.08911550045
7 0.768182629347 1.0384105444 saving model
8 0.73183863163 1.08350861073
9 0.693971771002 0.985091984272 saving model
10 0.678410243988 1.07318437099
11 0.632065683603 0.969934165478 saving model
12 0.603920161724 0.96910905838 saving model
13 0.552716913819 1.03270506859
14 0.529972809553 0.989215910435
15 0.504046201706 0.994191348553
16 0.448705068231 0.982539415359
17 0.458793643117 1.00592315197
18 0.45956735909 1.05279922485
19 0.509605458379 1.05901288986
20 0.537258264422 1.01151609421
21 0.478829142451 0.9690990448 saving model
22 0.429185321927 0.996144175529
23 0.459869742393 0.999540328979
24 0.458896544576 0.988655865192
25 0.451561015844 1.00249707699
26 0.450234872103 0.942046165466 saving model
27 0.443444126844 0.972127974033
28 0.423837864399 1.03487026691
29 0.418613377213 0.997856378555
30 0.406501209736 0.97724634409
31 0.395152819157 0.968478322029
32 0.385170531273 0.979501783848
33 0.368376910686 0.999839544296
34 0.370676368475 0.977396070957
35 0.363667652011 0.953391253948
36 0.369622066617 0.964338421822
37 0.356539869308 0.979718744755
38 0.344340467453 0.97460603714
39 0.361122071743 0.966361343861
40 0.365310895443 0.95636677742
41 0.360359412432 0.971079111099
42 0.333662751317 0.982778906822
43 0.355677768588 0.967790782452
44 0.360948315263 0.97087687254
45 0.367154029012 0.985957920551
46 0.369377592206 0.996697187424
47 0.371431314945 0.951278626919
48 0.359156775475 0.955990970135
49 0.364282459021 0.969574391842
model number is: 82605
mae: 1.03376588749
corr: 0.607441954289
mult_acc: 0.34985
mult f_score: 0.35806
Confusion Matrix :
[[268 111]
[ 76 231]]
Classification Report :
precision recall f1-score support
False 0.77907 0.70712 0.74136 379
True 0.67544 0.75244 0.71186 307
micro avg 0.72741 0.72741 0.72741 686
macro avg 0.72725 0.72978 0.72661 686
weighted avg 0.73269 0.72741 0.72816 686