Commit a82238d7 authored by Florian RICHOUX's avatar Florian RICHOUX

Results of LSTM 64x2 with embedding


Former-commit-id: 1e773cec
parent f86ecc43
File lstm64x2_embed2_10dense_shared_2019-01-04_01:51_gpu-3-1_nadam_0.002_1024_300_mirror-double.txt
lstm64x2_embed2_10dense_shared, epochs=300, batch=1024, optimizer=nadam, learning rate=0.002, patience=30
Number of training samples: 91036
Loss
0: train_loss=0.678187608263175, val_loss=0.7100322531468007
1: train_loss=0.6420174912092917, val_loss=8.165605250385576
2: train_loss=0.6350501866095294, val_loss=2.5848071967127533
3: train_loss=0.6303801848057772, val_loss=2.2299520731086675
4: train_loss=0.6244979725842987, val_loss=0.8120992195695834
5: train_loss=0.6281464353555367, val_loss=1.480194720881816
6: train_loss=0.6217974651346433, val_loss=1.1136083887726445
7: train_loss=0.6133177559946379, val_loss=1.0791225719123998
8: train_loss=0.6217137932976445, val_loss=2.1856977115263847
9: train_loss=0.6213251265046823, val_loss=3.3113820350362686
10: train_loss=0.5966634378737444, val_loss=3.0074136339910846
11: train_loss=0.5696919267869718, val_loss=1.6896886386509877
12: train_loss=0.5524838493763812, val_loss=0.7809302459025448
13: train_loss=0.5432884744398571, val_loss=0.6496019182768742
14: train_loss=0.5384781511582402, val_loss=0.8777159351302055
15: train_loss=0.5344697514204261, val_loss=0.6263262430096024
16: train_loss=0.530622021854644, val_loss=0.6914912923019485
17: train_loss=0.5269392009223302, val_loss=0.8179908771048198
18: train_loss=0.5246826755706673, val_loss=0.9964464884194759
19: train_loss=0.5209795673009072, val_loss=0.9800144475344827
20: train_loss=0.5169759778954651, val_loss=1.3981814163886088
21: train_loss=0.513856523306918, val_loss=0.8617835231214719
22: train_loss=0.5125634521737009, val_loss=1.2136717205826957
23: train_loss=0.511940888855892, val_loss=0.7197906134148778
24: train_loss=0.5108859913803486, val_loss=0.8335295929329959
25: train_loss=0.5095963495425817, val_loss=0.6847784764744197
26: train_loss=0.5083522535024112, val_loss=0.5866336463242364
27: train_loss=0.5083396786063473, val_loss=0.6088206571196625
28: train_loss=0.5078497148444276, val_loss=0.5880978014609078
29: train_loss=0.507601980514259, val_loss=0.5658003224881195
30: train_loss=0.5075878656844409, val_loss=0.5635741106819081
31: train_loss=0.5072516388531503, val_loss=0.5651155376594467
32: train_loss=0.5073124656144448, val_loss=0.5655948351654609
33: train_loss=0.5061343099290612, val_loss=0.5611604180085112
34: train_loss=0.5062420528002807, val_loss=0.5614594996842922
35: train_loss=0.5063789211843118, val_loss=0.5707733638187837
36: train_loss=0.5058875566399094, val_loss=0.5734536708421217
37: train_loss=0.5053890106406581, val_loss=0.5657328923625906
38: train_loss=0.5053332818324968, val_loss=0.5572325489567925
39: train_loss=0.5052067318612774, val_loss=0.5639569687305709
40: train_loss=0.5045486779648535, val_loss=0.5581000296665004
41: train_loss=0.5045645405851876, val_loss=0.5758668158543543
42: train_loss=0.5042919543590962, val_loss=0.5620660869556294
43: train_loss=0.5042147089197417, val_loss=0.5632405928205761
44: train_loss=0.503868755247614, val_loss=0.5540575373655088
45: train_loss=0.5034024556820074, val_loss=0.5532843840898142
46: train_loss=0.5034768473646847, val_loss=0.5554801587312674
47: train_loss=0.5033288764704289, val_loss=0.5533286467106384
48: train_loss=0.5036262254172529, val_loss=0.5581589747329302
49: train_loss=0.5037137646788039, val_loss=0.5580914606385359
50: train_loss=0.5033506504892439, val_loss=0.5548973332762279
51: train_loss=0.5031532858015075, val_loss=0.5580137422918683
52: train_loss=0.5033127332703762, val_loss=0.5592585078968003
53: train_loss=0.5029460429998602, val_loss=0.5620977821854158
54: train_loss=0.5029174580000066, val_loss=0.5636187539603182
55: train_loss=0.5026502826678019, val_loss=0.5602635574134925
56: train_loss=0.5031228275831556, val_loss=0.5631044133175054
57: train_loss=0.502952619553597, val_loss=0.5640812184302383
58: train_loss=0.5029578145243612, val_loss=0.5647758897277768
59: train_loss=0.5026421771353716, val_loss=0.5640241136917701
60: train_loss=0.5028209519124921, val_loss=0.5643548339476647
61: train_loss=0.5026618510745281, val_loss=0.565044940661835
62: train_loss=0.5027826916991295, val_loss=0.5646103805643872
63: train_loss=0.5029131499570422, val_loss=0.5652002546571072
64: train_loss=0.5027409062071704, val_loss=0.5648030925231725
65: train_loss=0.5027550844417273, val_loss=0.5652537041714758
66: train_loss=0.502816259473388, val_loss=0.5654108834499247
67: train_loss=0.5027192001251843, val_loss=0.5653814453135371
68: train_loss=0.5026934410727052, val_loss=0.5651236342350502
69: train_loss=0.5031534058645742, val_loss=0.5652830414479968
70: train_loss=0.5027812176541865, val_loss=0.5653365751981316
71: train_loss=0.5027708851540517, val_loss=0.5651599029035888
72: train_loss=0.5026811817101536, val_loss=0.5654528099652121
73: train_loss=0.5024345408891184, val_loss=0.5653692043515829
74: train_loss=0.5025166860232899, val_loss=0.5654589356412206
75: train_loss=0.5027799930182392, val_loss=0.5652324038373171
///////////////////////////////////////////
Accuracy
0: train_acc=0.604903554724151, val_acc=0.5457380455474024
1: train_acc=0.6429434511467622, val_acc=0.4872861025801625
2: train_acc=0.6515005054517456, val_acc=0.5174316326923301
3: train_acc=0.6536864538651379, val_acc=0.5127138969003344
4: train_acc=0.6591128784060096, val_acc=0.5238285621661826
5: train_acc=0.6566303442937585, val_acc=0.5127138969003344
6: train_acc=0.6611011027713626, val_acc=0.5060770825712124
7: train_acc=0.6678786411625445, val_acc=0.5251079480609531
8: train_acc=0.6608484556608017, val_acc=0.5127138969003344
9: train_acc=0.6599477131917023, val_acc=0.5127138969003344
10: train_acc=0.6799617732727201, val_acc=0.5127138969003344
11: train_acc=0.6988883517665202, val_acc=0.5127138969003344
12: train_acc=0.7100048329673138, val_acc=0.5136734363214123
13: train_acc=0.7187376424268865, val_acc=0.6249800100720025
14: train_acc=0.7220769805142213, val_acc=0.5127938585187576
15: train_acc=0.7253504106588735, val_acc=0.635534942722046
16: train_acc=0.7293158748210736, val_acc=0.5415800413893982
17: train_acc=0.7316116700613874, val_acc=0.5126339352819113
18: train_acc=0.7333362626010125, val_acc=0.5127138969003344
19: train_acc=0.7364449226205161, val_acc=0.5127138969003344
20: train_acc=0.73901533476314, val_acc=0.49176395321185923
21: train_acc=0.7412452216495963, val_acc=0.5069566610172884
22: train_acc=0.7424205809660078, val_acc=0.4935231088171687
23: train_acc=0.7421239949528753, val_acc=0.5528546297109815
24: train_acc=0.7431785224941118, val_acc=0.5352630737818052
25: train_acc=0.7442220658753982, val_acc=0.5810011198820733
26: train_acc=0.7448152379802316, val_acc=0.6933471935092402
27: train_acc=0.7444307744945623, val_acc=0.665040779920191
28: train_acc=0.7452326551164211, val_acc=0.6883895725569458
29: train_acc=0.7463750606094619, val_acc=0.7125379822358272
30: train_acc=0.7459137044214208, val_acc=0.7134975210468464
31: train_acc=0.7460345356937578, val_acc=0.7124580197023159
32: train_acc=0.7460674895668885, val_acc=0.7081400923074654
33: train_acc=0.7466496772682097, val_acc=0.7193347196778773
34: train_acc=0.746814447155034, val_acc=0.7143770988495011
35: train_acc=0.746495892025841, val_acc=0.7039820882066543
36: train_acc=0.7469792173011341, val_acc=0.7048616664382567
37: train_acc=0.7470341404134158, val_acc=0.7086998236364276
38: train_acc=0.7462322598151275, val_acc=0.7198944503967807
39: train_acc=0.7480117758048785, val_acc=0.7115784423858018
40: train_acc=0.7485610088290521, val_acc=0.7184551416941115
41: train_acc=0.748011775584887, val_acc=0.7002238923886023
42: train_acc=0.7484621466753945, val_acc=0.7103790182333726
43: train_acc=0.7485170699029099, val_acc=0.7102190943864676
44: train_acc=0.7486598707967643, val_acc=0.7198144885972464
45: train_acc=0.7479458674484022, val_acc=0.7225331841097743
46: train_acc=0.748901533752613, val_acc=0.7193347189439004
47: train_acc=0.7488685795128296, val_acc=0.7211738361676331
48: train_acc=0.7480996530548047, val_acc=0.7138173680066795
49: train_acc=0.7488905487441238, val_acc=0.7154165996411658
50: train_acc=0.7490882725040786, val_acc=0.7193347196778773
51: train_acc=0.7492640271977332, val_acc=0.715016792283027
52: train_acc=0.749231073397933, val_acc=0.7138173680066795
53: train_acc=0.7493848587136324, val_acc=0.7124580204934858
54: train_acc=0.7494837206263466, val_acc=0.7122980973805577
55: train_acc=0.7495386438355295, val_acc=0.7146169838858818
56: train_acc=0.7482424533882043, val_acc=0.7124580203123747
57: train_acc=0.7485170702669437, val_acc=0.7116584035180844
58: train_acc=0.7494727358430869, val_acc=0.7114984802812381
59: train_acc=0.7501208314923286, val_acc=0.7120582125252882
60: train_acc=0.7495606131663436, val_acc=0.7114984802812381
61: train_acc=0.7498681839653549, val_acc=0.7121381739626003
62: train_acc=0.7494397822344699, val_acc=0.7122980964654696
63: train_acc=0.7494397823549416, val_acc=0.7114984802812381
64: train_acc=0.7502965861126526, val_acc=0.7117383658704844
65: train_acc=0.7491981194017118, val_acc=0.711498481015215
66: train_acc=0.7496594752047675, val_acc=0.7117383651365076
67: train_acc=0.7494068281884887, val_acc=0.7113385577783686
68: train_acc=0.749099257363288, val_acc=0.7123780580838928
69: train_acc=0.7490113799693195, val_acc=0.7114984802812381
70: train_acc=0.7490443338660209, val_acc=0.7120582116102002
71: train_acc=0.7496704600142167, val_acc=0.7118982883733539
72: train_acc=0.7495056903211946, val_acc=0.7109387492573054
73: train_acc=0.7497912915274969, val_acc=0.7114984802812381
74: train_acc=0.749439782231851, val_acc=0.711498481015215
75: train_acc=0.7494507669941592, val_acc=0.7116584042520613
///////////////////////////////////////////
Validation metrics
Number of 0 predicted: 7718
Number of 1 predicted: 4788
Validation precision: 0.596980636691828
Validation recall: 0.7598162071846283
Validation F1-score: 0.668627090608344
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment