Commit 587f7cfb authored by Florian RICHOUX's avatar Florian RICHOUX

LSTM 64x2 Conv results


Former-commit-id: 49945fcf
parent 69fa85b0
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
protein1 (InputLayer) (None, None, 20) 0
__________________________________________________________________________________________________
protein2 (InputLayer) (None, None, 20) 0
__________________________________________________________________________________________________
conv1d_1 (Conv1D) (None, None, 5) 2005 protein1[0][0]
protein2[0][0]
__________________________________________________________________________________________________
max_pooling1d_1 (MaxPooling1D) (None, None, 5) 0 conv1d_1[0][0]
conv1d_1[1][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, None, 5) 20 max_pooling1d_1[0][0]
max_pooling1d_1[1][0]
__________________________________________________________________________________________________
conv1d_2 (Conv1D) (None, None, 5) 505 batch_normalization_1[0][0]
batch_normalization_1[1][0]
__________________________________________________________________________________________________
max_pooling1d_2 (MaxPooling1D) (None, None, 5) 0 conv1d_2[0][0]
conv1d_2[1][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, None, 5) 20 max_pooling1d_2[0][0]
max_pooling1d_2[1][0]
__________________________________________________________________________________________________
conv1d_3 (Conv1D) (None, None, 5) 505 batch_normalization_2[0][0]
batch_normalization_2[1][0]
__________________________________________________________________________________________________
max_pooling1d_3 (MaxPooling1D) (None, None, 5) 0 conv1d_3[0][0]
conv1d_3[1][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, None, 5) 20 max_pooling1d_3[0][0]
max_pooling1d_3[1][0]
__________________________________________________________________________________________________
lstm_1 (LSTM) (None, None, 64) 17920 batch_normalization_3[0][0]
batch_normalization_3[1][0]
__________________________________________________________________________________________________
lstm_2 (LSTM) (None, 64) 33024 lstm_1[0][0]
lstm_1[1][0]
__________________________________________________________________________________________________
concatenate_1 (Concatenate) (None, 128) 0 lstm_2[0][0]
lstm_2[1][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 100) 12900 concatenate_1[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 100) 400 dense_1[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 100) 10100 batch_normalization_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 100) 400 dense_2[0][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 50) 5050 batch_normalization_5[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 50) 200 dense_3[0][0]
__________________________________________________________________________________________________
dense_4 (Dense) (None, 50) 2550 batch_normalization_6[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 50) 200 dense_4[0][0]
__________________________________________________________________________________________________
dense_5 (Dense) (None, 50) 2550 batch_normalization_7[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 50) 200 dense_5[0][0]
__________________________________________________________________________________________________
dense_6 (Dense) (None, 25) 1275 batch_normalization_8[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 25) 100 dense_6[0][0]
__________________________________________________________________________________________________
dense_7 (Dense) (None, 25) 650 batch_normalization_9[0][0]
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 25) 100 dense_7[0][0]
__________________________________________________________________________________________________
dense_8 (Dense) (None, 25) 650 batch_normalization_10[0][0]
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 25) 100 dense_8[0][0]
__________________________________________________________________________________________________
dense_9 (Dense) (None, 25) 650 batch_normalization_11[0][0]
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 25) 100 dense_9[0][0]
__________________________________________________________________________________________________
dense_10 (Dense) (None, 1) 26 batch_normalization_12[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 1) 0 dense_10[0][0]
==================================================================================================
Total params: 92,220
Trainable params: 91,290
Non-trainable params: 930
__________________________________________________________________________________________________
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
protein1 (InputLayer) (None, None) 0
__________________________________________________________________________________________________
protein2 (InputLayer) (None, None) 0
__________________________________________________________________________________________________
embedding_1 (Embedding) (None, None, 2) 42 protein1[0][0]
protein2[0][0]
__________________________________________________________________________________________________
lstm_1 (LSTM) (None, None, 64) 17152 embedding_1[0][0]
embedding_1[1][0]
__________________________________________________________________________________________________
lstm_2 (LSTM) (None, 64) 33024 lstm_1[0][0]
lstm_1[1][0]
__________________________________________________________________________________________________
concatenate_1 (Concatenate) (None, 128) 0 lstm_2[0][0]
lstm_2[1][0]
__________________________________________________________________________________________________
dense_1 (Dense) (None, 100) 12900 concatenate_1[0][0]
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 100) 400 dense_1[0][0]
__________________________________________________________________________________________________
dense_2 (Dense) (None, 100) 10100 batch_normalization_1[0][0]
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 100) 400 dense_2[0][0]
__________________________________________________________________________________________________
dense_3 (Dense) (None, 50) 5050 batch_normalization_2[0][0]
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 50) 200 dense_3[0][0]
__________________________________________________________________________________________________
dense_4 (Dense) (None, 50) 2550 batch_normalization_3[0][0]
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 50) 200 dense_4[0][0]
__________________________________________________________________________________________________
dense_5 (Dense) (None, 50) 2550 batch_normalization_4[0][0]
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 50) 200 dense_5[0][0]
__________________________________________________________________________________________________
dense_6 (Dense) (None, 25) 1275 batch_normalization_5[0][0]
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 25) 100 dense_6[0][0]
__________________________________________________________________________________________________
dense_7 (Dense) (None, 25) 650 batch_normalization_6[0][0]
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 25) 100 dense_7[0][0]
__________________________________________________________________________________________________
dense_8 (Dense) (None, 25) 650 batch_normalization_7[0][0]
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 25) 100 dense_8[0][0]
__________________________________________________________________________________________________
dense_9 (Dense) (None, 25) 650 batch_normalization_8[0][0]
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 25) 100 dense_9[0][0]
__________________________________________________________________________________________________
dense_10 (Dense) (None, 1) 26 batch_normalization_9[0][0]
__________________________________________________________________________________________________
activation_1 (Activation) (None, 1) 0 dense_10[0][0]
==================================================================================================
Total params: 88,419
Trainable params: 87,519
Non-trainable params: 900
__________________________________________________________________________________________________
File lstm64x2_3conv3_10dense_shared_2019-01-03_15:14_gpu-0-1_nadam_0.002_1024_300_mirror-double.txt
lstm64x2_3conv3_10dense_shared, epochs=300, batch=1024, optimizer=nadam, learning rate=0.002, patience=10
Number of training samples: 91036
Loss
0: train_loss=0.6689475175628208, val_loss=0.7756835646425726
1: train_loss=0.5957034344496954, val_loss=0.9367693016553105
2: train_loss=0.5393137184506828, val_loss=0.9807872740951287
3: train_loss=0.5012083864100992, val_loss=0.9248648918706399
4: train_loss=0.4564182418182291, val_loss=1.4801183996554395
5: train_loss=0.42150143880726576, val_loss=1.1046808100730423
6: train_loss=0.35621877298596466, val_loss=1.1923347382893967
7: train_loss=0.33624198689832224, val_loss=0.8194037386556253
8: train_loss=0.33025051925690313, val_loss=0.3843219676781097
9: train_loss=0.3212847441108801, val_loss=0.48838195197585677
10: train_loss=0.31664398434087865, val_loss=0.5721379851181565
11: train_loss=0.3133092903363861, val_loss=0.43419309588369326
12: train_loss=0.3093715993930336, val_loss=0.5032465697602617
13: train_loss=0.3040760250186673, val_loss=0.4293847175935507
14: train_loss=0.28424571967773393, val_loss=0.3597193283261098
15: train_loss=0.277450177157416, val_loss=0.3570289863513295
16: train_loss=0.27556648712684295, val_loss=0.3918584239458096
17: train_loss=0.274342565557922, val_loss=0.3494594827730446
18: train_loss=0.27368622160035405, val_loss=0.3901224383235683
19: train_loss=0.27217139809730523, val_loss=0.3744661969822463
20: train_loss=0.2717643859746916, val_loss=0.36257585088157773
21: train_loss=0.26920915602605827, val_loss=0.3440333956025342
22: train_loss=0.26741530128902685, val_loss=0.34256795940447976
23: train_loss=0.2674633769265346, val_loss=0.367845374937727
24: train_loss=0.2649322823609916, val_loss=0.3628018487029775
25: train_loss=0.26508038037114184, val_loss=0.3531056069410764
26: train_loss=0.2633697093816396, val_loss=0.3525942731967415
27: train_loss=0.263152288411863, val_loss=0.35677218294688917
28: train_loss=0.25435758404768627, val_loss=0.3290056514107054
29: train_loss=0.25401442336739666, val_loss=0.3270651659411887
30: train_loss=0.2529447380813617, val_loss=0.33093891647764534
31: train_loss=0.25242801627567424, val_loss=0.3377196845408918
32: train_loss=0.25180625531654993, val_loss=0.3245943950716446
33: train_loss=0.25191185207027317, val_loss=0.3529798477559247
34: train_loss=0.2506901400582883, val_loss=0.32633887471322265
35: train_loss=0.24857037170940358, val_loss=0.32315102643763066
36: train_loss=0.24775793912346938, val_loss=0.3296353614379859
37: train_loss=0.25077970059403343, val_loss=0.3250975280537827
38: train_loss=0.24939528719167559, val_loss=0.32967238659365317
39: train_loss=0.24878521321805522, val_loss=0.32623647104048564
40: train_loss=0.24879967969995667, val_loss=0.32710295317460114
41: train_loss=0.2435367187229997, val_loss=0.3221253576029706
42: train_loss=0.24538698995360603, val_loss=0.32930147226535855
43: train_loss=0.24493630447930445, val_loss=0.32177136836768
44: train_loss=0.24497797776897018, val_loss=0.32069067401293466
45: train_loss=0.24288998595809438, val_loss=0.3227350735938894
46: train_loss=0.24536697963607285, val_loss=0.3242062871148829
47: train_loss=0.24646701110577907, val_loss=0.3231785449717839
48: train_loss=0.24527328405723225, val_loss=0.3243113466020624
49: train_loss=0.24312083906922893, val_loss=0.3220833022795009
50: train_loss=0.24174044648191675, val_loss=0.3227835233814332
51: train_loss=0.24450768068058953, val_loss=0.32021480872119007
52: train_loss=0.24472782794282863, val_loss=0.3241213680686292
53: train_loss=0.24197755018658773, val_loss=0.3201329769295882
54: train_loss=0.24306968494701733, val_loss=0.3199564954928736
55: train_loss=0.2431587136876285, val_loss=0.3202206507008228
56: train_loss=0.24221369205368043, val_loss=0.31973569120156675
57: train_loss=0.2420895690736528, val_loss=0.3205823675786974
58: train_loss=0.24272306903845145, val_loss=0.32032068818450143
59: train_loss=0.24142948058568256, val_loss=0.3204219462193624
60: train_loss=0.2413646611237527, val_loss=0.31993452612445883
61: train_loss=0.24051718368456523, val_loss=0.32036418779819426
62: train_loss=0.2414915245791574, val_loss=0.32016228671057134
63: train_loss=0.24187523534765268, val_loss=0.31997374028048475
64: train_loss=0.24153596344681982, val_loss=0.31985995667220574
65: train_loss=0.24240075728586236, val_loss=0.3199698868301913
66: train_loss=0.240088781058, val_loss=0.32025575848954896
67: train_loss=0.2413887285621573, val_loss=0.32012908574144266
68: train_loss=0.24108092167412146, val_loss=0.32017740632045294
69: train_loss=0.2421819413280261, val_loss=0.32013226646853127
70: train_loss=0.24138725977868145, val_loss=0.3201333970694128
///////////////////////////////////////////
Accuracy
0: train_acc=0.604749769599635, val_acc=0.5280665285526687
1: train_acc=0.6827958173065457, val_acc=0.5530945141373035
2: train_acc=0.73236961226518, val_acc=0.4858467935391012
3: train_acc=0.7582824379343198, val_acc=0.5730849189813969
4: train_acc=0.7886001140414801, val_acc=0.5258276032368202
5: train_acc=0.8118656357115447, val_acc=0.5750039978902778
6: train_acc=0.8497297770293265, val_acc=0.577642731298242
7: train_acc=0.8596159760607028, val_acc=0.6410522945171621
8: train_acc=0.8627246363237687, val_acc=0.8353590276190591
9: train_acc=0.868436662406168, val_acc=0.782904205637971
10: train_acc=0.8690847573771021, val_acc=0.741084279088274
11: train_acc=0.8713585836715871, val_acc=0.8086518475613974
12: train_acc=0.8734676391626159, val_acc=0.7709899250457863
13: train_acc=0.8762467596572713, val_acc=0.8219254760957233
14: train_acc=0.8868909004603137, val_acc=0.8514313134749794
15: train_acc=0.890219254055092, val_acc=0.8511914283146805
16: train_acc=0.8900215302218064, val_acc=0.8335199101570223
17: train_acc=0.890724548517157, val_acc=0.8543099313092655
18: train_acc=0.891229843293496, val_acc=0.8325603713460031
19: train_acc=0.8924711104635457, val_acc=0.850711658909171
20: train_acc=0.8921305854221321, val_acc=0.8483927717937882
21: train_acc=0.894525242530638, val_acc=0.8605469378513011
22: train_acc=0.8942506259923618, val_acc=0.8601471298831035
23: train_acc=0.893997978928942, val_acc=0.8449544220776743
24: train_acc=0.8953710619345968, val_acc=0.8484727331071821
25: train_acc=0.8955687858150234, val_acc=0.8537502001042215
26: train_acc=0.8968869459545776, val_acc=0.8558292020593055
27: train_acc=0.8962388506719887, val_acc=0.8538301612365042
28: train_acc=0.9008524102162978, val_acc=0.8672637131887876
29: train_acc=0.9005887778327337, val_acc=0.8707820242754882
30: train_acc=0.9015444439169529, val_acc=0.8683032149622655
31: train_acc=0.9017751216993186, val_acc=0.862386054464975
32: train_acc=0.9021156465888333, val_acc=0.8737406047672038
33: train_acc=0.9022914014789088, val_acc=0.8555093550994723
34: train_acc=0.9025110943287348, val_acc=0.8716616026882017
35: train_acc=0.9042576565867784, val_acc=0.8744602590279829
36: train_acc=0.9038402389163235, val_acc=0.8689429074807032
37: train_acc=0.9023902633968611, val_acc=0.8727810647360672
38: train_acc=0.9028296499398142, val_acc=0.8684631377701643
39: train_acc=0.9031042661638167, val_acc=0.8736606432726989
40: train_acc=0.9037303923801052, val_acc=0.8714217176518211
41: train_acc=0.9059932334775366, val_acc=0.875339837135667
42: train_acc=0.9054110464047629, val_acc=0.8682232529148948
43: train_acc=0.9054440004743147, val_acc=0.8761394530148692
44: train_acc=0.9058943712007884, val_acc=0.8764592997935912
45: train_acc=0.9060920952750172, val_acc=0.8746201818358816
46: train_acc=0.9054989237070681, val_acc=0.8735007200358526
47: train_acc=0.9053890771472793, val_acc=0.8738205657755682
48: train_acc=0.9055318772423544, val_acc=0.8726211422331978
49: train_acc=0.9067511753670554, val_acc=0.8746201822648292
50: train_acc=0.9067731448445306, val_acc=0.873100911333678
51: train_acc=0.905597785284557, val_acc=0.87685910715173
52: train_acc=0.9053561228132136, val_acc=0.8743802978385072
53: train_acc=0.9073773010307459, val_acc=0.8762194149383217
54: train_acc=0.9065095124871562, val_acc=0.8767791462672838
55: train_acc=0.9063996663647317, val_acc=0.876859107885707
56: train_acc=0.9068500370676349, val_acc=0.8767791462672838
57: train_acc=0.9073113935463792, val_acc=0.876059491396446
58: train_acc=0.9070477614325669, val_acc=0.8771789543593995
59: train_acc=0.9075420710118522, val_acc=0.876299375946686
60: train_acc=0.9070917001151468, val_acc=0.8772589156727933
61: train_acc=0.9078386574125891, val_acc=0.87693906950413
62: train_acc=0.9074761633834435, val_acc=0.8766991846488607
63: train_acc=0.9069818531965622, val_acc=0.8771789543593995
64: train_acc=0.9074981325675966, val_acc=0.8761394525859216
65: train_acc=0.9068170836266307, val_acc=0.8763793378701387
66: train_acc=0.9081901665118138, val_acc=0.8765392606780373
67: train_acc=0.9075201020450718, val_acc=0.8765392606780373
68: train_acc=0.9080363812196851, val_acc=0.87685910715173
69: train_acc=0.90683905259603, val_acc=0.8764592990596142
70: train_acc=0.9063776967929743, val_acc=0.8767791455333068
///////////////////////////////////////////
Validation metrics
Number of 0 predicted: 6679
Number of 1 predicted: 5827
Validation precision: 0.8516573679028553
Validation recall: 0.8906813111378068
Validation F1-score: 0.8707323211140006
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment