solo    train_final_3000,dev_final_2800  修改了损失函数

train results ------------
Epoch: 0, ave loss: 3.9705981184689696, acc_wn: 0.736, acc_wc: 0.286, acc_wo: 0.342, acc_wvi: 0.241, acc_wv: 0.241,acc_lx: 0.077
dev results ------------
Epoch: 0, ave loss: 3.2918514533725025, acc_wn: 0.833, acc_wc: 0.364, acc_wo: 0.566, acc_wvi: 0.327, acc_wv: 0.268,acc_lx: 0.173
Best Dev lx acc: 0.22047908232118757 at epoch: 0
data/zhuiyisql/train.db
100%|████████████████████████████████████| 4751/4751 [25:48<00:00, 3.06it/s]
data/zhuiyisql/dev.db
train results ------------
Epoch: 1, ave loss: 2.798174273873232, acc_wn: 0.834, acc_wc: 0.380, acc_wo: 0.376, acc_wvi: 0.377, acc_wv: 0.377,acc_lx: 0.120
dev results ------------
Epoch: 1, ave loss: 3.1038529826843466, acc_wn: 0.861, acc_wc: 0.382, acc_wo: 0.553, acc_wvi: 0.331, acc_wv: 0.337,acc_lx: 0.185
Best Dev lx acc: 0.2607962213225371 at epoch: 1

 

ori model  train_noor_right dev_noor_right

 train results ------------ Epoch: 1, ave loss: 1.939453357894915, acc_sn:0.943,acc_sc: 0.454, acc_sa: 0.894, acc_wn: 0.812,         acc_wc: 0.368, acc_wo: 0.770, acc_wvi: 0.413, acc_wv: 0.413, acc_lx: 0.080, acc_x: 0.106dev results ------------ Epoch: 1, ave loss: 2.474249188055439, acc_sn:0.959,acc_sc: 0.525, acc_sa: 0.899, acc_wn: 0.862,         acc_wc: 0.422, acc_wo: 0.787, acc_wvi: 0.422, acc_wv: 0.422, acc_lx: 0.113, acc_x: 0.132dev_after_fine_tune results ------------ Epoch: 1, ave loss: 2.474249188055439, acc_sn:0.959,acc_sc: 0.525, acc_sa: 0.899, acc_wn: 0.862,         acc_wc: 0.422, acc_wo: 0.773, acc_wvi: 0.422, acc_wv: 0.489, acc_lx: 0.142, acc_x: 0.248

trainresults ------------ Epoch: 2, ave loss: 1.5645157771599438, acc_sn:0.961,acc_sc: 0.527, acc_sa: 0.925, acc_wn: 0.845,         acc_wc: 0.437, acc_wo: 0.815, acc_wvi: 0.487, acc_wv: 0.487, acc_lx: 0.133, acc_x: 0.157dev results ------------ Epoch: 2, ave loss: 2.3968291460159805, acc_sn:0.964,acc_sc: 0.585, acc_sa: 0.915, acc_wn: 0.848,         acc_wc: 0.457, acc_wo: 0.792, acc_wvi: 0.466, acc_wv: 0.466, acc_lx: 0.165, acc_x: 0.184dev_after_fine_tune results ------------ Epoch: 2, ave loss: 2.3968291460159805, acc_sn:0.964,acc_sc: 0.585, acc_sa: 0.915, acc_wn: 0.848,         acc_wc: 0.457, acc_wo: 0.772, acc_wvi: 0.466, acc_wv: 0.528, acc_lx: 0.195, acc_x: 0.302

ori model  改num

train results ------------
Epoch: 2, ave loss: 1.5868457356598151, acc_sn:0.952,acc_sc: 0.505, acc_sa: 0.911, acc_wn: 0.846, acc_wc: 0.434, acc_wo: 0.817, acc_wvi: 0.480, acc_wv: 0.480, acc_lx: 0.123, acc_x: 0.147
dev results ------------
Epoch: 2, ave loss: 2.363940207163493, acc_sn:0.956,acc_sc: 0.563, acc_sa: 0.902, acc_wn: 0.864, acc_wc: 0.456, acc_wo: 0.812, acc_wvi: 0.464, acc_wv: 0.464, acc_lx: 0.153, acc_x: 0.173
dev_after_fine_tune results ------------
Epoch: 2, ave loss: 2.363940207163493, acc_sn:0.956,acc_sc: 0.563, acc_sa: 0.902, acc_wn: 0.864, acc_wc: 0.456, acc_wo: 0.793, acc_wvi: 0.464, acc_wv: 0.533, acc_lx: 0.179, acc_x: 0.288

ori model + WCP+WCP loss     train_final_28000 dev_fianl_3000+ dev_noor_all

train results ------------
Epoch: 0, ave loss: 4.863365432684659, acc_sn:0.897,acc_sc: 0.318, acc_sa: 0.806, acc_wn: 0.713, acc_wc: 0.268, acc_wo: 0.635, acc_wvi: 0.205, acc_wv: 0.205, acc_lx: 0.027, acc_x: 0.052
dev_3000 results ------------
Epoch: 0, ave loss: 3.9460013359986177, acc_sn:0.911,acc_sc: 0.413, acc_sa: 0.843, acc_wn: 0.809, acc_wc: 0.324, acc_wo: 0.707, acc_wvi: 0.270, acc_wv: 0.270, acc_lx: 0.053, acc_x: 0.112
dev_noor_right(old dev) results ------------
Epoch: 0, ave loss: 4.059925231440314, acc_sn:0.900,acc_sc: 0.427, acc_sa: 0.839, acc_wn: 0.783, acc_wc: 0.239, acc_wo: 0.669, acc_wvi: 0.313, acc_wv: 0.313, acc_lx: 0.062, acc_x: 0.117

train results ------------
Epoch: 3, ave loss: 2.370398828709476, acc_sn:0.968,acc_sc: 0.621, acc_sa: 0.944, acc_wn: 0.874, acc_wc: 0.422, acc_wo: 0.851, acc_wvi: 0.462, acc_wv: 0.462, acc_lx: 0.148, acc_x: 0.223
************************************************************
dev_3000 results ------------
Epoch: 3, ave loss: 3.453819248196889, acc_sn:0.973,acc_sc: 0.677, acc_sa: 0.943, acc_wn: 0.890, acc_wc: 0.410, acc_wo: 0.827, acc_wvi: 0.362, acc_wv: 0.362, acc_lx: 0.160, acc_x: 0.254
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 3, ave loss: 3.6268525209546465, acc_sn:0.971,acc_sc: 0.681, acc_sa: 0.942, acc_wn: 0.881, acc_wc: 0.335, acc_wo: 0.807, acc_wvi: 0.420, acc_wv: 0.420, acc_lx: 0.185, acc_x: 0.272
************************************************************
train results ------------
Epoch: 4, ave loss: 2.1656264911189393, acc_sn:0.974,acc_sc: 0.677, acc_sa: 0.955, acc_wn: 0.889, acc_wc: 0.440, acc_wo: 0.868, acc_wvi: 0.492, acc_wv: 0.492, acc_lx: 0.176, acc_x: 0.257
************************************************************
dev_3000 results ------------
Epoch: 4, ave loss: 3.3725050522087394, acc_sn:0.976,acc_sc: 0.724, acc_sa: 0.950, acc_wn: 0.908, acc_wc: 0.430, acc_wo: 0.840, acc_wvi: 0.376, acc_wv: 0.376, acc_lx: 0.183, acc_x: 0.280
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 4, ave loss: 3.5449231119365154, acc_sn:0.977,acc_sc: 0.725, acc_sa: 0.952, acc_wn: 0.899, acc_wc: 0.356, acc_wo: 0.820, acc_wvi: 0.436, acc_wv: 0.436, acc_lx: 0.212, acc_x: 0.299
************************************************************

 

ori model + WCP+WCP loss     wmm-bert  train_final_28000 dev_fianl_3000+ dev_noor_all  不如bert本身的

train results ------------
Epoch: 2, ave loss: 2.7139437514288853, acc_sn:0.954,acc_sc: 0.548, acc_sa: 0.905, acc_wn: 0.860, acc_wc: 0.401, acc_wo: 0.831, acc_wvi: 0.427, acc_wv: 0.427, acc_lx: 0.115, acc_x: 0.177
************************************************************
dev_3000 results ------------
Epoch: 2, ave loss: 3.7048803411836406, acc_sn:0.962,acc_sc: 0.621, acc_sa: 0.908, acc_wn: 0.879, acc_wc: 0.399, acc_wo: 0.805, acc_wvi: 0.338, acc_wv: 0.338, acc_lx: 0.135, acc_x: 0.212
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 2, ave loss: 3.8923735562536783, acc_sn:0.964,acc_sc: 0.629, acc_sa: 0.911, acc_wn: 0.865, acc_wc: 0.325, acc_wo: 0.779, acc_wvi: 0.392, acc_wv: 0.392, acc_lx: 0.157, acc_x: 0.227
************************************************************
train results ------------
Epoch: 3, ave loss: 2.4213486992417073, acc_sn:0.964,acc_sc: 0.632, acc_sa: 0.925, acc_wn: 0.879, acc_wc: 0.421, acc_wo: 0.855, acc_wvi: 0.470, acc_wv: 0.470, acc_lx: 0.152, acc_x: 0.225
************************************************************
dev_3000 results ------------
Epoch: 3, ave loss: 3.5529488654915298, acc_sn:0.965,acc_sc: 0.663, acc_sa: 0.913, acc_wn: 0.891, acc_wc: 0.395, acc_wo: 0.821, acc_wvi: 0.352, acc_wv: 0.352, acc_lx: 0.144, acc_x: 0.242
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 3, ave loss: 3.7379865078343117, acc_sn:0.965,acc_sc: 0.664, acc_sa: 0.915, acc_wn: 0.881, acc_wc: 0.320, acc_wo: 0.800, acc_wvi: 0.408, acc_wv: 0.408, acc_lx: 0.167, acc_x: 0.259
************************************************************

 

train WCP WOP改成same col     

sc比WCP4列的模型少0.04 sa少0.01 ,其他都更高

train results ------------
Epoch: 0, ave loss: 2.7898552958564053, acc_sn:0.925,acc_sc: 0.354, acc_sa: 0.862,acc_same:0.955, acc_wn: 0.778, acc_wc: 0.382, acc_wo: 0.726, acc_wvi: 0.298, acc_wv: 0.231, acc_lx: 0.044, acc_x: 0.063
dev_noor_right(old dev) results ------------
Epoch: 0, ave loss: 2.7573255713456852, acc_sn:0.947,acc_sc: 0.502, acc_sa: 0.881,acc_same:0.996, acc_wn: 0.751, acc_wc: 0.365, acc_wo: 0.676, acc_wvi: 0.331, acc_wv: 0.331, acc_lx: 0.105, acc_x: 0.131
dev_final results ------------
Epoch: 0, ave loss: 3.5106736698389827, acc_sn:0.949,acc_sc: 0.509, acc_sa: 0.872,acc_same:0.973, acc_wn: 0.783, acc_wc: 0.400, acc_wo: 0.703, acc_wvi: 0.243, acc_wv: 0.228, acc_lx: 0.072, acc_x: 0.105

 


dev_final after fine tune ...... results ------------
Epoch: 0, ave loss: 3.5106736698389827, acc_sn:0.949,acc_sc: 0.509, acc_sa: 0.872,acc_same:0.973, acc_wn: 0.783, acc_wc: 0.400, acc_wo: 0.696, acc_wvi: 0.243, acc_wv: 0.298, acc_lx: 0.099, acc_x: 0.241
Best Dev lx acc: 0.07200647249190939 at epoch: 0 train results ------------
Epoch: 1, ave loss: 1.6535311767962602, acc_sn:0.964,acc_sc: 0.563, acc_sa: 0.922,acc_same:0.983, acc_wn: 0.858, acc_wc: 0.547, acc_wo: 0.834, acc_wvi: 0.463, acc_wv: 0.357, acc_lx: 0.134, acc_x: 0.159
dev_noor_right(old dev) results ------------
Epoch: 1, ave loss: 2.661629102785479, acc_sn:0.963,acc_sc: 0.643, acc_sa: 0.915,acc_same:0.994, acc_wn: 0.864, acc_wc: 0.511, acc_wo: 0.806, acc_wvi: 0.491, acc_wv: 0.491, acc_lx: 0.225, acc_x: 0.233
dev_final results ------------
Epoch: 1, ave loss: 3.485209422284474, acc_sn:0.958,acc_sc: 0.644, acc_sa: 0.904,acc_same:0.982, acc_wn: 0.866, acc_wc: 0.526, acc_wo: 0.802, acc_wvi: 0.365, acc_wv: 0.339, acc_lx: 0.155, acc_x: 0.175

dev_final after fine tune ...... results ------------
Epoch: 1, ave loss: 3.485209422284474, acc_sn:0.958,acc_sc: 0.644, acc_sa: 0.904, acc_same:0.982,acc_wn: 0.866, acc_wc: 0.526, acc_wo: 0.802, acc_wvi: 0.365, acc_wv: 0.339, acc_lx: 0.155, acc_x: 0.175

train results ------------
Epoch: 2, ave loss: 1.3385876678911044, acc_sn:0.975,acc_sc: 0.670, acc_sa: 0.939,acc_same:0.987, acc_wn: 0.884, acc_wc: 0.606, acc_wo: 0.865, acc_wvi: 0.525, acc_wv: 0.405, acc_lx: 0.195, acc_x: 0.220
dev_noor_right(old dev) results ------------
Epoch: 2, ave loss: 2.700655585581139, acc_sn:0.975,acc_sc: 0.709, acc_sa: 0.933,acc_same:0.995, acc_wn: 0.893, acc_wc: 0.548, acc_wo: 0.842, acc_wvi: 0.548, acc_wv: 0.548, acc_lx: 0.295, acc_x: 0.308
dev_final results ------------
Epoch: 2, ave loss: 3.5511866594306087, acc_sn:0.969,acc_sc: 0.711, acc_sa: 0.913,acc_same:0.988, acc_wn: 0.887, acc_wc: 0.559, acc_wo: 0.828, acc_wvi: 0.410, acc_wv: 0.377, acc_lx: 0.203, acc_x: 0.238
dev_final after fine tune ...... results ------------
Epoch: 2, ave loss: 3.5511866594306087, acc_sn:0.969,acc_sc: 0.711, acc_sa: 0.913,acc_same:0.988, acc_wn: 0.887, acc_wc: 0.559, acc_wo: 0.800, acc_wvi: 0.410, acc_wv: 0.465, acc_lx: 0.258, acc_x: 0.381
Best Dev lx acc: 0.20334412081984898 at epoch: 2

 

500服务器

Epoch: 0, ave loss: 2.760298604639313, acc_sn:0.926,acc_sc: 0.350, acc_sa: 0.868, acc_same:0.955,acc_wn: 0.768, acc_wc: 0.375, acc_wo: 0.713, acc_wvi: 0.280, acc_wv: 0.218, acc_lx: 0.040, acc_x: 0.037
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 0, ave loss: 3.095265126665562, acc_sn:0.945,acc_sc: 0.505, acc_sa: 0.882, acc_same:0.976,acc_wn: 0.830, acc_wc: 0.456, acc_wo: 0.767, acc_wvi: 0.304, acc_wv: 0.279, acc_lx: 0.083, acc_x: 0.067
************************************************************
train results ------------
Epoch: 1, ave loss: 1.4912335560955743, acc_sn:0.966,acc_sc: 0.561, acc_sa: 0.934, acc_same:0.984,acc_wn: 0.866, acc_wc: 0.546, acc_wo: 0.843, acc_wvi: 0.464, acc_wv: 0.359, acc_lx: 0.130, acc_x: 0.120
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 1, ave loss: 2.9784555481475534, acc_sn:0.966,acc_sc: 0.660, acc_sa: 0.928, acc_same:0.984,acc_wn: 0.867, acc_wc: 0.518, acc_wo: 0.814, acc_wvi: 0.388, acc_wv: 0.357, acc_lx: 0.166, acc_x: 0.134
************************************************************
train results ------------
Epoch: 2, ave loss: 1.1748086275468923, acc_sn:0.978,acc_sc: 0.685, acc_sa: 0.957, acc_same:0.988,acc_wn: 0.890, acc_wc: 0.599, acc_wo: 0.873, acc_wvi: 0.531, acc_wv: 0.410, acc_lx: 0.198, acc_x: 0.180
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 2, ave loss: 3.3657749689384056, acc_sn:0.970,acc_sc: 0.744, acc_sa: 0.936, acc_same:0.983,acc_wn: 0.885, acc_wc: 0.540, acc_wo: 0.833, acc_wvi: 0.398, acc_wv: 0.367, acc_lx: 0.194, acc_x: 0.157
************************************************************
train results ------------
Epoch: 3, ave loss: 1.0197581509095657, acc_sn:0.982,acc_sc: 0.747, acc_sa: 0.966, acc_same:0.990,acc_wn: 0.904, acc_wc: 0.639, acc_wo: 0.889, acc_wvi: 0.568, acc_wv: 0.438, acc_lx: 0.239, acc_x: 0.217
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 3, ave loss: 3.5901366324357875, acc_sn:0.976,acc_sc: 0.781, acc_sa: 0.943, acc_same:0.986,acc_wn: 0.870, acc_wc: 0.549, acc_wo: 0.813, acc_wvi: 0.393, acc_wv: 0.364, acc_lx: 0.215, acc_x: 0.169

 

500服务器   10batchsize

Epoch: 0, ave loss: 2.7240601507498363, acc_sn:0.927,acc_sc: 0.355, acc_sa: 0.872,acc_same:0.955, acc_wn: 0.779, acc_wc: 0.377, acc_wo: 0.723, acc_wvi: 0.290, acc_wv: 0.225, acc_lx: 0.041, acc_x: 0.039
dev_noor_right(old dev) results ------------
Epoch: 0, ave loss: 2.4285969724849474, acc_sn:0.953,acc_sc: 0.532, acc_sa: 0.904,acc_same:0.991, acc_wn: 0.822, acc_wc: 0.421, acc_wo: 0.754, acc_wvi: 0.399, acc_wv: 0.399, acc_lx: 0.127, acc_x: 0.101
dev_final results ------------
Epoch: 0, ave loss: 3.1937271584199083, acc_sn:0.948,acc_sc: 0.515, acc_sa: 0.889,acc_same:0.978, acc_wn: 0.835, acc_wc: 0.454, acc_wo: 0.763, acc_wvi: 0.295, acc_wv: 0.275, acc_lx: 0.087, acc_x: 0.070
dev_final after fine tune ...... results ------------
Epoch: 0, ave loss: 3.1937271584199083, acc_sn:0.948,acc_sc: 0.515, acc_sa: 0.889,acc_same:0.978, acc_wn: 0.835, acc_wc: 0.454, acc_wo: 0.763, acc_wvi: 0.295, acc_wv: 0.275, acc_lx: 0.087, acc_x: 0.070
Best Dev lx acc: 0.0784789644012945 at epoch: 0

 

train WCP WOP改成4列的model   扩增noor + 扩增同列数据

rain results ------------
Epoch: 3, ave loss: 1.7784717851402951, acc_sn:0.982,acc_sc: 0.769, acc_sa: 0.966, acc_wn: 0.914, acc_wc: 0.472, acc_wo: 0.898, acc_wvi: 0.581, acc_wv: 0.447, acc_lx: 0.189, acc_x: 0.255
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 3, ave loss: 4.032047040970721, acc_sn:0.976,acc_sc: 0.775, acc_sa: 0.947, acc_wn: 0.900, acc_wc: 0.347, acc_wo: 0.813, acc_wvi: 0.416, acc_wv: 0.416, acc_lx: 0.229, acc_x: 0.365
************************************************************
dev_final results ------------
Epoch: 3, ave loss: 4.910492429361715, acc_sn:0.976,acc_sc: 0.791, acc_sa: 0.943, acc_wn: 0.892, acc_wc: 0.375, acc_wo: 0.809, acc_wvi: 0.297, acc_wv: 0.276, acc_lx: 0.152, acc_x: 0.286
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 3, ave loss: 4.910492429361715, acc_sn:0.976,acc_sc: 0.791, acc_sa: 0.943, acc_wn: 0.892, acc_wc: 0.375, acc_wo: 0.788, acc_wvi: 0.297, acc_wv: 0.320, acc_lx: 0.182, acc_x: 0.401
************************************************************

train results ------------
Epoch: 4, ave loss: 1.6247895048715542, acc_sn:0.985,acc_sc: 0.802, acc_sa: 0.971, acc_wn: 0.923, acc_wc: 0.484, acc_wo: 0.908, acc_wvi: 0.608, acc_wv: 0.469, acc_lx: 0.206, acc_x: 0.274
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 4, ave loss: 4.1395345384424385, acc_sn:0.980,acc_sc: 0.802, acc_sa: 0.953, acc_wn: 0.908, acc_wc: 0.373, acc_wo: 0.825, acc_wvi: 0.434, acc_wv: 0.434, acc_lx: 0.253, acc_x: 0.381
************************************************************
dev_final results ------------
Epoch: 4, ave loss: 5.041357705376365, acc_sn:0.972,acc_sc: 0.807, acc_sa: 0.940, acc_wn: 0.900, acc_wc: 0.399, acc_wo: 0.825, acc_wvi: 0.310, acc_wv: 0.288, acc_lx: 0.168, acc_x: 0.292
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 4, ave loss: 5.041357705376365, acc_sn:0.972,acc_sc: 0.807, acc_sa: 0.940, acc_wn: 0.900, acc_wc: 0.399, acc_wo: 0.801, acc_wvi: 0.310, acc_wv: 0.335, acc_lx: 0.201, acc_x: 0.412

************************************************************
dev_final results ------------
Epoch: 6, ave loss: 5.151140937062053, acc_sn:0.979,acc_sc: 0.843, acc_sa: 0.955, acc_wn: 0.896, acc_wc: 0.393, acc_wo: 0.821, acc_wvi: 0.312, acc_wv: 0.289, acc_lx: 0.181, acc_x: 0.337
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 6, ave loss: 5.151140937062053, acc_sn:0.979,acc_sc: 0.843, acc_sa: 0.955, acc_wn: 0.896, acc_wc: 0.393, acc_wo: 0.800, acc_wvi: 0.312, acc_wv: 0.328, acc_lx: 0.209, acc_x: 0.455
************************************************************
train results ------------
Epoch: 7, ave loss: 1.358646693560106, acc_sn:0.989,acc_sc: 0.856, acc_sa: 0.979, acc_wn: 0.941, acc_wc: 0.510, acc_wo: 0.930, acc_wvi: 0.661, acc_wv: 0.511, acc_lx: 0.243, acc_x: 0.315
************************************************************
dev_noor_right(old dev) results ------------
Epoch: 7, ave loss: 4.4229940004109585, acc_sn:0.984,acc_sc: 0.841, acc_sa: 0.961, acc_wn: 0.920, acc_wc: 0.379, acc_wo: 0.839, acc_wvi: 0.453, acc_wv: 0.453, acc_lx: 0.279, acc_x: 0.418
************************************************************
dev_final results ------------
Epoch: 7, ave loss: 5.429586288650315, acc_sn:0.983,acc_sc: 0.848, acc_sa: 0.954, acc_wn: 0.905, acc_wc: 0.405, acc_wo: 0.830, acc_wvi: 0.324, acc_wv: 0.301, acc_lx: 0.185, acc_x: 0.334
************************************************************
dev_final after fine tune ...... results ------------
Epoch: 7, ave loss: 5.429586288650315, acc_sn:0.983,acc_sc: 0.848, acc_sa: 0.954, acc_wn: 0.905, acc_wc: 0.405, acc_wo: 0.814, acc_wvi: 0.324, acc_wv: 0.340, acc_lx: 0.215, acc_x: 0.449
************************************************************

 

 

模型最后角逐

结论:

不修正dev final 

bs=16 和bs=6 ,w部分完胜,s部分略差

bs=10 和bs=6,完胜

bs=10 和bs=16 ,只有wvi、wv前者略低,0.01,0.004

即bs=16,w学得好,bs=6,s学的略好,bs=10中和了16和6,w和s都学得很好,只是w部分比bs=16略低

 

 

WCP WOP改成same col ——

 

Batchsize=6,自己电脑上7.29下午跑

acc_sn:0.947
acc_sc: 0.502, acc_sa:0.881,
acc_same:0.996, acc_wn: 0.751, acc_wc: 0.365, acc_wo: 0.676, acc_wvi: 0.331, acc_wv: 0.331, acc_lx: 0.105,
acc_x: 0.131

 

 

acc_sn:0.949,
acc_sc: 0.509, acc_sa: 0.872,
acc_same:0.973, acc_wn: 0.783, acc_wc: 0.400, acc_wo: 0.703, acc_wvi: 0.243, acc_wv: 0.228, acc_lx: 0.072,
acc_x: 0.105

 

acc_sn:0.949,
acc_sc: 0.509,
acc_sa: 0.872,
acc_same:0.973, acc_wn: 0.783, acc_wc: 0.400, acc_wo: 0.696,
acc_wvi: 0.243, acc_wv: 0.298,
acc_lx: 0.099,
acc_x: 0.241

500 bs=10

acc_sn:0.953,
acc_sc: 0.532, acc_sa:0.904,
acc_same:0.991, acc_wn: 0.822,         acc_wc: 0.421, acc_wo: 0.754, acc_wvi: 0.399, acc_wv: 0.399, acc_lx: 0.127,
acc_x: 0.101

 

acc_sn:0.948,
acc_sc: 0.515, acc_sa:0.889,
acc_same:0.978, acc_wn: 0.835,         acc_wc: 0.454, acc_wo: 0.763, acc_wvi: 0.295, --弥补了bs=6时太低,几乎接近bs=16

acc_wv: 0.275, acc_lx: 0.087,
acc_x: 0.070

 

 

500昨晚 bs=16

 

acc_sn:0.945,
acc_sc: 0.505,
acc_sa: 0.882, acc_same:0.976,
acc_wn: 0.830,  acc_wc: 0.456, acc_wo: 0.767, acc_wvi: 0.304, acc_wv: 0.279,
acc_lx: 0.083,
acc_x: 0.067