Thesis/icews14_l2_1e-5.out

4332 lines
283 KiB
Plaintext
Raw Permalink Normal View History

2023-06-24 04:11:17 +00:00
nohup: ignoring input
2023-05-21 09:52:52,862 - [INFO] - {'dataset': 'icews14', 'name': 'ice00001', 'gpu': '3', 'train_strategy': 'one_to_n', 'opt': 'adam', 'neg_num': 1000, 'batch_size': 128, 'l2': 1e-05, 'lr': 0.0001, 'max_epochs': 500, 'num_workers': 0, 'seed': 42, 'restore': False, 'lbl_smooth': 0.1, 'embed_dim': 400, 'ent_vec_dim': 400, 'rel_vec_dim': 400, 'bias': False, 'form': 'plain', 'k_w': 10, 'k_h': 20, 'num_filt': 96, 'ker_sz': 9, 'perm': 1, 'hid_drop': 0.5, 'feat_drop': 0.2, 'inp_drop': 0.2, 'drop_path': 0.0, 'drop': 0.0, 'in_channels': 1, 'out_channels': 32, 'filt_h': 1, 'filt_w': 9, 'image_h': 128, 'image_w': 128, 'patch_size': 8, 'mixer_dim': 256, 'expansion_factor': 4, 'expansion_factor_token': 0.5, 'mixer_depth': 16, 'mixer_dropout': 0.2, 'log_dir': './log/', 'config_dir': './config/', 'test_only': False, 'grid_search': False}
{'batch_size': 128,
'bias': False,
'config_dir': './config/',
'dataset': 'icews14',
'drop': 0.0,
'drop_path': 0.0,
'embed_dim': 400,
'ent_vec_dim': 400,
'expansion_factor': 4,
'expansion_factor_token': 0.5,
'feat_drop': 0.2,
'filt_h': 1,
'filt_w': 9,
'form': 'plain',
'gpu': '3',
'grid_search': False,
'hid_drop': 0.5,
'image_h': 128,
'image_w': 128,
'in_channels': 1,
'inp_drop': 0.2,
'k_h': 20,
'k_w': 10,
'ker_sz': 9,
'l2': 1e-05,
'lbl_smooth': 0.1,
'log_dir': './log/',
'lr': 0.0001,
'max_epochs': 500,
'mixer_depth': 16,
'mixer_dim': 256,
'mixer_dropout': 0.2,
'name': 'ice00001',
'neg_num': 1000,
'num_filt': 96,
'num_workers': 0,
'opt': 'adam',
'out_channels': 32,
'patch_size': 8,
'perm': 1,
'rel_vec_dim': 400,
'restore': False,
'seed': 42,
'test_only': False,
'train_strategy': 'one_to_n'}
2023-05-21 09:53:01,872 - [INFO] - [E:0| 0]: Train Loss:0.70005, Val MRR:0.0, ice00001
2023-05-21 09:54:44,438 - [INFO] - [E:0| 100]: Train Loss:0.37088, Val MRR:0.0, ice00001
2023-05-21 09:56:27,551 - [INFO] - [E:0| 200]: Train Loss:0.24322, Val MRR:0.0, ice00001
2023-05-21 09:58:10,978 - [INFO] - [E:0| 300]: Train Loss:0.18085, Val MRR:0.0, ice00001
2023-05-21 09:59:54,579 - [INFO] - [E:0| 400]: Train Loss:0.1442, Val MRR:0.0, ice00001
2023-05-21 10:01:28,424 - [INFO] - [E:0| 500]: Train Loss:0.12011, Val MRR:0.0, ice00001
2023-05-21 10:03:12,519 - [INFO] - [E:0| 600]: Train Loss:0.10307, Val MRR:0.0, ice00001
2023-05-21 10:04:56,298 - [INFO] - [E:0| 700]: Train Loss:0.090379, Val MRR:0.0, ice00001
2023-05-21 10:06:40,735 - [INFO] - [E:0| 800]: Train Loss:0.080559, Val MRR:0.0, ice00001
2023-05-21 10:08:23,985 - [INFO] - [E:0| 900]: Train Loss:0.072734, Val MRR:0.0, ice00001
2023-05-21 10:09:34,481 - [INFO] - [Epoch:0]: Training Loss:0.06808
2023-05-21 10:09:34,703 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 10:09:48,578 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 10:10:11,618 - [INFO] - [Evaluating Epoch 0 valid]:
MRR: Tail : 0.09185, Head : 0.06626, Avg : 0.07906
2023-05-21 10:10:13,843 - [INFO] - [Epoch 0]: Training Loss: 0.068079, Valid MRR: 0.07906,
2023-05-21 10:10:15,044 - [INFO] - [E:1| 0]: Train Loss:0.0086623, Val MRR:0.07906, ice00001
2023-05-21 10:11:59,461 - [INFO] - [E:1| 100]: Train Loss:0.0081907, Val MRR:0.07906, ice00001
2023-05-21 10:13:46,290 - [INFO] - [E:1| 200]: Train Loss:0.0078262, Val MRR:0.07906, ice00001
2023-05-21 10:15:30,406 - [INFO] - [E:1| 300]: Train Loss:0.007518, Val MRR:0.07906, ice00001
2023-05-21 10:17:14,171 - [INFO] - [E:1| 400]: Train Loss:0.0072515, Val MRR:0.07906, ice00001
2023-05-21 10:18:46,586 - [INFO] - [E:1| 500]: Train Loss:0.0070207, Val MRR:0.07906, ice00001
2023-05-21 10:20:30,503 - [INFO] - [E:1| 600]: Train Loss:0.0068193, Val MRR:0.07906, ice00001
2023-05-21 10:22:14,353 - [INFO] - [E:1| 700]: Train Loss:0.0066427, Val MRR:0.07906, ice00001
2023-05-21 10:23:57,937 - [INFO] - [E:1| 800]: Train Loss:0.0064861, Val MRR:0.07906, ice00001
2023-05-21 10:25:41,480 - [INFO] - [E:1| 900]: Train Loss:0.0063474, Val MRR:0.07906, ice00001
2023-05-21 10:26:54,276 - [INFO] - [Epoch:1]: Training Loss:0.006258
2023-05-21 10:26:54,798 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 10:27:13,318 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 10:27:29,096 - [INFO] - [Evaluating Epoch 1 valid]:
MRR: Tail : 0.09533, Head : 0.06944, Avg : 0.08238
2023-05-21 10:27:31,659 - [INFO] - [Epoch 1]: Training Loss: 0.0062582, Valid MRR: 0.08238,
2023-05-21 10:27:32,839 - [INFO] - [E:2| 0]: Train Loss:0.005127, Val MRR:0.08238, ice00001
2023-05-21 10:29:16,754 - [INFO] - [E:2| 100]: Train Loss:0.005043, Val MRR:0.08238, ice00001
2023-05-21 10:31:01,109 - [INFO] - [E:2| 200]: Train Loss:0.0049924, Val MRR:0.08238, ice00001
2023-05-21 10:32:43,956 - [INFO] - [E:2| 300]: Train Loss:0.0049512, Val MRR:0.08238, ice00001
2023-05-21 10:34:28,320 - [INFO] - [E:2| 400]: Train Loss:0.0049148, Val MRR:0.08238, ice00001
2023-05-21 10:36:02,300 - [INFO] - [E:2| 500]: Train Loss:0.0048828, Val MRR:0.08238, ice00001
2023-05-21 10:37:46,510 - [INFO] - [E:2| 600]: Train Loss:0.0048541, Val MRR:0.08238, ice00001
2023-05-21 10:39:30,314 - [INFO] - [E:2| 700]: Train Loss:0.004827, Val MRR:0.08238, ice00001
2023-05-21 10:41:12,934 - [INFO] - [E:2| 800]: Train Loss:0.0048034, Val MRR:0.08238, ice00001
2023-05-21 10:42:56,849 - [INFO] - [E:2| 900]: Train Loss:0.0047815, Val MRR:0.08238, ice00001
2023-05-21 10:44:09,559 - [INFO] - [Epoch:2]: Training Loss:0.004768
2023-05-21 10:44:10,050 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 10:44:30,836 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 10:44:44,508 - [INFO] - [Evaluating Epoch 2 valid]:
MRR: Tail : 0.09538, Head : 0.0697, Avg : 0.08254
2023-05-21 10:44:47,231 - [INFO] - [Epoch 2]: Training Loss: 0.0047677, Valid MRR: 0.08254,
2023-05-21 10:44:48,435 - [INFO] - [E:3| 0]: Train Loss:0.0046149, Val MRR:0.08254, ice00001
2023-05-21 10:46:32,114 - [INFO] - [E:3| 100]: Train Loss:0.0045548, Val MRR:0.08254, ice00001
2023-05-21 10:48:15,563 - [INFO] - [E:3| 200]: Train Loss:0.0045465, Val MRR:0.08254, ice00001
2023-05-21 10:49:59,902 - [INFO] - [E:3| 300]: Train Loss:0.0045373, Val MRR:0.08254, ice00001
2023-05-21 10:51:44,401 - [INFO] - [E:3| 400]: Train Loss:0.004526, Val MRR:0.08254, ice00001
2023-05-21 10:53:21,290 - [INFO] - [E:3| 500]: Train Loss:0.0045164, Val MRR:0.08254, ice00001
2023-05-21 10:55:03,171 - [INFO] - [E:3| 600]: Train Loss:0.0045051, Val MRR:0.08254, ice00001
2023-05-21 10:56:47,139 - [INFO] - [E:3| 700]: Train Loss:0.0044952, Val MRR:0.08254, ice00001
2023-05-21 10:58:30,656 - [INFO] - [E:3| 800]: Train Loss:0.0044854, Val MRR:0.08254, ice00001
2023-05-21 11:00:14,606 - [INFO] - [E:3| 900]: Train Loss:0.0044741, Val MRR:0.08254, ice00001
2023-05-21 11:01:29,379 - [INFO] - [Epoch:3]: Training Loss:0.004467
2023-05-21 11:01:29,775 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 11:01:53,167 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 11:02:07,478 - [INFO] - [Evaluating Epoch 3 valid]:
MRR: Tail : 0.09471, Head : 0.07009, Avg : 0.0824
2023-05-21 11:02:07,479 - [INFO] - [Epoch 3]: Training Loss: 0.0044671, Valid MRR: 0.08254,
2023-05-21 11:02:08,131 - [INFO] - [E:4| 0]: Train Loss:0.004339, Val MRR:0.08254, ice00001
2023-05-21 11:03:49,078 - [INFO] - [E:4| 100]: Train Loss:0.0043432, Val MRR:0.08254, ice00001
2023-05-21 11:05:31,973 - [INFO] - [E:4| 200]: Train Loss:0.0043363, Val MRR:0.08254, ice00001
2023-05-21 11:07:15,063 - [INFO] - [E:4| 300]: Train Loss:0.0043271, Val MRR:0.08254, ice00001
2023-05-21 11:08:59,148 - [INFO] - [E:4| 400]: Train Loss:0.0043174, Val MRR:0.08254, ice00001
2023-05-21 11:10:40,820 - [INFO] - [E:4| 500]: Train Loss:0.0043081, Val MRR:0.08254, ice00001
2023-05-21 11:12:14,049 - [INFO] - [E:4| 600]: Train Loss:0.004297, Val MRR:0.08254, ice00001
2023-05-21 11:13:55,869 - [INFO] - [E:4| 700]: Train Loss:0.0042864, Val MRR:0.08254, ice00001
2023-05-21 11:15:36,927 - [INFO] - [E:4| 800]: Train Loss:0.0042748, Val MRR:0.08254, ice00001
2023-05-21 11:17:17,752 - [INFO] - [E:4| 900]: Train Loss:0.0042641, Val MRR:0.08254, ice00001
2023-05-21 11:18:29,450 - [INFO] - [Epoch:4]: Training Loss:0.004257
2023-05-21 11:18:29,738 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 11:18:53,236 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 11:19:16,183 - [INFO] - [Evaluating Epoch 4 valid]:
MRR: Tail : 0.09531, Head : 0.06964, Avg : 0.08248
2023-05-21 11:19:16,184 - [INFO] - [Epoch 4]: Training Loss: 0.0042571, Valid MRR: 0.08254,
2023-05-21 11:19:17,347 - [INFO] - [E:5| 0]: Train Loss:0.0041452, Val MRR:0.08254, ice00001
2023-05-21 11:20:48,345 - [INFO] - [E:5| 100]: Train Loss:0.0041406, Val MRR:0.08254, ice00001
2023-05-21 11:22:29,161 - [INFO] - [E:5| 200]: Train Loss:0.0041292, Val MRR:0.08254, ice00001
2023-05-21 11:24:09,842 - [INFO] - [E:5| 300]: Train Loss:0.0041189, Val MRR:0.08254, ice00001
2023-05-21 11:25:50,519 - [INFO] - [E:5| 400]: Train Loss:0.004107, Val MRR:0.08254, ice00001
2023-05-21 11:27:30,891 - [INFO] - [E:5| 500]: Train Loss:0.0040962, Val MRR:0.08254, ice00001
2023-05-21 11:28:59,921 - [INFO] - [E:5| 600]: Train Loss:0.0040854, Val MRR:0.08254, ice00001
2023-05-21 11:30:38,373 - [INFO] - [E:5| 700]: Train Loss:0.0040746, Val MRR:0.08254, ice00001
2023-05-21 11:32:16,776 - [INFO] - [E:5| 800]: Train Loss:0.0040639, Val MRR:0.08254, ice00001
2023-05-21 11:33:55,332 - [INFO] - [E:5| 900]: Train Loss:0.0040538, Val MRR:0.08254, ice00001
2023-05-21 11:35:04,425 - [INFO] - [Epoch:5]: Training Loss:0.004046
2023-05-21 11:35:04,795 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 11:35:27,193 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 11:35:49,621 - [INFO] - [Evaluating Epoch 5 valid]:
MRR: Tail : 0.0947, Head : 0.06993, Avg : 0.08232
2023-05-21 11:35:49,621 - [INFO] - [Epoch 5]: Training Loss: 0.0040463, Valid MRR: 0.08254,
2023-05-21 11:35:50,708 - [INFO] - [E:6| 0]: Train Loss:0.0039617, Val MRR:0.08254, ice00001
2023-05-21 11:37:19,419 - [INFO] - [E:6| 100]: Train Loss:0.0039326, Val MRR:0.08254, ice00001
2023-05-21 11:38:58,296 - [INFO] - [E:6| 200]: Train Loss:0.0039224, Val MRR:0.08254, ice00001
2023-05-21 11:40:37,813 - [INFO] - [E:6| 300]: Train Loss:0.0039121, Val MRR:0.08254, ice00001
2023-05-21 11:42:16,823 - [INFO] - [E:6| 400]: Train Loss:0.0039036, Val MRR:0.08254, ice00001
2023-05-21 11:43:55,666 - [INFO] - [E:6| 500]: Train Loss:0.0038937, Val MRR:0.08254, ice00001
2023-05-21 11:45:27,875 - [INFO] - [E:6| 600]: Train Loss:0.0038836, Val MRR:0.08254, ice00001
2023-05-21 11:47:03,007 - [INFO] - [E:6| 700]: Train Loss:0.003874, Val MRR:0.08254, ice00001
2023-05-21 11:48:42,374 - [INFO] - [E:6| 800]: Train Loss:0.0038642, Val MRR:0.08254, ice00001
2023-05-21 11:50:23,883 - [INFO] - [E:6| 900]: Train Loss:0.0038548, Val MRR:0.08254, ice00001
2023-05-21 11:51:32,688 - [INFO] - [Epoch:6]: Training Loss:0.003848
2023-05-21 11:51:33,179 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 11:51:55,826 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 11:52:18,347 - [INFO] - [Evaluating Epoch 6 valid]:
MRR: Tail : 0.095, Head : 0.06919, Avg : 0.0821
2023-05-21 11:52:18,347 - [INFO] - [Epoch 6]: Training Loss: 0.0038482, Valid MRR: 0.08254,
2023-05-21 11:52:19,541 - [INFO] - [E:7| 0]: Train Loss:0.003808, Val MRR:0.08254, ice00001
2023-05-21 11:53:58,392 - [INFO] - [E:7| 100]: Train Loss:0.0037455, Val MRR:0.08254, ice00001
2023-05-21 11:55:26,930 - [INFO] - [E:7| 200]: Train Loss:0.0037388, Val MRR:0.08254, ice00001
2023-05-21 11:57:05,191 - [INFO] - [E:7| 300]: Train Loss:0.0037298, Val MRR:0.08254, ice00001
2023-05-21 11:58:42,642 - [INFO] - [E:7| 400]: Train Loss:0.0037213, Val MRR:0.08254, ice00001
2023-05-21 12:00:19,349 - [INFO] - [E:7| 500]: Train Loss:0.0037134, Val MRR:0.08254, ice00001
2023-05-21 12:01:57,308 - [INFO] - [E:7| 600]: Train Loss:0.0037056, Val MRR:0.08254, ice00001
2023-05-21 12:03:25,507 - [INFO] - [E:7| 700]: Train Loss:0.0036984, Val MRR:0.08254, ice00001
2023-05-21 12:05:04,485 - [INFO] - [E:7| 800]: Train Loss:0.0036902, Val MRR:0.08254, ice00001
2023-05-21 12:06:43,144 - [INFO] - [E:7| 900]: Train Loss:0.0036828, Val MRR:0.08254, ice00001
2023-05-21 12:07:52,561 - [INFO] - [Epoch:7]: Training Loss:0.003677
2023-05-21 12:07:52,893 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 12:08:15,679 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 12:08:37,763 - [INFO] - [Evaluating Epoch 7 valid]:
MRR: Tail : 0.09535, Head : 0.0697, Avg : 0.08253
2023-05-21 12:08:37,763 - [INFO] - [Epoch 7]: Training Loss: 0.003677, Valid MRR: 0.08254,
2023-05-21 12:08:38,858 - [INFO] - [E:8| 0]: Train Loss:0.0035613, Val MRR:0.08254, ice00001
2023-05-21 12:10:17,514 - [INFO] - [E:8| 100]: Train Loss:0.0035953, Val MRR:0.08254, ice00001
2023-05-21 12:11:48,568 - [INFO] - [E:8| 200]: Train Loss:0.0035904, Val MRR:0.08254, ice00001
2023-05-21 12:13:23,658 - [INFO] - [E:8| 300]: Train Loss:0.0035823, Val MRR:0.08254, ice00001
2023-05-21 12:15:02,437 - [INFO] - [E:8| 400]: Train Loss:0.0035763, Val MRR:0.08254, ice00001
2023-05-21 12:16:41,155 - [INFO] - [E:8| 500]: Train Loss:0.0035697, Val MRR:0.08254, ice00001
2023-05-21 12:18:20,787 - [INFO] - [E:8| 600]: Train Loss:0.0035635, Val MRR:0.08254, ice00001
2023-05-21 12:19:59,626 - [INFO] - [E:8| 700]: Train Loss:0.0035568, Val MRR:0.08254, ice00001
2023-05-21 12:21:28,328 - [INFO] - [E:8| 800]: Train Loss:0.0035503, Val MRR:0.08254, ice00001
2023-05-21 12:23:07,150 - [INFO] - [E:8| 900]: Train Loss:0.0035439, Val MRR:0.08254, ice00001
2023-05-21 12:24:16,719 - [INFO] - [Epoch:8]: Training Loss:0.00354
2023-05-21 12:24:17,218 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 12:24:39,757 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 12:25:02,054 - [INFO] - [Evaluating Epoch 8 valid]:
MRR: Tail : 0.09419, Head : 0.07002, Avg : 0.0821
2023-05-21 12:25:02,054 - [INFO] - [Epoch 8]: Training Loss: 0.0035399, Valid MRR: 0.08254,
2023-05-21 12:25:02,823 - [INFO] - [E:9| 0]: Train Loss:0.0034489, Val MRR:0.08254, ice00001
2023-05-21 12:26:42,251 - [INFO] - [E:9| 100]: Train Loss:0.003475, Val MRR:0.08254, ice00001
2023-05-21 12:28:19,934 - [INFO] - [E:9| 200]: Train Loss:0.003473, Val MRR:0.08254, ice00001
2023-05-21 12:29:48,480 - [INFO] - [E:9| 300]: Train Loss:0.0034693, Val MRR:0.08254, ice00001
2023-05-21 12:31:26,915 - [INFO] - [E:9| 400]: Train Loss:0.0034656, Val MRR:0.08254, ice00001
2023-05-21 12:33:05,316 - [INFO] - [E:9| 500]: Train Loss:0.0034602, Val MRR:0.08254, ice00001
2023-05-21 12:34:43,710 - [INFO] - [E:9| 600]: Train Loss:0.0034562, Val MRR:0.08254, ice00001
2023-05-21 12:36:22,151 - [INFO] - [E:9| 700]: Train Loss:0.003451, Val MRR:0.08254, ice00001
2023-05-21 12:37:55,684 - [INFO] - [E:9| 800]: Train Loss:0.003447, Val MRR:0.08254, ice00001
2023-05-21 12:39:32,168 - [INFO] - [E:9| 900]: Train Loss:0.0034423, Val MRR:0.08254, ice00001
2023-05-21 12:40:41,104 - [INFO] - [Epoch:9]: Training Loss:0.003439
2023-05-21 12:40:41,611 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 12:41:04,276 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 12:41:26,123 - [INFO] - [Evaluating Epoch 9 valid]:
MRR: Tail : 0.09494, Head : 0.06954, Avg : 0.08224
MR: Tail : 782.83, Head : 966.23, Avg : 874.53
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09868, Head : 0.06202, Avg : 0.08035
Hit-10: Tail : 0.21083, Head : 0.15754, Avg : 0.18418
2023-05-21 12:41:26,123 - [INFO] - [Epoch 9]: Training Loss: 0.0034393, Valid MRR: 0.08254,
2023-05-21 12:41:27,266 - [INFO] - [E:10| 0]: Train Loss:0.0034043, Val MRR:0.08254, ice00001
2023-05-21 12:43:04,474 - [INFO] - [E:10| 100]: Train Loss:0.0033925, Val MRR:0.08254, ice00001
2023-05-21 12:44:42,878 - [INFO] - [E:10| 200]: Train Loss:0.0033955, Val MRR:0.08254, ice00001
2023-05-21 12:46:20,500 - [INFO] - [E:10| 300]: Train Loss:0.0033925, Val MRR:0.08254, ice00001
2023-05-21 12:47:49,303 - [INFO] - [E:10| 400]: Train Loss:0.0033878, Val MRR:0.08254, ice00001
2023-05-21 12:49:28,222 - [INFO] - [E:10| 500]: Train Loss:0.0033864, Val MRR:0.08254, ice00001
2023-05-21 12:51:06,916 - [INFO] - [E:10| 600]: Train Loss:0.0033843, Val MRR:0.08254, ice00001
2023-05-21 12:52:45,372 - [INFO] - [E:10| 700]: Train Loss:0.0033821, Val MRR:0.08254, ice00001
2023-05-21 12:54:24,217 - [INFO] - [E:10| 800]: Train Loss:0.0033796, Val MRR:0.08254, ice00001
2023-05-21 12:55:52,217 - [INFO] - [E:10| 900]: Train Loss:0.0033773, Val MRR:0.08254, ice00001
2023-05-21 12:57:00,352 - [INFO] - [Epoch:10]: Training Loss:0.003376
2023-05-21 12:57:00,604 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 12:57:23,091 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 12:57:44,929 - [INFO] - [Evaluating Epoch 10 valid]:
MRR: Tail : 0.0953, Head : 0.06934, Avg : 0.08232
2023-05-21 12:57:44,930 - [INFO] - [Epoch 10]: Training Loss: 0.0033761, Valid MRR: 0.08254,
2023-05-21 12:57:46,071 - [INFO] - [E:11| 0]: Train Loss:0.0034035, Val MRR:0.08254, ice00001
2023-05-21 12:59:23,741 - [INFO] - [E:11| 100]: Train Loss:0.0033487, Val MRR:0.08254, ice00001
2023-05-21 13:01:01,833 - [INFO] - [E:11| 200]: Train Loss:0.0033478, Val MRR:0.08254, ice00001
2023-05-21 13:02:40,157 - [INFO] - [E:11| 300]: Train Loss:0.0033482, Val MRR:0.08254, ice00001
2023-05-21 13:04:07,911 - [INFO] - [E:11| 400]: Train Loss:0.0033474, Val MRR:0.08254, ice00001
2023-05-21 13:05:46,145 - [INFO] - [E:11| 500]: Train Loss:0.0033455, Val MRR:0.08254, ice00001
2023-05-21 13:07:24,390 - [INFO] - [E:11| 600]: Train Loss:0.0033436, Val MRR:0.08254, ice00001
2023-05-21 13:09:01,678 - [INFO] - [E:11| 700]: Train Loss:0.0033418, Val MRR:0.08254, ice00001
2023-05-21 13:10:39,655 - [INFO] - [E:11| 800]: Train Loss:0.003341, Val MRR:0.08254, ice00001
2023-05-21 13:12:16,177 - [INFO] - [E:11| 900]: Train Loss:0.0033398, Val MRR:0.08254, ice00001
2023-05-21 13:13:15,009 - [INFO] - [Epoch:11]: Training Loss:0.003339
2023-05-21 13:13:15,397 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 13:13:37,675 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 13:13:59,912 - [INFO] - [Evaluating Epoch 11 valid]:
MRR: Tail : 0.09543, Head : 0.06931, Avg : 0.08237
2023-05-21 13:13:59,912 - [INFO] - [Epoch 11]: Training Loss: 0.0033394, Valid MRR: 0.08254,
2023-05-21 13:14:00,856 - [INFO] - [E:12| 0]: Train Loss:0.0033223, Val MRR:0.08254, ice00001
2023-05-21 13:15:39,420 - [INFO] - [E:12| 100]: Train Loss:0.0033282, Val MRR:0.08254, ice00001
2023-05-21 13:17:17,386 - [INFO] - [E:12| 200]: Train Loss:0.0033261, Val MRR:0.08254, ice00001
2023-05-21 13:18:55,626 - [INFO] - [E:12| 300]: Train Loss:0.0033229, Val MRR:0.08254, ice00001
2023-05-21 13:20:33,919 - [INFO] - [E:12| 400]: Train Loss:0.0033222, Val MRR:0.08254, ice00001
2023-05-21 13:22:02,122 - [INFO] - [E:12| 500]: Train Loss:0.0033217, Val MRR:0.08254, ice00001
2023-05-21 13:23:40,409 - [INFO] - [E:12| 600]: Train Loss:0.0033209, Val MRR:0.08254, ice00001
2023-05-21 13:25:22,010 - [INFO] - [E:12| 700]: Train Loss:0.00332, Val MRR:0.08254, ice00001
2023-05-21 13:26:59,198 - [INFO] - [E:12| 800]: Train Loss:0.0033187, Val MRR:0.08254, ice00001
2023-05-21 13:28:37,246 - [INFO] - [E:12| 900]: Train Loss:0.0033174, Val MRR:0.08254, ice00001
2023-05-21 13:29:45,812 - [INFO] - [Epoch:12]: Training Loss:0.003317
2023-05-21 13:29:46,131 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 13:30:03,856 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 13:30:19,212 - [INFO] - [Evaluating Epoch 12 valid]:
MRR: Tail : 0.09463, Head : 0.06961, Avg : 0.08212
2023-05-21 13:30:19,212 - [INFO] - [Epoch 12]: Training Loss: 0.003317, Valid MRR: 0.08254,
2023-05-21 13:30:20,012 - [INFO] - [E:13| 0]: Train Loss:0.0032687, Val MRR:0.08254, ice00001
2023-05-21 13:31:57,064 - [INFO] - [E:13| 100]: Train Loss:0.0033006, Val MRR:0.08254, ice00001
2023-05-21 13:33:34,394 - [INFO] - [E:13| 200]: Train Loss:0.0033028, Val MRR:0.08254, ice00001
2023-05-21 13:35:11,753 - [INFO] - [E:13| 300]: Train Loss:0.0033027, Val MRR:0.08254, ice00001
2023-05-21 13:36:49,282 - [INFO] - [E:13| 400]: Train Loss:0.0033032, Val MRR:0.08254, ice00001
2023-05-21 13:38:27,089 - [INFO] - [E:13| 500]: Train Loss:0.0033019, Val MRR:0.08254, ice00001
2023-05-21 13:39:53,345 - [INFO] - [E:13| 600]: Train Loss:0.0033001, Val MRR:0.08254, ice00001
2023-05-21 13:41:30,035 - [INFO] - [E:13| 700]: Train Loss:0.0032986, Val MRR:0.08254, ice00001
2023-05-21 13:43:07,873 - [INFO] - [E:13| 800]: Train Loss:0.0032979, Val MRR:0.08254, ice00001
2023-05-21 13:44:45,695 - [INFO] - [E:13| 900]: Train Loss:0.0032974, Val MRR:0.08254, ice00001
2023-05-21 13:45:53,682 - [INFO] - [Epoch:13]: Training Loss:0.003297
2023-05-21 13:45:53,930 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 13:46:16,527 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 13:46:38,926 - [INFO] - [Evaluating Epoch 13 valid]:
MRR: Tail : 0.09549, Head : 0.06896, Avg : 0.08223
2023-05-21 13:46:38,926 - [INFO] - [Epoch 13]: Training Loss: 0.0032971, Valid MRR: 0.08254,
2023-05-21 13:46:40,061 - [INFO] - [E:14| 0]: Train Loss:0.0032637, Val MRR:0.08254, ice00001
2023-05-21 13:48:07,301 - [INFO] - [E:14| 100]: Train Loss:0.0032856, Val MRR:0.08254, ice00001
2023-05-21 13:49:44,929 - [INFO] - [E:14| 200]: Train Loss:0.0032864, Val MRR:0.08254, ice00001
2023-05-21 13:51:22,722 - [INFO] - [E:14| 300]: Train Loss:0.0032856, Val MRR:0.08254, ice00001
2023-05-21 13:53:00,317 - [INFO] - [E:14| 400]: Train Loss:0.0032844, Val MRR:0.08254, ice00001
2023-05-21 13:54:36,502 - [INFO] - [E:14| 500]: Train Loss:0.0032841, Val MRR:0.08254, ice00001
2023-05-21 13:56:06,964 - [INFO] - [E:14| 600]: Train Loss:0.0032819, Val MRR:0.08254, ice00001
2023-05-21 13:57:39,842 - [INFO] - [E:14| 700]: Train Loss:0.0032802, Val MRR:0.08254, ice00001
2023-05-21 13:59:16,916 - [INFO] - [E:14| 800]: Train Loss:0.003279, Val MRR:0.08254, ice00001
2023-05-21 14:00:54,760 - [INFO] - [E:14| 900]: Train Loss:0.0032777, Val MRR:0.08254, ice00001
2023-05-21 14:02:03,440 - [INFO] - [Epoch:14]: Training Loss:0.003278
2023-05-21 14:02:03,764 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 14:02:26,415 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 14:02:48,384 - [INFO] - [Evaluating Epoch 14 valid]:
MRR: Tail : 0.09339, Head : 0.07015, Avg : 0.08177
2023-05-21 14:02:48,384 - [INFO] - [Epoch 14]: Training Loss: 0.0032776, Valid MRR: 0.08254,
2023-05-21 14:02:49,460 - [INFO] - [E:15| 0]: Train Loss:0.0032531, Val MRR:0.08254, ice00001
2023-05-21 14:04:27,108 - [INFO] - [E:15| 100]: Train Loss:0.0032686, Val MRR:0.08254, ice00001
2023-05-21 14:05:54,219 - [INFO] - [E:15| 200]: Train Loss:0.0032687, Val MRR:0.08254, ice00001
2023-05-21 14:07:32,258 - [INFO] - [E:15| 300]: Train Loss:0.0032686, Val MRR:0.08254, ice00001
2023-05-21 14:09:09,197 - [INFO] - [E:15| 400]: Train Loss:0.0032658, Val MRR:0.08254, ice00001
2023-05-21 14:10:45,766 - [INFO] - [E:15| 500]: Train Loss:0.0032636, Val MRR:0.08254, ice00001
2023-05-21 14:12:22,795 - [INFO] - [E:15| 600]: Train Loss:0.0032624, Val MRR:0.08254, ice00001
2023-05-21 14:13:53,133 - [INFO] - [E:15| 700]: Train Loss:0.0032614, Val MRR:0.08254, ice00001
2023-05-21 14:15:30,256 - [INFO] - [E:15| 800]: Train Loss:0.0032596, Val MRR:0.08254, ice00001
2023-05-21 14:17:06,892 - [INFO] - [E:15| 900]: Train Loss:0.0032587, Val MRR:0.08254, ice00001
2023-05-21 14:18:15,410 - [INFO] - [Epoch:15]: Training Loss:0.003258
2023-05-21 14:18:15,900 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 14:18:38,460 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 14:19:00,182 - [INFO] - [Evaluating Epoch 15 valid]:
MRR: Tail : 0.09201, Head : 0.06502, Avg : 0.07852
2023-05-21 14:19:00,183 - [INFO] - [Epoch 15]: Training Loss: 0.003258, Valid MRR: 0.08254,
2023-05-21 14:19:01,195 - [INFO] - [E:16| 0]: Train Loss:0.0032393, Val MRR:0.08254, ice00001
2023-05-21 14:20:38,549 - [INFO] - [E:16| 100]: Train Loss:0.0032428, Val MRR:0.08254, ice00001
2023-05-21 14:22:15,499 - [INFO] - [E:16| 200]: Train Loss:0.0032447, Val MRR:0.08254, ice00001
2023-05-21 14:23:41,717 - [INFO] - [E:16| 300]: Train Loss:0.0032435, Val MRR:0.08254, ice00001
2023-05-21 14:25:18,118 - [INFO] - [E:16| 400]: Train Loss:0.00324, Val MRR:0.08254, ice00001
2023-05-21 14:26:54,929 - [INFO] - [E:16| 500]: Train Loss:0.0032372, Val MRR:0.08254, ice00001
2023-05-21 14:28:32,536 - [INFO] - [E:16| 600]: Train Loss:0.0032335, Val MRR:0.08254, ice00001
2023-05-21 14:30:09,704 - [INFO] - [E:16| 700]: Train Loss:0.0032298, Val MRR:0.08254, ice00001
2023-05-21 14:31:37,070 - [INFO] - [E:16| 800]: Train Loss:0.0032249, Val MRR:0.08254, ice00001
2023-05-21 14:33:14,520 - [INFO] - [E:16| 900]: Train Loss:0.0032189, Val MRR:0.08254, ice00001
2023-05-21 14:34:23,163 - [INFO] - [Epoch:16]: Training Loss:0.003214
2023-05-21 14:34:23,475 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 14:34:45,871 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 14:35:07,768 - [INFO] - [Evaluating Epoch 16 valid]:
MRR: Tail : 0.0953, Head : 0.06896, Avg : 0.08213
2023-05-21 14:35:07,768 - [INFO] - [Epoch 16]: Training Loss: 0.0032141, Valid MRR: 0.08254,
2023-05-21 14:35:08,541 - [INFO] - [E:17| 0]: Train Loss:0.0031859, Val MRR:0.08254, ice00001
2023-05-21 14:36:46,415 - [INFO] - [E:17| 100]: Train Loss:0.0031358, Val MRR:0.08254, ice00001
2023-05-21 14:38:22,746 - [INFO] - [E:17| 200]: Train Loss:0.0031304, Val MRR:0.08254, ice00001
2023-05-21 14:39:51,751 - [INFO] - [E:17| 300]: Train Loss:0.0031197, Val MRR:0.08254, ice00001
2023-05-21 14:41:27,210 - [INFO] - [E:17| 400]: Train Loss:0.003106, Val MRR:0.08254, ice00001
2023-05-21 14:43:05,126 - [INFO] - [E:17| 500]: Train Loss:0.003095, Val MRR:0.08254, ice00001
2023-05-21 14:44:42,121 - [INFO] - [E:17| 600]: Train Loss:0.0030859, Val MRR:0.08254, ice00001
2023-05-21 14:46:20,110 - [INFO] - [E:17| 700]: Train Loss:0.0030778, Val MRR:0.08254, ice00001
2023-05-21 14:47:58,328 - [INFO] - [E:17| 800]: Train Loss:0.0030697, Val MRR:0.08254, ice00001
2023-05-21 14:49:25,730 - [INFO] - [E:17| 900]: Train Loss:0.0030638, Val MRR:0.08254, ice00001
2023-05-21 14:50:34,189 - [INFO] - [Epoch:17]: Training Loss:0.003059
2023-05-21 14:50:34,633 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 14:50:57,145 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 14:51:18,886 - [INFO] - [Evaluating Epoch 17 valid]:
MRR: Tail : 0.09432, Head : 0.06985, Avg : 0.08209
2023-05-21 14:51:18,886 - [INFO] - [Epoch 17]: Training Loss: 0.0030591, Valid MRR: 0.08254,
2023-05-21 14:51:19,902 - [INFO] - [E:18| 0]: Train Loss:0.0029766, Val MRR:0.08254, ice00001
2023-05-21 14:52:56,407 - [INFO] - [E:18| 100]: Train Loss:0.0029949, Val MRR:0.08254, ice00001
2023-05-21 14:54:33,715 - [INFO] - [E:18| 200]: Train Loss:0.0029904, Val MRR:0.08254, ice00001
2023-05-21 14:56:10,725 - [INFO] - [E:18| 300]: Train Loss:0.0029864, Val MRR:0.08254, ice00001
2023-05-21 14:57:38,073 - [INFO] - [E:18| 400]: Train Loss:0.0029837, Val MRR:0.08254, ice00001
2023-05-21 14:59:16,083 - [INFO] - [E:18| 500]: Train Loss:0.0029796, Val MRR:0.08254, ice00001
2023-05-21 15:00:53,464 - [INFO] - [E:18| 600]: Train Loss:0.0029768, Val MRR:0.08254, ice00001
2023-05-21 15:02:34,235 - [INFO] - [E:18| 700]: Train Loss:0.0029743, Val MRR:0.08254, ice00001
2023-05-21 15:04:11,774 - [INFO] - [E:18| 800]: Train Loss:0.0029723, Val MRR:0.08254, ice00001
2023-05-21 15:05:50,082 - [INFO] - [E:18| 900]: Train Loss:0.0029697, Val MRR:0.08254, ice00001
2023-05-21 15:06:47,837 - [INFO] - [Epoch:18]: Training Loss:0.002968
2023-05-21 15:06:48,198 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 15:07:10,213 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 15:07:32,242 - [INFO] - [Evaluating Epoch 18 valid]:
MRR: Tail : 0.09511, Head : 0.06947, Avg : 0.08229
2023-05-21 15:07:32,242 - [INFO] - [Epoch 18]: Training Loss: 0.0029676, Valid MRR: 0.08254,
2023-05-21 15:07:33,142 - [INFO] - [E:19| 0]: Train Loss:0.0029467, Val MRR:0.08254, ice00001
2023-05-21 15:09:10,026 - [INFO] - [E:19| 100]: Train Loss:0.0029418, Val MRR:0.08254, ice00001
2023-05-21 15:10:47,338 - [INFO] - [E:19| 200]: Train Loss:0.0029384, Val MRR:0.08254, ice00001
2023-05-21 15:12:25,506 - [INFO] - [E:19| 300]: Train Loss:0.0029368, Val MRR:0.08254, ice00001
2023-05-21 15:14:03,535 - [INFO] - [E:19| 400]: Train Loss:0.0029352, Val MRR:0.08254, ice00001
2023-05-21 15:15:30,683 - [INFO] - [E:19| 500]: Train Loss:0.0029333, Val MRR:0.08254, ice00001
2023-05-21 15:17:08,264 - [INFO] - [E:19| 600]: Train Loss:0.0029324, Val MRR:0.08254, ice00001
2023-05-21 15:18:45,762 - [INFO] - [E:19| 700]: Train Loss:0.0029306, Val MRR:0.08254, ice00001
2023-05-21 15:20:23,314 - [INFO] - [E:19| 800]: Train Loss:0.002929, Val MRR:0.08254, ice00001
2023-05-21 15:22:00,170 - [INFO] - [E:19| 900]: Train Loss:0.0029268, Val MRR:0.08254, ice00001
2023-05-21 15:23:08,462 - [INFO] - [Epoch:19]: Training Loss:0.002926
2023-05-21 15:23:08,720 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 15:23:25,971 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 15:23:41,517 - [INFO] - [Evaluating Epoch 19 valid]:
MRR: Tail : 0.09501, Head : 0.06922, Avg : 0.08211
MR: Tail : 876.26, Head : 1038.9, Avg : 957.57
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.10001, Head : 0.06263, Avg : 0.08132
Hit-10: Tail : 0.21544, Head : 0.15463, Avg : 0.18503
2023-05-21 15:23:41,517 - [INFO] - [Epoch 19]: Training Loss: 0.0029255, Valid MRR: 0.08254,
2023-05-21 15:23:42,572 - [INFO] - [E:20| 0]: Train Loss:0.0028917, Val MRR:0.08254, ice00001
2023-05-21 15:25:20,166 - [INFO] - [E:20| 100]: Train Loss:0.0029105, Val MRR:0.08254, ice00001
2023-05-21 15:26:57,884 - [INFO] - [E:20| 200]: Train Loss:0.0029121, Val MRR:0.08254, ice00001
2023-05-21 15:28:35,602 - [INFO] - [E:20| 300]: Train Loss:0.0029114, Val MRR:0.08254, ice00001
2023-05-21 15:30:12,819 - [INFO] - [E:20| 400]: Train Loss:0.0029099, Val MRR:0.08254, ice00001
2023-05-21 15:31:50,186 - [INFO] - [E:20| 500]: Train Loss:0.0029097, Val MRR:0.08254, ice00001
2023-05-21 15:33:17,384 - [INFO] - [E:20| 600]: Train Loss:0.0029086, Val MRR:0.08254, ice00001
2023-05-21 15:34:54,823 - [INFO] - [E:20| 700]: Train Loss:0.0029082, Val MRR:0.08254, ice00001
2023-05-21 15:36:31,037 - [INFO] - [E:20| 800]: Train Loss:0.002908, Val MRR:0.08254, ice00001
2023-05-21 15:38:08,352 - [INFO] - [E:20| 900]: Train Loss:0.0029076, Val MRR:0.08254, ice00001
2023-05-21 15:39:17,153 - [INFO] - [Epoch:20]: Training Loss:0.002907
2023-05-21 15:39:17,532 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 15:39:40,230 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 15:40:02,535 - [INFO] - [Evaluating Epoch 20 valid]:
MRR: Tail : 0.09516, Head : 0.06952, Avg : 0.08234
2023-05-21 15:40:02,536 - [INFO] - [Epoch 20]: Training Loss: 0.0029072, Valid MRR: 0.08254,
2023-05-21 15:40:03,676 - [INFO] - [E:21| 0]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-21 15:41:31,198 - [INFO] - [E:21| 100]: Train Loss:0.0028943, Val MRR:0.08254, ice00001
2023-05-21 15:43:08,014 - [INFO] - [E:21| 200]: Train Loss:0.0028966, Val MRR:0.08254, ice00001
2023-05-21 15:44:46,317 - [INFO] - [E:21| 300]: Train Loss:0.0028958, Val MRR:0.08254, ice00001
2023-05-21 15:46:23,562 - [INFO] - [E:21| 400]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-21 15:48:01,193 - [INFO] - [E:21| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-21 15:49:33,969 - [INFO] - [E:21| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 15:51:06,906 - [INFO] - [E:21| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 15:52:43,921 - [INFO] - [E:21| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 15:54:21,757 - [INFO] - [E:21| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 15:55:30,435 - [INFO] - [Epoch:21]: Training Loss:0.002902
2023-05-21 15:55:30,930 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 15:55:53,482 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 15:56:15,431 - [INFO] - [Evaluating Epoch 21 valid]:
MRR: Tail : 0.0955, Head : 0.06894, Avg : 0.08222
2023-05-21 15:56:15,431 - [INFO] - [Epoch 21]: Training Loss: 0.0029022, Valid MRR: 0.08254,
2023-05-21 15:56:16,441 - [INFO] - [E:22| 0]: Train Loss:0.0028676, Val MRR:0.08254, ice00001
2023-05-21 15:57:54,408 - [INFO] - [E:22| 100]: Train Loss:0.002906, Val MRR:0.08254, ice00001
2023-05-21 15:59:21,360 - [INFO] - [E:22| 200]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-21 16:00:59,149 - [INFO] - [E:22| 300]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 16:02:36,140 - [INFO] - [E:22| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 16:04:13,763 - [INFO] - [E:22| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-21 16:05:49,991 - [INFO] - [E:22| 600]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-21 16:07:17,036 - [INFO] - [E:22| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 16:08:54,526 - [INFO] - [E:22| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 16:10:32,547 - [INFO] - [E:22| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 16:11:41,294 - [INFO] - [Epoch:22]: Training Loss:0.002902
2023-05-21 16:11:41,672 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 16:12:04,483 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 16:12:26,652 - [INFO] - [Evaluating Epoch 22 valid]:
MRR: Tail : 0.09475, Head : 0.06939, Avg : 0.08207
2023-05-21 16:12:26,652 - [INFO] - [Epoch 22]: Training Loss: 0.0029017, Valid MRR: 0.08254,
2023-05-21 16:12:27,793 - [INFO] - [E:23| 0]: Train Loss:0.0029466, Val MRR:0.08254, ice00001
2023-05-21 16:14:05,276 - [INFO] - [E:23| 100]: Train Loss:0.002905, Val MRR:0.08254, ice00001
2023-05-21 16:15:42,723 - [INFO] - [E:23| 200]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-21 16:17:10,001 - [INFO] - [E:23| 300]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-21 16:18:47,010 - [INFO] - [E:23| 400]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-21 16:20:23,142 - [INFO] - [E:23| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 16:22:00,733 - [INFO] - [E:23| 600]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-21 16:23:37,705 - [INFO] - [E:23| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 16:25:04,521 - [INFO] - [E:23| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 16:26:42,152 - [INFO] - [E:23| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 16:27:50,728 - [INFO] - [Epoch:23]: Training Loss:0.002901
2023-05-21 16:27:51,106 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 16:28:14,016 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 16:28:36,181 - [INFO] - [Evaluating Epoch 23 valid]:
MRR: Tail : 0.09357, Head : 0.06961, Avg : 0.08159
2023-05-21 16:28:36,181 - [INFO] - [Epoch 23]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-21 16:28:37,305 - [INFO] - [E:24| 0]: Train Loss:0.0030078, Val MRR:0.08254, ice00001
2023-05-21 16:30:14,967 - [INFO] - [E:24| 100]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 16:31:52,717 - [INFO] - [E:24| 200]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 16:33:22,121 - [INFO] - [E:24| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 16:34:55,675 - [INFO] - [E:24| 400]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 16:36:33,477 - [INFO] - [E:24| 500]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 16:38:13,459 - [INFO] - [E:24| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-21 16:39:51,536 - [INFO] - [E:24| 700]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-21 16:41:29,489 - [INFO] - [E:24| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 16:42:56,229 - [INFO] - [E:24| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 16:44:04,489 - [INFO] - [Epoch:24]: Training Loss:0.002902
2023-05-21 16:44:04,942 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 16:44:27,883 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 16:44:50,068 - [INFO] - [Evaluating Epoch 24 valid]:
MRR: Tail : 0.09564, Head : 0.0691, Avg : 0.08237
2023-05-21 16:44:50,068 - [INFO] - [Epoch 24]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-21 16:44:50,990 - [INFO] - [E:25| 0]: Train Loss:0.0028768, Val MRR:0.08254, ice00001
2023-05-21 16:46:28,955 - [INFO] - [E:25| 100]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-21 16:48:06,048 - [INFO] - [E:25| 200]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-21 16:49:42,705 - [INFO] - [E:25| 300]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 16:51:09,552 - [INFO] - [E:25| 400]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 16:52:47,068 - [INFO] - [E:25| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 16:54:24,714 - [INFO] - [E:25| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 16:56:02,266 - [INFO] - [E:25| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 16:57:40,237 - [INFO] - [E:25| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 16:59:17,706 - [INFO] - [E:25| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 17:00:16,094 - [INFO] - [Epoch:25]: Training Loss:0.002901
2023-05-21 17:00:16,596 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 17:00:39,166 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 17:01:01,374 - [INFO] - [Evaluating Epoch 25 valid]:
MRR: Tail : 0.09421, Head : 0.06878, Avg : 0.0815
2023-05-21 17:01:01,374 - [INFO] - [Epoch 25]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 17:01:02,148 - [INFO] - [E:26| 0]: Train Loss:0.0028624, Val MRR:0.08254, ice00001
2023-05-21 17:02:39,222 - [INFO] - [E:26| 100]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-21 17:04:17,459 - [INFO] - [E:26| 200]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-21 17:05:55,499 - [INFO] - [E:26| 300]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 17:07:34,123 - [INFO] - [E:26| 400]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 17:09:01,203 - [INFO] - [E:26| 500]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 17:10:39,083 - [INFO] - [E:26| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 17:12:17,118 - [INFO] - [E:26| 700]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 17:13:54,787 - [INFO] - [E:26| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 17:15:32,304 - [INFO] - [E:26| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 17:16:40,568 - [INFO] - [Epoch:26]: Training Loss:0.002901
2023-05-21 17:16:40,902 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 17:16:56,221 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 17:17:13,262 - [INFO] - [Evaluating Epoch 26 valid]:
MRR: Tail : 0.09302, Head : 0.06981, Avg : 0.08141
2023-05-21 17:17:13,262 - [INFO] - [Epoch 26]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 17:17:14,094 - [INFO] - [E:27| 0]: Train Loss:0.002913, Val MRR:0.08254, ice00001
2023-05-21 17:18:51,712 - [INFO] - [E:27| 100]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-21 17:20:30,112 - [INFO] - [E:27| 200]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 17:22:08,170 - [INFO] - [E:27| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 17:23:45,764 - [INFO] - [E:27| 400]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-21 17:25:22,984 - [INFO] - [E:27| 500]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 17:26:53,463 - [INFO] - [E:27| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 17:28:31,227 - [INFO] - [E:27| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 17:30:08,814 - [INFO] - [E:27| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 17:31:45,161 - [INFO] - [E:27| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 17:32:52,804 - [INFO] - [Epoch:27]: Training Loss:0.002902
2023-05-21 17:32:53,159 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 17:33:15,750 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 17:33:37,748 - [INFO] - [Evaluating Epoch 27 valid]:
MRR: Tail : 0.09524, Head : 0.06918, Avg : 0.08221
2023-05-21 17:33:37,748 - [INFO] - [Epoch 27]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-21 17:33:38,886 - [INFO] - [E:28| 0]: Train Loss:0.0028744, Val MRR:0.08254, ice00001
2023-05-21 17:35:05,625 - [INFO] - [E:28| 100]: Train Loss:0.0028962, Val MRR:0.08254, ice00001
2023-05-21 17:36:43,530 - [INFO] - [E:28| 200]: Train Loss:0.0028957, Val MRR:0.08254, ice00001
2023-05-21 17:38:21,744 - [INFO] - [E:28| 300]: Train Loss:0.0028967, Val MRR:0.08254, ice00001
2023-05-21 17:39:58,900 - [INFO] - [E:28| 400]: Train Loss:0.0028982, Val MRR:0.08254, ice00001
2023-05-21 17:41:36,048 - [INFO] - [E:28| 500]: Train Loss:0.0028992, Val MRR:0.08254, ice00001
2023-05-21 17:43:08,699 - [INFO] - [E:28| 600]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 17:44:41,247 - [INFO] - [E:28| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 17:46:17,854 - [INFO] - [E:28| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 17:47:54,606 - [INFO] - [E:28| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 17:49:03,571 - [INFO] - [Epoch:28]: Training Loss:0.002901
2023-05-21 17:49:03,839 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 17:49:26,284 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 17:49:48,806 - [INFO] - [Evaluating Epoch 28 valid]:
MRR: Tail : 0.09487, Head : 0.06903, Avg : 0.08195
2023-05-21 17:49:48,806 - [INFO] - [Epoch 28]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 17:49:49,932 - [INFO] - [E:29| 0]: Train Loss:0.0029188, Val MRR:0.08254, ice00001
2023-05-21 17:51:27,630 - [INFO] - [E:29| 100]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-21 17:52:55,349 - [INFO] - [E:29| 200]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 17:54:32,755 - [INFO] - [E:29| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 17:56:10,163 - [INFO] - [E:29| 400]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-21 17:57:47,884 - [INFO] - [E:29| 500]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-21 17:59:25,272 - [INFO] - [E:29| 600]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 18:00:51,501 - [INFO] - [E:29| 700]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-21 18:02:28,732 - [INFO] - [E:29| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 18:04:06,575 - [INFO] - [E:29| 900]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 18:05:15,259 - [INFO] - [Epoch:29]: Training Loss:0.002902
2023-05-21 18:05:15,595 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 18:05:38,412 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 18:06:00,335 - [INFO] - [Evaluating Epoch 29 valid]:
MRR: Tail : 0.09311, Head : 0.06965, Avg : 0.08138
MR: Tail : 834.87, Head : 983.42, Avg : 909.15
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.08181, Head : 0.06506, Avg : 0.07343
Hit-10: Tail : 0.21022, Head : 0.15706, Avg : 0.18364
2023-05-21 18:06:00,335 - [INFO] - [Epoch 29]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 18:06:01,377 - [INFO] - [E:30| 0]: Train Loss:0.0029383, Val MRR:0.08254, ice00001
2023-05-21 18:07:39,416 - [INFO] - [E:30| 100]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 18:09:12,402 - [INFO] - [E:30| 200]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-21 18:10:44,050 - [INFO] - [E:30| 300]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 18:12:21,406 - [INFO] - [E:30| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 18:13:58,764 - [INFO] - [E:30| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 18:15:37,262 - [INFO] - [E:30| 600]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-21 18:17:15,141 - [INFO] - [E:30| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 18:18:42,184 - [INFO] - [E:30| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 18:20:20,118 - [INFO] - [E:30| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 18:21:28,434 - [INFO] - [Epoch:30]: Training Loss:0.002901
2023-05-21 18:21:28,812 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 18:21:51,392 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 18:22:13,261 - [INFO] - [Evaluating Epoch 30 valid]:
MRR: Tail : 0.09253, Head : 0.07025, Avg : 0.08139
2023-05-21 18:22:13,261 - [INFO] - [Epoch 30]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 18:22:14,403 - [INFO] - [E:31| 0]: Train Loss:0.0029163, Val MRR:0.08254, ice00001
2023-05-21 18:23:51,171 - [INFO] - [E:31| 100]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-21 18:25:29,438 - [INFO] - [E:31| 200]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 18:26:56,223 - [INFO] - [E:31| 300]: Train Loss:0.0028991, Val MRR:0.08254, ice00001
2023-05-21 18:28:33,188 - [INFO] - [E:31| 400]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-21 18:30:09,953 - [INFO] - [E:31| 500]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-21 18:31:47,033 - [INFO] - [E:31| 600]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-21 18:33:24,313 - [INFO] - [E:31| 700]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-21 18:35:02,455 - [INFO] - [E:31| 800]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 18:36:30,394 - [INFO] - [E:31| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 18:37:39,058 - [INFO] - [Epoch:31]: Training Loss:0.002901
2023-05-21 18:37:39,553 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 18:38:02,098 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 18:38:24,019 - [INFO] - [Evaluating Epoch 31 valid]:
MRR: Tail : 0.09474, Head : 0.06924, Avg : 0.08199
2023-05-21 18:38:24,020 - [INFO] - [Epoch 31]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-21 18:38:25,079 - [INFO] - [E:32| 0]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 18:40:02,812 - [INFO] - [E:32| 100]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-21 18:41:40,029 - [INFO] - [E:32| 200]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 18:43:17,110 - [INFO] - [E:32| 300]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 18:44:43,532 - [INFO] - [E:32| 400]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 18:46:21,260 - [INFO] - [E:32| 500]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 18:47:58,764 - [INFO] - [E:32| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 18:49:36,749 - [INFO] - [E:32| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 18:51:14,479 - [INFO] - [E:32| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 18:52:48,436 - [INFO] - [E:32| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 18:53:51,067 - [INFO] - [Epoch:32]: Training Loss:0.002901
2023-05-21 18:53:51,571 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 18:54:13,946 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 18:54:36,292 - [INFO] - [Evaluating Epoch 32 valid]:
MRR: Tail : 0.0949, Head : 0.06939, Avg : 0.08215
2023-05-21 18:54:36,292 - [INFO] - [Epoch 32]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 18:54:37,099 - [INFO] - [E:33| 0]: Train Loss:0.0029128, Val MRR:0.08254, ice00001
2023-05-21 18:56:14,830 - [INFO] - [E:33| 100]: Train Loss:0.0029051, Val MRR:0.08254, ice00001
2023-05-21 18:57:52,327 - [INFO] - [E:33| 200]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-21 18:59:28,977 - [INFO] - [E:33| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 19:01:06,781 - [INFO] - [E:33| 400]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-21 19:02:35,963 - [INFO] - [E:33| 500]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-21 19:04:13,501 - [INFO] - [E:33| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 19:05:51,047 - [INFO] - [E:33| 700]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-21 19:07:29,147 - [INFO] - [E:33| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-21 19:09:06,273 - [INFO] - [E:33| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 19:10:14,437 - [INFO] - [Epoch:33]: Training Loss:0.002902
2023-05-21 19:10:14,646 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 19:10:27,957 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 19:10:49,238 - [INFO] - [Evaluating Epoch 33 valid]:
MRR: Tail : 0.09495, Head : 0.06898, Avg : 0.08197
2023-05-21 19:10:49,238 - [INFO] - [Epoch 33]: Training Loss: 0.0029017, Valid MRR: 0.08254,
2023-05-21 19:10:50,033 - [INFO] - [E:34| 0]: Train Loss:0.0029805, Val MRR:0.08254, ice00001
2023-05-21 19:12:27,633 - [INFO] - [E:34| 100]: Train Loss:0.0028954, Val MRR:0.08254, ice00001
2023-05-21 19:14:04,678 - [INFO] - [E:34| 200]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-21 19:15:42,655 - [INFO] - [E:34| 300]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-21 19:17:20,033 - [INFO] - [E:34| 400]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-21 19:18:54,804 - [INFO] - [E:34| 500]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 19:20:25,630 - [INFO] - [E:34| 600]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-21 19:22:03,876 - [INFO] - [E:34| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 19:23:41,523 - [INFO] - [E:34| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 19:25:19,608 - [INFO] - [E:34| 900]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 19:26:28,524 - [INFO] - [Epoch:34]: Training Loss:0.002902
2023-05-21 19:26:28,860 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 19:26:50,711 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 19:27:12,781 - [INFO] - [Evaluating Epoch 34 valid]:
MRR: Tail : 0.09317, Head : 0.06954, Avg : 0.08135
2023-05-21 19:27:12,782 - [INFO] - [Epoch 34]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-21 19:27:13,906 - [INFO] - [E:35| 0]: Train Loss:0.0028627, Val MRR:0.08254, ice00001
2023-05-21 19:28:40,592 - [INFO] - [E:35| 100]: Train Loss:0.0028976, Val MRR:0.08254, ice00001
2023-05-21 19:30:17,993 - [INFO] - [E:35| 200]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 19:31:55,872 - [INFO] - [E:35| 300]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 19:33:33,198 - [INFO] - [E:35| 400]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-21 19:35:10,852 - [INFO] - [E:35| 500]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 19:36:38,132 - [INFO] - [E:35| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 19:38:15,357 - [INFO] - [E:35| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-21 19:39:52,871 - [INFO] - [E:35| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 19:41:29,885 - [INFO] - [E:35| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 19:42:38,062 - [INFO] - [Epoch:35]: Training Loss:0.002901
2023-05-21 19:42:38,517 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 19:43:00,786 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 19:43:23,340 - [INFO] - [Evaluating Epoch 35 valid]:
MRR: Tail : 0.09582, Head : 0.06892, Avg : 0.08237
2023-05-21 19:43:23,340 - [INFO] - [Epoch 35]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 19:43:24,443 - [INFO] - [E:36| 0]: Train Loss:0.0030218, Val MRR:0.08254, ice00001
2023-05-21 19:44:57,852 - [INFO] - [E:36| 100]: Train Loss:0.0029062, Val MRR:0.08254, ice00001
2023-05-21 19:46:29,497 - [INFO] - [E:36| 200]: Train Loss:0.0029057, Val MRR:0.08254, ice00001
2023-05-21 19:48:06,714 - [INFO] - [E:36| 300]: Train Loss:0.0029046, Val MRR:0.08254, ice00001
2023-05-21 19:49:44,411 - [INFO] - [E:36| 400]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 19:51:24,435 - [INFO] - [E:36| 500]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 19:53:01,869 - [INFO] - [E:36| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 19:54:29,284 - [INFO] - [E:36| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 19:56:06,153 - [INFO] - [E:36| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 19:57:43,226 - [INFO] - [E:36| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 19:58:52,024 - [INFO] - [Epoch:36]: Training Loss:0.002902
2023-05-21 19:58:52,525 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 19:59:15,174 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 19:59:37,240 - [INFO] - [Evaluating Epoch 36 valid]:
MRR: Tail : 0.09537, Head : 0.06908, Avg : 0.08223
2023-05-21 19:59:37,240 - [INFO] - [Epoch 36]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 19:59:38,260 - [INFO] - [E:37| 0]: Train Loss:0.0028665, Val MRR:0.08254, ice00001
2023-05-21 20:01:15,936 - [INFO] - [E:37| 100]: Train Loss:0.0028992, Val MRR:0.08254, ice00001
2023-05-21 20:02:43,193 - [INFO] - [E:37| 200]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-21 20:04:20,440 - [INFO] - [E:37| 300]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 20:05:58,441 - [INFO] - [E:37| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 20:07:36,385 - [INFO] - [E:37| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 20:09:13,456 - [INFO] - [E:37| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 20:10:49,902 - [INFO] - [E:37| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-21 20:12:16,694 - [INFO] - [E:37| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 20:13:54,672 - [INFO] - [E:37| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 20:15:03,353 - [INFO] - [Epoch:37]: Training Loss:0.002902
2023-05-21 20:15:03,686 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 20:15:26,516 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 20:15:48,641 - [INFO] - [Evaluating Epoch 37 valid]:
MRR: Tail : 0.09435, Head : 0.06937, Avg : 0.08186
2023-05-21 20:15:48,641 - [INFO] - [Epoch 37]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-21 20:15:49,696 - [INFO] - [E:38| 0]: Train Loss:0.0029496, Val MRR:0.08254, ice00001
2023-05-21 20:17:26,382 - [INFO] - [E:38| 100]: Train Loss:0.0029049, Val MRR:0.08254, ice00001
2023-05-21 20:19:03,781 - [INFO] - [E:38| 200]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-21 20:20:31,142 - [INFO] - [E:38| 300]: Train Loss:0.0029046, Val MRR:0.08254, ice00001
2023-05-21 20:22:08,753 - [INFO] - [E:38| 400]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-21 20:23:46,036 - [INFO] - [E:38| 500]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-21 20:25:22,237 - [INFO] - [E:38| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 20:26:59,719 - [INFO] - [E:38| 700]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-21 20:28:35,137 - [INFO] - [E:38| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 20:30:04,872 - [INFO] - [E:38| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 20:31:13,715 - [INFO] - [Epoch:38]: Training Loss:0.002902
2023-05-21 20:31:14,064 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 20:31:36,889 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 20:31:58,883 - [INFO] - [Evaluating Epoch 38 valid]:
MRR: Tail : 0.09472, Head : 0.06953, Avg : 0.08212
2023-05-21 20:31:58,883 - [INFO] - [Epoch 38]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 20:31:59,919 - [INFO] - [E:39| 0]: Train Loss:0.002941, Val MRR:0.08254, ice00001
2023-05-21 20:33:37,142 - [INFO] - [E:39| 100]: Train Loss:0.0028984, Val MRR:0.08254, ice00001
2023-05-21 20:35:14,717 - [INFO] - [E:39| 200]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 20:36:52,565 - [INFO] - [E:39| 300]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-21 20:38:19,649 - [INFO] - [E:39| 400]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 20:39:58,421 - [INFO] - [E:39| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 20:41:36,862 - [INFO] - [E:39| 600]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 20:43:14,635 - [INFO] - [E:39| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 20:44:52,362 - [INFO] - [E:39| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 20:46:20,152 - [INFO] - [E:39| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 20:47:27,964 - [INFO] - [Epoch:39]: Training Loss:0.002901
2023-05-21 20:47:28,215 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 20:47:50,565 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 20:48:12,833 - [INFO] - [Evaluating Epoch 39 valid]:
MRR: Tail : 0.09427, Head : 0.0695, Avg : 0.08188
MR: Tail : 828.91, Head : 1009.2, Avg : 919.04
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09868, Head : 0.06202, Avg : 0.08035
Hit-10: Tail : 0.20002, Head : 0.15949, Avg : 0.17975
2023-05-21 20:48:12,833 - [INFO] - [Epoch 39]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-21 20:48:13,952 - [INFO] - [E:40| 0]: Train Loss:0.0028476, Val MRR:0.08254, ice00001
2023-05-21 20:49:52,238 - [INFO] - [E:40| 100]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-21 20:51:30,047 - [INFO] - [E:40| 200]: Train Loss:0.0029059, Val MRR:0.08254, ice00001
2023-05-21 20:53:07,382 - [INFO] - [E:40| 300]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-21 20:54:42,970 - [INFO] - [E:40| 400]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-21 20:56:11,108 - [INFO] - [E:40| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-21 20:57:49,075 - [INFO] - [E:40| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 20:59:26,990 - [INFO] - [E:40| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-21 21:01:04,770 - [INFO] - [E:40| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 21:02:41,922 - [INFO] - [E:40| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 21:03:42,707 - [INFO] - [Epoch:40]: Training Loss:0.002902
2023-05-21 21:03:42,916 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 21:04:03,135 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 21:04:25,439 - [INFO] - [Evaluating Epoch 40 valid]:
MRR: Tail : 0.09494, Head : 0.06884, Avg : 0.08189
2023-05-21 21:04:25,439 - [INFO] - [Epoch 40]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-21 21:04:26,340 - [INFO] - [E:41| 0]: Train Loss:0.0029511, Val MRR:0.08254, ice00001
2023-05-21 21:06:04,033 - [INFO] - [E:41| 100]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-21 21:07:41,223 - [INFO] - [E:41| 200]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-21 21:09:17,882 - [INFO] - [E:41| 300]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-21 21:10:55,382 - [INFO] - [E:41| 400]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-21 21:12:22,059 - [INFO] - [E:41| 500]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-21 21:13:59,471 - [INFO] - [E:41| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 21:15:37,018 - [INFO] - [E:41| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 21:17:14,943 - [INFO] - [E:41| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 21:18:52,614 - [INFO] - [E:41| 900]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-21 21:20:01,052 - [INFO] - [Epoch:41]: Training Loss:0.002901
2023-05-21 21:20:01,433 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 21:20:24,179 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 21:20:46,576 - [INFO] - [Evaluating Epoch 41 valid]:
MRR: Tail : 0.09463, Head : 0.06896, Avg : 0.08179
2023-05-21 21:20:46,576 - [INFO] - [Epoch 41]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 21:20:47,784 - [INFO] - [E:42| 0]: Train Loss:0.0029098, Val MRR:0.08254, ice00001
2023-05-21 21:22:14,252 - [INFO] - [E:42| 100]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-21 21:23:50,782 - [INFO] - [E:42| 200]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 21:25:28,413 - [INFO] - [E:42| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 21:27:08,079 - [INFO] - [E:42| 400]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-21 21:28:45,813 - [INFO] - [E:42| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-21 21:30:12,898 - [INFO] - [E:42| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 21:31:50,625 - [INFO] - [E:42| 700]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-21 21:33:28,457 - [INFO] - [E:42| 800]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-21 21:35:06,301 - [INFO] - [E:42| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-21 21:36:15,205 - [INFO] - [Epoch:42]: Training Loss:0.002901
2023-05-21 21:36:15,457 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 21:36:37,748 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 21:36:59,858 - [INFO] - [Evaluating Epoch 42 valid]:
MRR: Tail : 0.09179, Head : 0.06491, Avg : 0.07835
2023-05-21 21:36:59,858 - [INFO] - [Epoch 42]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 21:37:00,798 - [INFO] - [E:43| 0]: Train Loss:0.0029293, Val MRR:0.08254, ice00001
2023-05-21 21:38:28,770 - [INFO] - [E:43| 100]: Train Loss:0.002899, Val MRR:0.08254, ice00001
2023-05-21 21:40:04,856 - [INFO] - [E:43| 200]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 21:41:42,993 - [INFO] - [E:43| 300]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-21 21:43:20,804 - [INFO] - [E:43| 400]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 21:44:58,685 - [INFO] - [E:43| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-21 21:46:35,527 - [INFO] - [E:43| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-21 21:48:03,069 - [INFO] - [E:43| 700]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-21 21:49:41,014 - [INFO] - [E:43| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 21:51:18,088 - [INFO] - [E:43| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 21:52:25,963 - [INFO] - [Epoch:43]: Training Loss:0.002901
2023-05-21 21:52:26,297 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 21:52:48,666 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 21:53:10,890 - [INFO] - [Evaluating Epoch 43 valid]:
MRR: Tail : 0.09512, Head : 0.06938, Avg : 0.08225
2023-05-21 21:53:10,890 - [INFO] - [Epoch 43]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 21:53:12,001 - [INFO] - [E:44| 0]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-21 21:54:49,643 - [INFO] - [E:44| 100]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-21 21:56:17,344 - [INFO] - [E:44| 200]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-21 21:57:54,681 - [INFO] - [E:44| 300]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 21:59:32,675 - [INFO] - [E:44| 400]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 22:01:10,197 - [INFO] - [E:44| 500]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 22:02:47,599 - [INFO] - [E:44| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 22:04:25,015 - [INFO] - [E:44| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-21 22:05:51,698 - [INFO] - [E:44| 800]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-21 22:07:28,050 - [INFO] - [E:44| 900]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 22:08:36,470 - [INFO] - [Epoch:44]: Training Loss:0.002902
2023-05-21 22:08:36,848 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 22:08:59,366 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 22:09:21,270 - [INFO] - [Evaluating Epoch 44 valid]:
MRR: Tail : 0.09188, Head : 0.07011, Avg : 0.08099
2023-05-21 22:09:21,270 - [INFO] - [Epoch 44]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-21 22:09:22,392 - [INFO] - [E:45| 0]: Train Loss:0.0028829, Val MRR:0.08254, ice00001
2023-05-21 22:11:00,141 - [INFO] - [E:45| 100]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-21 22:12:37,609 - [INFO] - [E:45| 200]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-21 22:14:05,151 - [INFO] - [E:45| 300]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-21 22:15:44,872 - [INFO] - [E:45| 400]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-21 22:17:23,261 - [INFO] - [E:45| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-21 22:19:00,959 - [INFO] - [E:45| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-21 22:20:37,398 - [INFO] - [E:45| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 22:22:04,289 - [INFO] - [E:45| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-21 22:23:41,431 - [INFO] - [E:45| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 22:24:50,446 - [INFO] - [Epoch:45]: Training Loss:0.002902
2023-05-21 22:24:50,697 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 22:25:13,446 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 22:25:35,742 - [INFO] - [Evaluating Epoch 45 valid]:
MRR: Tail : 0.09496, Head : 0.06927, Avg : 0.08211
2023-05-21 22:25:35,742 - [INFO] - [Epoch 45]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 22:25:36,858 - [INFO] - [E:46| 0]: Train Loss:0.0028528, Val MRR:0.08254, ice00001
2023-05-21 22:27:14,947 - [INFO] - [E:46| 100]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-21 22:28:53,074 - [INFO] - [E:46| 200]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-21 22:30:31,541 - [INFO] - [E:46| 300]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-21 22:31:58,825 - [INFO] - [E:46| 400]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-21 22:33:36,560 - [INFO] - [E:46| 500]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-21 22:35:13,183 - [INFO] - [E:46| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-21 22:36:50,442 - [INFO] - [E:46| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-21 22:38:27,601 - [INFO] - [E:46| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 22:39:54,987 - [INFO] - [E:46| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 22:41:03,746 - [INFO] - [Epoch:46]: Training Loss:0.002901
2023-05-21 22:41:04,208 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 22:41:26,817 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 22:41:49,299 - [INFO] - [Evaluating Epoch 46 valid]:
MRR: Tail : 0.09498, Head : 0.06893, Avg : 0.08196
2023-05-21 22:41:49,299 - [INFO] - [Epoch 46]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-21 22:41:50,207 - [INFO] - [E:47| 0]: Train Loss:0.0028679, Val MRR:0.08254, ice00001
2023-05-21 22:43:27,978 - [INFO] - [E:47| 100]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-21 22:45:05,880 - [INFO] - [E:47| 200]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-21 22:46:43,348 - [INFO] - [E:47| 300]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-21 22:48:13,379 - [INFO] - [E:47| 400]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-21 22:49:47,596 - [INFO] - [E:47| 500]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-21 22:51:24,755 - [INFO] - [E:47| 600]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-21 22:53:02,384 - [INFO] - [E:47| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 22:54:39,857 - [INFO] - [E:47| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 22:56:17,638 - [INFO] - [E:47| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 22:57:15,629 - [INFO] - [Epoch:47]: Training Loss:0.002901
2023-05-21 22:57:16,121 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 22:57:38,576 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 22:58:00,364 - [INFO] - [Evaluating Epoch 47 valid]:
MRR: Tail : 0.09369, Head : 0.0688, Avg : 0.08125
2023-05-21 22:58:00,364 - [INFO] - [Epoch 47]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 22:58:01,508 - [INFO] - [E:48| 0]: Train Loss:0.002916, Val MRR:0.08254, ice00001
2023-05-21 22:59:38,543 - [INFO] - [E:48| 100]: Train Loss:0.0029056, Val MRR:0.08254, ice00001
2023-05-21 23:01:16,405 - [INFO] - [E:48| 200]: Train Loss:0.0029045, Val MRR:0.08254, ice00001
2023-05-21 23:02:53,715 - [INFO] - [E:48| 300]: Train Loss:0.0029049, Val MRR:0.08254, ice00001
2023-05-21 23:04:32,521 - [INFO] - [E:48| 400]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-21 23:05:59,311 - [INFO] - [E:48| 500]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-21 23:07:36,977 - [INFO] - [E:48| 600]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-21 23:09:14,913 - [INFO] - [E:48| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-21 23:10:52,619 - [INFO] - [E:48| 800]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-21 23:12:30,558 - [INFO] - [E:48| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-21 23:13:39,487 - [INFO] - [Epoch:48]: Training Loss:0.002901
2023-05-21 23:13:39,800 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 23:14:02,075 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 23:14:18,727 - [INFO] - [Evaluating Epoch 48 valid]:
MRR: Tail : 0.09298, Head : 0.0643, Avg : 0.07864
2023-05-21 23:14:18,727 - [INFO] - [Epoch 48]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 23:14:19,370 - [INFO] - [E:49| 0]: Train Loss:0.0029104, Val MRR:0.08254, ice00001
2023-05-21 23:15:51,547 - [INFO] - [E:49| 100]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-21 23:17:28,764 - [INFO] - [E:49| 200]: Train Loss:0.0028988, Val MRR:0.08254, ice00001
2023-05-21 23:19:05,375 - [INFO] - [E:49| 300]: Train Loss:0.0028984, Val MRR:0.08254, ice00001
2023-05-21 23:20:42,561 - [INFO] - [E:49| 400]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-21 23:22:20,530 - [INFO] - [E:49| 500]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-21 23:23:47,849 - [INFO] - [E:49| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 23:25:25,141 - [INFO] - [E:49| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 23:27:02,355 - [INFO] - [E:49| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-21 23:28:40,500 - [INFO] - [E:49| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-21 23:29:48,723 - [INFO] - [Epoch:49]: Training Loss:0.002902
2023-05-21 23:29:49,103 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 23:30:11,763 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 23:30:33,968 - [INFO] - [Evaluating Epoch 49 valid]:
MRR: Tail : 0.0941, Head : 0.06977, Avg : 0.08193
MR: Tail : 767.31, Head : 922.11, Avg : 844.71
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09868, Head : 0.06202, Avg : 0.08035
Hit-10: Tail : 0.20706, Head : 0.16021, Avg : 0.18364
2023-05-21 23:30:33,968 - [INFO] - [Epoch 49]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 23:30:35,169 - [INFO] - [E:50| 0]: Train Loss:0.0029099, Val MRR:0.08254, ice00001
2023-05-21 23:32:02,269 - [INFO] - [E:50| 100]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-21 23:33:38,420 - [INFO] - [E:50| 200]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-21 23:35:15,769 - [INFO] - [E:50| 300]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-21 23:36:53,353 - [INFO] - [E:50| 400]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-21 23:38:31,416 - [INFO] - [E:50| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-21 23:40:09,163 - [INFO] - [E:50| 600]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-21 23:41:36,334 - [INFO] - [E:50| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-21 23:43:14,551 - [INFO] - [E:50| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-21 23:44:52,414 - [INFO] - [E:50| 900]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-21 23:46:01,098 - [INFO] - [Epoch:50]: Training Loss:0.002901
2023-05-21 23:46:01,477 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-21 23:46:23,802 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-21 23:46:45,702 - [INFO] - [Evaluating Epoch 50 valid]:
MRR: Tail : 0.0955, Head : 0.06875, Avg : 0.08212
2023-05-21 23:46:45,702 - [INFO] - [Epoch 50]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-21 23:46:46,808 - [INFO] - [E:51| 0]: Train Loss:0.0029667, Val MRR:0.08254, ice00001
2023-05-21 23:48:23,437 - [INFO] - [E:51| 100]: Train Loss:0.002907, Val MRR:0.08254, ice00001
2023-05-21 23:49:50,458 - [INFO] - [E:51| 200]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-21 23:51:30,589 - [INFO] - [E:51| 300]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-21 23:53:08,045 - [INFO] - [E:51| 400]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-21 23:54:45,749 - [INFO] - [E:51| 500]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-21 23:56:23,816 - [INFO] - [E:51| 600]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-21 23:57:55,683 - [INFO] - [E:51| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-21 23:59:28,935 - [INFO] - [E:51| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 00:01:06,592 - [INFO] - [E:51| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 00:02:14,219 - [INFO] - [Epoch:51]: Training Loss:0.002901
2023-05-22 00:02:14,598 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 00:02:36,888 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 00:02:58,773 - [INFO] - [Evaluating Epoch 51 valid]:
MRR: Tail : 0.09388, Head : 0.06987, Avg : 0.08188
2023-05-22 00:02:58,773 - [INFO] - [Epoch 51]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 00:02:59,907 - [INFO] - [E:52| 0]: Train Loss:0.002908, Val MRR:0.08254, ice00001
2023-05-22 00:04:37,236 - [INFO] - [E:52| 100]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-22 00:06:14,463 - [INFO] - [E:52| 200]: Train Loss:0.0028985, Val MRR:0.08254, ice00001
2023-05-22 00:07:41,182 - [INFO] - [E:52| 300]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-22 00:09:18,943 - [INFO] - [E:52| 400]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 00:10:56,892 - [INFO] - [E:52| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 00:12:34,820 - [INFO] - [E:52| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 00:14:12,628 - [INFO] - [E:52| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 00:15:39,839 - [INFO] - [E:52| 800]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 00:17:16,778 - [INFO] - [E:52| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 00:18:25,716 - [INFO] - [Epoch:52]: Training Loss:0.002901
2023-05-22 00:18:26,047 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 00:18:48,896 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 00:19:11,118 - [INFO] - [Evaluating Epoch 52 valid]:
MRR: Tail : 0.09493, Head : 0.06935, Avg : 0.08214
2023-05-22 00:19:11,118 - [INFO] - [Epoch 52]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 00:19:12,179 - [INFO] - [E:53| 0]: Train Loss:0.0028624, Val MRR:0.08254, ice00001
2023-05-22 00:20:49,875 - [INFO] - [E:53| 100]: Train Loss:0.0028987, Val MRR:0.08254, ice00001
2023-05-22 00:22:27,567 - [INFO] - [E:53| 200]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 00:23:59,881 - [INFO] - [E:53| 300]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 00:25:32,668 - [INFO] - [E:53| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 00:27:10,678 - [INFO] - [E:53| 500]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 00:28:47,756 - [INFO] - [E:53| 600]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 00:30:24,597 - [INFO] - [E:53| 700]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 00:32:00,796 - [INFO] - [E:53| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 00:33:27,969 - [INFO] - [E:53| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 00:34:36,976 - [INFO] - [Epoch:53]: Training Loss:0.002901
2023-05-22 00:34:37,230 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 00:34:59,666 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 00:35:22,307 - [INFO] - [Evaluating Epoch 53 valid]:
MRR: Tail : 0.09509, Head : 0.06937, Avg : 0.08223
2023-05-22 00:35:22,307 - [INFO] - [Epoch 53]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 00:35:23,427 - [INFO] - [E:54| 0]: Train Loss:0.0029331, Val MRR:0.08254, ice00001
2023-05-22 00:37:01,197 - [INFO] - [E:54| 100]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 00:38:38,964 - [INFO] - [E:54| 200]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 00:40:18,723 - [INFO] - [E:54| 300]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 00:41:46,275 - [INFO] - [E:54| 400]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 00:43:24,226 - [INFO] - [E:54| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 00:45:02,234 - [INFO] - [E:54| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 00:46:38,835 - [INFO] - [E:54| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 00:48:16,325 - [INFO] - [E:54| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 00:49:54,363 - [INFO] - [E:54| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 00:50:52,333 - [INFO] - [Epoch:54]: Training Loss:0.002902
2023-05-22 00:50:52,791 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 00:51:15,345 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 00:51:37,790 - [INFO] - [Evaluating Epoch 54 valid]:
MRR: Tail : 0.09424, Head : 0.069, Avg : 0.08162
2023-05-22 00:51:37,790 - [INFO] - [Epoch 54]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 00:51:38,845 - [INFO] - [E:55| 0]: Train Loss:0.0029055, Val MRR:0.08254, ice00001
2023-05-22 00:53:16,452 - [INFO] - [E:55| 100]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 00:54:53,838 - [INFO] - [E:55| 200]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 00:56:31,437 - [INFO] - [E:55| 300]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 00:58:09,157 - [INFO] - [E:55| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 00:59:35,604 - [INFO] - [E:55| 500]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 01:01:12,118 - [INFO] - [E:55| 600]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 01:02:49,972 - [INFO] - [E:55| 700]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 01:04:27,498 - [INFO] - [E:55| 800]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 01:06:04,926 - [INFO] - [E:55| 900]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 01:07:13,603 - [INFO] - [Epoch:55]: Training Loss:0.002901
2023-05-22 01:07:14,129 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 01:07:32,282 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 01:07:46,711 - [INFO] - [Evaluating Epoch 55 valid]:
MRR: Tail : 0.09588, Head : 0.06875, Avg : 0.08231
2023-05-22 01:07:46,711 - [INFO] - [Epoch 55]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 01:07:47,797 - [INFO] - [E:56| 0]: Train Loss:0.0028953, Val MRR:0.08254, ice00001
2023-05-22 01:09:25,609 - [INFO] - [E:56| 100]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 01:11:03,388 - [INFO] - [E:56| 200]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 01:12:41,227 - [INFO] - [E:56| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 01:14:18,006 - [INFO] - [E:56| 400]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 01:15:54,723 - [INFO] - [E:56| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 01:17:21,889 - [INFO] - [E:56| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 01:18:59,563 - [INFO] - [E:56| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 01:20:37,247 - [INFO] - [E:56| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 01:22:14,685 - [INFO] - [E:56| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 01:23:23,174 - [INFO] - [Epoch:56]: Training Loss:0.002902
2023-05-22 01:23:23,510 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 01:23:46,304 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 01:24:08,496 - [INFO] - [Evaluating Epoch 56 valid]:
MRR: Tail : 0.09503, Head : 0.06881, Avg : 0.08192
2023-05-22 01:24:08,496 - [INFO] - [Epoch 56]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 01:24:09,620 - [INFO] - [E:57| 0]: Train Loss:0.002919, Val MRR:0.08254, ice00001
2023-05-22 01:25:36,785 - [INFO] - [E:57| 100]: Train Loss:0.0029047, Val MRR:0.08254, ice00001
2023-05-22 01:27:14,363 - [INFO] - [E:57| 200]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 01:28:52,838 - [INFO] - [E:57| 300]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 01:30:29,228 - [INFO] - [E:57| 400]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 01:32:06,652 - [INFO] - [E:57| 500]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 01:33:41,649 - [INFO] - [E:57| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 01:35:11,845 - [INFO] - [E:57| 700]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 01:36:49,404 - [INFO] - [E:57| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 01:38:27,347 - [INFO] - [E:57| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 01:39:36,021 - [INFO] - [Epoch:57]: Training Loss:0.002901
2023-05-22 01:39:36,356 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 01:39:59,032 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 01:40:21,351 - [INFO] - [Evaluating Epoch 57 valid]:
MRR: Tail : 0.09573, Head : 0.06903, Avg : 0.08238
2023-05-22 01:40:21,351 - [INFO] - [Epoch 57]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 01:40:22,475 - [INFO] - [E:58| 0]: Train Loss:0.0028566, Val MRR:0.08254, ice00001
2023-05-22 01:42:00,409 - [INFO] - [E:58| 100]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 01:43:26,672 - [INFO] - [E:58| 200]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-22 01:45:03,747 - [INFO] - [E:58| 300]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 01:46:42,195 - [INFO] - [E:58| 400]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 01:48:20,464 - [INFO] - [E:58| 500]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 01:49:57,647 - [INFO] - [E:58| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 01:51:24,853 - [INFO] - [E:58| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 01:53:02,522 - [INFO] - [E:58| 800]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 01:54:39,538 - [INFO] - [E:58| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 01:55:48,408 - [INFO] - [Epoch:58]: Training Loss:0.002901
2023-05-22 01:55:48,659 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 01:56:11,299 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 01:56:33,311 - [INFO] - [Evaluating Epoch 58 valid]:
MRR: Tail : 0.09439, Head : 0.06883, Avg : 0.08161
2023-05-22 01:56:33,311 - [INFO] - [Epoch 58]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 01:56:34,432 - [INFO] - [E:59| 0]: Train Loss:0.0029333, Val MRR:0.08254, ice00001
2023-05-22 01:58:10,869 - [INFO] - [E:59| 100]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 01:59:46,871 - [INFO] - [E:59| 200]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 02:01:15,269 - [INFO] - [E:59| 300]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 02:02:53,188 - [INFO] - [E:59| 400]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-22 02:04:31,246 - [INFO] - [E:59| 500]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 02:06:08,787 - [INFO] - [E:59| 600]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 02:07:46,332 - [INFO] - [E:59| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 02:09:13,809 - [INFO] - [E:59| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 02:10:51,719 - [INFO] - [E:59| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 02:11:59,391 - [INFO] - [Epoch:59]: Training Loss:0.002901
2023-05-22 02:11:59,693 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 02:12:21,585 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 02:12:43,627 - [INFO] - [Evaluating Epoch 59 valid]:
MRR: Tail : 0.09539, Head : 0.0684, Avg : 0.08189
MR: Tail : 850.62, Head : 1041.6, Avg : 946.09
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09892, Head : 0.06178, Avg : 0.08035
Hit-10: Tail : 0.21556, Head : 0.15463, Avg : 0.1851
2023-05-22 02:12:43,627 - [INFO] - [Epoch 59]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 02:12:44,518 - [INFO] - [E:60| 0]: Train Loss:0.0029434, Val MRR:0.08254, ice00001
2023-05-22 02:14:21,498 - [INFO] - [E:60| 100]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 02:16:00,670 - [INFO] - [E:60| 200]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 02:17:28,742 - [INFO] - [E:60| 300]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 02:19:05,102 - [INFO] - [E:60| 400]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 02:20:42,432 - [INFO] - [E:60| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 02:22:20,216 - [INFO] - [E:60| 600]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 02:23:57,659 - [INFO] - [E:60| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 02:25:35,652 - [INFO] - [E:60| 800]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 02:27:02,183 - [INFO] - [E:60| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 02:28:10,440 - [INFO] - [Epoch:60]: Training Loss:0.002901
2023-05-22 02:28:10,691 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 02:28:33,042 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 02:28:55,025 - [INFO] - [Evaluating Epoch 60 valid]:
MRR: Tail : 0.09537, Head : 0.06902, Avg : 0.0822
2023-05-22 02:28:55,025 - [INFO] - [Epoch 60]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 02:28:56,158 - [INFO] - [E:61| 0]: Train Loss:0.0029474, Val MRR:0.08254, ice00001
2023-05-22 02:30:33,086 - [INFO] - [E:61| 100]: Train Loss:0.0029042, Val MRR:0.08254, ice00001
2023-05-22 02:32:10,090 - [INFO] - [E:61| 200]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 02:33:46,944 - [INFO] - [E:61| 300]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 02:35:14,944 - [INFO] - [E:61| 400]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 02:36:52,186 - [INFO] - [E:61| 500]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 02:38:29,326 - [INFO] - [E:61| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 02:40:06,084 - [INFO] - [E:61| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 02:41:42,529 - [INFO] - [E:61| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 02:43:19,055 - [INFO] - [E:61| 900]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 02:44:18,726 - [INFO] - [Epoch:61]: Training Loss:0.002902
2023-05-22 02:44:18,976 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 02:44:41,440 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 02:45:03,819 - [INFO] - [Evaluating Epoch 61 valid]:
MRR: Tail : 0.09421, Head : 0.06944, Avg : 0.08182
2023-05-22 02:45:03,819 - [INFO] - [Epoch 61]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 02:45:04,931 - [INFO] - [E:62| 0]: Train Loss:0.0028606, Val MRR:0.08254, ice00001
2023-05-22 02:46:42,213 - [INFO] - [E:62| 100]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 02:48:19,510 - [INFO] - [E:62| 200]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 02:49:56,912 - [INFO] - [E:62| 300]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 02:51:34,815 - [INFO] - [E:62| 400]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-22 02:53:01,499 - [INFO] - [E:62| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 02:54:38,233 - [INFO] - [E:62| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 02:56:14,271 - [INFO] - [E:62| 700]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 02:57:51,604 - [INFO] - [E:62| 800]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 02:59:29,111 - [INFO] - [E:62| 900]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 03:00:37,461 - [INFO] - [Epoch:62]: Training Loss:0.002902
2023-05-22 03:00:37,669 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 03:00:50,976 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 03:01:11,274 - [INFO] - [Evaluating Epoch 62 valid]:
MRR: Tail : 0.09536, Head : 0.06883, Avg : 0.08209
2023-05-22 03:01:11,274 - [INFO] - [Epoch 62]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 03:01:12,419 - [INFO] - [E:63| 0]: Train Loss:0.0029855, Val MRR:0.08254, ice00001
2023-05-22 03:02:49,325 - [INFO] - [E:63| 100]: Train Loss:0.0029055, Val MRR:0.08254, ice00001
2023-05-22 03:04:28,774 - [INFO] - [E:63| 200]: Train Loss:0.0029049, Val MRR:0.08254, ice00001
2023-05-22 03:06:06,197 - [INFO] - [E:63| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 03:07:43,257 - [INFO] - [E:63| 400]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 03:09:21,275 - [INFO] - [E:63| 500]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 03:10:47,773 - [INFO] - [E:63| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 03:12:24,542 - [INFO] - [E:63| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 03:14:01,909 - [INFO] - [E:63| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 03:15:38,608 - [INFO] - [E:63| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 03:16:46,437 - [INFO] - [Epoch:63]: Training Loss:0.002902
2023-05-22 03:16:46,938 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 03:17:09,502 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 03:17:31,301 - [INFO] - [Evaluating Epoch 63 valid]:
MRR: Tail : 0.0946, Head : 0.06909, Avg : 0.08184
2023-05-22 03:17:31,301 - [INFO] - [Epoch 63]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 03:17:32,255 - [INFO] - [E:64| 0]: Train Loss:0.0029395, Val MRR:0.08254, ice00001
2023-05-22 03:18:59,638 - [INFO] - [E:64| 100]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-22 03:20:37,711 - [INFO] - [E:64| 200]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 03:22:15,632 - [INFO] - [E:64| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 03:23:53,043 - [INFO] - [E:64| 400]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 03:25:29,713 - [INFO] - [E:64| 500]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 03:27:01,700 - [INFO] - [E:64| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 03:28:33,835 - [INFO] - [E:64| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 03:30:11,141 - [INFO] - [E:64| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 03:31:47,796 - [INFO] - [E:64| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 03:32:56,506 - [INFO] - [Epoch:64]: Training Loss:0.002902
2023-05-22 03:32:57,003 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 03:33:19,689 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 03:33:41,784 - [INFO] - [Evaluating Epoch 64 valid]:
MRR: Tail : 0.09488, Head : 0.06956, Avg : 0.08222
2023-05-22 03:33:41,784 - [INFO] - [Epoch 64]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 03:33:42,571 - [INFO] - [E:65| 0]: Train Loss:0.002908, Val MRR:0.08254, ice00001
2023-05-22 03:35:20,264 - [INFO] - [E:65| 100]: Train Loss:0.0028967, Val MRR:0.08254, ice00001
2023-05-22 03:36:47,932 - [INFO] - [E:65| 200]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-22 03:38:25,001 - [INFO] - [E:65| 300]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-22 03:40:01,195 - [INFO] - [E:65| 400]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 03:41:38,651 - [INFO] - [E:65| 500]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 03:43:17,022 - [INFO] - [E:65| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 03:44:44,830 - [INFO] - [E:65| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 03:46:22,950 - [INFO] - [E:65| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 03:48:00,793 - [INFO] - [E:65| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 03:49:09,064 - [INFO] - [Epoch:65]: Training Loss:0.002902
2023-05-22 03:49:09,401 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 03:49:32,192 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 03:49:54,699 - [INFO] - [Evaluating Epoch 65 valid]:
MRR: Tail : 0.09546, Head : 0.06933, Avg : 0.08239
2023-05-22 03:49:54,699 - [INFO] - [Epoch 65]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 03:49:55,800 - [INFO] - [E:66| 0]: Train Loss:0.0029381, Val MRR:0.08254, ice00001
2023-05-22 03:51:33,318 - [INFO] - [E:66| 100]: Train Loss:0.0029061, Val MRR:0.08254, ice00001
2023-05-22 03:53:10,459 - [INFO] - [E:66| 200]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-22 03:54:38,820 - [INFO] - [E:66| 300]: Train Loss:0.0029045, Val MRR:0.08254, ice00001
2023-05-22 03:56:16,482 - [INFO] - [E:66| 400]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 03:57:54,208 - [INFO] - [E:66| 500]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 03:59:32,065 - [INFO] - [E:66| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 04:01:09,683 - [INFO] - [E:66| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 04:02:37,447 - [INFO] - [E:66| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 04:04:15,472 - [INFO] - [E:66| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 04:05:24,301 - [INFO] - [Epoch:66]: Training Loss:0.002901
2023-05-22 04:05:24,553 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 04:05:47,357 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 04:06:09,688 - [INFO] - [Evaluating Epoch 66 valid]:
MRR: Tail : 0.09528, Head : 0.06922, Avg : 0.08225
2023-05-22 04:06:09,688 - [INFO] - [Epoch 66]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 04:06:10,805 - [INFO] - [E:67| 0]: Train Loss:0.0028498, Val MRR:0.08254, ice00001
2023-05-22 04:07:47,827 - [INFO] - [E:67| 100]: Train Loss:0.0029053, Val MRR:0.08254, ice00001
2023-05-22 04:09:25,449 - [INFO] - [E:67| 200]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 04:10:52,484 - [INFO] - [E:67| 300]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 04:12:29,385 - [INFO] - [E:67| 400]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 04:14:06,712 - [INFO] - [E:67| 500]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 04:15:43,440 - [INFO] - [E:67| 600]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 04:17:21,047 - [INFO] - [E:67| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 04:18:58,649 - [INFO] - [E:67| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 04:20:26,583 - [INFO] - [E:67| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 04:21:35,405 - [INFO] - [Epoch:67]: Training Loss:0.002901
2023-05-22 04:21:35,656 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 04:21:57,693 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 04:22:19,832 - [INFO] - [Evaluating Epoch 67 valid]:
MRR: Tail : 0.09508, Head : 0.06945, Avg : 0.08227
2023-05-22 04:22:19,832 - [INFO] - [Epoch 67]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 04:22:20,737 - [INFO] - [E:68| 0]: Train Loss:0.0028235, Val MRR:0.08254, ice00001
2023-05-22 04:23:57,505 - [INFO] - [E:68| 100]: Train Loss:0.0029055, Val MRR:0.08254, ice00001
2023-05-22 04:25:34,535 - [INFO] - [E:68| 200]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 04:27:12,243 - [INFO] - [E:68| 300]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 04:28:38,934 - [INFO] - [E:68| 400]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 04:30:16,381 - [INFO] - [E:68| 500]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 04:31:54,126 - [INFO] - [E:68| 600]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 04:33:31,690 - [INFO] - [E:68| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 04:35:10,283 - [INFO] - [E:68| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 04:36:44,507 - [INFO] - [E:68| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 04:37:45,652 - [INFO] - [Epoch:68]: Training Loss:0.002902
2023-05-22 04:37:45,902 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 04:38:08,553 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 04:38:30,852 - [INFO] - [Evaluating Epoch 68 valid]:
MRR: Tail : 0.09521, Head : 0.06885, Avg : 0.08203
2023-05-22 04:38:30,852 - [INFO] - [Epoch 68]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 04:38:31,961 - [INFO] - [E:69| 0]: Train Loss:0.0028792, Val MRR:0.08254, ice00001
2023-05-22 04:40:10,510 - [INFO] - [E:69| 100]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-22 04:41:49,910 - [INFO] - [E:69| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 04:43:26,950 - [INFO] - [E:69| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 04:45:04,803 - [INFO] - [E:69| 400]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 04:46:32,408 - [INFO] - [E:69| 500]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 04:48:10,307 - [INFO] - [E:69| 600]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 04:49:47,519 - [INFO] - [E:69| 700]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 04:51:24,262 - [INFO] - [E:69| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 04:53:01,740 - [INFO] - [E:69| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 04:54:08,502 - [INFO] - [Epoch:69]: Training Loss:0.002902
2023-05-22 04:54:08,711 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 04:54:22,045 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 04:54:44,386 - [INFO] - [Evaluating Epoch 69 valid]:
MRR: Tail : 0.0942, Head : 0.06943, Avg : 0.08181
MR: Tail : 829.65, Head : 990.71, Avg : 910.18
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.0903, Head : 0.06044, Avg : 0.07537
Hit-10: Tail : 0.20803, Head : 0.15572, Avg : 0.18188
2023-05-22 04:54:44,386 - [INFO] - [Epoch 69]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 04:54:45,511 - [INFO] - [E:70| 0]: Train Loss:0.0029215, Val MRR:0.08254, ice00001
2023-05-22 04:56:23,036 - [INFO] - [E:70| 100]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 04:58:00,782 - [INFO] - [E:70| 200]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 04:59:38,973 - [INFO] - [E:70| 300]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 05:01:17,322 - [INFO] - [E:70| 400]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 05:02:53,937 - [INFO] - [E:70| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 05:04:23,020 - [INFO] - [E:70| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 05:06:00,236 - [INFO] - [E:70| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 05:07:37,602 - [INFO] - [E:70| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 05:09:15,887 - [INFO] - [E:70| 900]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 05:10:25,077 - [INFO] - [Epoch:70]: Training Loss:0.002902
2023-05-22 05:10:25,499 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 05:10:47,860 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 05:11:10,163 - [INFO] - [Evaluating Epoch 70 valid]:
MRR: Tail : 0.09172, Head : 0.07038, Avg : 0.08105
2023-05-22 05:11:10,163 - [INFO] - [Epoch 70]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 05:11:11,103 - [INFO] - [E:71| 0]: Train Loss:0.0028736, Val MRR:0.08254, ice00001
2023-05-22 05:12:38,960 - [INFO] - [E:71| 100]: Train Loss:0.0028957, Val MRR:0.08254, ice00001
2023-05-22 05:14:16,481 - [INFO] - [E:71| 200]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 05:15:54,225 - [INFO] - [E:71| 300]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 05:17:30,994 - [INFO] - [E:71| 400]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-22 05:19:08,772 - [INFO] - [E:71| 500]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 05:20:35,109 - [INFO] - [E:71| 600]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 05:22:11,991 - [INFO] - [E:71| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 05:23:49,537 - [INFO] - [E:71| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 05:25:27,061 - [INFO] - [E:71| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 05:26:35,769 - [INFO] - [Epoch:71]: Training Loss:0.002902
2023-05-22 05:26:36,266 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 05:26:58,834 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 05:27:20,501 - [INFO] - [Evaluating Epoch 71 valid]:
MRR: Tail : 0.09474, Head : 0.06926, Avg : 0.082
2023-05-22 05:27:20,501 - [INFO] - [Epoch 71]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 05:27:21,492 - [INFO] - [E:72| 0]: Train Loss:0.0029166, Val MRR:0.08254, ice00001
2023-05-22 05:29:00,640 - [INFO] - [E:72| 100]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 05:30:28,459 - [INFO] - [E:72| 200]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 05:32:06,128 - [INFO] - [E:72| 300]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 05:33:43,033 - [INFO] - [E:72| 400]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 05:35:19,378 - [INFO] - [E:72| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 05:36:57,306 - [INFO] - [E:72| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 05:38:24,767 - [INFO] - [E:72| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 05:40:02,612 - [INFO] - [E:72| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 05:41:40,173 - [INFO] - [E:72| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 05:42:49,034 - [INFO] - [Epoch:72]: Training Loss:0.002902
2023-05-22 05:42:49,371 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 05:43:12,016 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 05:43:34,332 - [INFO] - [Evaluating Epoch 72 valid]:
MRR: Tail : 0.09447, Head : 0.06951, Avg : 0.08199
2023-05-22 05:43:34,332 - [INFO] - [Epoch 72]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 05:43:35,384 - [INFO] - [E:73| 0]: Train Loss:0.0028697, Val MRR:0.08254, ice00001
2023-05-22 05:45:14,039 - [INFO] - [E:73| 100]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 05:46:41,955 - [INFO] - [E:73| 200]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 05:48:19,364 - [INFO] - [E:73| 300]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 05:49:55,690 - [INFO] - [E:73| 400]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 05:51:32,820 - [INFO] - [E:73| 500]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 05:53:10,410 - [INFO] - [E:73| 600]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 05:54:47,996 - [INFO] - [E:73| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 05:56:15,319 - [INFO] - [E:73| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 05:57:52,975 - [INFO] - [E:73| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 05:59:01,414 - [INFO] - [Epoch:73]: Training Loss:0.002901
2023-05-22 05:59:01,745 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 05:59:24,599 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 05:59:47,011 - [INFO] - [Evaluating Epoch 73 valid]:
MRR: Tail : 0.09486, Head : 0.06848, Avg : 0.08167
2023-05-22 05:59:47,011 - [INFO] - [Epoch 73]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 05:59:48,146 - [INFO] - [E:74| 0]: Train Loss:0.0028683, Val MRR:0.08254, ice00001
2023-05-22 06:01:26,457 - [INFO] - [E:74| 100]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 06:03:05,072 - [INFO] - [E:74| 200]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 06:04:31,841 - [INFO] - [E:74| 300]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 06:06:09,193 - [INFO] - [E:74| 400]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 06:07:46,404 - [INFO] - [E:74| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 06:09:23,975 - [INFO] - [E:74| 600]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 06:11:01,635 - [INFO] - [E:74| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 06:12:38,096 - [INFO] - [E:74| 800]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 06:14:06,412 - [INFO] - [E:74| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 06:15:14,642 - [INFO] - [Epoch:74]: Training Loss:0.002901
2023-05-22 06:15:15,141 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 06:15:37,978 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 06:16:00,346 - [INFO] - [Evaluating Epoch 74 valid]:
MRR: Tail : 0.0948, Head : 0.0695, Avg : 0.08215
2023-05-22 06:16:00,346 - [INFO] - [Epoch 74]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 06:16:01,401 - [INFO] - [E:75| 0]: Train Loss:0.0029101, Val MRR:0.08254, ice00001
2023-05-22 06:17:40,390 - [INFO] - [E:75| 100]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 06:19:16,472 - [INFO] - [E:75| 200]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 06:20:54,304 - [INFO] - [E:75| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 06:22:21,039 - [INFO] - [E:75| 400]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 06:23:58,734 - [INFO] - [E:75| 500]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 06:25:37,012 - [INFO] - [E:75| 600]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 06:27:15,189 - [INFO] - [E:75| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 06:28:52,518 - [INFO] - [E:75| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 06:30:23,166 - [INFO] - [E:75| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 06:31:28,233 - [INFO] - [Epoch:75]: Training Loss:0.002901
2023-05-22 06:31:28,516 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 06:31:50,439 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 06:32:12,501 - [INFO] - [Evaluating Epoch 75 valid]:
MRR: Tail : 0.09552, Head : 0.06911, Avg : 0.08232
2023-05-22 06:32:12,502 - [INFO] - [Epoch 75]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 06:32:13,396 - [INFO] - [E:76| 0]: Train Loss:0.0028787, Val MRR:0.08254, ice00001
2023-05-22 06:33:49,463 - [INFO] - [E:76| 100]: Train Loss:0.0028964, Val MRR:0.08254, ice00001
2023-05-22 06:35:27,789 - [INFO] - [E:76| 200]: Train Loss:0.0028958, Val MRR:0.08254, ice00001
2023-05-22 06:37:06,140 - [INFO] - [E:76| 300]: Train Loss:0.002899, Val MRR:0.08254, ice00001
2023-05-22 06:38:44,498 - [INFO] - [E:76| 400]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 06:40:12,504 - [INFO] - [E:76| 500]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 06:41:50,374 - [INFO] - [E:76| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 06:43:27,876 - [INFO] - [E:76| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 06:45:05,207 - [INFO] - [E:76| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 06:46:42,468 - [INFO] - [E:76| 900]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 06:47:45,452 - [INFO] - [Epoch:76]: Training Loss:0.002901
2023-05-22 06:47:45,661 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 06:48:01,870 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 06:48:23,944 - [INFO] - [Evaluating Epoch 76 valid]:
MRR: Tail : 0.09494, Head : 0.06947, Avg : 0.0822
2023-05-22 06:48:23,944 - [INFO] - [Epoch 76]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 06:48:24,728 - [INFO] - [E:77| 0]: Train Loss:0.0028417, Val MRR:0.08254, ice00001
2023-05-22 06:50:02,401 - [INFO] - [E:77| 100]: Train Loss:0.002899, Val MRR:0.08254, ice00001
2023-05-22 06:51:39,807 - [INFO] - [E:77| 200]: Train Loss:0.0028989, Val MRR:0.08254, ice00001
2023-05-22 06:53:17,332 - [INFO] - [E:77| 300]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 06:54:55,373 - [INFO] - [E:77| 400]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 06:56:29,185 - [INFO] - [E:77| 500]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 06:58:01,003 - [INFO] - [E:77| 600]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 06:59:38,950 - [INFO] - [E:77| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 07:01:16,530 - [INFO] - [E:77| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 07:02:52,755 - [INFO] - [E:77| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 07:04:01,116 - [INFO] - [Epoch:77]: Training Loss:0.002902
2023-05-22 07:04:01,613 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 07:04:24,200 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 07:04:46,223 - [INFO] - [Evaluating Epoch 77 valid]:
MRR: Tail : 0.09398, Head : 0.06916, Avg : 0.08157
2023-05-22 07:04:46,223 - [INFO] - [Epoch 77]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 07:04:47,364 - [INFO] - [E:78| 0]: Train Loss:0.0028966, Val MRR:0.08254, ice00001
2023-05-22 07:06:17,655 - [INFO] - [E:78| 100]: Train Loss:0.0028983, Val MRR:0.08254, ice00001
2023-05-22 07:07:56,037 - [INFO] - [E:78| 200]: Train Loss:0.0028974, Val MRR:0.08254, ice00001
2023-05-22 07:09:33,990 - [INFO] - [E:78| 300]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 07:11:11,740 - [INFO] - [E:78| 400]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 07:12:49,654 - [INFO] - [E:78| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 07:14:17,337 - [INFO] - [E:78| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 07:15:54,339 - [INFO] - [E:78| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 07:17:30,898 - [INFO] - [E:78| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 07:19:08,619 - [INFO] - [E:78| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 07:20:17,682 - [INFO] - [Epoch:78]: Training Loss:0.002902
2023-05-22 07:20:17,933 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 07:20:40,813 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 07:21:03,130 - [INFO] - [Evaluating Epoch 78 valid]:
MRR: Tail : 0.09558, Head : 0.06865, Avg : 0.08211
2023-05-22 07:21:03,130 - [INFO] - [Epoch 78]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 07:21:04,257 - [INFO] - [E:79| 0]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-22 07:22:38,782 - [INFO] - [E:79| 100]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 07:24:09,304 - [INFO] - [E:79| 200]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 07:25:47,119 - [INFO] - [E:79| 300]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 07:27:24,536 - [INFO] - [E:79| 400]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 07:29:01,757 - [INFO] - [E:79| 500]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 07:30:38,222 - [INFO] - [E:79| 600]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 07:32:04,894 - [INFO] - [E:79| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 07:33:42,599 - [INFO] - [E:79| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 07:35:20,490 - [INFO] - [E:79| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 07:36:28,647 - [INFO] - [Epoch:79]: Training Loss:0.002901
2023-05-22 07:36:29,030 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 07:36:51,682 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 07:37:14,070 - [INFO] - [Evaluating Epoch 79 valid]:
MRR: Tail : 0.09554, Head : 0.06888, Avg : 0.08221
MR: Tail : 846.06, Head : 1014.1, Avg : 930.07
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09831, Head : 0.06142, Avg : 0.07986
Hit-10: Tail : 0.21544, Head : 0.15487, Avg : 0.18516
2023-05-22 07:37:14,070 - [INFO] - [Epoch 79]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 07:37:15,253 - [INFO] - [E:80| 0]: Train Loss:0.0028767, Val MRR:0.08254, ice00001
2023-05-22 07:38:52,856 - [INFO] - [E:80| 100]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 07:40:20,327 - [INFO] - [E:80| 200]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 07:41:57,474 - [INFO] - [E:80| 300]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 07:43:34,722 - [INFO] - [E:80| 400]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-22 07:45:11,610 - [INFO] - [E:80| 500]: Train Loss:0.0028991, Val MRR:0.08254, ice00001
2023-05-22 07:46:47,961 - [INFO] - [E:80| 600]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-22 07:48:25,212 - [INFO] - [E:80| 700]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 07:49:52,698 - [INFO] - [E:80| 800]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 07:51:31,248 - [INFO] - [E:80| 900]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 07:52:39,977 - [INFO] - [Epoch:80]: Training Loss:0.002901
2023-05-22 07:52:40,358 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 07:53:02,874 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 07:53:27,789 - [INFO] - [Evaluating Epoch 80 valid]:
MRR: Tail : 0.09496, Head : 0.06958, Avg : 0.08227
2023-05-22 07:53:27,789 - [INFO] - [Epoch 80]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 07:53:28,904 - [INFO] - [E:81| 0]: Train Loss:0.0029108, Val MRR:0.08254, ice00001
2023-05-22 07:55:06,756 - [INFO] - [E:81| 100]: Train Loss:0.0028954, Val MRR:0.08254, ice00001
2023-05-22 07:56:44,677 - [INFO] - [E:81| 200]: Train Loss:0.0028979, Val MRR:0.08254, ice00001
2023-05-22 07:58:12,821 - [INFO] - [E:81| 300]: Train Loss:0.0028985, Val MRR:0.08254, ice00001
2023-05-22 07:59:49,025 - [INFO] - [E:81| 400]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 08:01:25,846 - [INFO] - [E:81| 500]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 08:03:03,041 - [INFO] - [E:81| 600]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 08:04:40,326 - [INFO] - [E:81| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 08:06:15,902 - [INFO] - [E:81| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 08:07:45,574 - [INFO] - [E:81| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 08:08:54,455 - [INFO] - [Epoch:81]: Training Loss:0.002902
2023-05-22 08:08:54,837 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 08:09:17,389 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 08:09:39,580 - [INFO] - [Evaluating Epoch 81 valid]:
MRR: Tail : 0.09462, Head : 0.06942, Avg : 0.08202
2023-05-22 08:09:39,580 - [INFO] - [Epoch 81]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 08:09:40,725 - [INFO] - [E:82| 0]: Train Loss:0.0028476, Val MRR:0.08254, ice00001
2023-05-22 08:11:18,108 - [INFO] - [E:82| 100]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 08:12:55,362 - [INFO] - [E:82| 200]: Train Loss:0.0028971, Val MRR:0.08254, ice00001
2023-05-22 08:14:32,309 - [INFO] - [E:82| 300]: Train Loss:0.002899, Val MRR:0.08254, ice00001
2023-05-22 08:15:58,911 - [INFO] - [E:82| 400]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 08:17:36,429 - [INFO] - [E:82| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 08:19:13,984 - [INFO] - [E:82| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 08:20:50,717 - [INFO] - [E:82| 700]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 08:22:28,124 - [INFO] - [E:82| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 08:23:55,156 - [INFO] - [E:82| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 08:25:03,226 - [INFO] - [Epoch:82]: Training Loss:0.002901
2023-05-22 08:25:03,604 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 08:25:26,119 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 08:25:47,943 - [INFO] - [Evaluating Epoch 82 valid]:
MRR: Tail : 0.09452, Head : 0.06991, Avg : 0.08222
2023-05-22 08:25:47,944 - [INFO] - [Epoch 82]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 08:25:49,097 - [INFO] - [E:83| 0]: Train Loss:0.0028863, Val MRR:0.08254, ice00001
2023-05-22 08:27:26,790 - [INFO] - [E:83| 100]: Train Loss:0.0029058, Val MRR:0.08254, ice00001
2023-05-22 08:29:02,864 - [INFO] - [E:83| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 08:30:39,801 - [INFO] - [E:83| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 08:32:17,007 - [INFO] - [E:83| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 08:33:46,017 - [INFO] - [E:83| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 08:35:23,520 - [INFO] - [E:83| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 08:37:01,390 - [INFO] - [E:83| 700]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 08:38:38,569 - [INFO] - [E:83| 800]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 08:40:15,697 - [INFO] - [E:83| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 08:41:15,304 - [INFO] - [Epoch:83]: Training Loss:0.002901
2023-05-22 08:41:15,513 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 08:41:39,047 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 08:42:01,372 - [INFO] - [Evaluating Epoch 83 valid]:
MRR: Tail : 0.09308, Head : 0.06971, Avg : 0.0814
2023-05-22 08:42:01,372 - [INFO] - [Epoch 83]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 08:42:02,191 - [INFO] - [E:84| 0]: Train Loss:0.0028778, Val MRR:0.08254, ice00001
2023-05-22 08:43:38,897 - [INFO] - [E:84| 100]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 08:45:16,276 - [INFO] - [E:84| 200]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 08:46:54,461 - [INFO] - [E:84| 300]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-22 08:48:32,251 - [INFO] - [E:84| 400]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 08:50:00,989 - [INFO] - [E:84| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 08:51:38,214 - [INFO] - [E:84| 600]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 08:53:15,292 - [INFO] - [E:84| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 08:54:53,210 - [INFO] - [E:84| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 08:56:30,761 - [INFO] - [E:84| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 08:57:38,399 - [INFO] - [Epoch:84]: Training Loss:0.002901
2023-05-22 08:57:38,734 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 08:58:01,008 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 08:58:23,021 - [INFO] - [Evaluating Epoch 84 valid]:
MRR: Tail : 0.095, Head : 0.06941, Avg : 0.08221
2023-05-22 08:58:23,021 - [INFO] - [Epoch 84]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 08:58:23,691 - [INFO] - [E:85| 0]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-22 08:59:51,102 - [INFO] - [E:85| 100]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 09:01:28,624 - [INFO] - [E:85| 200]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 09:03:06,221 - [INFO] - [E:85| 300]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 09:04:43,982 - [INFO] - [E:85| 400]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 09:06:22,288 - [INFO] - [E:85| 500]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 09:07:49,320 - [INFO] - [E:85| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 09:09:27,531 - [INFO] - [E:85| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 09:11:05,679 - [INFO] - [E:85| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 09:12:42,118 - [INFO] - [E:85| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 09:13:50,465 - [INFO] - [Epoch:85]: Training Loss:0.002901
2023-05-22 09:13:50,970 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 09:14:13,417 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 09:14:35,592 - [INFO] - [Evaluating Epoch 85 valid]:
MRR: Tail : 0.0942, Head : 0.06863, Avg : 0.08142
2023-05-22 09:14:35,592 - [INFO] - [Epoch 85]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 09:14:36,392 - [INFO] - [E:86| 0]: Train Loss:0.0029098, Val MRR:0.08254, ice00001
2023-05-22 09:16:04,160 - [INFO] - [E:86| 100]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 09:17:41,112 - [INFO] - [E:86| 200]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 09:19:18,160 - [INFO] - [E:86| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 09:20:55,660 - [INFO] - [E:86| 400]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 09:22:33,148 - [INFO] - [E:86| 500]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 09:24:10,280 - [INFO] - [E:86| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 09:25:36,876 - [INFO] - [E:86| 700]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 09:27:12,908 - [INFO] - [E:86| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 09:28:49,944 - [INFO] - [E:86| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 09:30:01,320 - [INFO] - [Epoch:86]: Training Loss:0.002901
2023-05-22 09:30:01,683 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 09:30:24,220 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 09:30:46,354 - [INFO] - [Evaluating Epoch 86 valid]:
MRR: Tail : 0.09475, Head : 0.06924, Avg : 0.082
2023-05-22 09:30:46,354 - [INFO] - [Epoch 86]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 09:30:47,135 - [INFO] - [E:87| 0]: Train Loss:0.0029186, Val MRR:0.08254, ice00001
2023-05-22 09:32:24,511 - [INFO] - [E:87| 100]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 09:33:51,610 - [INFO] - [E:87| 200]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 09:35:29,136 - [INFO] - [E:87| 300]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 09:37:06,690 - [INFO] - [E:87| 400]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 09:38:43,965 - [INFO] - [E:87| 500]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 09:40:21,431 - [INFO] - [E:87| 600]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 09:41:57,537 - [INFO] - [E:87| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 09:43:25,040 - [INFO] - [E:87| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 09:45:02,570 - [INFO] - [E:87| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 09:46:11,732 - [INFO] - [Epoch:87]: Training Loss:0.002902
2023-05-22 09:46:11,984 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 09:46:34,422 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 09:46:56,652 - [INFO] - [Evaluating Epoch 87 valid]:
MRR: Tail : 0.09559, Head : 0.06892, Avg : 0.08226
2023-05-22 09:46:56,652 - [INFO] - [Epoch 87]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 09:46:57,775 - [INFO] - [E:88| 0]: Train Loss:0.0029367, Val MRR:0.08254, ice00001
2023-05-22 09:48:35,337 - [INFO] - [E:88| 100]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 09:50:13,128 - [INFO] - [E:88| 200]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 09:51:40,775 - [INFO] - [E:88| 300]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 09:53:17,518 - [INFO] - [E:88| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 09:54:54,655 - [INFO] - [E:88| 500]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 09:56:30,881 - [INFO] - [E:88| 600]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 09:58:08,530 - [INFO] - [E:88| 700]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 09:59:36,490 - [INFO] - [E:88| 800]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 10:01:13,361 - [INFO] - [E:88| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 10:02:21,782 - [INFO] - [Epoch:88]: Training Loss:0.002901
2023-05-22 10:02:22,141 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 10:02:44,869 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 10:03:06,949 - [INFO] - [Evaluating Epoch 88 valid]:
MRR: Tail : 0.0945, Head : 0.06921, Avg : 0.08186
2023-05-22 10:03:06,949 - [INFO] - [Epoch 88]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 10:03:07,969 - [INFO] - [E:89| 0]: Train Loss:0.0028683, Val MRR:0.08254, ice00001
2023-05-22 10:04:45,154 - [INFO] - [E:89| 100]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 10:06:23,092 - [INFO] - [E:89| 200]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-22 10:08:01,909 - [INFO] - [E:89| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 10:09:28,775 - [INFO] - [E:89| 400]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 10:11:04,983 - [INFO] - [E:89| 500]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 10:12:42,394 - [INFO] - [E:89| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 10:14:21,025 - [INFO] - [E:89| 700]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 10:15:58,467 - [INFO] - [E:89| 800]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 10:17:26,330 - [INFO] - [E:89| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 10:18:37,691 - [INFO] - [Epoch:89]: Training Loss:0.002902
2023-05-22 10:18:38,189 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 10:19:00,829 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 10:19:23,228 - [INFO] - [Evaluating Epoch 89 valid]:
MRR: Tail : 0.09248, Head : 0.07016, Avg : 0.08132
MR: Tail : 842.91, Head : 1005.7, Avg : 924.29
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.08181, Head : 0.06506, Avg : 0.07343
Hit-10: Tail : 0.20379, Head : 0.16021, Avg : 0.182
2023-05-22 10:19:23,228 - [INFO] - [Epoch 89]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 10:19:24,194 - [INFO] - [E:90| 0]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-22 10:21:02,367 - [INFO] - [E:90| 100]: Train Loss:0.0029071, Val MRR:0.08254, ice00001
2023-05-22 10:22:40,371 - [INFO] - [E:90| 200]: Train Loss:0.0029055, Val MRR:0.08254, ice00001
2023-05-22 10:24:17,597 - [INFO] - [E:90| 300]: Train Loss:0.0029058, Val MRR:0.08254, ice00001
2023-05-22 10:25:48,349 - [INFO] - [E:90| 400]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-22 10:27:21,874 - [INFO] - [E:90| 500]: Train Loss:0.0029047, Val MRR:0.08254, ice00001
2023-05-22 10:28:59,236 - [INFO] - [E:90| 600]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-22 10:30:36,684 - [INFO] - [E:90| 700]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 10:32:14,560 - [INFO] - [E:90| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 10:33:51,866 - [INFO] - [E:90| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 10:34:50,342 - [INFO] - [Epoch:90]: Training Loss:0.002902
2023-05-22 10:34:50,840 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 10:35:13,229 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 10:35:35,001 - [INFO] - [Evaluating Epoch 90 valid]:
MRR: Tail : 0.09459, Head : 0.06939, Avg : 0.08199
2023-05-22 10:35:35,002 - [INFO] - [Epoch 90]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 10:35:36,037 - [INFO] - [E:91| 0]: Train Loss:0.0029225, Val MRR:0.08254, ice00001
2023-05-22 10:37:13,117 - [INFO] - [E:91| 100]: Train Loss:0.0028989, Val MRR:0.08254, ice00001
2023-05-22 10:38:49,551 - [INFO] - [E:91| 200]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-22 10:40:25,972 - [INFO] - [E:91| 300]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-22 10:42:03,315 - [INFO] - [E:91| 400]: Train Loss:0.0028988, Val MRR:0.08254, ice00001
2023-05-22 10:43:30,481 - [INFO] - [E:91| 500]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-22 10:45:08,903 - [INFO] - [E:91| 600]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 10:46:46,821 - [INFO] - [E:91| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 10:48:24,111 - [INFO] - [E:91| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 10:50:01,729 - [INFO] - [E:91| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 10:51:10,403 - [INFO] - [Epoch:91]: Training Loss:0.002902
2023-05-22 10:51:10,654 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 10:51:33,186 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 10:51:51,997 - [INFO] - [Evaluating Epoch 91 valid]:
MRR: Tail : 0.09551, Head : 0.06914, Avg : 0.08232
2023-05-22 10:51:51,997 - [INFO] - [Epoch 91]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 10:51:52,628 - [INFO] - [E:92| 0]: Train Loss:0.0028792, Val MRR:0.08254, ice00001
2023-05-22 10:53:22,202 - [INFO] - [E:92| 100]: Train Loss:0.0028912, Val MRR:0.08254, ice00001
2023-05-22 10:54:58,748 - [INFO] - [E:92| 200]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-22 10:56:36,611 - [INFO] - [E:92| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 10:58:14,603 - [INFO] - [E:92| 400]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 10:59:51,772 - [INFO] - [E:92| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 11:01:18,860 - [INFO] - [E:92| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 11:02:56,342 - [INFO] - [E:92| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 11:04:34,058 - [INFO] - [E:92| 800]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 11:06:14,032 - [INFO] - [E:92| 900]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 11:07:22,344 - [INFO] - [Epoch:92]: Training Loss:0.002902
2023-05-22 11:07:22,842 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 11:07:44,863 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 11:08:06,722 - [INFO] - [Evaluating Epoch 92 valid]:
MRR: Tail : 0.09476, Head : 0.06905, Avg : 0.0819
2023-05-22 11:08:06,723 - [INFO] - [Epoch 92]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 11:08:07,844 - [INFO] - [E:93| 0]: Train Loss:0.0029509, Val MRR:0.08254, ice00001
2023-05-22 11:09:34,826 - [INFO] - [E:93| 100]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 11:11:12,265 - [INFO] - [E:93| 200]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 11:12:50,243 - [INFO] - [E:93| 300]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 11:14:28,439 - [INFO] - [E:93| 400]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 11:16:05,892 - [INFO] - [E:93| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 11:17:43,394 - [INFO] - [E:93| 600]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 11:19:10,758 - [INFO] - [E:93| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 11:20:48,753 - [INFO] - [E:93| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 11:22:25,935 - [INFO] - [E:93| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 11:23:34,124 - [INFO] - [Epoch:93]: Training Loss:0.002901
2023-05-22 11:23:34,508 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 11:23:57,233 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 11:24:19,265 - [INFO] - [Evaluating Epoch 93 valid]:
MRR: Tail : 0.09448, Head : 0.06874, Avg : 0.08161
2023-05-22 11:24:19,265 - [INFO] - [Epoch 93]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 11:24:20,473 - [INFO] - [E:94| 0]: Train Loss:0.0029823, Val MRR:0.08254, ice00001
2023-05-22 11:25:58,471 - [INFO] - [E:94| 100]: Train Loss:0.0029056, Val MRR:0.08254, ice00001
2023-05-22 11:27:25,910 - [INFO] - [E:94| 200]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 11:29:03,353 - [INFO] - [E:94| 300]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 11:30:40,900 - [INFO] - [E:94| 400]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 11:32:18,246 - [INFO] - [E:94| 500]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 11:33:55,788 - [INFO] - [E:94| 600]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 11:35:29,439 - [INFO] - [E:94| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 11:37:00,119 - [INFO] - [E:94| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 11:38:37,296 - [INFO] - [E:94| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 11:39:46,083 - [INFO] - [Epoch:94]: Training Loss:0.002902
2023-05-22 11:39:46,334 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 11:40:09,106 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 11:40:31,383 - [INFO] - [Evaluating Epoch 94 valid]:
MRR: Tail : 0.09529, Head : 0.06928, Avg : 0.08229
2023-05-22 11:40:31,383 - [INFO] - [Epoch 94]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 11:40:32,489 - [INFO] - [E:95| 0]: Train Loss:0.0029417, Val MRR:0.08254, ice00001
2023-05-22 11:42:09,891 - [INFO] - [E:95| 100]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 11:43:47,330 - [INFO] - [E:95| 200]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-22 11:45:15,037 - [INFO] - [E:95| 300]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 11:46:52,371 - [INFO] - [E:95| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 11:48:30,007 - [INFO] - [E:95| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 11:50:07,663 - [INFO] - [E:95| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 11:51:43,758 - [INFO] - [E:95| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 11:53:10,791 - [INFO] - [E:95| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 11:54:50,745 - [INFO] - [E:95| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 11:55:59,676 - [INFO] - [Epoch:95]: Training Loss:0.002901
2023-05-22 11:56:00,034 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 11:56:22,704 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 11:56:45,033 - [INFO] - [Evaluating Epoch 95 valid]:
MRR: Tail : 0.09447, Head : 0.06963, Avg : 0.08205
2023-05-22 11:56:45,033 - [INFO] - [Epoch 95]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 11:56:45,835 - [INFO] - [E:96| 0]: Train Loss:0.002907, Val MRR:0.08254, ice00001
2023-05-22 11:58:24,050 - [INFO] - [E:96| 100]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-22 12:00:01,797 - [INFO] - [E:96| 200]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 12:01:38,171 - [INFO] - [E:96| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 12:03:07,081 - [INFO] - [E:96| 400]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 12:04:44,633 - [INFO] - [E:96| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 12:06:20,733 - [INFO] - [E:96| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 12:07:57,529 - [INFO] - [E:96| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 12:09:34,457 - [INFO] - [E:96| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 12:11:01,359 - [INFO] - [E:96| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 12:12:10,119 - [INFO] - [Epoch:96]: Training Loss:0.002902
2023-05-22 12:12:10,480 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 12:12:32,853 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 12:12:54,819 - [INFO] - [Evaluating Epoch 96 valid]:
MRR: Tail : 0.09411, Head : 0.06935, Avg : 0.08173
2023-05-22 12:12:54,819 - [INFO] - [Epoch 96]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 12:12:55,859 - [INFO] - [E:97| 0]: Train Loss:0.0029107, Val MRR:0.08254, ice00001
2023-05-22 12:14:33,784 - [INFO] - [E:97| 100]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 12:16:11,518 - [INFO] - [E:97| 200]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 12:17:49,807 - [INFO] - [E:97| 300]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 12:19:17,558 - [INFO] - [E:97| 400]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 12:20:53,519 - [INFO] - [E:97| 500]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 12:22:30,502 - [INFO] - [E:97| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 12:24:08,292 - [INFO] - [E:97| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 12:25:45,345 - [INFO] - [E:97| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 12:27:22,186 - [INFO] - [E:97| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 12:28:20,266 - [INFO] - [Epoch:97]: Training Loss:0.002902
2023-05-22 12:28:20,517 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 12:28:42,744 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 12:29:04,742 - [INFO] - [Evaluating Epoch 97 valid]:
MRR: Tail : 0.09408, Head : 0.06969, Avg : 0.08189
2023-05-22 12:29:04,742 - [INFO] - [Epoch 97]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 12:29:05,847 - [INFO] - [E:98| 0]: Train Loss:0.0029322, Val MRR:0.08254, ice00001
2023-05-22 12:30:43,964 - [INFO] - [E:98| 100]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-22 12:32:21,224 - [INFO] - [E:98| 200]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-22 12:33:58,768 - [INFO] - [E:98| 300]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 12:35:35,359 - [INFO] - [E:98| 400]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 12:37:02,764 - [INFO] - [E:98| 500]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 12:38:40,371 - [INFO] - [E:98| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 12:40:17,469 - [INFO] - [E:98| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 12:41:54,880 - [INFO] - [E:98| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 12:43:35,196 - [INFO] - [E:98| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 12:44:43,749 - [INFO] - [Epoch:98]: Training Loss:0.002902
2023-05-22 12:44:44,227 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 12:45:06,826 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 12:45:24,656 - [INFO] - [Evaluating Epoch 98 valid]:
MRR: Tail : 0.09453, Head : 0.06963, Avg : 0.08208
2023-05-22 12:45:24,656 - [INFO] - [Epoch 98]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 12:45:25,273 - [INFO] - [E:99| 0]: Train Loss:0.0028766, Val MRR:0.08254, ice00001
2023-05-22 12:46:57,071 - [INFO] - [E:99| 100]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 12:48:34,823 - [INFO] - [E:99| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 12:50:11,067 - [INFO] - [E:99| 300]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 12:51:48,360 - [INFO] - [E:99| 400]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 12:53:26,473 - [INFO] - [E:99| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 12:54:54,070 - [INFO] - [E:99| 600]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 12:56:32,405 - [INFO] - [E:99| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 12:58:10,634 - [INFO] - [E:99| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 12:59:48,607 - [INFO] - [E:99| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 13:00:57,748 - [INFO] - [Epoch:99]: Training Loss:0.002902
2023-05-22 13:00:58,122 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 13:01:21,077 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 13:01:43,639 - [INFO] - [Evaluating Epoch 99 valid]:
MRR: Tail : 0.09507, Head : 0.06927, Avg : 0.08217
MR: Tail : 752.95, Head : 916.68, Avg : 834.82
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09831, Head : 0.06142, Avg : 0.07986
Hit-10: Tail : 0.21435, Head : 0.15378, Avg : 0.18406
2023-05-22 13:01:43,639 - [INFO] - [Epoch 99]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 13:01:44,777 - [INFO] - [E:100| 0]: Train Loss:0.0028675, Val MRR:0.08254, ice00001
2023-05-22 13:03:11,916 - [INFO] - [E:100| 100]: Train Loss:0.0029063, Val MRR:0.08254, ice00001
2023-05-22 13:04:47,051 - [INFO] - [E:100| 200]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-22 13:06:24,673 - [INFO] - [E:100| 300]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 13:08:02,107 - [INFO] - [E:100| 400]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 13:09:39,902 - [INFO] - [E:100| 500]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 13:11:17,751 - [INFO] - [E:100| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 13:12:45,255 - [INFO] - [E:100| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 13:14:23,414 - [INFO] - [E:100| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 13:16:00,773 - [INFO] - [E:100| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 13:17:09,237 - [INFO] - [Epoch:100]: Training Loss:0.002902
2023-05-22 13:17:09,578 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 13:17:32,065 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 13:17:53,649 - [INFO] - [Evaluating Epoch 100 valid]:
MRR: Tail : 0.09536, Head : 0.0693, Avg : 0.08233
2023-05-22 13:17:53,649 - [INFO] - [Epoch 100]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 13:17:54,679 - [INFO] - [E:101| 0]: Train Loss:0.0029455, Val MRR:0.08254, ice00001
2023-05-22 13:19:30,904 - [INFO] - [E:101| 100]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 13:20:58,073 - [INFO] - [E:101| 200]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 13:22:35,753 - [INFO] - [E:101| 300]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 13:24:13,724 - [INFO] - [E:101| 400]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 13:25:51,584 - [INFO] - [E:101| 500]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 13:27:28,595 - [INFO] - [E:101| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 13:29:01,670 - [INFO] - [E:101| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 13:30:32,894 - [INFO] - [E:101| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 13:32:12,558 - [INFO] - [E:101| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 13:33:20,363 - [INFO] - [Epoch:101]: Training Loss:0.002902
2023-05-22 13:33:20,703 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 13:33:42,759 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 13:34:05,046 - [INFO] - [Evaluating Epoch 101 valid]:
MRR: Tail : 0.09555, Head : 0.06884, Avg : 0.0822
2023-05-22 13:34:05,046 - [INFO] - [Epoch 101]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 13:34:06,145 - [INFO] - [E:102| 0]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 13:35:43,456 - [INFO] - [E:102| 100]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 13:37:20,858 - [INFO] - [E:102| 200]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-22 13:38:48,241 - [INFO] - [E:102| 300]: Train Loss:0.0028992, Val MRR:0.08254, ice00001
2023-05-22 13:40:25,836 - [INFO] - [E:102| 400]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 13:42:02,480 - [INFO] - [E:102| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 13:43:40,287 - [INFO] - [E:102| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 13:45:17,374 - [INFO] - [E:102| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 13:46:44,215 - [INFO] - [E:102| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 13:48:21,520 - [INFO] - [E:102| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 13:49:30,249 - [INFO] - [Epoch:102]: Training Loss:0.002902
2023-05-22 13:49:30,743 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 13:49:53,517 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 13:50:15,975 - [INFO] - [Evaluating Epoch 102 valid]:
MRR: Tail : 0.09555, Head : 0.0688, Avg : 0.08217
2023-05-22 13:50:15,975 - [INFO] - [Epoch 102]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 13:50:17,166 - [INFO] - [E:103| 0]: Train Loss:0.0029058, Val MRR:0.08254, ice00001
2023-05-22 13:51:54,398 - [INFO] - [E:103| 100]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 13:53:32,636 - [INFO] - [E:103| 200]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 13:55:09,942 - [INFO] - [E:103| 300]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 13:56:38,444 - [INFO] - [E:103| 400]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 13:58:15,470 - [INFO] - [E:103| 500]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 13:59:52,822 - [INFO] - [E:103| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 14:01:29,283 - [INFO] - [E:103| 700]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 14:03:05,628 - [INFO] - [E:103| 800]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 14:04:32,686 - [INFO] - [E:103| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 14:05:41,172 - [INFO] - [Epoch:103]: Training Loss:0.002902
2023-05-22 14:05:41,489 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 14:06:03,935 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 14:06:25,902 - [INFO] - [Evaluating Epoch 103 valid]:
MRR: Tail : 0.09343, Head : 0.07, Avg : 0.08171
2023-05-22 14:06:25,902 - [INFO] - [Epoch 103]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 14:06:26,969 - [INFO] - [E:104| 0]: Train Loss:0.0028772, Val MRR:0.08254, ice00001
2023-05-22 14:08:04,917 - [INFO] - [E:104| 100]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 14:09:43,158 - [INFO] - [E:104| 200]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-22 14:11:21,276 - [INFO] - [E:104| 300]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-22 14:12:50,384 - [INFO] - [E:104| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 14:14:26,661 - [INFO] - [E:104| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 14:16:04,736 - [INFO] - [E:104| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 14:17:46,042 - [INFO] - [E:104| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 14:19:32,184 - [INFO] - [E:104| 800]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 14:21:10,400 - [INFO] - [E:104| 900]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 14:22:09,330 - [INFO] - [Epoch:104]: Training Loss:0.002901
2023-05-22 14:22:09,661 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 14:22:32,598 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 14:22:54,555 - [INFO] - [Evaluating Epoch 104 valid]:
MRR: Tail : 0.0954, Head : 0.06896, Avg : 0.08218
2023-05-22 14:22:54,556 - [INFO] - [Epoch 104]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 14:22:55,666 - [INFO] - [E:105| 0]: Train Loss:0.0029104, Val MRR:0.08254, ice00001
2023-05-22 14:24:34,603 - [INFO] - [E:105| 100]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-22 14:26:12,459 - [INFO] - [E:105| 200]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 14:27:49,976 - [INFO] - [E:105| 300]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 14:29:28,009 - [INFO] - [E:105| 400]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 14:30:54,675 - [INFO] - [E:105| 500]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 14:32:32,186 - [INFO] - [E:105| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 14:34:09,339 - [INFO] - [E:105| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 14:35:47,036 - [INFO] - [E:105| 800]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 14:37:24,222 - [INFO] - [E:105| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 14:38:32,691 - [INFO] - [Epoch:105]: Training Loss:0.002902
2023-05-22 14:38:32,993 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 14:38:55,508 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 14:39:15,554 - [INFO] - [Evaluating Epoch 105 valid]:
MRR: Tail : 0.09483, Head : 0.06901, Avg : 0.08192
2023-05-22 14:39:15,554 - [INFO] - [Epoch 105]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 14:39:16,169 - [INFO] - [E:106| 0]: Train Loss:0.0028902, Val MRR:0.08254, ice00001
2023-05-22 14:40:45,874 - [INFO] - [E:106| 100]: Train Loss:0.0028989, Val MRR:0.08254, ice00001
2023-05-22 14:42:23,049 - [INFO] - [E:106| 200]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 14:44:00,607 - [INFO] - [E:106| 300]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 14:45:37,257 - [INFO] - [E:106| 400]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-22 14:47:14,294 - [INFO] - [E:106| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 14:48:41,068 - [INFO] - [E:106| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 14:50:18,693 - [INFO] - [E:106| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 14:51:56,522 - [INFO] - [E:106| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 14:53:34,468 - [INFO] - [E:106| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 14:54:43,439 - [INFO] - [Epoch:106]: Training Loss:0.002901
2023-05-22 14:54:43,690 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 14:55:06,305 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 14:55:28,348 - [INFO] - [Evaluating Epoch 106 valid]:
MRR: Tail : 0.09495, Head : 0.06866, Avg : 0.0818
2023-05-22 14:55:28,349 - [INFO] - [Epoch 106]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 14:55:29,466 - [INFO] - [E:107| 0]: Train Loss:0.0029619, Val MRR:0.08254, ice00001
2023-05-22 14:56:57,108 - [INFO] - [E:107| 100]: Train Loss:0.0028945, Val MRR:0.08254, ice00001
2023-05-22 14:58:34,575 - [INFO] - [E:107| 200]: Train Loss:0.0028984, Val MRR:0.08254, ice00001
2023-05-22 15:00:10,567 - [INFO] - [E:107| 300]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 15:01:47,452 - [INFO] - [E:107| 400]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-22 15:03:24,938 - [INFO] - [E:107| 500]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 15:05:01,990 - [INFO] - [E:107| 600]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-22 15:06:29,469 - [INFO] - [E:107| 700]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 15:08:09,163 - [INFO] - [E:107| 800]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 15:09:46,393 - [INFO] - [E:107| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 15:10:54,506 - [INFO] - [Epoch:107]: Training Loss:0.002902
2023-05-22 15:10:54,848 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 15:11:17,524 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 15:11:39,614 - [INFO] - [Evaluating Epoch 107 valid]:
MRR: Tail : 0.0949, Head : 0.06908, Avg : 0.08199
2023-05-22 15:11:39,614 - [INFO] - [Epoch 107]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 15:11:40,705 - [INFO] - [E:108| 0]: Train Loss:0.0029456, Val MRR:0.08254, ice00001
2023-05-22 15:13:17,540 - [INFO] - [E:108| 100]: Train Loss:0.0029046, Val MRR:0.08254, ice00001
2023-05-22 15:14:43,944 - [INFO] - [E:108| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 15:16:21,539 - [INFO] - [E:108| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 15:17:58,895 - [INFO] - [E:108| 400]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 15:19:35,801 - [INFO] - [E:108| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 15:21:12,844 - [INFO] - [E:108| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 15:22:48,261 - [INFO] - [E:108| 700]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 15:24:17,690 - [INFO] - [E:108| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 15:25:55,171 - [INFO] - [E:108| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 15:27:03,986 - [INFO] - [Epoch:108]: Training Loss:0.002902
2023-05-22 15:27:04,276 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 15:27:26,493 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 15:27:48,742 - [INFO] - [Evaluating Epoch 108 valid]:
MRR: Tail : 0.09474, Head : 0.06832, Avg : 0.08153
2023-05-22 15:27:48,743 - [INFO] - [Epoch 108]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 15:27:49,746 - [INFO] - [E:109| 0]: Train Loss:0.0028602, Val MRR:0.08254, ice00001
2023-05-22 15:29:26,349 - [INFO] - [E:109| 100]: Train Loss:0.0028973, Val MRR:0.08254, ice00001
2023-05-22 15:31:04,491 - [INFO] - [E:109| 200]: Train Loss:0.0028951, Val MRR:0.08254, ice00001
2023-05-22 15:32:31,628 - [INFO] - [E:109| 300]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 15:34:09,405 - [INFO] - [E:109| 400]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 15:35:46,712 - [INFO] - [E:109| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 15:37:24,191 - [INFO] - [E:109| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 15:39:02,109 - [INFO] - [E:109| 700]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 15:40:29,711 - [INFO] - [E:109| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 15:42:06,746 - [INFO] - [E:109| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 15:43:15,033 - [INFO] - [Epoch:109]: Training Loss:0.002901
2023-05-22 15:43:15,285 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 15:43:37,602 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 15:43:59,586 - [INFO] - [Evaluating Epoch 109 valid]:
MRR: Tail : 0.09462, Head : 0.06911, Avg : 0.08186
MR: Tail : 813.15, Head : 978.01, Avg : 895.58
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09637, Head : 0.05947, Avg : 0.07792
Hit-10: Tail : 0.19942, Head : 0.16106, Avg : 0.18024
2023-05-22 15:43:59,586 - [INFO] - [Epoch 109]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 15:44:00,717 - [INFO] - [E:110| 0]: Train Loss:0.0028691, Val MRR:0.08254, ice00001
2023-05-22 15:45:38,280 - [INFO] - [E:110| 100]: Train Loss:0.0028985, Val MRR:0.08254, ice00001
2023-05-22 15:47:16,120 - [INFO] - [E:110| 200]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 15:48:50,283 - [INFO] - [E:110| 300]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 15:50:20,650 - [INFO] - [E:110| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 15:51:58,203 - [INFO] - [E:110| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 15:53:35,665 - [INFO] - [E:110| 600]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 15:55:12,599 - [INFO] - [E:110| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 15:56:52,034 - [INFO] - [E:110| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 15:58:19,156 - [INFO] - [E:110| 900]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 15:59:27,818 - [INFO] - [Epoch:110]: Training Loss:0.002902
2023-05-22 15:59:28,297 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 15:59:50,942 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 16:00:13,436 - [INFO] - [Evaluating Epoch 110 valid]:
MRR: Tail : 0.09543, Head : 0.06924, Avg : 0.08234
2023-05-22 16:00:13,436 - [INFO] - [Epoch 110]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 16:00:14,337 - [INFO] - [E:111| 0]: Train Loss:0.0028596, Val MRR:0.08254, ice00001
2023-05-22 16:01:52,258 - [INFO] - [E:111| 100]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 16:03:29,802 - [INFO] - [E:111| 200]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-22 16:05:07,407 - [INFO] - [E:111| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 16:06:35,353 - [INFO] - [E:111| 400]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 16:08:12,546 - [INFO] - [E:111| 500]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 16:09:50,672 - [INFO] - [E:111| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 16:11:27,453 - [INFO] - [E:111| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 16:13:04,252 - [INFO] - [E:111| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 16:14:42,628 - [INFO] - [E:111| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 16:15:41,314 - [INFO] - [Epoch:111]: Training Loss:0.002902
2023-05-22 16:15:41,586 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 16:16:03,951 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 16:16:26,129 - [INFO] - [Evaluating Epoch 111 valid]:
MRR: Tail : 0.0953, Head : 0.06919, Avg : 0.08224
2023-05-22 16:16:26,129 - [INFO] - [Epoch 111]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 16:16:27,249 - [INFO] - [E:112| 0]: Train Loss:0.0028696, Val MRR:0.08254, ice00001
2023-05-22 16:18:05,351 - [INFO] - [E:112| 100]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 16:19:43,118 - [INFO] - [E:112| 200]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 16:21:20,286 - [INFO] - [E:112| 300]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-22 16:22:57,359 - [INFO] - [E:112| 400]: Train Loss:0.0029045, Val MRR:0.08254, ice00001
2023-05-22 16:24:24,352 - [INFO] - [E:112| 500]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-22 16:26:00,794 - [INFO] - [E:112| 600]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 16:27:37,836 - [INFO] - [E:112| 700]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 16:29:15,692 - [INFO] - [E:112| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 16:30:53,466 - [INFO] - [E:112| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 16:32:02,041 - [INFO] - [Epoch:112]: Training Loss:0.002901
2023-05-22 16:32:02,419 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 16:32:22,256 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 16:32:35,353 - [INFO] - [Evaluating Epoch 112 valid]:
MRR: Tail : 0.09432, Head : 0.06996, Avg : 0.08214
2023-05-22 16:32:35,353 - [INFO] - [Epoch 112]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 16:32:36,078 - [INFO] - [E:113| 0]: Train Loss:0.0028611, Val MRR:0.08254, ice00001
2023-05-22 16:34:13,329 - [INFO] - [E:113| 100]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 16:35:49,971 - [INFO] - [E:113| 200]: Train Loss:0.0028985, Val MRR:0.08254, ice00001
2023-05-22 16:37:26,919 - [INFO] - [E:113| 300]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 16:39:04,241 - [INFO] - [E:113| 400]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 16:40:40,664 - [INFO] - [E:113| 500]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 16:42:07,444 - [INFO] - [E:113| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 16:43:47,157 - [INFO] - [E:113| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 16:45:25,162 - [INFO] - [E:113| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 16:47:02,253 - [INFO] - [E:113| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 16:48:10,769 - [INFO] - [Epoch:113]: Training Loss:0.002901
2023-05-22 16:48:11,032 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 16:48:33,421 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 16:48:55,658 - [INFO] - [Evaluating Epoch 113 valid]:
MRR: Tail : 0.09481, Head : 0.06917, Avg : 0.08199
2023-05-22 16:48:55,658 - [INFO] - [Epoch 113]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 16:48:56,773 - [INFO] - [E:114| 0]: Train Loss:0.0029601, Val MRR:0.08254, ice00001
2023-05-22 16:50:24,126 - [INFO] - [E:114| 100]: Train Loss:0.0029054, Val MRR:0.08254, ice00001
2023-05-22 16:52:01,102 - [INFO] - [E:114| 200]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-22 16:53:38,239 - [INFO] - [E:114| 300]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 16:55:14,573 - [INFO] - [E:114| 400]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 16:56:52,959 - [INFO] - [E:114| 500]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 16:58:28,696 - [INFO] - [E:114| 600]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 16:59:58,237 - [INFO] - [E:114| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 17:01:35,968 - [INFO] - [E:114| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 17:03:13,202 - [INFO] - [E:114| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 17:04:21,539 - [INFO] - [Epoch:114]: Training Loss:0.002902
2023-05-22 17:04:21,950 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 17:04:44,278 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 17:05:06,024 - [INFO] - [Evaluating Epoch 114 valid]:
MRR: Tail : 0.09498, Head : 0.06954, Avg : 0.08226
2023-05-22 17:05:06,024 - [INFO] - [Epoch 114]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 17:05:06,916 - [INFO] - [E:115| 0]: Train Loss:0.0028731, Val MRR:0.08254, ice00001
2023-05-22 17:06:44,431 - [INFO] - [E:115| 100]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-22 17:08:11,112 - [INFO] - [E:115| 200]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 17:09:47,667 - [INFO] - [E:115| 300]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 17:11:24,603 - [INFO] - [E:115| 400]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 17:13:01,891 - [INFO] - [E:115| 500]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 17:14:38,961 - [INFO] - [E:115| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 17:16:05,210 - [INFO] - [E:115| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 17:17:42,839 - [INFO] - [E:115| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 17:19:20,477 - [INFO] - [E:115| 900]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 17:20:28,744 - [INFO] - [Epoch:115]: Training Loss:0.002902
2023-05-22 17:20:29,063 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 17:20:51,645 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 17:21:13,804 - [INFO] - [Evaluating Epoch 115 valid]:
MRR: Tail : 0.09307, Head : 0.06987, Avg : 0.08147
2023-05-22 17:21:13,804 - [INFO] - [Epoch 115]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 17:21:14,870 - [INFO] - [E:116| 0]: Train Loss:0.0028612, Val MRR:0.08254, ice00001
2023-05-22 17:22:52,148 - [INFO] - [E:116| 100]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 17:24:26,743 - [INFO] - [E:116| 200]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 17:25:56,100 - [INFO] - [E:116| 300]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 17:27:33,975 - [INFO] - [E:116| 400]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 17:29:11,904 - [INFO] - [E:116| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 17:30:49,667 - [INFO] - [E:116| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 17:32:28,988 - [INFO] - [E:116| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 17:33:56,258 - [INFO] - [E:116| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 17:35:34,101 - [INFO] - [E:116| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 17:36:42,929 - [INFO] - [Epoch:116]: Training Loss:0.002902
2023-05-22 17:36:43,285 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 17:37:05,947 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 17:37:27,669 - [INFO] - [Evaluating Epoch 116 valid]:
MRR: Tail : 0.09573, Head : 0.06881, Avg : 0.08227
2023-05-22 17:37:27,669 - [INFO] - [Epoch 116]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 17:37:28,704 - [INFO] - [E:117| 0]: Train Loss:0.0028868, Val MRR:0.08254, ice00001
2023-05-22 17:39:06,183 - [INFO] - [E:117| 100]: Train Loss:0.0029055, Val MRR:0.08254, ice00001
2023-05-22 17:40:43,929 - [INFO] - [E:117| 200]: Train Loss:0.002905, Val MRR:0.08254, ice00001
2023-05-22 17:42:11,248 - [INFO] - [E:117| 300]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-22 17:43:49,230 - [INFO] - [E:117| 400]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 17:45:27,307 - [INFO] - [E:117| 500]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 17:47:05,418 - [INFO] - [E:117| 600]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 17:48:43,604 - [INFO] - [E:117| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 17:50:21,930 - [INFO] - [E:117| 800]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 17:51:49,055 - [INFO] - [E:117| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 17:52:56,972 - [INFO] - [Epoch:117]: Training Loss:0.002902
2023-05-22 17:52:57,223 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 17:53:19,322 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 17:53:41,280 - [INFO] - [Evaluating Epoch 117 valid]:
MRR: Tail : 0.09563, Head : 0.06864, Avg : 0.08213
2023-05-22 17:53:41,280 - [INFO] - [Epoch 117]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 17:53:42,331 - [INFO] - [E:118| 0]: Train Loss:0.0029078, Val MRR:0.08254, ice00001
2023-05-22 17:55:19,671 - [INFO] - [E:118| 100]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 17:56:57,530 - [INFO] - [E:118| 200]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-22 17:58:35,359 - [INFO] - [E:118| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 18:00:02,839 - [INFO] - [E:118| 400]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 18:01:40,798 - [INFO] - [E:118| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 18:03:18,112 - [INFO] - [E:118| 600]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 18:04:55,552 - [INFO] - [E:118| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 18:06:32,292 - [INFO] - [E:118| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 18:08:04,218 - [INFO] - [E:118| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 18:09:06,474 - [INFO] - [Epoch:118]: Training Loss:0.002902
2023-05-22 18:09:06,810 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 18:09:28,982 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 18:09:50,856 - [INFO] - [Evaluating Epoch 118 valid]:
MRR: Tail : 0.09485, Head : 0.06919, Avg : 0.08202
2023-05-22 18:09:50,856 - [INFO] - [Epoch 118]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 18:09:51,642 - [INFO] - [E:119| 0]: Train Loss:0.0028946, Val MRR:0.08254, ice00001
2023-05-22 18:11:29,007 - [INFO] - [E:119| 100]: Train Loss:0.0028988, Val MRR:0.08254, ice00001
2023-05-22 18:13:06,620 - [INFO] - [E:119| 200]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 18:14:44,962 - [INFO] - [E:119| 300]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 18:16:23,216 - [INFO] - [E:119| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 18:17:50,715 - [INFO] - [E:119| 500]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 18:19:28,879 - [INFO] - [E:119| 600]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 18:21:08,201 - [INFO] - [E:119| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 18:22:44,532 - [INFO] - [E:119| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 18:24:22,338 - [INFO] - [E:119| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 18:25:30,843 - [INFO] - [Epoch:119]: Training Loss:0.002901
2023-05-22 18:25:31,053 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 18:25:44,334 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 18:26:04,439 - [INFO] - [Evaluating Epoch 119 valid]:
MRR: Tail : 0.09564, Head : 0.06883, Avg : 0.08223
MR: Tail : 813.93, Head : 1009.0, Avg : 911.46
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09831, Head : 0.06142, Avg : 0.07986
Hit-10: Tail : 0.21532, Head : 0.15463, Avg : 0.18497
2023-05-22 18:26:04,440 - [INFO] - [Epoch 119]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 18:26:05,504 - [INFO] - [E:120| 0]: Train Loss:0.002944, Val MRR:0.08254, ice00001
2023-05-22 18:27:42,353 - [INFO] - [E:120| 100]: Train Loss:0.0029041, Val MRR:0.08254, ice00001
2023-05-22 18:29:19,591 - [INFO] - [E:120| 200]: Train Loss:0.0029042, Val MRR:0.08254, ice00001
2023-05-22 18:30:56,997 - [INFO] - [E:120| 300]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 18:32:33,991 - [INFO] - [E:120| 400]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-22 18:34:11,981 - [INFO] - [E:120| 500]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-22 18:35:38,579 - [INFO] - [E:120| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 18:37:15,663 - [INFO] - [E:120| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 18:38:53,752 - [INFO] - [E:120| 800]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 18:40:32,014 - [INFO] - [E:120| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 18:41:40,398 - [INFO] - [Epoch:120]: Training Loss:0.002901
2023-05-22 18:41:40,649 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 18:42:03,326 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 18:42:25,678 - [INFO] - [Evaluating Epoch 120 valid]:
MRR: Tail : 0.09319, Head : 0.06999, Avg : 0.08159
2023-05-22 18:42:25,678 - [INFO] - [Epoch 120]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 18:42:26,790 - [INFO] - [E:121| 0]: Train Loss:0.0028711, Val MRR:0.08254, ice00001
2023-05-22 18:43:53,529 - [INFO] - [E:121| 100]: Train Loss:0.0029077, Val MRR:0.08254, ice00001
2023-05-22 18:45:31,238 - [INFO] - [E:121| 200]: Train Loss:0.0029042, Val MRR:0.08254, ice00001
2023-05-22 18:47:09,509 - [INFO] - [E:121| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 18:48:47,466 - [INFO] - [E:121| 400]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 18:50:23,550 - [INFO] - [E:121| 500]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 18:51:50,166 - [INFO] - [E:121| 600]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-22 18:53:27,486 - [INFO] - [E:121| 700]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-22 18:55:05,384 - [INFO] - [E:121| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 18:56:43,366 - [INFO] - [E:121| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 18:57:52,242 - [INFO] - [Epoch:121]: Training Loss:0.002902
2023-05-22 18:57:52,607 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 18:58:15,496 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 18:58:38,011 - [INFO] - [Evaluating Epoch 121 valid]:
MRR: Tail : 0.09505, Head : 0.06853, Avg : 0.08179
2023-05-22 18:58:38,011 - [INFO] - [Epoch 121]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 18:58:39,141 - [INFO] - [E:122| 0]: Train Loss:0.0028968, Val MRR:0.08254, ice00001
2023-05-22 19:00:16,879 - [INFO] - [E:122| 100]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-22 19:01:43,782 - [INFO] - [E:122| 200]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-22 19:03:21,013 - [INFO] - [E:122| 300]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 19:04:57,422 - [INFO] - [E:122| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 19:06:34,333 - [INFO] - [E:122| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 19:08:13,314 - [INFO] - [E:122| 600]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 19:09:40,763 - [INFO] - [E:122| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 19:11:18,370 - [INFO] - [E:122| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 19:12:56,703 - [INFO] - [E:122| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 19:14:05,603 - [INFO] - [Epoch:122]: Training Loss:0.002902
2023-05-22 19:14:05,854 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 19:14:28,540 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 19:14:50,929 - [INFO] - [Evaluating Epoch 122 valid]:
MRR: Tail : 0.09512, Head : 0.06927, Avg : 0.0822
2023-05-22 19:14:50,929 - [INFO] - [Epoch 122]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 19:14:52,058 - [INFO] - [E:123| 0]: Train Loss:0.0028764, Val MRR:0.08254, ice00001
2023-05-22 19:16:29,408 - [INFO] - [E:123| 100]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-22 19:18:02,651 - [INFO] - [E:123| 200]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 19:19:33,092 - [INFO] - [E:123| 300]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 19:21:10,920 - [INFO] - [E:123| 400]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-22 19:22:49,292 - [INFO] - [E:123| 500]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 19:24:26,939 - [INFO] - [E:123| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 19:26:04,331 - [INFO] - [E:123| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 19:27:32,025 - [INFO] - [E:123| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 19:29:09,124 - [INFO] - [E:123| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 19:30:17,758 - [INFO] - [Epoch:123]: Training Loss:0.002902
2023-05-22 19:30:18,167 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 19:30:40,580 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 19:31:02,624 - [INFO] - [Evaluating Epoch 123 valid]:
MRR: Tail : 0.09223, Head : 0.06872, Avg : 0.08047
2023-05-22 19:31:02,624 - [INFO] - [Epoch 123]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 19:31:03,563 - [INFO] - [E:124| 0]: Train Loss:0.0028778, Val MRR:0.08254, ice00001
2023-05-22 19:32:40,868 - [INFO] - [E:124| 100]: Train Loss:0.0028964, Val MRR:0.08254, ice00001
2023-05-22 19:34:17,516 - [INFO] - [E:124| 200]: Train Loss:0.0028988, Val MRR:0.08254, ice00001
2023-05-22 19:35:45,255 - [INFO] - [E:124| 300]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 19:37:22,225 - [INFO] - [E:124| 400]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 19:38:59,017 - [INFO] - [E:124| 500]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 19:40:36,009 - [INFO] - [E:124| 600]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 19:42:13,087 - [INFO] - [E:124| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 19:43:50,411 - [INFO] - [E:124| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 19:45:17,071 - [INFO] - [E:124| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 19:46:25,439 - [INFO] - [Epoch:124]: Training Loss:0.002901
2023-05-22 19:46:25,812 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 19:46:48,172 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 19:47:10,183 - [INFO] - [Evaluating Epoch 124 valid]:
MRR: Tail : 0.09542, Head : 0.06903, Avg : 0.08222
2023-05-22 19:47:10,183 - [INFO] - [Epoch 124]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 19:47:11,358 - [INFO] - [E:125| 0]: Train Loss:0.0028904, Val MRR:0.08254, ice00001
2023-05-22 19:48:47,155 - [INFO] - [E:125| 100]: Train Loss:0.0029074, Val MRR:0.08254, ice00001
2023-05-22 19:50:23,902 - [INFO] - [E:125| 200]: Train Loss:0.002905, Val MRR:0.08254, ice00001
2023-05-22 19:52:01,308 - [INFO] - [E:125| 300]: Train Loss:0.0029051, Val MRR:0.08254, ice00001
2023-05-22 19:53:28,503 - [INFO] - [E:125| 400]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-22 19:55:06,020 - [INFO] - [E:125| 500]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-22 19:56:45,453 - [INFO] - [E:125| 600]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 19:58:23,377 - [INFO] - [E:125| 700]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 20:00:00,859 - [INFO] - [E:125| 800]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 20:01:34,358 - [INFO] - [E:125| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 20:02:35,651 - [INFO] - [Epoch:125]: Training Loss:0.002901
2023-05-22 20:02:35,982 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 20:02:58,112 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 20:03:20,244 - [INFO] - [Evaluating Epoch 125 valid]:
MRR: Tail : 0.09404, Head : 0.06903, Avg : 0.08153
2023-05-22 20:03:20,244 - [INFO] - [Epoch 125]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 20:03:21,359 - [INFO] - [E:126| 0]: Train Loss:0.0028769, Val MRR:0.08254, ice00001
2023-05-22 20:04:58,363 - [INFO] - [E:126| 100]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 20:06:35,714 - [INFO] - [E:126| 200]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-22 20:08:13,302 - [INFO] - [E:126| 300]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 20:09:51,020 - [INFO] - [E:126| 400]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 20:11:18,921 - [INFO] - [E:126| 500]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 20:12:56,567 - [INFO] - [E:126| 600]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-22 20:14:34,128 - [INFO] - [E:126| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 20:16:11,370 - [INFO] - [E:126| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 20:17:48,065 - [INFO] - [E:126| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 20:18:53,738 - [INFO] - [Epoch:126]: Training Loss:0.002901
2023-05-22 20:18:53,955 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 20:19:08,411 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 20:19:30,437 - [INFO] - [Evaluating Epoch 126 valid]:
MRR: Tail : 0.0951, Head : 0.06876, Avg : 0.08193
2023-05-22 20:19:30,437 - [INFO] - [Epoch 126]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 20:19:31,542 - [INFO] - [E:127| 0]: Train Loss:0.0029062, Val MRR:0.08254, ice00001
2023-05-22 20:21:09,053 - [INFO] - [E:127| 100]: Train Loss:0.0028953, Val MRR:0.08254, ice00001
2023-05-22 20:22:46,450 - [INFO] - [E:127| 200]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-22 20:24:23,939 - [INFO] - [E:127| 300]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-22 20:26:01,171 - [INFO] - [E:127| 400]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-22 20:27:36,562 - [INFO] - [E:127| 500]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-22 20:29:06,177 - [INFO] - [E:127| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 20:30:43,055 - [INFO] - [E:127| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 20:32:19,494 - [INFO] - [E:127| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 20:33:57,139 - [INFO] - [E:127| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 20:35:06,146 - [INFO] - [Epoch:127]: Training Loss:0.002902
2023-05-22 20:35:06,503 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 20:35:29,088 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 20:35:51,194 - [INFO] - [Evaluating Epoch 127 valid]:
MRR: Tail : 0.09451, Head : 0.06968, Avg : 0.08209
2023-05-22 20:35:51,194 - [INFO] - [Epoch 127]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 20:35:52,091 - [INFO] - [E:128| 0]: Train Loss:0.0029299, Val MRR:0.08254, ice00001
2023-05-22 20:37:19,094 - [INFO] - [E:128| 100]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 20:38:56,216 - [INFO] - [E:128| 200]: Train Loss:0.0029051, Val MRR:0.08254, ice00001
2023-05-22 20:40:33,202 - [INFO] - [E:128| 300]: Train Loss:0.0029043, Val MRR:0.08254, ice00001
2023-05-22 20:42:10,455 - [INFO] - [E:128| 400]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 20:43:47,777 - [INFO] - [E:128| 500]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 20:45:18,352 - [INFO] - [E:128| 600]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 20:46:52,641 - [INFO] - [E:128| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 20:48:30,238 - [INFO] - [E:128| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 20:50:07,480 - [INFO] - [E:128| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 20:51:15,568 - [INFO] - [Epoch:128]: Training Loss:0.002901
2023-05-22 20:51:16,029 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 20:51:38,701 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 20:52:00,800 - [INFO] - [Evaluating Epoch 128 valid]:
MRR: Tail : 0.09305, Head : 0.06394, Avg : 0.07849
2023-05-22 20:52:00,800 - [INFO] - [Epoch 128]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 20:52:01,754 - [INFO] - [E:129| 0]: Train Loss:0.0028908, Val MRR:0.08254, ice00001
2023-05-22 20:53:38,909 - [INFO] - [E:129| 100]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-22 20:55:06,820 - [INFO] - [E:129| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 20:56:44,885 - [INFO] - [E:129| 300]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-22 20:58:22,740 - [INFO] - [E:129| 400]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-22 20:59:59,359 - [INFO] - [E:129| 500]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-22 21:01:36,507 - [INFO] - [E:129| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 21:03:03,356 - [INFO] - [E:129| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 21:04:40,479 - [INFO] - [E:129| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 21:06:17,116 - [INFO] - [E:129| 900]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 21:07:25,077 - [INFO] - [Epoch:129]: Training Loss:0.002902
2023-05-22 21:07:25,412 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 21:07:47,998 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 21:08:09,923 - [INFO] - [Evaluating Epoch 129 valid]:
MRR: Tail : 0.09274, Head : 0.06949, Avg : 0.08112
MR: Tail : 756.82, Head : 920.39, Avg : 838.61
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.08181, Head : 0.06506, Avg : 0.07343
Hit-10: Tail : 0.2101, Head : 0.15305, Avg : 0.18158
2023-05-22 21:08:09,923 - [INFO] - [Epoch 129]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 21:08:10,991 - [INFO] - [E:130| 0]: Train Loss:0.0028711, Val MRR:0.08254, ice00001
2023-05-22 21:09:47,674 - [INFO] - [E:130| 100]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 21:11:17,606 - [INFO] - [E:130| 200]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-22 21:12:51,512 - [INFO] - [E:130| 300]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-22 21:14:28,038 - [INFO] - [E:130| 400]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-22 21:16:04,052 - [INFO] - [E:130| 500]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 21:17:41,912 - [INFO] - [E:130| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 21:19:19,154 - [INFO] - [E:130| 700]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 21:20:46,419 - [INFO] - [E:130| 800]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-22 21:22:23,449 - [INFO] - [E:130| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 21:23:32,065 - [INFO] - [Epoch:130]: Training Loss:0.002901
2023-05-22 21:23:32,392 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 21:23:55,214 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 21:24:17,355 - [INFO] - [Evaluating Epoch 130 valid]:
MRR: Tail : 0.09421, Head : 0.06901, Avg : 0.08161
2023-05-22 21:24:17,356 - [INFO] - [Epoch 130]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 21:24:18,561 - [INFO] - [E:131| 0]: Train Loss:0.0028511, Val MRR:0.08254, ice00001
2023-05-22 21:25:55,690 - [INFO] - [E:131| 100]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-22 21:27:33,151 - [INFO] - [E:131| 200]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-22 21:28:59,357 - [INFO] - [E:131| 300]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 21:30:36,217 - [INFO] - [E:131| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 21:32:14,169 - [INFO] - [E:131| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 21:33:53,725 - [INFO] - [E:131| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 21:35:31,039 - [INFO] - [E:131| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-22 21:37:08,118 - [INFO] - [E:131| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 21:38:35,784 - [INFO] - [E:131| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 21:39:44,020 - [INFO] - [Epoch:131]: Training Loss:0.002901
2023-05-22 21:39:44,468 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 21:40:06,908 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 21:40:29,031 - [INFO] - [Evaluating Epoch 131 valid]:
MRR: Tail : 0.09425, Head : 0.06868, Avg : 0.08146
2023-05-22 21:40:29,031 - [INFO] - [Epoch 131]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 21:40:29,793 - [INFO] - [E:132| 0]: Train Loss:0.0028842, Val MRR:0.08254, ice00001
2023-05-22 21:42:08,116 - [INFO] - [E:132| 100]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 21:43:44,929 - [INFO] - [E:132| 200]: Train Loss:0.0029042, Val MRR:0.08254, ice00001
2023-05-22 21:45:22,049 - [INFO] - [E:132| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-22 21:46:49,179 - [INFO] - [E:132| 400]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 21:48:26,620 - [INFO] - [E:132| 500]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 21:50:03,872 - [INFO] - [E:132| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 21:51:41,077 - [INFO] - [E:132| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 21:53:18,390 - [INFO] - [E:132| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 21:54:50,126 - [INFO] - [E:132| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 21:55:54,144 - [INFO] - [Epoch:132]: Training Loss:0.002901
2023-05-22 21:55:54,522 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 21:56:16,985 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 21:56:39,166 - [INFO] - [Evaluating Epoch 132 valid]:
MRR: Tail : 0.09554, Head : 0.06916, Avg : 0.08235
2023-05-22 21:56:39,166 - [INFO] - [Epoch 132]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 21:56:39,961 - [INFO] - [E:133| 0]: Train Loss:0.0028536, Val MRR:0.08254, ice00001
2023-05-22 21:58:16,147 - [INFO] - [E:133| 100]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 21:59:53,100 - [INFO] - [E:133| 200]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 22:01:30,433 - [INFO] - [E:133| 300]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-22 22:03:08,192 - [INFO] - [E:133| 400]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-22 22:04:35,657 - [INFO] - [E:133| 500]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 22:06:13,760 - [INFO] - [E:133| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 22:07:51,402 - [INFO] - [E:133| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 22:09:29,000 - [INFO] - [E:133| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 22:11:06,803 - [INFO] - [E:133| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 22:12:08,983 - [INFO] - [Epoch:133]: Training Loss:0.002901
2023-05-22 22:12:09,192 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 22:12:26,017 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 22:12:47,828 - [INFO] - [Evaluating Epoch 133 valid]:
MRR: Tail : 0.09406, Head : 0.06995, Avg : 0.08201
2023-05-22 22:12:47,828 - [INFO] - [Epoch 133]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-22 22:12:48,764 - [INFO] - [E:134| 0]: Train Loss:0.0029431, Val MRR:0.08254, ice00001
2023-05-22 22:14:25,913 - [INFO] - [E:134| 100]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-22 22:16:03,593 - [INFO] - [E:134| 200]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 22:17:41,468 - [INFO] - [E:134| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 22:19:19,036 - [INFO] - [E:134| 400]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 22:20:55,963 - [INFO] - [E:134| 500]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 22:22:26,056 - [INFO] - [E:134| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 22:24:03,671 - [INFO] - [E:134| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 22:25:41,351 - [INFO] - [E:134| 800]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 22:27:17,180 - [INFO] - [E:134| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 22:28:25,827 - [INFO] - [Epoch:134]: Training Loss:0.002902
2023-05-22 22:28:26,206 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 22:28:48,756 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 22:29:10,924 - [INFO] - [Evaluating Epoch 134 valid]:
MRR: Tail : 0.09439, Head : 0.06856, Avg : 0.08147
2023-05-22 22:29:10,925 - [INFO] - [Epoch 134]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 22:29:12,072 - [INFO] - [E:135| 0]: Train Loss:0.0029326, Val MRR:0.08254, ice00001
2023-05-22 22:30:39,799 - [INFO] - [E:135| 100]: Train Loss:0.0028984, Val MRR:0.08254, ice00001
2023-05-22 22:32:17,319 - [INFO] - [E:135| 200]: Train Loss:0.0028978, Val MRR:0.08254, ice00001
2023-05-22 22:33:54,892 - [INFO] - [E:135| 300]: Train Loss:0.0028991, Val MRR:0.08254, ice00001
2023-05-22 22:35:33,039 - [INFO] - [E:135| 400]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-22 22:37:11,594 - [INFO] - [E:135| 500]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-22 22:38:38,916 - [INFO] - [E:135| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 22:40:16,953 - [INFO] - [E:135| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 22:41:53,150 - [INFO] - [E:135| 800]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-22 22:43:30,414 - [INFO] - [E:135| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 22:44:39,002 - [INFO] - [Epoch:135]: Training Loss:0.002902
2023-05-22 22:44:39,252 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 22:45:01,661 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 22:45:23,663 - [INFO] - [Evaluating Epoch 135 valid]:
MRR: Tail : 0.09567, Head : 0.06873, Avg : 0.0822
2023-05-22 22:45:23,663 - [INFO] - [Epoch 135]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 22:45:24,773 - [INFO] - [E:136| 0]: Train Loss:0.002892, Val MRR:0.08254, ice00001
2023-05-22 22:46:56,443 - [INFO] - [E:136| 100]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-22 22:48:29,790 - [INFO] - [E:136| 200]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 22:50:06,563 - [INFO] - [E:136| 300]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 22:51:43,435 - [INFO] - [E:136| 400]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 22:53:20,827 - [INFO] - [E:136| 500]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 22:54:57,507 - [INFO] - [E:136| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 22:56:24,172 - [INFO] - [E:136| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 22:58:01,927 - [INFO] - [E:136| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-22 22:59:40,360 - [INFO] - [E:136| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 23:00:49,556 - [INFO] - [Epoch:136]: Training Loss:0.002902
2023-05-22 23:00:49,970 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 23:01:12,245 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 23:01:34,618 - [INFO] - [Evaluating Epoch 136 valid]:
MRR: Tail : 0.09518, Head : 0.06913, Avg : 0.08215
2023-05-22 23:01:34,618 - [INFO] - [Epoch 136]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 23:01:35,505 - [INFO] - [E:137| 0]: Train Loss:0.0029063, Val MRR:0.08254, ice00001
2023-05-22 23:03:13,295 - [INFO] - [E:137| 100]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-22 23:04:40,799 - [INFO] - [E:137| 200]: Train Loss:0.0028985, Val MRR:0.08254, ice00001
2023-05-22 23:06:17,545 - [INFO] - [E:137| 300]: Train Loss:0.0028991, Val MRR:0.08254, ice00001
2023-05-22 23:07:54,627 - [INFO] - [E:137| 400]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-22 23:09:33,424 - [INFO] - [E:137| 500]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-22 23:11:10,611 - [INFO] - [E:137| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-22 23:12:49,252 - [INFO] - [E:137| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 23:14:16,911 - [INFO] - [E:137| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 23:15:54,910 - [INFO] - [E:137| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 23:17:03,980 - [INFO] - [Epoch:137]: Training Loss:0.002902
2023-05-22 23:17:04,232 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 23:17:26,591 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 23:17:48,835 - [INFO] - [Evaluating Epoch 137 valid]:
MRR: Tail : 0.09541, Head : 0.06883, Avg : 0.08212
2023-05-22 23:17:48,835 - [INFO] - [Epoch 137]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 23:17:49,944 - [INFO] - [E:138| 0]: Train Loss:0.0029325, Val MRR:0.08254, ice00001
2023-05-22 23:19:27,407 - [INFO] - [E:138| 100]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 23:21:04,051 - [INFO] - [E:138| 200]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-22 23:22:30,846 - [INFO] - [E:138| 300]: Train Loss:0.0029044, Val MRR:0.08254, ice00001
2023-05-22 23:24:06,221 - [INFO] - [E:138| 400]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 23:25:42,099 - [INFO] - [E:138| 500]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-22 23:27:19,274 - [INFO] - [E:138| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-22 23:28:55,908 - [INFO] - [E:138| 700]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-22 23:30:30,692 - [INFO] - [E:138| 800]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-22 23:32:01,286 - [INFO] - [E:138| 900]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-22 23:33:10,145 - [INFO] - [Epoch:138]: Training Loss:0.002902
2023-05-22 23:33:10,557 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 23:33:32,768 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 23:33:55,062 - [INFO] - [Evaluating Epoch 138 valid]:
MRR: Tail : 0.09428, Head : 0.06876, Avg : 0.08152
2023-05-22 23:33:55,062 - [INFO] - [Epoch 138]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-22 23:33:56,062 - [INFO] - [E:139| 0]: Train Loss:0.0029208, Val MRR:0.08254, ice00001
2023-05-22 23:35:34,336 - [INFO] - [E:139| 100]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-22 23:37:12,027 - [INFO] - [E:139| 200]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-22 23:38:49,160 - [INFO] - [E:139| 300]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-22 23:40:16,396 - [INFO] - [E:139| 400]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-22 23:41:54,480 - [INFO] - [E:139| 500]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-22 23:43:32,614 - [INFO] - [E:139| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-22 23:45:10,973 - [INFO] - [E:139| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-22 23:46:49,244 - [INFO] - [E:139| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-22 23:48:17,076 - [INFO] - [E:139| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-22 23:49:25,482 - [INFO] - [Epoch:139]: Training Loss:0.002901
2023-05-22 23:49:25,755 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-22 23:49:48,325 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-22 23:50:10,721 - [INFO] - [Evaluating Epoch 139 valid]:
MRR: Tail : 0.09505, Head : 0.0694, Avg : 0.08223
MR: Tail : 849.57, Head : 989.43, Avg : 919.5
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09977, Head : 0.06251, Avg : 0.08114
Hit-10: Tail : 0.21083, Head : 0.15754, Avg : 0.18418
2023-05-22 23:50:10,722 - [INFO] - [Epoch 139]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-22 23:50:11,571 - [INFO] - [E:140| 0]: Train Loss:0.0028737, Val MRR:0.08254, ice00001
2023-05-22 23:51:48,660 - [INFO] - [E:140| 100]: Train Loss:0.0029057, Val MRR:0.08254, ice00001
2023-05-22 23:53:24,543 - [INFO] - [E:140| 200]: Train Loss:0.0029039, Val MRR:0.08254, ice00001
2023-05-22 23:55:01,554 - [INFO] - [E:140| 300]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-22 23:56:38,306 - [INFO] - [E:140| 400]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-22 23:58:09,030 - [INFO] - [E:140| 500]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-22 23:59:47,190 - [INFO] - [E:140| 600]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-23 00:01:24,653 - [INFO] - [E:140| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 00:03:02,578 - [INFO] - [E:140| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 00:04:40,568 - [INFO] - [E:140| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 00:05:39,286 - [INFO] - [Epoch:140]: Training Loss:0.002902
2023-05-23 00:05:39,604 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 00:06:02,714 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 00:06:24,939 - [INFO] - [Evaluating Epoch 140 valid]:
MRR: Tail : 0.09508, Head : 0.06951, Avg : 0.0823
2023-05-23 00:06:24,939 - [INFO] - [Epoch 140]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 00:06:26,088 - [INFO] - [E:141| 0]: Train Loss:0.0028834, Val MRR:0.08254, ice00001
2023-05-23 00:08:02,131 - [INFO] - [E:141| 100]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-23 00:09:39,438 - [INFO] - [E:141| 200]: Train Loss:0.0029041, Val MRR:0.08254, ice00001
2023-05-23 00:11:16,938 - [INFO] - [E:141| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 00:12:55,004 - [INFO] - [E:141| 400]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-23 00:14:22,087 - [INFO] - [E:141| 500]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-23 00:15:59,670 - [INFO] - [E:141| 600]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-23 00:17:37,881 - [INFO] - [E:141| 700]: Train Loss:0.0028994, Val MRR:0.08254, ice00001
2023-05-23 00:19:15,602 - [INFO] - [E:141| 800]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 00:20:53,132 - [INFO] - [E:141| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 00:22:00,788 - [INFO] - [Epoch:141]: Training Loss:0.002901
2023-05-23 00:22:01,290 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 00:22:23,214 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 00:22:41,421 - [INFO] - [Evaluating Epoch 141 valid]:
MRR: Tail : 0.0953, Head : 0.06936, Avg : 0.08233
2023-05-23 00:22:41,421 - [INFO] - [Epoch 141]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 00:22:42,038 - [INFO] - [E:142| 0]: Train Loss:0.0029307, Val MRR:0.08254, ice00001
2023-05-23 00:24:12,434 - [INFO] - [E:142| 100]: Train Loss:0.0029113, Val MRR:0.08254, ice00001
2023-05-23 00:25:49,504 - [INFO] - [E:142| 200]: Train Loss:0.0029089, Val MRR:0.08254, ice00001
2023-05-23 00:27:26,249 - [INFO] - [E:142| 300]: Train Loss:0.002905, Val MRR:0.08254, ice00001
2023-05-23 00:29:03,236 - [INFO] - [E:142| 400]: Train Loss:0.0029047, Val MRR:0.08254, ice00001
2023-05-23 00:30:40,111 - [INFO] - [E:142| 500]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 00:32:07,182 - [INFO] - [E:142| 600]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 00:33:44,739 - [INFO] - [E:142| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 00:35:21,910 - [INFO] - [E:142| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 00:36:57,912 - [INFO] - [E:142| 900]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 00:38:05,631 - [INFO] - [Epoch:142]: Training Loss:0.002901
2023-05-23 00:38:06,122 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 00:38:28,625 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 00:38:51,103 - [INFO] - [Evaluating Epoch 142 valid]:
MRR: Tail : 0.09512, Head : 0.06951, Avg : 0.08231
2023-05-23 00:38:51,104 - [INFO] - [Epoch 142]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 00:38:52,189 - [INFO] - [E:143| 0]: Train Loss:0.0028851, Val MRR:0.08254, ice00001
2023-05-23 00:40:18,868 - [INFO] - [E:143| 100]: Train Loss:0.0028989, Val MRR:0.08254, ice00001
2023-05-23 00:41:56,018 - [INFO] - [E:143| 200]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-23 00:43:33,342 - [INFO] - [E:143| 300]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 00:45:13,219 - [INFO] - [E:143| 400]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-23 00:46:51,038 - [INFO] - [E:143| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 00:48:28,740 - [INFO] - [E:143| 600]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 00:49:55,382 - [INFO] - [E:143| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 00:51:31,429 - [INFO] - [E:143| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 00:53:08,003 - [INFO] - [E:143| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 00:54:16,202 - [INFO] - [Epoch:143]: Training Loss:0.002902
2023-05-23 00:54:16,701 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 00:54:39,447 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 00:55:01,631 - [INFO] - [Evaluating Epoch 143 valid]:
MRR: Tail : 0.09351, Head : 0.06934, Avg : 0.08142
2023-05-23 00:55:01,631 - [INFO] - [Epoch 143]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 00:55:02,835 - [INFO] - [E:144| 0]: Train Loss:0.0028981, Val MRR:0.08254, ice00001
2023-05-23 00:56:40,145 - [INFO] - [E:144| 100]: Train Loss:0.0028988, Val MRR:0.08254, ice00001
2023-05-23 00:58:06,978 - [INFO] - [E:144| 200]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 00:59:44,325 - [INFO] - [E:144| 300]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-23 01:01:21,610 - [INFO] - [E:144| 400]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 01:02:58,336 - [INFO] - [E:144| 500]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 01:04:35,141 - [INFO] - [E:144| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 01:06:10,307 - [INFO] - [E:144| 700]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 01:07:38,454 - [INFO] - [E:144| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 01:09:15,341 - [INFO] - [E:144| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 01:10:23,327 - [INFO] - [Epoch:144]: Training Loss:0.002902
2023-05-23 01:10:23,578 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 01:10:46,077 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 01:11:08,374 - [INFO] - [Evaluating Epoch 144 valid]:
MRR: Tail : 0.09518, Head : 0.06896, Avg : 0.08207
2023-05-23 01:11:08,374 - [INFO] - [Epoch 144]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 01:11:09,485 - [INFO] - [E:145| 0]: Train Loss:0.0028371, Val MRR:0.08254, ice00001
2023-05-23 01:12:46,693 - [INFO] - [E:145| 100]: Train Loss:0.0029044, Val MRR:0.08254, ice00001
2023-05-23 01:14:24,001 - [INFO] - [E:145| 200]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 01:15:51,336 - [INFO] - [E:145| 300]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 01:17:28,643 - [INFO] - [E:145| 400]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 01:19:06,761 - [INFO] - [E:145| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-23 01:20:43,443 - [INFO] - [E:145| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 01:22:21,044 - [INFO] - [E:145| 700]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 01:23:48,153 - [INFO] - [E:145| 800]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 01:25:25,840 - [INFO] - [E:145| 900]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 01:26:34,763 - [INFO] - [Epoch:145]: Training Loss:0.002902
2023-05-23 01:26:35,015 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 01:26:57,788 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 01:27:20,220 - [INFO] - [Evaluating Epoch 145 valid]:
MRR: Tail : 0.09505, Head : 0.06942, Avg : 0.08224
2023-05-23 01:27:20,220 - [INFO] - [Epoch 145]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 01:27:21,367 - [INFO] - [E:146| 0]: Train Loss:0.0029161, Val MRR:0.08254, ice00001
2023-05-23 01:28:58,657 - [INFO] - [E:146| 100]: Train Loss:0.0028948, Val MRR:0.08254, ice00001
2023-05-23 01:30:36,591 - [INFO] - [E:146| 200]: Train Loss:0.0028969, Val MRR:0.08254, ice00001
2023-05-23 01:32:13,726 - [INFO] - [E:146| 300]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 01:33:42,824 - [INFO] - [E:146| 400]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 01:35:19,564 - [INFO] - [E:146| 500]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-23 01:36:57,604 - [INFO] - [E:146| 600]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-23 01:38:35,138 - [INFO] - [E:146| 700]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 01:40:13,003 - [INFO] - [E:146| 800]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 01:41:40,398 - [INFO] - [E:146| 900]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 01:42:48,715 - [INFO] - [Epoch:146]: Training Loss:0.002901
2023-05-23 01:42:48,966 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 01:43:11,646 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 01:43:33,831 - [INFO] - [Evaluating Epoch 146 valid]:
MRR: Tail : 0.0927, Head : 0.06967, Avg : 0.08119
2023-05-23 01:43:33,831 - [INFO] - [Epoch 146]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 01:43:34,957 - [INFO] - [E:147| 0]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 01:45:12,091 - [INFO] - [E:147| 100]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 01:46:48,745 - [INFO] - [E:147| 200]: Train Loss:0.0029042, Val MRR:0.08254, ice00001
2023-05-23 01:48:25,424 - [INFO] - [E:147| 300]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-23 01:49:53,710 - [INFO] - [E:147| 400]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 01:51:29,435 - [INFO] - [E:147| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 01:53:06,917 - [INFO] - [E:147| 600]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 01:54:44,051 - [INFO] - [E:147| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 01:56:21,435 - [INFO] - [E:147| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 01:57:58,703 - [INFO] - [E:147| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 01:58:57,014 - [INFO] - [Epoch:147]: Training Loss:0.002902
2023-05-23 01:58:57,507 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 01:59:20,379 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 01:59:42,650 - [INFO] - [Evaluating Epoch 147 valid]:
MRR: Tail : 0.09529, Head : 0.06912, Avg : 0.0822
2023-05-23 01:59:42,651 - [INFO] - [Epoch 147]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 01:59:43,668 - [INFO] - [E:148| 0]: Train Loss:0.002882, Val MRR:0.08254, ice00001
2023-05-23 02:01:20,580 - [INFO] - [E:148| 100]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 02:02:57,305 - [INFO] - [E:148| 200]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-23 02:04:33,969 - [INFO] - [E:148| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 02:06:12,070 - [INFO] - [E:148| 400]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 02:07:39,383 - [INFO] - [E:148| 500]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 02:09:16,893 - [INFO] - [E:148| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 02:10:54,204 - [INFO] - [E:148| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 02:12:31,989 - [INFO] - [E:148| 800]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 02:14:09,362 - [INFO] - [E:148| 900]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 02:15:17,887 - [INFO] - [Epoch:148]: Training Loss:0.002902
2023-05-23 02:15:18,385 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 02:15:40,937 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 02:15:58,916 - [INFO] - [Evaluating Epoch 148 valid]:
MRR: Tail : 0.09439, Head : 0.06942, Avg : 0.08191
2023-05-23 02:15:58,916 - [INFO] - [Epoch 148]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 02:15:59,532 - [INFO] - [E:149| 0]: Train Loss:0.002916, Val MRR:0.08254, ice00001
2023-05-23 02:17:29,395 - [INFO] - [E:149| 100]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 02:19:06,222 - [INFO] - [E:149| 200]: Train Loss:0.0028983, Val MRR:0.08254, ice00001
2023-05-23 02:20:43,344 - [INFO] - [E:149| 300]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-23 02:22:22,780 - [INFO] - [E:149| 400]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 02:24:00,624 - [INFO] - [E:149| 500]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 02:25:28,608 - [INFO] - [E:149| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 02:27:06,776 - [INFO] - [E:149| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 02:28:45,101 - [INFO] - [E:149| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 02:30:23,295 - [INFO] - [E:149| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 02:31:31,928 - [INFO] - [Epoch:149]: Training Loss:0.002902
2023-05-23 02:31:32,377 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 02:31:54,339 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 02:32:16,667 - [INFO] - [Evaluating Epoch 149 valid]:
MRR: Tail : 0.09463, Head : 0.06983, Avg : 0.08223
MR: Tail : 844.54, Head : 1022.8, Avg : 933.69
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09977, Head : 0.06251, Avg : 0.08114
Hit-10: Tail : 0.21119, Head : 0.159, Avg : 0.1851
2023-05-23 02:32:16,667 - [INFO] - [Epoch 149]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 02:32:17,553 - [INFO] - [E:150| 0]: Train Loss:0.002895, Val MRR:0.08254, ice00001
2023-05-23 02:33:44,505 - [INFO] - [E:150| 100]: Train Loss:0.0028972, Val MRR:0.08254, ice00001
2023-05-23 02:35:22,371 - [INFO] - [E:150| 200]: Train Loss:0.0028975, Val MRR:0.08254, ice00001
2023-05-23 02:37:00,132 - [INFO] - [E:150| 300]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-23 02:38:37,144 - [INFO] - [E:150| 400]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 02:40:14,589 - [INFO] - [E:150| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 02:41:52,560 - [INFO] - [E:150| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 02:43:19,782 - [INFO] - [E:150| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 02:44:57,156 - [INFO] - [E:150| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 02:46:33,727 - [INFO] - [E:150| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 02:47:41,898 - [INFO] - [Epoch:150]: Training Loss:0.002902
2023-05-23 02:47:42,280 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 02:48:04,849 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 02:48:26,773 - [INFO] - [Evaluating Epoch 150 valid]:
MRR: Tail : 0.09418, Head : 0.0697, Avg : 0.08194
2023-05-23 02:48:26,773 - [INFO] - [Epoch 150]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 02:48:27,831 - [INFO] - [E:151| 0]: Train Loss:0.0029403, Val MRR:0.08254, ice00001
2023-05-23 02:50:04,742 - [INFO] - [E:151| 100]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 02:51:32,158 - [INFO] - [E:151| 200]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 02:53:10,087 - [INFO] - [E:151| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 02:54:47,934 - [INFO] - [E:151| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 02:56:25,698 - [INFO] - [E:151| 500]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 02:58:03,562 - [INFO] - [E:151| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 02:59:33,137 - [INFO] - [E:151| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 03:01:07,288 - [INFO] - [E:151| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 03:02:44,127 - [INFO] - [E:151| 900]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 03:03:53,004 - [INFO] - [Epoch:151]: Training Loss:0.002901
2023-05-23 03:03:53,413 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 03:04:16,076 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 03:04:38,369 - [INFO] - [Evaluating Epoch 151 valid]:
MRR: Tail : 0.09349, Head : 0.06975, Avg : 0.08162
2023-05-23 03:04:38,369 - [INFO] - [Epoch 151]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 03:04:39,368 - [INFO] - [E:152| 0]: Train Loss:0.0029166, Val MRR:0.08254, ice00001
2023-05-23 03:06:17,156 - [INFO] - [E:152| 100]: Train Loss:0.0029063, Val MRR:0.08254, ice00001
2023-05-23 03:07:55,529 - [INFO] - [E:152| 200]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-23 03:09:24,845 - [INFO] - [E:152| 300]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 03:11:01,890 - [INFO] - [E:152| 400]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-23 03:12:39,191 - [INFO] - [E:152| 500]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 03:14:16,719 - [INFO] - [E:152| 600]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-23 03:15:53,711 - [INFO] - [E:152| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 03:17:20,802 - [INFO] - [E:152| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 03:18:59,000 - [INFO] - [E:152| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 03:20:08,050 - [INFO] - [Epoch:152]: Training Loss:0.002901
2023-05-23 03:20:08,387 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 03:20:31,235 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 03:20:53,864 - [INFO] - [Evaluating Epoch 152 valid]:
MRR: Tail : 0.09295, Head : 0.06961, Avg : 0.08128
2023-05-23 03:20:53,864 - [INFO] - [Epoch 152]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 03:20:54,929 - [INFO] - [E:153| 0]: Train Loss:0.0029073, Val MRR:0.08254, ice00001
2023-05-23 03:22:33,317 - [INFO] - [E:153| 100]: Train Loss:0.002895, Val MRR:0.08254, ice00001
2023-05-23 03:24:11,199 - [INFO] - [E:153| 200]: Train Loss:0.0028956, Val MRR:0.08254, ice00001
2023-05-23 03:25:40,388 - [INFO] - [E:153| 300]: Train Loss:0.0028968, Val MRR:0.08254, ice00001
2023-05-23 03:27:16,561 - [INFO] - [E:153| 400]: Train Loss:0.0028978, Val MRR:0.08254, ice00001
2023-05-23 03:28:54,507 - [INFO] - [E:153| 500]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-23 03:30:31,074 - [INFO] - [E:153| 600]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 03:32:07,989 - [INFO] - [E:153| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 03:33:44,670 - [INFO] - [E:153| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 03:35:11,760 - [INFO] - [E:153| 900]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 03:36:19,992 - [INFO] - [Epoch:153]: Training Loss:0.002902
2023-05-23 03:36:20,490 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 03:36:42,862 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 03:37:05,136 - [INFO] - [Evaluating Epoch 153 valid]:
MRR: Tail : 0.09306, Head : 0.06996, Avg : 0.08151
2023-05-23 03:37:05,136 - [INFO] - [Epoch 153]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 03:37:05,921 - [INFO] - [E:154| 0]: Train Loss:0.0029136, Val MRR:0.08254, ice00001
2023-05-23 03:38:43,952 - [INFO] - [E:154| 100]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 03:40:21,315 - [INFO] - [E:154| 200]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 03:41:59,464 - [INFO] - [E:154| 300]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-23 03:43:26,937 - [INFO] - [E:154| 400]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-23 03:45:03,222 - [INFO] - [E:154| 500]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 03:46:39,748 - [INFO] - [E:154| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 03:48:17,715 - [INFO] - [E:154| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 03:49:55,466 - [INFO] - [E:154| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 03:51:32,553 - [INFO] - [E:154| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 03:52:30,910 - [INFO] - [Epoch:154]: Training Loss:0.002901
2023-05-23 03:52:31,201 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 03:52:53,634 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 03:53:15,972 - [INFO] - [Evaluating Epoch 154 valid]:
MRR: Tail : 0.09571, Head : 0.06816, Avg : 0.08193
2023-05-23 03:53:15,972 - [INFO] - [Epoch 154]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 03:53:17,083 - [INFO] - [E:155| 0]: Train Loss:0.0028818, Val MRR:0.08254, ice00001
2023-05-23 03:54:54,595 - [INFO] - [E:155| 100]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-23 03:56:32,323 - [INFO] - [E:155| 200]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-23 03:58:11,551 - [INFO] - [E:155| 300]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 03:59:47,908 - [INFO] - [E:155| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 04:01:14,985 - [INFO] - [E:155| 500]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 04:02:52,954 - [INFO] - [E:155| 600]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 04:04:30,552 - [INFO] - [E:155| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 04:06:08,005 - [INFO] - [E:155| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 04:07:44,908 - [INFO] - [E:155| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 04:08:53,183 - [INFO] - [Epoch:155]: Training Loss:0.002902
2023-05-23 04:08:53,434 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 04:09:11,546 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 04:09:26,379 - [INFO] - [Evaluating Epoch 155 valid]:
MRR: Tail : 0.09519, Head : 0.06889, Avg : 0.08204
2023-05-23 04:09:26,379 - [INFO] - [Epoch 155]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 04:09:27,524 - [INFO] - [E:156| 0]: Train Loss:0.0029427, Val MRR:0.08254, ice00001
2023-05-23 04:11:04,904 - [INFO] - [E:156| 100]: Train Loss:0.0029041, Val MRR:0.08254, ice00001
2023-05-23 04:12:42,410 - [INFO] - [E:156| 200]: Train Loss:0.0029057, Val MRR:0.08254, ice00001
2023-05-23 04:14:19,522 - [INFO] - [E:156| 300]: Train Loss:0.0029045, Val MRR:0.08254, ice00001
2023-05-23 04:15:57,697 - [INFO] - [E:156| 400]: Train Loss:0.0029045, Val MRR:0.08254, ice00001
2023-05-23 04:17:34,748 - [INFO] - [E:156| 500]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-23 04:19:02,805 - [INFO] - [E:156| 600]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-23 04:20:41,105 - [INFO] - [E:156| 700]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-23 04:22:18,779 - [INFO] - [E:156| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 04:23:56,035 - [INFO] - [E:156| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 04:25:05,279 - [INFO] - [Epoch:156]: Training Loss:0.002901
2023-05-23 04:25:05,530 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 04:25:28,171 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 04:25:50,503 - [INFO] - [Evaluating Epoch 156 valid]:
MRR: Tail : 0.09478, Head : 0.06858, Avg : 0.08168
2023-05-23 04:25:50,503 - [INFO] - [Epoch 156]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 04:25:51,620 - [INFO] - [E:157| 0]: Train Loss:0.0029522, Val MRR:0.08254, ice00001
2023-05-23 04:27:18,923 - [INFO] - [E:157| 100]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-23 04:28:55,328 - [INFO] - [E:157| 200]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 04:30:32,508 - [INFO] - [E:157| 300]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 04:32:09,739 - [INFO] - [E:157| 400]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 04:33:47,369 - [INFO] - [E:157| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 04:35:19,048 - [INFO] - [E:157| 600]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 04:36:53,118 - [INFO] - [E:157| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 04:38:31,066 - [INFO] - [E:157| 800]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 04:40:08,592 - [INFO] - [E:157| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 04:41:16,917 - [INFO] - [Epoch:157]: Training Loss:0.002902
2023-05-23 04:41:17,216 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 04:41:39,398 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 04:42:01,435 - [INFO] - [Evaluating Epoch 157 valid]:
MRR: Tail : 0.09307, Head : 0.07017, Avg : 0.08162
2023-05-23 04:42:01,435 - [INFO] - [Epoch 157]: Training Loss: 0.0029019, Valid MRR: 0.08254,
2023-05-23 04:42:02,400 - [INFO] - [E:158| 0]: Train Loss:0.0028715, Val MRR:0.08254, ice00001
2023-05-23 04:43:39,214 - [INFO] - [E:158| 100]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 04:45:06,839 - [INFO] - [E:158| 200]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-23 04:46:46,644 - [INFO] - [E:158| 300]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 04:48:23,726 - [INFO] - [E:158| 400]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 04:50:01,242 - [INFO] - [E:158| 500]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 04:51:38,555 - [INFO] - [E:158| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 04:53:05,461 - [INFO] - [E:158| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 04:54:43,018 - [INFO] - [E:158| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 04:56:19,491 - [INFO] - [E:158| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 04:57:27,259 - [INFO] - [Epoch:158]: Training Loss:0.002902
2023-05-23 04:57:27,576 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 04:57:49,836 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 04:58:12,045 - [INFO] - [Evaluating Epoch 158 valid]:
MRR: Tail : 0.0938, Head : 0.06926, Avg : 0.08153
2023-05-23 04:58:12,045 - [INFO] - [Epoch 158]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 04:58:13,167 - [INFO] - [E:159| 0]: Train Loss:0.0028919, Val MRR:0.08254, ice00001
2023-05-23 04:59:50,900 - [INFO] - [E:159| 100]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-23 05:01:23,080 - [INFO] - [E:159| 200]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-23 05:02:55,531 - [INFO] - [E:159| 300]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-23 05:04:33,710 - [INFO] - [E:159| 400]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 05:06:11,477 - [INFO] - [E:159| 500]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 05:07:49,207 - [INFO] - [E:159| 600]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 05:09:26,242 - [INFO] - [E:159| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 05:10:53,379 - [INFO] - [E:159| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 05:12:30,565 - [INFO] - [E:159| 900]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 05:13:39,517 - [INFO] - [Epoch:159]: Training Loss:0.002901
2023-05-23 05:13:39,998 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 05:14:02,795 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 05:14:25,183 - [INFO] - [Evaluating Epoch 159 valid]:
MRR: Tail : 0.09392, Head : 0.06835, Avg : 0.08113
MR: Tail : 764.68, Head : 891.91, Avg : 828.3
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.08739, Head : 0.05887, Avg : 0.07313
Hit-10: Tail : 0.21568, Head : 0.15499, Avg : 0.18534
2023-05-23 05:14:25,183 - [INFO] - [Epoch 159]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 05:14:26,216 - [INFO] - [E:160| 0]: Train Loss:0.002889, Val MRR:0.08254, ice00001
2023-05-23 05:16:03,754 - [INFO] - [E:160| 100]: Train Loss:0.0028921, Val MRR:0.08254, ice00001
2023-05-23 05:17:41,440 - [INFO] - [E:160| 200]: Train Loss:0.0028973, Val MRR:0.08254, ice00001
2023-05-23 05:19:09,069 - [INFO] - [E:160| 300]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-23 05:20:46,614 - [INFO] - [E:160| 400]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-23 05:22:24,900 - [INFO] - [E:160| 500]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-23 05:24:03,242 - [INFO] - [E:160| 600]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 05:25:40,416 - [INFO] - [E:160| 700]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 05:27:17,595 - [INFO] - [E:160| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 05:28:44,618 - [INFO] - [E:160| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 05:29:54,154 - [INFO] - [Epoch:160]: Training Loss:0.002901
2023-05-23 05:29:54,487 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 05:30:17,492 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 05:30:39,842 - [INFO] - [Evaluating Epoch 160 valid]:
MRR: Tail : 0.09526, Head : 0.06905, Avg : 0.08216
2023-05-23 05:30:39,842 - [INFO] - [Epoch 160]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 05:30:40,983 - [INFO] - [E:161| 0]: Train Loss:0.0028982, Val MRR:0.08254, ice00001
2023-05-23 05:32:18,615 - [INFO] - [E:161| 100]: Train Loss:0.0028966, Val MRR:0.08254, ice00001
2023-05-23 05:33:59,384 - [INFO] - [E:161| 200]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-23 05:35:36,886 - [INFO] - [E:161| 300]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 05:37:05,222 - [INFO] - [E:161| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 05:38:42,774 - [INFO] - [E:161| 500]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 05:40:19,775 - [INFO] - [E:161| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 05:41:56,837 - [INFO] - [E:161| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 05:43:34,418 - [INFO] - [E:161| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 05:45:05,674 - [INFO] - [E:161| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 05:46:10,348 - [INFO] - [Epoch:161]: Training Loss:0.002901
2023-05-23 05:46:10,821 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 05:46:33,033 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 05:46:54,934 - [INFO] - [Evaluating Epoch 161 valid]:
MRR: Tail : 0.09559, Head : 0.06905, Avg : 0.08232
2023-05-23 05:46:54,934 - [INFO] - [Epoch 161]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 05:46:55,724 - [INFO] - [E:162| 0]: Train Loss:0.0029533, Val MRR:0.08254, ice00001
2023-05-23 05:48:33,016 - [INFO] - [E:162| 100]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-23 05:50:10,510 - [INFO] - [E:162| 200]: Train Loss:0.0028982, Val MRR:0.08254, ice00001
2023-05-23 05:51:47,705 - [INFO] - [E:162| 300]: Train Loss:0.0028991, Val MRR:0.08254, ice00001
2023-05-23 05:53:24,864 - [INFO] - [E:162| 400]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-23 05:54:51,597 - [INFO] - [E:162| 500]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-23 05:56:29,026 - [INFO] - [E:162| 600]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 05:58:07,801 - [INFO] - [E:162| 700]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 05:59:46,096 - [INFO] - [E:162| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 06:01:23,772 - [INFO] - [E:162| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 06:02:28,235 - [INFO] - [Epoch:162]: Training Loss:0.002902
2023-05-23 06:02:28,444 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 06:02:44,040 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 06:03:06,062 - [INFO] - [Evaluating Epoch 162 valid]:
MRR: Tail : 0.09351, Head : 0.06429, Avg : 0.0789
2023-05-23 06:03:06,062 - [INFO] - [Epoch 162]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 06:03:07,078 - [INFO] - [E:163| 0]: Train Loss:0.0028732, Val MRR:0.08254, ice00001
2023-05-23 06:04:43,494 - [INFO] - [E:163| 100]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 06:06:20,449 - [INFO] - [E:163| 200]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-23 06:07:57,785 - [INFO] - [E:163| 300]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 06:09:33,326 - [INFO] - [E:163| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 06:11:06,665 - [INFO] - [E:163| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 06:12:38,474 - [INFO] - [E:163| 600]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 06:14:15,860 - [INFO] - [E:163| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 06:15:53,637 - [INFO] - [E:163| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 06:17:31,441 - [INFO] - [E:163| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 06:18:40,566 - [INFO] - [Epoch:163]: Training Loss:0.002902
2023-05-23 06:18:40,903 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 06:19:03,423 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 06:19:25,659 - [INFO] - [Evaluating Epoch 163 valid]:
MRR: Tail : 0.09475, Head : 0.06953, Avg : 0.08214
2023-05-23 06:19:25,659 - [INFO] - [Epoch 163]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 06:19:26,766 - [INFO] - [E:164| 0]: Train Loss:0.0028341, Val MRR:0.08254, ice00001
2023-05-23 06:20:54,667 - [INFO] - [E:164| 100]: Train Loss:0.0029052, Val MRR:0.08254, ice00001
2023-05-23 06:22:34,369 - [INFO] - [E:164| 200]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-23 06:24:10,851 - [INFO] - [E:164| 300]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-23 06:25:48,557 - [INFO] - [E:164| 400]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-23 06:27:25,989 - [INFO] - [E:164| 500]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-23 06:28:53,042 - [INFO] - [E:164| 600]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 06:30:30,285 - [INFO] - [E:164| 700]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 06:32:08,239 - [INFO] - [E:164| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 06:33:45,621 - [INFO] - [E:164| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 06:34:54,378 - [INFO] - [Epoch:164]: Training Loss:0.002902
2023-05-23 06:34:54,629 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 06:35:17,449 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 06:35:39,904 - [INFO] - [Evaluating Epoch 164 valid]:
MRR: Tail : 0.09571, Head : 0.06846, Avg : 0.08208
2023-05-23 06:35:39,904 - [INFO] - [Epoch 164]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 06:35:41,033 - [INFO] - [E:165| 0]: Train Loss:0.0028819, Val MRR:0.08254, ice00001
2023-05-23 06:37:18,797 - [INFO] - [E:165| 100]: Train Loss:0.0028983, Val MRR:0.08254, ice00001
2023-05-23 06:38:45,080 - [INFO] - [E:165| 200]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-23 06:40:22,747 - [INFO] - [E:165| 300]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-23 06:42:00,393 - [INFO] - [E:165| 400]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 06:43:38,060 - [INFO] - [E:165| 500]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 06:45:15,528 - [INFO] - [E:165| 600]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 06:46:43,008 - [INFO] - [E:165| 700]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 06:48:20,708 - [INFO] - [E:165| 800]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 06:49:58,743 - [INFO] - [E:165| 900]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 06:51:07,386 - [INFO] - [Epoch:165]: Training Loss:0.002902
2023-05-23 06:51:07,743 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 06:51:30,275 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 06:51:51,852 - [INFO] - [Evaluating Epoch 165 valid]:
MRR: Tail : 0.09535, Head : 0.069, Avg : 0.08217
2023-05-23 06:51:51,852 - [INFO] - [Epoch 165]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 06:51:52,969 - [INFO] - [E:166| 0]: Train Loss:0.0029156, Val MRR:0.08254, ice00001
2023-05-23 06:53:29,093 - [INFO] - [E:166| 100]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-23 06:54:56,494 - [INFO] - [E:166| 200]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 06:56:33,837 - [INFO] - [E:166| 300]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 06:58:11,419 - [INFO] - [E:166| 400]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-23 06:59:49,660 - [INFO] - [E:166| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 07:01:27,431 - [INFO] - [E:166| 600]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 07:03:05,320 - [INFO] - [E:166| 700]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 07:04:32,974 - [INFO] - [E:166| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 07:06:09,955 - [INFO] - [E:166| 900]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 07:07:18,157 - [INFO] - [Epoch:166]: Training Loss:0.002901
2023-05-23 07:07:18,408 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 07:07:40,556 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 07:08:02,739 - [INFO] - [Evaluating Epoch 166 valid]:
MRR: Tail : 0.09508, Head : 0.06927, Avg : 0.08217
2023-05-23 07:08:02,740 - [INFO] - [Epoch 166]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 07:08:03,623 - [INFO] - [E:167| 0]: Train Loss:0.0028862, Val MRR:0.08254, ice00001
2023-05-23 07:09:42,736 - [INFO] - [E:167| 100]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 07:11:21,627 - [INFO] - [E:167| 200]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 07:12:49,057 - [INFO] - [E:167| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 07:14:26,929 - [INFO] - [E:167| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 07:16:05,261 - [INFO] - [E:167| 500]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 07:17:42,426 - [INFO] - [E:167| 600]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 07:19:20,616 - [INFO] - [E:167| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 07:20:57,492 - [INFO] - [E:167| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 07:22:25,010 - [INFO] - [E:167| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 07:23:33,414 - [INFO] - [Epoch:167]: Training Loss:0.002901
2023-05-23 07:23:33,737 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 07:23:56,486 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 07:24:18,612 - [INFO] - [Evaluating Epoch 167 valid]:
MRR: Tail : 0.09452, Head : 0.0698, Avg : 0.08216
2023-05-23 07:24:18,612 - [INFO] - [Epoch 167]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 07:24:19,721 - [INFO] - [E:168| 0]: Train Loss:0.0029486, Val MRR:0.08254, ice00001
2023-05-23 07:25:57,348 - [INFO] - [E:168| 100]: Train Loss:0.0028964, Val MRR:0.08254, ice00001
2023-05-23 07:27:34,861 - [INFO] - [E:168| 200]: Train Loss:0.0028972, Val MRR:0.08254, ice00001
2023-05-23 07:29:12,553 - [INFO] - [E:168| 300]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-23 07:30:39,886 - [INFO] - [E:168| 400]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-23 07:32:17,538 - [INFO] - [E:168| 500]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-23 07:33:55,681 - [INFO] - [E:168| 600]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 07:35:32,558 - [INFO] - [E:168| 700]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 07:37:09,281 - [INFO] - [E:168| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 07:38:36,825 - [INFO] - [E:168| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 07:39:45,816 - [INFO] - [Epoch:168]: Training Loss:0.002901
2023-05-23 07:39:46,172 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 07:40:08,758 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 07:40:30,886 - [INFO] - [Evaluating Epoch 168 valid]:
MRR: Tail : 0.09257, Head : 0.06378, Avg : 0.07817
2023-05-23 07:40:30,886 - [INFO] - [Epoch 168]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 07:40:31,966 - [INFO] - [E:169| 0]: Train Loss:0.0028986, Val MRR:0.08254, ice00001
2023-05-23 07:42:09,234 - [INFO] - [E:169| 100]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 07:43:46,642 - [INFO] - [E:169| 200]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 07:45:24,804 - [INFO] - [E:169| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 07:47:02,353 - [INFO] - [E:169| 400]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 07:48:30,045 - [INFO] - [E:169| 500]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-23 07:50:06,693 - [INFO] - [E:169| 600]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 07:51:43,508 - [INFO] - [E:169| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 07:53:20,802 - [INFO] - [E:169| 800]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 07:54:57,788 - [INFO] - [E:169| 900]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 07:55:59,878 - [INFO] - [Epoch:169]: Training Loss:0.002901
2023-05-23 07:56:00,087 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 07:56:18,371 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 07:56:40,893 - [INFO] - [Evaluating Epoch 169 valid]:
MRR: Tail : 0.09452, Head : 0.06961, Avg : 0.08207
MR: Tail : 830.41, Head : 994.53, Avg : 912.47
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09892, Head : 0.06142, Avg : 0.08017
Hit-10: Tail : 0.21022, Head : 0.15864, Avg : 0.18443
2023-05-23 07:56:40,893 - [INFO] - [Epoch 169]: Training Loss: 0.0029013, Valid MRR: 0.08254,
2023-05-23 07:56:41,996 - [INFO] - [E:170| 0]: Train Loss:0.0029116, Val MRR:0.08254, ice00001
2023-05-23 07:58:22,407 - [INFO] - [E:170| 100]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-23 08:00:00,075 - [INFO] - [E:170| 200]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 08:01:37,229 - [INFO] - [E:170| 300]: Train Loss:0.0029043, Val MRR:0.08254, ice00001
2023-05-23 08:03:15,023 - [INFO] - [E:170| 400]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-23 08:04:47,231 - [INFO] - [E:170| 500]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 08:06:18,796 - [INFO] - [E:170| 600]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 08:07:57,089 - [INFO] - [E:170| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 08:09:34,616 - [INFO] - [E:170| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 08:11:11,716 - [INFO] - [E:170| 900]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 08:12:20,169 - [INFO] - [Epoch:170]: Training Loss:0.002901
2023-05-23 08:12:20,612 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 08:12:43,225 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 08:13:05,446 - [INFO] - [Evaluating Epoch 170 valid]:
MRR: Tail : 0.09485, Head : 0.06876, Avg : 0.0818
2023-05-23 08:13:05,446 - [INFO] - [Epoch 170]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 08:13:06,258 - [INFO] - [E:171| 0]: Train Loss:0.002916, Val MRR:0.08254, ice00001
2023-05-23 08:14:33,495 - [INFO] - [E:171| 100]: Train Loss:0.0029043, Val MRR:0.08254, ice00001
2023-05-23 08:16:11,535 - [INFO] - [E:171| 200]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-23 08:17:49,012 - [INFO] - [E:171| 300]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 08:19:25,580 - [INFO] - [E:171| 400]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 08:21:02,679 - [INFO] - [E:171| 500]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 08:22:30,483 - [INFO] - [E:171| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 08:24:08,343 - [INFO] - [E:171| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 08:25:46,437 - [INFO] - [E:171| 800]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 08:27:24,087 - [INFO] - [E:171| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 08:28:32,724 - [INFO] - [Epoch:171]: Training Loss:0.002901
2023-05-23 08:28:33,049 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 08:28:56,045 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 08:29:18,425 - [INFO] - [Evaluating Epoch 171 valid]:
MRR: Tail : 0.09275, Head : 0.07023, Avg : 0.08149
2023-05-23 08:29:18,425 - [INFO] - [Epoch 171]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 08:29:19,550 - [INFO] - [E:172| 0]: Train Loss:0.0029421, Val MRR:0.08254, ice00001
2023-05-23 08:30:54,496 - [INFO] - [E:172| 100]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-23 08:32:25,656 - [INFO] - [E:172| 200]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-23 08:34:01,996 - [INFO] - [E:172| 300]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 08:35:39,742 - [INFO] - [E:172| 400]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-23 08:37:18,276 - [INFO] - [E:172| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 08:38:56,408 - [INFO] - [E:172| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 08:40:23,733 - [INFO] - [E:172| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 08:42:00,603 - [INFO] - [E:172| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 08:43:38,116 - [INFO] - [E:172| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 08:44:46,291 - [INFO] - [Epoch:172]: Training Loss:0.002902
2023-05-23 08:44:46,626 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 08:45:09,178 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 08:45:31,589 - [INFO] - [Evaluating Epoch 172 valid]:
MRR: Tail : 0.09339, Head : 0.06968, Avg : 0.08154
2023-05-23 08:45:31,589 - [INFO] - [Epoch 172]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 08:45:32,674 - [INFO] - [E:173| 0]: Train Loss:0.0029126, Val MRR:0.08254, ice00001
2023-05-23 08:47:13,184 - [INFO] - [E:173| 100]: Train Loss:0.0029073, Val MRR:0.08254, ice00001
2023-05-23 08:48:39,725 - [INFO] - [E:173| 200]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-23 08:50:15,722 - [INFO] - [E:173| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 08:51:52,218 - [INFO] - [E:173| 400]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 08:53:28,325 - [INFO] - [E:173| 500]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 08:55:04,637 - [INFO] - [E:173| 600]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 08:56:39,605 - [INFO] - [E:173| 700]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 08:58:07,790 - [INFO] - [E:173| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 08:59:44,699 - [INFO] - [E:173| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 09:00:52,886 - [INFO] - [Epoch:173]: Training Loss:0.002901
2023-05-23 09:00:53,135 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 09:01:15,507 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 09:01:37,621 - [INFO] - [Evaluating Epoch 173 valid]:
MRR: Tail : 0.09445, Head : 0.06949, Avg : 0.08197
2023-05-23 09:01:37,621 - [INFO] - [Epoch 173]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 09:01:38,749 - [INFO] - [E:174| 0]: Train Loss:0.0028754, Val MRR:0.08254, ice00001
2023-05-23 09:03:16,291 - [INFO] - [E:174| 100]: Train Loss:0.0029042, Val MRR:0.08254, ice00001
2023-05-23 09:04:51,820 - [INFO] - [E:174| 200]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-23 09:06:20,012 - [INFO] - [E:174| 300]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 09:07:56,331 - [INFO] - [E:174| 400]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 09:09:32,588 - [INFO] - [E:174| 500]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 09:11:09,401 - [INFO] - [E:174| 600]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 09:12:45,287 - [INFO] - [E:174| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 09:14:11,518 - [INFO] - [E:174| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 09:15:47,456 - [INFO] - [E:174| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 09:16:54,557 - [INFO] - [Epoch:174]: Training Loss:0.002902
2023-05-23 09:16:55,055 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 09:17:16,970 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 09:17:38,973 - [INFO] - [Evaluating Epoch 174 valid]:
MRR: Tail : 0.09336, Head : 0.07002, Avg : 0.08169
2023-05-23 09:17:38,973 - [INFO] - [Epoch 174]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 09:17:39,935 - [INFO] - [E:175| 0]: Train Loss:0.0028777, Val MRR:0.08254, ice00001
2023-05-23 09:19:16,133 - [INFO] - [E:175| 100]: Train Loss:0.0029062, Val MRR:0.08254, ice00001
2023-05-23 09:20:52,343 - [INFO] - [E:175| 200]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 09:22:18,709 - [INFO] - [E:175| 300]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 09:23:54,568 - [INFO] - [E:175| 400]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 09:25:30,398 - [INFO] - [E:175| 500]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 09:27:06,324 - [INFO] - [E:175| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 09:28:41,959 - [INFO] - [E:175| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 09:30:08,018 - [INFO] - [E:175| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 09:31:43,954 - [INFO] - [E:175| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 09:32:55,335 - [INFO] - [Epoch:175]: Training Loss:0.002901
2023-05-23 09:32:55,802 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 09:33:17,558 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 09:33:39,495 - [INFO] - [Evaluating Epoch 175 valid]:
MRR: Tail : 0.09547, Head : 0.06857, Avg : 0.08202
2023-05-23 09:33:39,495 - [INFO] - [Epoch 175]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 09:33:40,305 - [INFO] - [E:176| 0]: Train Loss:0.0029081, Val MRR:0.08254, ice00001
2023-05-23 09:35:16,685 - [INFO] - [E:176| 100]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 09:36:52,903 - [INFO] - [E:176| 200]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 09:38:19,184 - [INFO] - [E:176| 300]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 09:39:55,247 - [INFO] - [E:176| 400]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 09:41:31,730 - [INFO] - [E:176| 500]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 09:43:08,743 - [INFO] - [E:176| 600]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 09:44:45,887 - [INFO] - [E:176| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 09:46:14,574 - [INFO] - [E:176| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 09:47:48,169 - [INFO] - [E:176| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 09:48:56,042 - [INFO] - [Epoch:176]: Training Loss:0.002901
2023-05-23 09:48:56,293 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 09:49:18,625 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 09:49:40,454 - [INFO] - [Evaluating Epoch 176 valid]:
MRR: Tail : 0.09319, Head : 0.06945, Avg : 0.08132
2023-05-23 09:49:40,455 - [INFO] - [Epoch 176]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 09:49:41,576 - [INFO] - [E:177| 0]: Train Loss:0.0029397, Val MRR:0.08254, ice00001
2023-05-23 09:51:17,807 - [INFO] - [E:177| 100]: Train Loss:0.0029097, Val MRR:0.08254, ice00001
2023-05-23 09:52:53,811 - [INFO] - [E:177| 200]: Train Loss:0.0029044, Val MRR:0.08254, ice00001
2023-05-23 09:54:24,360 - [INFO] - [E:177| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 09:55:57,255 - [INFO] - [E:177| 400]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 09:57:33,580 - [INFO] - [E:177| 500]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 09:59:09,822 - [INFO] - [E:177| 600]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 10:00:46,001 - [INFO] - [E:177| 700]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 10:02:22,312 - [INFO] - [E:177| 800]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 10:03:48,718 - [INFO] - [E:177| 900]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 10:04:56,700 - [INFO] - [Epoch:177]: Training Loss:0.002902
2023-05-23 10:04:57,162 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 10:05:19,206 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 10:05:40,950 - [INFO] - [Evaluating Epoch 177 valid]:
MRR: Tail : 0.09463, Head : 0.06937, Avg : 0.082
2023-05-23 10:05:40,950 - [INFO] - [Epoch 177]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 10:05:42,069 - [INFO] - [E:178| 0]: Train Loss:0.0029154, Val MRR:0.08254, ice00001
2023-05-23 10:07:18,752 - [INFO] - [E:178| 100]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 10:08:55,692 - [INFO] - [E:178| 200]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 10:10:32,548 - [INFO] - [E:178| 300]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 10:11:59,310 - [INFO] - [E:178| 400]: Train Loss:0.0029005, Val MRR:0.08254, ice00001
2023-05-23 10:13:35,968 - [INFO] - [E:178| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 10:15:12,063 - [INFO] - [E:178| 600]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 10:16:48,783 - [INFO] - [E:178| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 10:18:28,065 - [INFO] - [E:178| 800]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 10:19:54,676 - [INFO] - [E:178| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 10:21:02,739 - [INFO] - [Epoch:178]: Training Loss:0.002901
2023-05-23 10:21:03,162 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 10:21:25,309 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 10:21:47,383 - [INFO] - [Evaluating Epoch 178 valid]:
MRR: Tail : 0.09372, Head : 0.06939, Avg : 0.08156
2023-05-23 10:21:47,383 - [INFO] - [Epoch 178]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 10:21:48,470 - [INFO] - [E:179| 0]: Train Loss:0.0029285, Val MRR:0.08254, ice00001
2023-05-23 10:23:24,800 - [INFO] - [E:179| 100]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 10:25:00,452 - [INFO] - [E:179| 200]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-23 10:26:36,618 - [INFO] - [E:179| 300]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 10:28:02,879 - [INFO] - [E:179| 400]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 10:29:39,028 - [INFO] - [E:179| 500]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 10:31:15,002 - [INFO] - [E:179| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 10:32:51,342 - [INFO] - [E:179| 700]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-23 10:34:27,452 - [INFO] - [E:179| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 10:35:53,971 - [INFO] - [E:179| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 10:37:01,909 - [INFO] - [Epoch:179]: Training Loss:0.002901
2023-05-23 10:37:02,166 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 10:37:24,068 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 10:37:45,714 - [INFO] - [Evaluating Epoch 179 valid]:
MRR: Tail : 0.09406, Head : 0.06977, Avg : 0.08191
MR: Tail : 757.01, Head : 943.76, Avg : 850.38
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09892, Head : 0.06142, Avg : 0.08017
Hit-10: Tail : 0.21568, Head : 0.15499, Avg : 0.18534
2023-05-23 10:37:45,715 - [INFO] - [Epoch 179]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 10:37:46,748 - [INFO] - [E:180| 0]: Train Loss:0.0028774, Val MRR:0.08254, ice00001
2023-05-23 10:39:23,258 - [INFO] - [E:180| 100]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-23 10:41:00,102 - [INFO] - [E:180| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 10:42:37,242 - [INFO] - [E:180| 300]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 10:44:04,279 - [INFO] - [E:180| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 10:45:41,059 - [INFO] - [E:180| 500]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 10:47:17,493 - [INFO] - [E:180| 600]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 10:48:53,283 - [INFO] - [E:180| 700]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 10:50:29,111 - [INFO] - [E:180| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 10:51:56,380 - [INFO] - [E:180| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 10:53:03,350 - [INFO] - [Epoch:180]: Training Loss:0.002902
2023-05-23 10:53:03,689 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 10:53:25,992 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 10:53:47,379 - [INFO] - [Evaluating Epoch 180 valid]:
MRR: Tail : 0.09407, Head : 0.06961, Avg : 0.08184
2023-05-23 10:53:47,379 - [INFO] - [Epoch 180]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 10:53:48,521 - [INFO] - [E:181| 0]: Train Loss:0.0029134, Val MRR:0.08254, ice00001
2023-05-23 10:55:24,359 - [INFO] - [E:181| 100]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 10:57:00,570 - [INFO] - [E:181| 200]: Train Loss:0.0028996, Val MRR:0.08254, ice00001
2023-05-23 10:58:36,293 - [INFO] - [E:181| 300]: Train Loss:0.0028976, Val MRR:0.08254, ice00001
2023-05-23 11:00:05,073 - [INFO] - [E:181| 400]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-23 11:01:38,686 - [INFO] - [E:181| 500]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 11:03:16,785 - [INFO] - [E:181| 600]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 11:04:53,071 - [INFO] - [E:181| 700]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-23 11:06:29,102 - [INFO] - [E:181| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 11:08:05,301 - [INFO] - [E:181| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 11:09:03,454 - [INFO] - [Epoch:181]: Training Loss:0.002901
2023-05-23 11:09:03,940 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 11:09:26,039 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 11:09:47,997 - [INFO] - [Evaluating Epoch 181 valid]:
MRR: Tail : 0.09468, Head : 0.06956, Avg : 0.08212
2023-05-23 11:09:47,997 - [INFO] - [Epoch 181]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 11:09:49,111 - [INFO] - [E:182| 0]: Train Loss:0.0029312, Val MRR:0.08254, ice00001
2023-05-23 11:11:25,770 - [INFO] - [E:182| 100]: Train Loss:0.0028972, Val MRR:0.08254, ice00001
2023-05-23 11:13:02,938 - [INFO] - [E:182| 200]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 11:14:39,447 - [INFO] - [E:182| 300]: Train Loss:0.0029003, Val MRR:0.08254, ice00001
2023-05-23 11:16:16,004 - [INFO] - [E:182| 400]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 11:17:42,161 - [INFO] - [E:182| 500]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 11:19:18,291 - [INFO] - [E:182| 600]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 11:20:54,694 - [INFO] - [E:182| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 11:22:30,600 - [INFO] - [E:182| 800]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 11:24:06,707 - [INFO] - [E:182| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 11:25:04,384 - [INFO] - [Epoch:182]: Training Loss:0.002902
2023-05-23 11:25:04,715 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 11:25:27,204 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 11:25:48,768 - [INFO] - [Evaluating Epoch 182 valid]:
MRR: Tail : 0.09543, Head : 0.0692, Avg : 0.08231
2023-05-23 11:25:48,768 - [INFO] - [Epoch 182]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 11:25:49,839 - [INFO] - [E:183| 0]: Train Loss:0.0029452, Val MRR:0.08254, ice00001
2023-05-23 11:27:26,483 - [INFO] - [E:183| 100]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 11:29:02,996 - [INFO] - [E:183| 200]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-23 11:30:39,623 - [INFO] - [E:183| 300]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 11:32:15,891 - [INFO] - [E:183| 400]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-23 11:33:42,566 - [INFO] - [E:183| 500]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 11:35:18,919 - [INFO] - [E:183| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 11:36:55,431 - [INFO] - [E:183| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 11:38:31,295 - [INFO] - [E:183| 800]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 11:40:07,576 - [INFO] - [E:183| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 11:41:09,685 - [INFO] - [Epoch:183]: Training Loss:0.002901
2023-05-23 11:41:09,895 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 11:41:27,012 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 11:41:48,743 - [INFO] - [Evaluating Epoch 183 valid]:
MRR: Tail : 0.09465, Head : 0.06903, Avg : 0.08184
2023-05-23 11:41:48,743 - [INFO] - [Epoch 183]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 11:41:49,828 - [INFO] - [E:184| 0]: Train Loss:0.002883, Val MRR:0.08254, ice00001
2023-05-23 11:43:26,587 - [INFO] - [E:184| 100]: Train Loss:0.002905, Val MRR:0.08254, ice00001
2023-05-23 11:45:02,547 - [INFO] - [E:184| 200]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 11:46:38,285 - [INFO] - [E:184| 300]: Train Loss:0.002903, Val MRR:0.08254, ice00001
2023-05-23 11:48:18,393 - [INFO] - [E:184| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 11:49:45,804 - [INFO] - [E:184| 500]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 11:51:22,960 - [INFO] - [E:184| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 11:52:59,889 - [INFO] - [E:184| 700]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 11:54:36,443 - [INFO] - [E:184| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 11:56:12,874 - [INFO] - [E:184| 900]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 11:57:20,940 - [INFO] - [Epoch:184]: Training Loss:0.002902
2023-05-23 11:57:21,278 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 11:57:39,785 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 11:57:53,950 - [INFO] - [Evaluating Epoch 184 valid]:
MRR: Tail : 0.09275, Head : 0.06412, Avg : 0.07843
2023-05-23 11:57:53,950 - [INFO] - [Epoch 184]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 11:57:55,003 - [INFO] - [E:185| 0]: Train Loss:0.0028647, Val MRR:0.08254, ice00001
2023-05-23 11:59:31,819 - [INFO] - [E:185| 100]: Train Loss:0.0029041, Val MRR:0.08254, ice00001
2023-05-23 12:01:08,474 - [INFO] - [E:185| 200]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 12:02:45,553 - [INFO] - [E:185| 300]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 12:04:22,872 - [INFO] - [E:185| 400]: Train Loss:0.0029036, Val MRR:0.08254, ice00001
2023-05-23 12:05:54,544 - [INFO] - [E:185| 500]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 12:07:25,782 - [INFO] - [E:185| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 12:09:01,865 - [INFO] - [E:185| 700]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 12:10:37,977 - [INFO] - [E:185| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 12:12:13,984 - [INFO] - [E:185| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 12:13:21,832 - [INFO] - [Epoch:185]: Training Loss:0.002902
2023-05-23 12:13:22,161 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 12:13:44,482 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 12:14:02,950 - [INFO] - [Evaluating Epoch 185 valid]:
MRR: Tail : 0.09541, Head : 0.06917, Avg : 0.08229
2023-05-23 12:14:02,950 - [INFO] - [Epoch 185]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 12:14:03,580 - [INFO] - [E:186| 0]: Train Loss:0.0028872, Val MRR:0.08254, ice00001
2023-05-23 12:15:32,833 - [INFO] - [E:186| 100]: Train Loss:0.0029083, Val MRR:0.08254, ice00001
2023-05-23 12:17:09,189 - [INFO] - [E:186| 200]: Train Loss:0.002906, Val MRR:0.08254, ice00001
2023-05-23 12:18:45,122 - [INFO] - [E:186| 300]: Train Loss:0.002905, Val MRR:0.08254, ice00001
2023-05-23 12:20:21,003 - [INFO] - [E:186| 400]: Train Loss:0.0029043, Val MRR:0.08254, ice00001
2023-05-23 12:21:56,877 - [INFO] - [E:186| 500]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-23 12:23:23,516 - [INFO] - [E:186| 600]: Train Loss:0.0029026, Val MRR:0.08254, ice00001
2023-05-23 12:24:59,178 - [INFO] - [E:186| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 12:26:35,252 - [INFO] - [E:186| 800]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 12:28:11,524 - [INFO] - [E:186| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 12:29:19,329 - [INFO] - [Epoch:186]: Training Loss:0.002902
2023-05-23 12:29:19,579 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 12:29:41,558 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 12:30:03,574 - [INFO] - [Evaluating Epoch 186 valid]:
MRR: Tail : 0.09317, Head : 0.06927, Avg : 0.08122
2023-05-23 12:30:03,574 - [INFO] - [Epoch 186]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 12:30:04,333 - [INFO] - [E:187| 0]: Train Loss:0.0029176, Val MRR:0.08254, ice00001
2023-05-23 12:31:30,574 - [INFO] - [E:187| 100]: Train Loss:0.0029006, Val MRR:0.08254, ice00001
2023-05-23 12:33:06,722 - [INFO] - [E:187| 200]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 12:34:46,315 - [INFO] - [E:187| 300]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 12:36:22,267 - [INFO] - [E:187| 400]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 12:37:58,437 - [INFO] - [E:187| 500]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 12:39:24,612 - [INFO] - [E:187| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 12:41:01,344 - [INFO] - [E:187| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 12:42:37,251 - [INFO] - [E:187| 800]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 12:44:13,270 - [INFO] - [E:187| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 12:45:20,439 - [INFO] - [Epoch:187]: Training Loss:0.002902
2023-05-23 12:45:20,936 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 12:45:42,813 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 12:46:04,629 - [INFO] - [Evaluating Epoch 187 valid]:
MRR: Tail : 0.09531, Head : 0.06917, Avg : 0.08224
2023-05-23 12:46:04,629 - [INFO] - [Epoch 187]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 12:46:05,770 - [INFO] - [E:188| 0]: Train Loss:0.0028882, Val MRR:0.08254, ice00001
2023-05-23 12:47:31,681 - [INFO] - [E:188| 100]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 12:49:08,177 - [INFO] - [E:188| 200]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-23 12:50:44,527 - [INFO] - [E:188| 300]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 12:52:21,586 - [INFO] - [E:188| 400]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 12:53:58,575 - [INFO] - [E:188| 500]: Train Loss:0.0029033, Val MRR:0.08254, ice00001
2023-05-23 12:55:25,853 - [INFO] - [E:188| 600]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 12:57:01,682 - [INFO] - [E:188| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 12:58:38,330 - [INFO] - [E:188| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 13:00:14,597 - [INFO] - [E:188| 900]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 13:01:22,284 - [INFO] - [Epoch:188]: Training Loss:0.002902
2023-05-23 13:01:22,616 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 13:01:44,375 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 13:02:06,387 - [INFO] - [Evaluating Epoch 188 valid]:
MRR: Tail : 0.09495, Head : 0.06875, Avg : 0.08185
2023-05-23 13:02:06,388 - [INFO] - [Epoch 188]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 13:02:07,317 - [INFO] - [E:189| 0]: Train Loss:0.0028492, Val MRR:0.08254, ice00001
2023-05-23 13:03:34,797 - [INFO] - [E:189| 100]: Train Loss:0.0029, Val MRR:0.08254, ice00001
2023-05-23 13:05:09,695 - [INFO] - [E:189| 200]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 13:06:46,528 - [INFO] - [E:189| 300]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 13:08:23,160 - [INFO] - [E:189| 400]: Train Loss:0.0029035, Val MRR:0.08254, ice00001
2023-05-23 13:09:59,228 - [INFO] - [E:189| 500]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 13:11:34,407 - [INFO] - [E:189| 600]: Train Loss:0.0029028, Val MRR:0.08254, ice00001
2023-05-23 13:13:01,552 - [INFO] - [E:189| 700]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 13:14:38,442 - [INFO] - [E:189| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 13:16:14,511 - [INFO] - [E:189| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 13:17:22,536 - [INFO] - [Epoch:189]: Training Loss:0.002901
2023-05-23 13:17:22,869 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 13:17:45,256 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 13:18:07,017 - [INFO] - [Evaluating Epoch 189 valid]:
MRR: Tail : 0.09579, Head : 0.06923, Avg : 0.08251
MR: Tail : 745.83, Head : 903.94, Avg : 824.88
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09831, Head : 0.06142, Avg : 0.07986
Hit-10: Tail : 0.21119, Head : 0.159, Avg : 0.1851
2023-05-23 13:18:07,017 - [INFO] - [Epoch 189]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 13:18:08,045 - [INFO] - [E:190| 0]: Train Loss:0.0029323, Val MRR:0.08254, ice00001
2023-05-23 13:19:47,572 - [INFO] - [E:190| 100]: Train Loss:0.0029007, Val MRR:0.08254, ice00001
2023-05-23 13:21:13,932 - [INFO] - [E:190| 200]: Train Loss:0.002902, Val MRR:0.08254, ice00001
2023-05-23 13:22:50,576 - [INFO] - [E:190| 300]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 13:24:26,512 - [INFO] - [E:190| 400]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 13:26:02,536 - [INFO] - [E:190| 500]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-23 13:27:38,530 - [INFO] - [E:190| 600]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 13:29:05,115 - [INFO] - [E:190| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 13:30:41,594 - [INFO] - [E:190| 800]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 13:32:18,393 - [INFO] - [E:190| 900]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 13:33:25,981 - [INFO] - [Epoch:190]: Training Loss:0.002902
2023-05-23 13:33:26,480 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 13:33:48,506 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 13:34:10,740 - [INFO] - [Evaluating Epoch 190 valid]:
MRR: Tail : 0.09509, Head : 0.06953, Avg : 0.08231
2023-05-23 13:34:10,740 - [INFO] - [Epoch 190]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 13:34:11,611 - [INFO] - [E:191| 0]: Train Loss:0.0028915, Val MRR:0.08254, ice00001
2023-05-23 13:35:48,433 - [INFO] - [E:191| 100]: Train Loss:0.0029065, Val MRR:0.08254, ice00001
2023-05-23 13:37:15,529 - [INFO] - [E:191| 200]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-23 13:38:51,211 - [INFO] - [E:191| 300]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 13:40:27,115 - [INFO] - [E:191| 400]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 13:42:03,149 - [INFO] - [E:191| 500]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 13:43:38,919 - [INFO] - [E:191| 600]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 13:45:05,028 - [INFO] - [E:191| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 13:46:41,341 - [INFO] - [E:191| 800]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 13:48:16,911 - [INFO] - [E:191| 900]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 13:49:24,369 - [INFO] - [Epoch:191]: Training Loss:0.002902
2023-05-23 13:49:24,702 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 13:49:47,018 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 13:50:08,618 - [INFO] - [Evaluating Epoch 191 valid]:
MRR: Tail : 0.0903, Head : 0.06511, Avg : 0.07771
2023-05-23 13:50:08,618 - [INFO] - [Epoch 191]: Training Loss: 0.0029016, Valid MRR: 0.08254,
2023-05-23 13:50:09,723 - [INFO] - [E:192| 0]: Train Loss:0.0029027, Val MRR:0.08254, ice00001
2023-05-23 13:51:45,841 - [INFO] - [E:192| 100]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 13:53:12,001 - [INFO] - [E:192| 200]: Train Loss:0.0029056, Val MRR:0.08254, ice00001
2023-05-23 13:54:48,068 - [INFO] - [E:192| 300]: Train Loss:0.0029069, Val MRR:0.08254, ice00001
2023-05-23 13:56:23,885 - [INFO] - [E:192| 400]: Train Loss:0.0029058, Val MRR:0.08254, ice00001
2023-05-23 13:58:00,030 - [INFO] - [E:192| 500]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-23 13:59:36,532 - [INFO] - [E:192| 600]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-23 14:01:06,893 - [INFO] - [E:192| 700]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 14:02:39,740 - [INFO] - [E:192| 800]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 14:04:16,414 - [INFO] - [E:192| 900]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 14:05:27,331 - [INFO] - [Epoch:192]: Training Loss:0.002902
2023-05-23 14:05:27,739 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 14:05:49,732 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 14:06:11,452 - [INFO] - [Evaluating Epoch 192 valid]:
MRR: Tail : 0.09509, Head : 0.06937, Avg : 0.08223
2023-05-23 14:06:11,452 - [INFO] - [Epoch 192]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 14:06:12,592 - [INFO] - [E:193| 0]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-23 14:07:48,526 - [INFO] - [E:193| 100]: Train Loss:0.0028991, Val MRR:0.08254, ice00001
2023-05-23 14:09:21,714 - [INFO] - [E:193| 200]: Train Loss:0.0029002, Val MRR:0.08254, ice00001
2023-05-23 14:10:50,973 - [INFO] - [E:193| 300]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 14:12:27,616 - [INFO] - [E:193| 400]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 14:14:03,995 - [INFO] - [E:193| 500]: Train Loss:0.0029029, Val MRR:0.08254, ice00001
2023-05-23 14:15:39,867 - [INFO] - [E:193| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 14:17:15,943 - [INFO] - [E:193| 700]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 14:18:42,294 - [INFO] - [E:193| 800]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 14:20:18,175 - [INFO] - [E:193| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 14:21:25,568 - [INFO] - [Epoch:193]: Training Loss:0.002902
2023-05-23 14:21:25,968 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 14:21:48,201 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 14:22:09,742 - [INFO] - [Evaluating Epoch 193 valid]:
MRR: Tail : 0.09287, Head : 0.06993, Avg : 0.0814
2023-05-23 14:22:09,742 - [INFO] - [Epoch 193]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 14:22:10,873 - [INFO] - [E:194| 0]: Train Loss:0.0029544, Val MRR:0.08254, ice00001
2023-05-23 14:23:46,792 - [INFO] - [E:194| 100]: Train Loss:0.0029008, Val MRR:0.08254, ice00001
2023-05-23 14:25:23,033 - [INFO] - [E:194| 200]: Train Loss:0.0028995, Val MRR:0.08254, ice00001
2023-05-23 14:26:49,878 - [INFO] - [E:194| 300]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 14:28:25,558 - [INFO] - [E:194| 400]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 14:30:01,486 - [INFO] - [E:194| 500]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 14:31:37,603 - [INFO] - [E:194| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 14:33:13,600 - [INFO] - [E:194| 700]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 14:34:40,339 - [INFO] - [E:194| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 14:36:16,469 - [INFO] - [E:194| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 14:37:24,137 - [INFO] - [Epoch:194]: Training Loss:0.002901
2023-05-23 14:37:24,388 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 14:37:46,469 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 14:38:08,246 - [INFO] - [Evaluating Epoch 194 valid]:
MRR: Tail : 0.09488, Head : 0.06903, Avg : 0.08196
2023-05-23 14:38:08,246 - [INFO] - [Epoch 194]: Training Loss: 0.0029014, Valid MRR: 0.08254,
2023-05-23 14:38:09,387 - [INFO] - [E:195| 0]: Train Loss:0.0028468, Val MRR:0.08254, ice00001
2023-05-23 14:39:45,333 - [INFO] - [E:195| 100]: Train Loss:0.0029037, Val MRR:0.08254, ice00001
2023-05-23 14:41:21,366 - [INFO] - [E:195| 200]: Train Loss:0.002906, Val MRR:0.08254, ice00001
2023-05-23 14:42:47,766 - [INFO] - [E:195| 300]: Train Loss:0.0029022, Val MRR:0.08254, ice00001
2023-05-23 14:44:23,765 - [INFO] - [E:195| 400]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 14:46:00,495 - [INFO] - [E:195| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 14:47:36,916 - [INFO] - [E:195| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 14:49:13,652 - [INFO] - [E:195| 700]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 14:50:45,492 - [INFO] - [E:195| 800]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 14:52:19,834 - [INFO] - [E:195| 900]: Train Loss:0.0029016, Val MRR:0.08254, ice00001
2023-05-23 14:53:27,584 - [INFO] - [Epoch:195]: Training Loss:0.002901
2023-05-23 14:53:28,070 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 14:53:50,069 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 14:54:11,665 - [INFO] - [Evaluating Epoch 195 valid]:
MRR: Tail : 0.0943, Head : 0.06974, Avg : 0.08202
2023-05-23 14:54:11,665 - [INFO] - [Epoch 195]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 14:54:12,571 - [INFO] - [E:196| 0]: Train Loss:0.0029452, Val MRR:0.08254, ice00001
2023-05-23 14:55:48,933 - [INFO] - [E:196| 100]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-23 14:57:24,678 - [INFO] - [E:196| 200]: Train Loss:0.0029034, Val MRR:0.08254, ice00001
2023-05-23 14:58:53,336 - [INFO] - [E:196| 300]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 15:00:27,275 - [INFO] - [E:196| 400]: Train Loss:0.0029001, Val MRR:0.08254, ice00001
2023-05-23 15:02:03,370 - [INFO] - [E:196| 500]: Train Loss:0.0029019, Val MRR:0.08254, ice00001
2023-05-23 15:03:40,110 - [INFO] - [E:196| 600]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 15:05:16,991 - [INFO] - [E:196| 700]: Train Loss:0.0029025, Val MRR:0.08254, ice00001
2023-05-23 15:06:52,537 - [INFO] - [E:196| 800]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 15:08:19,447 - [INFO] - [E:196| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 15:09:27,148 - [INFO] - [Epoch:196]: Training Loss:0.002902
2023-05-23 15:09:27,534 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 15:09:49,381 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 15:10:11,177 - [INFO] - [Evaluating Epoch 196 valid]:
MRR: Tail : 0.095, Head : 0.06916, Avg : 0.08208
2023-05-23 15:10:11,177 - [INFO] - [Epoch 196]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 15:10:12,234 - [INFO] - [E:197| 0]: Train Loss:0.0028602, Val MRR:0.08254, ice00001
2023-05-23 15:11:48,193 - [INFO] - [E:197| 100]: Train Loss:0.0029013, Val MRR:0.08254, ice00001
2023-05-23 15:13:23,850 - [INFO] - [E:197| 200]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 15:14:59,924 - [INFO] - [E:197| 300]: Train Loss:0.0028997, Val MRR:0.08254, ice00001
2023-05-23 15:16:26,337 - [INFO] - [E:197| 400]: Train Loss:0.0028998, Val MRR:0.08254, ice00001
2023-05-23 15:18:03,068 - [INFO] - [E:197| 500]: Train Loss:0.002901, Val MRR:0.08254, ice00001
2023-05-23 15:19:39,427 - [INFO] - [E:197| 600]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 15:21:15,283 - [INFO] - [E:197| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 15:22:51,602 - [INFO] - [E:197| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 15:24:17,617 - [INFO] - [E:197| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 15:25:25,245 - [INFO] - [Epoch:197]: Training Loss:0.002901
2023-05-23 15:25:25,575 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 15:25:47,661 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 15:26:09,512 - [INFO] - [Evaluating Epoch 197 valid]:
MRR: Tail : 0.09551, Head : 0.06906, Avg : 0.08228
2023-05-23 15:26:09,512 - [INFO] - [Epoch 197]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 15:26:10,534 - [INFO] - [E:198| 0]: Train Loss:0.0029275, Val MRR:0.08254, ice00001
2023-05-23 15:27:46,491 - [INFO] - [E:198| 100]: Train Loss:0.0029078, Val MRR:0.08254, ice00001
2023-05-23 15:29:22,606 - [INFO] - [E:198| 200]: Train Loss:0.0029031, Val MRR:0.08254, ice00001
2023-05-23 15:30:58,454 - [INFO] - [E:198| 300]: Train Loss:0.0029032, Val MRR:0.08254, ice00001
2023-05-23 15:32:24,891 - [INFO] - [E:198| 400]: Train Loss:0.0029023, Val MRR:0.08254, ice00001
2023-05-23 15:34:00,746 - [INFO] - [E:198| 500]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 15:35:36,804 - [INFO] - [E:198| 600]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 15:37:15,877 - [INFO] - [E:198| 700]: Train Loss:0.0029009, Val MRR:0.08254, ice00001
2023-05-23 15:38:51,924 - [INFO] - [E:198| 800]: Train Loss:0.0029011, Val MRR:0.08254, ice00001
2023-05-23 15:40:17,910 - [INFO] - [E:198| 900]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 15:41:25,816 - [INFO] - [Epoch:198]: Training Loss:0.002902
2023-05-23 15:41:26,068 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 15:41:48,207 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 15:42:10,113 - [INFO] - [Evaluating Epoch 198 valid]:
MRR: Tail : 0.09404, Head : 0.0688, Avg : 0.08142
2023-05-23 15:42:10,113 - [INFO] - [Epoch 198]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 15:42:11,159 - [INFO] - [E:199| 0]: Train Loss:0.0028617, Val MRR:0.08254, ice00001
2023-05-23 15:43:47,102 - [INFO] - [E:199| 100]: Train Loss:0.0029076, Val MRR:0.08254, ice00001
2023-05-23 15:45:23,269 - [INFO] - [E:199| 200]: Train Loss:0.0029038, Val MRR:0.08254, ice00001
2023-05-23 15:46:59,271 - [INFO] - [E:199| 300]: Train Loss:0.0029051, Val MRR:0.08254, ice00001
2023-05-23 15:48:25,214 - [INFO] - [E:199| 400]: Train Loss:0.002904, Val MRR:0.08254, ice00001
2023-05-23 15:50:00,923 - [INFO] - [E:199| 500]: Train Loss:0.0029021, Val MRR:0.08254, ice00001
2023-05-23 15:51:37,086 - [INFO] - [E:199| 600]: Train Loss:0.0029015, Val MRR:0.08254, ice00001
2023-05-23 15:53:12,833 - [INFO] - [E:199| 700]: Train Loss:0.0029017, Val MRR:0.08254, ice00001
2023-05-23 15:54:49,188 - [INFO] - [E:199| 800]: Train Loss:0.0029018, Val MRR:0.08254, ice00001
2023-05-23 15:56:22,141 - [INFO] - [E:199| 900]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 15:57:23,747 - [INFO] - [Epoch:199]: Training Loss:0.002902
2023-05-23 15:57:24,241 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 15:57:46,331 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 15:58:08,452 - [INFO] - [Evaluating Epoch 199 valid]:
MRR: Tail : 0.09459, Head : 0.06915, Avg : 0.08187
MR: Tail : 901.29, Head : 1054.1, Avg : 977.68
Hit-1: Tail : 0.03848, Head : 0.02779, Avg : 0.03314
Hit-3: Tail : 0.09892, Head : 0.06142, Avg : 0.08017
Hit-10: Tail : 0.20646, Head : 0.15876, Avg : 0.18261
2023-05-23 15:58:08,453 - [INFO] - [Epoch 199]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 15:58:09,270 - [INFO] - [E:200| 0]: Train Loss:0.0028972, Val MRR:0.08254, ice00001
2023-05-23 15:59:45,628 - [INFO] - [E:200| 100]: Train Loss:0.0029048, Val MRR:0.08254, ice00001
2023-05-23 16:01:21,630 - [INFO] - [E:200| 200]: Train Loss:0.0029024, Val MRR:0.08254, ice00001
2023-05-23 16:02:57,533 - [INFO] - [E:200| 300]: Train Loss:0.0029014, Val MRR:0.08254, ice00001
2023-05-23 16:04:31,103 - [INFO] - [E:200| 400]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 16:05:58,641 - [INFO] - [E:200| 500]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 16:07:35,839 - [INFO] - [E:200| 600]: Train Loss:0.0028993, Val MRR:0.08254, ice00001
2023-05-23 16:09:11,700 - [INFO] - [E:200| 700]: Train Loss:0.0028999, Val MRR:0.08254, ice00001
2023-05-23 16:10:48,333 - [INFO] - [E:200| 800]: Train Loss:0.0029004, Val MRR:0.08254, ice00001
2023-05-23 16:12:24,622 - [INFO] - [E:200| 900]: Train Loss:0.0029012, Val MRR:0.08254, ice00001
2023-05-23 16:13:22,005 - [INFO] - [Epoch:200]: Training Loss:0.002901
2023-05-23 16:13:22,344 - [INFO] - [Valid, Tail_Batch Step 0] ice00001
2023-05-23 16:13:44,761 - [INFO] - [Valid, Head_Batch Step 0] ice00001
2023-05-23 16:14:06,398 - [INFO] - [Evaluating Epoch 200 valid]:
MRR: Tail : 0.09471, Head : 0.06957, Avg : 0.08214
2023-05-23 16:14:06,398 - [INFO] - [Epoch 200]: Training Loss: 0.0029015, Valid MRR: 0.08254,
2023-05-23 16:14:07,538 - [INFO] - [E:201| 0]: Train Loss:0.0028923, Val MRR:0.08254, ice00001
2023-05-23 16:15:43,288 - [INFO] - [E:201| 100]: Train Loss:0.0028965, Val MRR:0.08254, ice00001
2023-05-23 16:17:19,346 - [INFO] - [E:201| 200]: Train Loss:0.0028998, Val MRR:0.08254, ice00001