Giter VIP home page Giter VIP logo

mixgcf's People

Contributors

huangtinglin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

mixgcf's Issues

inconsistent learning curve & cannot reproduce the performance in the paper

The learning curve in my experiments is follows, so that we cannot reproduce the performance in the original paper.

python main.py --dataset ali --dim 64 --lr 0.001 --batch_size 2048 --gpu_id 0 --context_hops 3 --pool mean --ns mixgcf --K 1 --n_negs 32
reading train and test user-item set ...
building the adj mat ...
loading over ...
start training ...
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------------------+------------------------------------+
| 0 | 31.091588735580444 | 65.43453598022461 | 248.38352966308594 | [0.00521209 0.00675287 0.00855975] | [0.00290582 0.00323882 0.00357563] | [0.00033437 0.00021855 0.00018362] | [0.00657713 0.00846616 0.0105896 ] |
| 0 | 31.091588735580444 | 61.18087434768677 | 248.38352966308594 | [0.00042361 0.00096173 0.00145551] | [0.00012074 0.00023461 0.0003275 ] | [2.36431073e-05 2.91844606e-05 3.05390136e-05] | [0.00047286 0.00116738 0.00183234] |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------------------+------------------------------------+
using time 31.0734s, training loss at epoch 1: 247.9624
using time 30.8402s, training loss at epoch 2: 246.7937
using time 30.1051s, training loss at epoch 3: 245.5179
using time 30.0405s, training loss at epoch 4: 244.6414
+-------+-------------------+--------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+--------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 5 | 30.29911184310913 | 66.6713593006134 | 243.82125854492188 | [0.03267104 0.05192487 0.0667467 ] | [0.01398339 0.01813131 0.02091478] | [0.00204967 0.0016398 0.00141378] | [0.03950416 0.06233798 0.07994595] |
| 5 | 30.29911184310913 | 55.695730686187744 | 243.82125854492188 | [0.00274051 0.00538618 0.00803865] | [0.00106481 0.00160977 0.0020967 ] | [0.00014999 0.00014925 0.00015196] | [0.00295539 0.00588122 0.0089105 ] |
+-------+-------------------+--------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 30.4237s, training loss at epoch 6: 242.8385
using time 29.9755s, training loss at epoch 7: 241.7055
using time 30.1087s, training loss at epoch 8: 240.0855
using time 30.0281s, training loss at epoch 9: 238.1280
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 10 | 30.125856161117554 | 68.09304714202881 | 236.12973022460938 | [0.04374692 0.06633123 0.08247408] | [0.01900985 0.02387068 0.0268808 ] | [0.00267636 0.00205208 0.00171185] | [0.05183112 0.07824996 0.09719541] |
| 10 | 30.125856161117554 | 54.29553246498108 | 236.12973022460938 | [0.00637994 0.01126802 0.01629268] | [0.00256767 0.00357393 0.00448176] | [0.00033618 0.00030108 0.00029357] | [0.00664962 0.01192499 0.01737768] |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 30.4452s, training loss at epoch 11: 234.4755
using time 30.3866s, training loss at epoch 12: 232.9239
using time 29.9950s, training loss at epoch 13: 231.6173
using time 30.3294s, training loss at epoch 14: 230.3941
+-------+--------------------+------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 15 | 30.094027757644653 | 65.6173038482666 | 229.18226623535156 | [0.04888425 0.07312196 0.09081854] | [0.02147865 0.02666906 0.0299837 ] | [0.0029666 0.00223719 0.0018711 ] | [0.05749821 0.08576471 0.10641994] |
| 15 | 30.094027757644653 | 58.8669114112854 | 229.18226623535156 | [0.00720684 0.01260252 0.01798143] | [0.00288381 0.00400238 0.00497617] | [0.00037903 0.00033728 0.00032534] | [0.00752146 0.0133288 0.01931346] |
+-------+--------------------+------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 30.5914s, training loss at epoch 16: 228.0764
using time 30.2541s, training loss at epoch 17: 227.0421
using time 30.1063s, training loss at epoch 18: 225.9921
using time 30.0690s, training loss at epoch 19: 224.9205
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 20 | 29.969733953475952 | 64.87095975875854 | 223.89605712890625 | [0.05175932 0.07734852 0.09704839] | [0.02289635 0.02838933 0.03207255] | [0.00312241 0.00235577 0.00198923] | [0.06060063 0.09041145 0.11343831] |
| 20 | 29.969733953475952 | 61.04182410240173 | 223.89605712890625 | [0.0073636 0.01358592 0.01887642] | [0.00296721 0.00425694 0.00521806] | [0.00038789 0.00036167 0.00034135] | [0.00769879 0.01433363 0.02024441] |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 30.5000s, training loss at epoch 21: 222.7980
using time 30.4034s, training loss at epoch 22: 221.7419
using time 30.0427s, training loss at epoch 23: 220.7088
using time 30.2865s, training loss at epoch 24: 219.6499
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 25 | 30.06094980239868 | 66.04566931724548 | 218.554443359375 | [0.05453842 0.08171983 0.10125716] | [0.02430466 0.03011026 0.03376734] | [0.00327685 0.00248194 0.00207288] | [0.0636479 0.09541669 0.11816778] |
| 25 | 30.06094980239868 | 60.66896033287048 | 218.554443359375 | [0.00776775 0.01394068 0.01965047] | [0.00313556 0.00440959 0.00544411] | [0.00040784 0.00037312 0.00035588] | [0.00809776 0.01476217 0.02111625] |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 29.9762s, training loss at epoch 26: 217.4170
using time 30.3889s, training loss at epoch 27: 216.4077
using time 30.1408s, training loss at epoch 28: 215.2682
using time 30.2326s, training loss at epoch 29: 214.2147
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 30 | 30.09700036048889 | 63.66904306411743 | 213.06771850585938 | [0.05622589 0.08382674 0.10392809] | [0.0251786 0.03109215 0.03484458] | [0.00336165 0.00254019 0.0021193 ] | [0.0653301 0.09778832 0.12095306] |
| 30 | 30.09700036048889 | 60.36732816696167 | 213.06771850585938 | [0.00796773 0.0143803 0.02017037] | [0.00316652 0.00449104 0.0055425 ] | [0.00042188 0.00038457 0.00036622] | [0.00837853 0.01523503 0.0217221 ] |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 30.2859s, training loss at epoch 31: 211.9128
using time 30.2449s, training loss at epoch 32: 210.7381
using time 30.4290s, training loss at epoch 33: 209.6218
using time 30.2707s, training loss at epoch 34: 208.4770
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 35 | 30.465541124343872 | 65.57251191139221 | 207.28221130371094 | [0.05790284 0.08554036 0.10670552] | [0.02583893 0.03174864 0.03569791] | [0.00345954 0.00258949 0.00217376] | [0.06721913 0.09976008 0.12406927] |
| 35 | 30.465541124343872 | 59.63603067398071 | 207.28221130371094 | [0.00799974 0.01463398 0.0204451 ] | [0.00311626 0.00449703 0.00555131] | [0.00042114 0.00039233 0.00037115] | [0.00836375 0.01554534 0.02201764] |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 30.2523s, training loss at epoch 36: 206.0771
using time 30.2868s, training loss at epoch 37: 204.8871
using time 30.0030s, training loss at epoch 38: 203.6991
using time 30.0778s, training loss at epoch 39: 202.4524
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 40 | 30.12627100944519 | 68.39437079429626 | 201.25155639648438 | [0.05871282 0.08677589 0.107119 ] | [0.02635491 0.03235851 0.03615478] | [0.00350505 0.00262499 0.00217882] | [0.06818433 0.10102863 0.12430368] |
| 40 | 30.12627100944519 | 59.75320839881897 | 201.25155639648438 | [0.00839133 0.01536174 0.02102112] | [0.00326422 0.00470726 0.00573925] | [0.00044109 0.00041191 0.00038395] | [0.00876273 0.01628419 0.02266783] |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 56.2995s, training loss at epoch 41: 199.9825
using time 56.2676s, training loss at epoch 42: 198.7187
using time 56.2418s, training loss at epoch 43: 197.4769
using time 59.8346s, training loss at epoch 44: 196.2529
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 45 | 54.07617378234863 | 81.81470656394958 | 194.9663543701172 | [0.05923396 0.0878324 0.10826348] | [0.02672194 0.03283609 0.03665353] | [0.00352435 0.00265326 0.00220134] | [0.06861177 0.10210413 0.12562738] |
| 45 | 54.07617378234863 | 61.48608136177063 | 194.9663543701172 | [0.00890668 0.01557355 0.02180065] | [0.00341919 0.00480017 0.00593079] | [0.00046843 0.00041893 0.00039799] | [0.00932425 0.01655018 0.02353967] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.3913s, training loss at epoch 46: 193.6214
using time 61.5744s, training loss at epoch 47: 192.3505
using time 53.0488s, training loss at epoch 48: 191.0624
using time 57.3518s, training loss at epoch 49: 189.6828
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 50 | 55.75839567184448 | 73.32510209083557 | 188.50856018066406 | [0.05962149 0.08819743 0.10851732] | [0.02688112 0.03298699 0.03678932] | [0.00355262 0.00266188 0.00220433] | [0.0691771 0.10240748 0.12564117] |
| 50 | 55.75839567184448 | 62.73708772659302 | 188.50856018066406 | [0.00890853 0.01602211 0.02225209] | [0.00345502 0.0049262 0.00606238] | [0.00046991 0.00043001 0.00040661] | [0.00933903 0.01696393 0.02405686] |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 52.4998s, training loss at epoch 51: 187.1704
using time 61.1787s, training loss at epoch 52: 185.9580
using time 60.7039s, training loss at epoch 53: 184.6470
using time 55.1603s, training loss at epoch 54: 183.2871
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 55 | 56.71571135520935 | 63.89519238471985 | 181.9259033203125 | [0.05994709 0.08813546 0.10800011] | [0.02702002 0.03304437 0.03675533] | [0.00356641 0.00265912 0.00219307] | [0.06938393 0.10232475 0.12507584] |
| 55 | 56.71571135520935 | 65.39995098114014 | 181.9259033203125 | [0.00916466 0.01660099 0.02300818] | [0.00355184 0.00509347 0.00626139] | [0.00048321 0.00044626 0.00042065] | [0.00960501 0.01761411 0.02486959] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.0612s, training loss at epoch 56: 180.5965
using time 60.7789s, training loss at epoch 57: 179.2776
using time 55.0802s, training loss at epoch 58: 177.9432
using time 55.4764s, training loss at epoch 59: 176.6102
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 60 | 52.30838966369629 | 72.00008416175842 | 175.3128204345703 | [0.0604954 0.08878779 0.10761584] | [0.02719499 0.03324279 0.03676701] | [0.00359467 0.00267429 0.00218479] | [0.06997684 0.10294523 0.12467597] |
| 60 | 52.30838966369629 | 64.03703212738037 | 175.3128204345703 | [0.00957226 0.01719687 0.02354004] | [0.00369312 0.00526995 0.0064321 ] | [0.00050611 0.00046141 0.00043247] | [0.0100631 0.01824952 0.02553456] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.5391s, training loss at epoch 61: 173.9547
using time 57.4665s, training loss at epoch 62: 172.6278
using time 49.1362s, training loss at epoch 63: 171.2968
using time 57.1014s, training loss at epoch 64: 169.8966
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 65 | 61.46128273010254 | 72.70320224761963 | 168.510986328125 | [0.06078095 0.08864303 0.10725393] | [0.02744477 0.0334033 0.03687473] | [0.00361329 0.00267153 0.00217445] | [0.07030776 0.10264188 0.124152 ] |
| 65 | 61.46128273010254 | 65.27112889289856 | 168.510986328125 | [0.00984167 0.01741039 0.02402385] | [0.00383494 0.00540759 0.00661704] | [0.00051867 0.00046843 0.00044257] | [0.01031431 0.01854506 0.02611086] |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 51.4162s, training loss at epoch 66: 167.2891
using time 54.1614s, training loss at epoch 67: 166.0730
using time 56.3973s, training loss at epoch 68: 164.6640
using time 61.3277s, training loss at epoch 69: 163.2901
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 70 | 61.480873346328735 | 72.27342653274536 | 161.89915466308594 | [0.06080242 0.08839898 0.10695825] | [0.02754364 0.03343839 0.03690032] | [0.00361811 0.00265912 0.00216572] | [0.07040428 0.10233854 0.12377971] |
| 70 | 61.480873346328735 | 65.96440029144287 | 161.89915466308594 | [0.01048126 0.01807967 0.02429984] | [0.00403728 0.00561031 0.00674681] | [0.00055414 0.0004869 0.00044823] | [0.0110236 0.01926913 0.02645073] |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 55.5515s, training loss at epoch 71: 160.5596
using time 57.3824s, training loss at epoch 72: 159.1740
using time 61.6448s, training loss at epoch 73: 157.8662
using time 57.8451s, training loss at epoch 74: 156.5100
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 75 | 57.1681547164917 | 71.26590704917908 | 155.1714324951172 | [0.0608253 0.08766091 0.10613758] | [0.02754331 0.03327922 0.03672755] | [0.00361191 0.00263223 0.00214596] | [0.07032155 0.10127682 0.12263527] |
| 75 | 57.1681547164917 | 66.2656397819519 | 155.1714324951172 | [0.01059775 0.01825416 0.02477699] | [0.0041418 0.00573637 0.00692831] | [0.00056152 0.00049466 0.0004571 ] | [0.01115659 0.01957945 0.02696792] |
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 55.5267s, training loss at epoch 76: 153.8907
using time 51.4084s, training loss at epoch 77: 152.6562
using time 61.3008s, training loss at epoch 78: 151.2225
using time 60.9705s, training loss at epoch 79: 150.0665
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 80 | 61.27260947227478 | 72.37369775772095 | 148.62730407714844 | [0.06048988 0.08703184 0.10484461] | [0.02752848 0.03321087 0.03653239] | [0.00358019 0.00261638 0.00211953] | [0.06981137 0.10055981 0.12107716] |
| 80 | 61.27260947227478 | 72.20736956596375 | 148.62730407714844 | [0.010988 0.01878256 0.02496756] | [0.00426408 0.00588429 0.0070164 ] | [0.00058073 0.00050833 0.00046153] | [0.01151124 0.02011142 0.02718957] |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 57.9172s, training loss at epoch 81: 147.3903
using time 61.3224s, training loss at epoch 82: 146.0894
using time 61.2309s, training loss at epoch 83: 144.7690
using time 61.1550s, training loss at epoch 84: 143.5496
+-------+--------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 85 | 61.057267904281616 | 80.62557077407837 | 142.229248046875 | [0.060069 0.0864432 0.10467053] | [0.02742911 0.0330851 0.03648765] | [0.00356089 0.00259466 0.00211424] | [0.06942529 0.09976008 0.12063593] |
| 85 | 61.057267904281616 | 78.980961561203 | 142.229248046875 | [0.01115312 0.01902761 0.02539848] | [0.004374 0.00600581 0.0071688 ] | [0.00059182 0.00051424 0.00046868] | [0.01174767 0.02036263 0.0276181 ] |
+-------+--------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.6484s, training loss at epoch 86: 140.9525
using time 61.5718s, training loss at epoch 87: 139.5781
using time 53.1574s, training loss at epoch 88: 138.5421
using time 61.0660s, training loss at epoch 89: 137.1131
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 90 | 56.21543478965759 | 74.0668478012085 | 135.8595428466797 | [0.06029673 0.08632698 0.10413783] | [0.02748119 0.03306076 0.0363788 ] | [0.0035733 0.0025919 0.00210114] | [0.06963212 0.09963598 0.11984998] |
| 90 | 56.21543478965759 | 65.78000569343567 | 135.8595428466797 | [0.011368 0.01943767 0.02577291] | [0.00443624 0.00610965 0.00727319] | [0.00060068 0.00052495 0.00047705] | [0.01195455 0.02077638 0.02807619] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.4226s, training loss at epoch 91: 134.7610
using time 61.4137s, training loss at epoch 92: 133.4752
using time 61.3878s, training loss at epoch 93: 132.2789
using time 58.6677s, training loss at epoch 94: 130.9853
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 95 | 54.90701508522034 | 74.9561927318573 | 129.793701171875 | [0.06031008 0.08549781 0.10354819] | [0.02758384 0.03298663 0.03635247] | [0.00356916 0.00256191 0.00208735] | [0.06954939 0.09867078 0.11911919] |
| 95 | 54.90701508522034 | 64.28854656219482 | 129.793701171875 | [0.01154913 0.01938977 0.02628916] | [0.0045309 0.0061671 0.0074286] | [0.00061177 0.00052606 0.00048542] | [0.01216142 0.02080593 0.02860816] |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.2051s, training loss at epoch 96: 128.5379
using time 61.5208s, training loss at epoch 97: 127.3532
using time 60.2401s, training loss at epoch 98: 126.0964
using time 48.6408s, training loss at epoch 99: 125.0246
+-------+-------------------+--------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+--------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 100 | 56.63795471191406 | 71.98336577415466 | 123.93643188476562 | [0.05981781 0.0848489 0.10252626] | [0.02742541 0.03280461 0.0360974 ] | [0.00353952 0.00254502 0.00206759] | [0.06899785 0.0979262 0.11791959] |
| 100 | 56.63795471191406 | 63.808828592300415 | 123.93643188476562 | [0.01212051 0.01987887 0.02651767] | [0.00469898 0.00631634 0.00753096] | [0.00064206 0.00054157 0.00049232] | [0.01276728 0.02138224 0.02897758] |
+-------+-------------------+--------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.6196s, training loss at epoch 101: 122.6348
using time 53.4672s, training loss at epoch 102: 121.5805
using time 56.3378s, training loss at epoch 103: 120.4517
using time 54.8895s, training loss at epoch 104: 119.2148
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 105 | 58.23894691467285 | 73.30016875267029 | 118.1908950805664 | [0.05975909 0.08438657 0.10125123] | [0.02719437 0.03247952 0.03562147] | [0.00352987 0.00252709 0.00204047] | [0.06876344 0.09723678 0.116458 ] |
| 105 | 58.23894691467285 | 64.98772764205933 | 118.1908950805664 | [0.01216596 0.01997112 0.02674816] | [0.00474076 0.00637197 0.00761583] | [0.0006428 0.00054305 0.00049724] | [0.01279683 0.0214709 0.02925835] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.4688s, training loss at epoch 106: 117.0977
using time 56.0988s, training loss at epoch 107: 115.9931
using time 55.1008s, training loss at epoch 108: 114.9028
using time 57.8720s, training loss at epoch 109: 113.7076
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 110 | 56.25667762756348 | 69.23644590377808 | 112.66089630126953 | [0.05951459 0.08391025 0.10031907] | [0.02725291 0.03248109 0.03553701] | [0.00351952 0.00251262 0.00202048] | [0.0685704 0.09661629 0.11535492] |
| 110 | 56.25667762756348 | 64.54259514808655 | 112.66089630126953 | [0.01216557 0.02022392 0.02703194] | [0.00479262 0.00648006 0.00773212] | [0.00064649 0.00055155 0.00050414] | [0.01282639 0.02181077 0.02959821] |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 58.0530s, training loss at epoch 111: 111.7549
using time 54.2299s, training loss at epoch 112: 110.6006
using time 52.6575s, training loss at epoch 113: 109.5813
using time 57.2345s, training loss at epoch 114: 108.5650
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 115 | 61.252628564834595 | 69.748779296875 | 107.52191925048828 | [0.05916762 0.08317484 0.09960298] | [0.02714653 0.03230273 0.03535207] | [0.00349539 0.00249331 0.0020021 ] | [0.06806023 0.09591308 0.11448624] |
| 115 | 61.252628564834595 | 63.68185591697693 | 107.52191925048828 | [0.01247269 0.02060615 0.02710984] | [0.00488623 0.00658658 0.00778157] | [0.00066053 0.000563 0.00050488] | [0.0131367 0.02225407 0.02965732] |
+-------+--------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 55.7055s, training loss at epoch 116: 106.5542
using time 49.4647s, training loss at epoch 117: 105.5309
using time 60.0435s, training loss at epoch 118: 104.4908
using time 61.3930s, training loss at epoch 119: 103.6087
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 120 | 61.34108638763428 | 70.57331323623657 | 102.66134643554688 | [0.05852152 0.08252356 0.09858378] | [0.02699763 0.03214772 0.03513272] | [0.00345265 0.00246953 0.00198141] | [0.06732944 0.09503061 0.113328 ] |
| 120 | 61.34108638763428 | 67.1350154876709 | 102.66134643554688 | [0.01265629 0.02077004 0.02746744] | [0.00496587 0.00666726 0.00789411] | [0.00067235 0.00057039 0.00051301] | [0.01335836 0.02249051 0.03013018] |
+-------+-------------------+-------------------+--------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 54.8440s, training loss at epoch 121: 101.6870
using time 57.6139s, training loss at epoch 122: 100.7853
using time 61.2308s, training loss at epoch 123: 99.6899
using time 61.0229s, training loss at epoch 124: 98.8919
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 125 | 53.175352573394775 | 72.36662292480469 | 98.02346801757812 | [0.05821516 0.08198879 0.09858603] | [0.02685921 0.03195622 0.03505021] | [0.00344024 0.00245195 0.00198371] | [0.06702609 0.0944377 0.11335558] |
| 125 | 53.175352573394775 | 68.25410676002502 | 98.02346801757812 | [0.01278178 0.0209246 0.02750931] | [0.00501148 0.00671738 0.00792492] | [0.000679 0.00057261 0.00051276] | [0.01349135 0.02260872 0.03013018] |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 56.1590s, training loss at epoch 126: 97.0603
using time 61.5887s, training loss at epoch 127: 96.2543
using time 53.2827s, training loss at epoch 128: 95.3318
using time 61.3500s, training loss at epoch 129: 94.3620
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 130 | 61.43909931182861 | 72.56485509872437 | 93.71167755126953 | [0.05815347 0.08137359 0.09798078] | [0.02680698 0.03178421 0.03487999] | [0.00343266 0.00243437 0.00197176] | [0.06684684 0.09367933 0.11254206] |
| 130 | 61.43909931182861 | 66.42108464241028 | 93.71167755126953 | [0.01294198 0.02081229 0.02778291] | [0.00508303 0.00673805 0.00801623] | [0.00068787 0.0005715 0.00051842] | [0.01366867 0.02253484 0.0304405 ] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 56.7190s, training loss at epoch 131: 92.7020
using time 61.4483s, training loss at epoch 132: 92.0627
using time 61.2446s, training loss at epoch 133: 91.0750
using time 61.3705s, training loss at epoch 134: 90.3120
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 135 | 59.90675187110901 | 74.92616176605225 | 89.52626037597656 | [0.0572607 0.08069166 0.09707933] | [0.02643825 0.03146974 0.03452139] | [0.00339198 0.00242023 0.0019559 ] | [0.06601952 0.09308643 0.11167338] |
| 135 | 59.90675187110901 | 68.7034010887146 | 89.52626037597656 | [0.01308878 0.02122309 0.02819235] | [0.00515679 0.0068574 0.00813695] | [0.00069599 0.00058147 0.00052631] | [0.01383122 0.02294859 0.03091336] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.3432s, training loss at epoch 136: 88.6954
using time 61.5738s, training loss at epoch 137: 87.8204
using time 61.4004s, training loss at epoch 138: 87.2404
using time 53.3841s, training loss at epoch 139: 86.5673
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 140 | 56.09763717651367 | 74.97895050048828 | 85.81751251220703 | [0.05723308 0.08040669 0.09605745] | [0.02632316 0.03128503 0.03420119] | [0.00338991 0.00240955 0.00193315] | [0.06599195 0.09267277 0.11040483] |
| 140 | 56.09763717651367 | 66.08892011642456 | 85.81751251220703 | [0.01311721 0.02152382 0.02839172] | [0.00520957 0.00696639 0.00822819] | [0.00069673 0.00058849 0.00053098] | [0.01384599 0.02324413 0.03119412] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 56.3472s, training loss at epoch 141: 85.0548
using time 58.7998s, training loss at epoch 142: 84.3047
using time 61.4294s, training loss at epoch 143: 83.7744
using time 58.6572s, training loss at epoch 144: 82.8952
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 145 | 55.205522775650024 | 75.96521496772766 | 82.24386596679688 | [0.05664996 0.07976399 0.09554302] | [0.02623511 0.03119453 0.03412855] | [0.0033451 0.00238921 0.00191959] | [0.06512327 0.09190061 0.10968783] |
| 145 | 55.205522775650024 | 64.1243736743927 | 82.24386596679688 | [0.01310208 0.02145829 0.02856967] | [0.00524888 0.0069997 0.00830331] | [0.00069673 0.0005896 0.00053394] | [0.01386077 0.02325891 0.03135667] |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.4994s, training loss at epoch 146: 81.5304
using time 61.4106s, training loss at epoch 147: 80.9708
using time 56.8084s, training loss at epoch 148: 80.2756
using time 55.7782s, training loss at epoch 149: 79.6424
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 150 | 50.554200887680054 | 70.90792417526245 | 78.94828033447266 | [0.05640211 0.07950339 0.09474983] | [0.02601684 0.03095786 0.03380066] | [0.00332993 0.00237783 0.00190558] | [0.0648475 0.09148696 0.10892946] |
| 150 | 50.554200887680054 | 61.43150043487549 | 78.94828033447266 | [0.01309849 0.02139094 0.0286726 ] | [0.00526433 0.00700612 0.00834215] | [0.00069599 0.00058775 0.00053616] | [0.01383122 0.0231998 0.03148966] |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 61.6005s, training loss at epoch 151: 78.3064
using time 59.4658s, training loss at epoch 152: 77.6508
using time 47.7446s, training loss at epoch 153: 76.9794
using time 55.2966s, training loss at epoch 154: 76.5720
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 155 | 60.0142605304718 | 72.03431415557861 | 75.79754638671875 | [0.05617375 0.07911774 0.09454723] | [0.02598776 0.03090085 0.0337768 ] | [0.00331752 0.00236818 0.00190236] | [0.06461309 0.09103193 0.108764 ] |
| 155 | 60.0142605304718 | 63.03007531166077 | 75.79754638671875 | [0.0131688 0.0217521 0.02842926] | [0.00526362 0.00706775 0.00829078] | [0.00069969 0.00059921 0.00053074] | [0.01389033 0.02364311 0.03114979] |
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 53.3853s, training loss at epoch 156: 75.3833
using time 55.3694s, training loss at epoch 157: 74.7915
using time 55.1050s, training loss at epoch 158: 74.2417
using time 58.6737s, training loss at epoch 159: 73.5959
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 160 | 61.216028690338135 | 69.82913279533386 | 73.18667602539062 | [0.05571866 0.07819073 0.0938651 ] | [0.02573167 0.03053834 0.03346321] | [0.00329684 0.0023406 0.00188903] | [0.06415807 0.09001158 0.10795047] |
| 160 | 61.216028690338135 | 61.38755416870117 | 73.18667602539062 | [0.0131379 0.02177722 0.02829308] | [0.00529164 0.00709775 0.00829777] | [0.00069969 0.00059773 0.00052926] | [0.0139051 0.02359878 0.03106113] |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 57.2767s, training loss at epoch 161: 72.5360
using time 55.7104s, training loss at epoch 162: 72.0784
using time 57.1863s, training loss at epoch 163: 71.5115
using time 61.2707s, training loss at epoch 164: 70.9790
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 165 | 52.004945516586304 | 87.83489203453064 | 70.55828094482422 | [0.05562597 0.07809684 0.09320438] | [0.02567386 0.03046705 0.03328587] | [0.00329408 0.00233647 0.00187639] | [0.06408913 0.08985991 0.10716452] |
| 165 | 52.004945516586304 | 81.4789366722107 | 70.55828094482422 | [0.01326709 0.02195099 0.02848608] | [0.00533873 0.00715758 0.00835462] | [0.00070634 0.00060549 0.00053296] | [0.01405287 0.02387954 0.03125323] |
+-------+--------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 63.4631s, training loss at epoch 166: 69.9598
using time 67.4411s, training loss at epoch 167: 69.5398
using time 53.0752s, training loss at epoch 168: 69.1183
using time 60.7449s, training loss at epoch 169: 68.5918
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 170 | 62.34258460998535 | 72.9967155456543 | 68.1095962524414 | [0.05486664 0.07713276 0.09259751] | [0.02538383 0.0301467 0.03302856] | [0.00324513 0.00230717 0.00186214] | [0.06313772 0.08875683 0.10641994] |
| 170 | 62.34258460998535 | 67.57762932777405 | 68.1095962524414 | [0.01318259 0.0220134 0.02812577] | [0.00535091 0.00719922 0.0083236 ] | [0.0007056 0.00060696 0.00052655] | [0.01399376 0.02392387 0.03089859] |
+-------+-------------------+-------------------+------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 59.3640s, training loss at epoch 171: 67.5686
using time 53.7982s, training loss at epoch 172: 67.1283
using time 62.7492s, training loss at epoch 173: 66.8620
using time 70.4158s, training loss at epoch 174: 66.3644
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 175 | 70.5975353717804 | 74.8909866809845 | 65.78717041015625 | [0.05508117 0.07692773 0.0922598 ] | [0.02538126 0.03005674 0.03291411] | [0.00325203 0.002302 0.00185548] | [0.06328939 0.08848105 0.10607523] |
| 175 | 70.5975353717804 | 73.23875141143799 | 65.78717041015625 | [0.01332395 0.02190504 0.02814469] | [0.00539165 0.00719258 0.00833902] | [0.00071003 0.00060438 0.00052704] | [0.0140972 0.02382043 0.03091336] |
+-------+------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
using time 60.1980s, training loss at epoch 176: 65.4615
using time 70.9444s, training loss at epoch 177: 64.9919
using time 70.6525s, training loss at epoch 178: 64.5915
using time 70.9288s, training loss at epoch 179: 64.3561
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| Epoch | training time(s) | tesing time(s) | Loss | recall | ndcg | precision | hit_ratio |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+
| 180 | 64.04352879524231 | 89.00319290161133 | 63.91488265991211 | [0.05463113 0.07675966 0.09184289] | [0.02508424 0.02981529 0.0326235 ] | [0.00323134 0.00229648 0.00184652] | [0.06280679 0.0883018 0.10556505] |
| 180 | 64.04352879524231 | 79.35301399230957 | 63.91488265991211 | [0.01333036 0.02204507 0.02827239] | [0.00537992 0.00720917 0.00835354] | [0.00071373 0.00060918 0.00053 ] | [0.01414153 0.02404208 0.03113502] |
+-------+-------------------+-------------------+-------------------+------------------------------------+------------------------------------+------------------------------------+------------------------------------+

Pretrained Models

Hi! Above all, thank you for the great work. I was wondering if you could provide the trained model for reproducing the results of the paper?

main.py: error: unrecognized arguments: --agg concat

Hi, I run the recommended code
python main.py --dataset ali --gnn ngcf --dim 64 --lr 0.0001 --batch_size 1024 --gpu_id 0 --context_hops 3 --agg concat --ns mixgcf --K 1 --n_negs 64

And received the issue:
main.py: error: unrecognized arguments: --agg concat;

I checked the file /utils/parser.py, and found the --agg parameter is not declared in that file.

Is there anything I missed?

Thanks for your time and have a nice day.

NameError: name 'test_user_set' is not defined

Hello big bro! i meet some problem.

  • [ When i use this command ] " python main.py --dataset yelp2018 --dim 64 --lr 0.001 --batch_size 2048 --gpu_id 0 --context_hops 3 --pool mean --ns mixgcf --K 1 --n_negs 64"

  • [After first train epoch successfully completed ] the program move to test

  • [ Then an error occurred ] NameError: name 'test_user_set' is not defined

What adjustments should I make to make the code run smoothly.
This may not be a big problem for you,but I really can't handle it,I sincerely hope to get your help.
老哥,我想学习下您的代码……可出师未捷,连命令行运行示例都没跑通。
我应该做些什么调整来正确运行您分享的代码呢?
真诚期待能够得到您的帮助!

pinSAGE code

Hi, will you release your PinSAGE implementation?

Inconsistent learning curve between the suggested configure and presented logs

Hi, I tried the suggested config to run the code as follows. However, the evaluation results are different from the results reported in the logs folder. Does it use a different config to generate the logs?

"python main.py --dataset ali --gnn ngcf --dim 64 --lr 0.0001 --batch_size 1024 --gpu_id 0 --context_hops 3 --pool concat --ns mixgcf --K 1 --n_negs 64"

Take the "Recall@20" as an example, I got Recall@20=0.025 at 10th epoch evaluated on the test data, while the uploaded logs show that it can reach to 0.05. Do I miss something important to reproduce the learning curve shown in the logs? -:)

dynamic negative sampling

I wanted to apply dns on LightGCN, but I could't find any code. I noticed that you did it in your work. I was wondering if you implemented dns by selecting the highest rated negative items.

code is Inconsistent with the paper

Hi, thank you for sharing the code. However, it looks like the code is inconsistent with the paper. In the paper, dropout is not used,and the coefficient for positive mixing is vector-wise instead of element-wise. Could you please explain the inconsistency?

MixGCF for large graph version?

Hi! thank you for the great work. I was wondering if there is a kind of solution to make this work adaptive to large user-item graph?

Epochs for training

Hi, I am curious about the training epochs for each dataset. What are the epochs setting for each dataset with results shown in Table 2 in the paper?

Best regards,

Number of negatives

Hi, I have noticed that the provided configs all have K==1. However, in the paper, it's claimed that "Note that when 𝐾 = 1, the 𝐾-pair loss will degenerate into the BPR loss."

I am not sure if that means the reported configs are not the best performing configs and shall we expect higher performance if I increase K.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.