Recordings of the codes and their results(Updated 30th,Aug)

Just Keep a record of the running tasks, I have already begun to get confused.....

All these are based on the files of cluster :

On going: 

Job 32415:failed(because of bus error...)

Job 32416:failed

Job 32419:

Job 32622 :1000, triple, NYC, radius_in = 5, radius_out = 10

Job 32623 : 50000, triple,NYC, radius_in = 5, radius_out = 10

job 32625 : all, triple , NYC, radius_in = 5, radius_out = 10

job 32825 : all , binary, NYC, radius = 1

job 32826: all, binary,NYC,radius = 2

job 32829: all, binary, NYC, radius = 3

job 32830 : 50000, binary, NYC,radius = 3  

job 32832 : radius = 0.5, binary, NYC, 5000 checkins(failed)

job 32833 : radius = 0.5, binary, NYC, 100000 checkins

job 32835 : radius = 0.5, binary,NYC, 50000 checkins

job 33035:  triple, radius_in = 1, radius_out =5, all, th_center = 0.2(cancelled )

job 33235: triple , radius_in = 1, radius_out = 5, 10000checkins, th_center = 0.2

job 33236: triple, radius_in = 1, radius_out = 5, 50000checkins, th_center= 0.2

job 33738 : triple,radius_in =1 , radius_out = 5, 100000checkins, th_center = 0.2

job 33754 : pre_data running for NYC whole dataset, train_test_spilit random_state = 42. radius_in=1,radius_out=5

job 33755: pre_data running for TKY whole dataset, train_test_spilit random_state = 42,radius_in=1,radius_out=5

job 33756: pre_data running for NYC&TKY whole dataset, train_test_spilit random_state = 42, shuffle random_state = 26, radius_in=1,radius_out=5

job 33772 : pre_data running for NYC 10 0000 checkins, 26,42, radius_in = 1

job 33773 : pre_data running for NYC 50 0000 checkins, 26,42, radius_in = 1

job 33774 : pre_data running for NYC  50000 checkins, 26,42, radius_in = 1

job 33775: pre_data running for NYC_TKY 50000 checkins, 26 42, radius_in = 1

job 33802: joint model running using file import,triple NYC all, radius_in =0, radius_out =1(Failed)

job 33797: joint model NYCTKY 50000, radius_center =1, radius_in = 0, radius_out =5

job 33808 : joint model NYC 50000,radius_in =0, radius_out =1, 26 42

 job 33812:  triple, 26 42 , radius_in = 1, radius_out =5, all, th_center = 0.2

job 33818: predata,NYCTKY 100000, RADIUS_CENTER=1, 26 42

job 33837: global , original,1 cpus, without evaluation, 1576s

job 33845: global without evaluation, 8cpus, 

job 33843: pre_DATA, NYC, 5000

 job 33858: pre_data global, 100000, radius_center = 1

job 33856: pre_data global, 50000, radius _center = 1

job 33851: pre_data global , 10000, radius-center =1

job 33838: whole global evaluation, 8 cpus

job 33872 :pre-data, global, 500000, radius_center = 1

 job 33911 : NYCTKY 50000, center=1,in=1,out=5

 job 33912: NYCTKY 100000. center=1, in=1, out=5 

job 33907 33908: GLOBAL dataset, center 15, predata

Completed:all data in NYC ,center for user, corrected new model(binary, positive and negative)

Job 31019 and 31020: binary user model, all checkins, auc 0.526 

Job 30606, 30607, 30609:  all NYC data with different radius, binary model,  

Job 30603: 10000 data in NYC, center for user,  corrected new model(binary, positive and negative)

Job 28347: Preprocessing the data with the change to the optimized codes and cut data. Original data: NYC data, cut data: check-ins 50000/227427,usr 943, pois 15983

 Job 28311: Preprocessing data with my own way, randomly choosing 400 centers.

Job 28301:  Preprocessing the data with that way I thought: that is to replace the non-negtive with 0.1 

Job 28327: Preprocessing the data with changes to the codes, but the coeds were not optimized...

Job 28345 : Preprocessing the whole data with the optimized changed codes.(failed when using  )

Job 28345: test for different codes: using cython.parallel codes and not using it(results came out that there not much difference)

Job 28450: with 1000 check-ins of the NYC data, the results turn out to be good! 

Job 28455: new model, with all the checkins in NYC, expected to be long

Job 28458: 50000 checkins data in NYC dataset, new model

Job 28462: max_sampled = 20, all data, do not expected a good feedback, the experiment on small sataset shows little possility

Job 28499: MAX_SAMPLED = 10, RADIUS = 2, CHECK_NUM = all

Job 28497: MAX_SAMPLED = 10, RADIUS = 28, CHECK_NUM = all

Job 28492: MAX_SAMPLED = 30,RADIUS = 10, CHECK_NUM = 5000

 Job 28490: MAX_SAMPLED = 30, RADIUS = 10, all checkins

 Job 28493 : MAX_SAMPLED = 10, RADIUS = 10, all checkins

 Job 28494: MAX_SAMPLED = 10 , RADIUS =2, all checkins

 Job 28479: maxsampled =10, use the pre@k evaluation k =10, checkins =10000

 

posted @ 2017-08-24 15:07  Fassy  阅读(257)  评论(0编辑  收藏  举报