tune.splsda and tune.multilevel

Create issue
Issue #44 resolved
Kim-Anh Le Cao repo owner created an issue
  • see my code attached, make sure tune.splsda outputs parameters as in my code (mean error, sd error, error rate per class, PER keepX and PER comp)

  • tune.multilevel (code attached), could you please help me implement parallelization or a more efficient way to run my code as it is taking for ever? (note: might not be in mixOmics yet)

  • we need to make sure all the 'tune' functions are similar. i.e if you put repeat in tune.splsda then tune.multilevel must have a repeat too.

  • add already tested in tune.splsda

  • add sd in error.per.class

Comments (12)

  1. Florian Rohart

    related to that: we need to do something about NA in predict functions, because we take them into account in plsda/splsda but not in predict....

  2. Kim-Anh Le Cao reporter

    Issues to solve with tune.splsda (and probably tune.multilevel, I havent had that one).

    method.predict should be dist (consistent with tune.multilevel)

    list.keepX should be test.keepX consistent with tune.multilevel)

    cat('comp', comp.real, nrep, sep = '-', '\t’) not needed. but you could add a progress bar with progress = T/F as an input (then do the same for tune.multilevel)

    Pb with outputs when nrepeat = 1:

    $error.per.class.sd comp 1 comp 2 comp 3 Left_Antecubital_fossa NA NA NA Stool NA NA NA Subgingival_plaque NA NA NA

    $mat.sd.error comp 1 comp 2 comp 3 10 NA NA NA

    here the numbers are the same across all repeats - are the folds random and rerun?? $error.per.class $error.per.class1 [,1] [,2] [,3] [,4] [,5] Left_Antecubital_fossa 0.7592593 0.7592593 0.7592593 0.7592593 0.7592593 Stool 0.5370370 0.5370370 0.5370370 0.5370370 0.5370370 Subgingival_plaque 0.1296296 0.1296296 0.1296296 0.1296296 0.1296296

    same for $mat.error.final across each repeat. $mat.error.final1 [,1] [,2] [,3] [,4] [,5] 10 0.4753086 0.4753086 0.4753086 0.4753086 0.4753086

  3. Kim-Anh Le Cao reporter

    Seems to work for tune.splsda. FOr the outputs however, mat.error.final should be per component but also report ALL the errors across ALL the list.keepX. that is, mat.error.finalcomp = mat.error from my script. This is for users who want to see the evolution of the error rates across all possible keepX per component.

    Pls check the same applies for tune.multilevel.

  4. fbartolo

    Hi,

    I am sorry but this id not already the case?

    $mat.error.final $mat.error.final1 [,1] [,2] 5 0.671875 0.71875 10 0.671875 0.71875 15 0.671875 0.71875

    $mat.error.final2 [,1] [,2] 5 0.234375 0.265625 10 0.234375 0.265625 15 0.265625 0.296875

    1 and 2 are comp [,1] [,2] are repeats and 5 10 and 15 are keepX

  5. Kim-Anh Le Cao reporter

    I get this output, across 5 repeats , when testing list.keepX = seq(10,50,100) #

    And so I should get below for each list item a matrix of size length(list.keepX) * nrepeats.

    $mat.error.final $mat.error.final1 [,1] [,2] [,3] [,4] [,5] 10 0.4506173 0.5 0.5185185 0.4753086 0.4382716

    $mat.error.final2 [,1] [,2] [,3] [,4] [,5] 10 0.191358 0.191358 0.191358 0.191358 0.1604938

    $mat.error.final3 [,1] [,2] [,3] [,4] [,5] 10 0.1049383 0.117284 0.09876543 0.08024691 0.1419753

  6. Log in to comment