r - mlr - Ensemble Models -


the mlr package great , idea of creating modelmultiplexer helps. modelmultiplexer "selects" 1 single model out of models used.

is there support or planned support creating bagged or boosted ensemble of individual models?

bls = list(   makelearner("classif.ksvm"),   makelearner("classif.randomforest") ) lrn = makemodelmultiplexer(bls) ps = makemodelmultiplexerparamset(lrn,   makenumericparam("sigma", lower = -10, upper = 10, trafo = function(x) 2^x),   makeintegerparam("ntree", lower = 1l, upper = 500l)) > print(res) tune result: **op. pars: selected.learner=classif.randomforest; classif.randomforest.ntree=197 mmce.test.mean=0.0333** 

you have few options in mlr. if have single model, can use baggingwrapper:

lrn = makelearner("classif.part") bag.lrn = makebaggingwrapper(lrn, bw.iters = 50, bw.replace = true, bw.size = 0.8, bw.feats = 3/4) 

more details on in the tutorial.

for several learners, can use stacking:

base.learners = list(   makelearner("classif.ksvm"),   makelearner("classif.randomforest") ) lrn = makestackedlearner(base.learners, super.learner = null, predict.type = null,   method = "stack.nocv", use.feat = false, resampling = null,   parset = list()) 

you can combine predictions of base learners using different methods, including fitting learner on top of them. can combine bagging individual learners.

boosting supported in number of learners mlr supports, see list of learners.


Comments

Popular posts from this blog

sql - VB.NET Operand type clash: date is incompatible with int error -

SVG stroke-linecap doesn't work for circles in Firefox? -

python - TypeError: Scalar value for argument 'color' is not numeric in openCV -