Fix agghoo for tree / rpart
[agghoo.git] / man / agghoo.Rd
... / ...
CommitLineData
1% Generated by roxygen2: do not edit by hand
2% Please edit documentation in R/agghoo.R
3\name{agghoo}
4\alias{agghoo}
5\title{agghoo}
6\usage{
7agghoo(data, target, task = NULL, gmodel = NULL, params = NULL, loss = NULL)
8}
9\arguments{
10\item{data}{Data frame or matrix containing the data in lines.}
11
12\item{target}{The target values to predict. Generally a vector,
13but possibly a matrix in the case of "soft classification".}
14
15\item{task}{"classification" or "regression". Default:
16regression if target is numerical, classification otherwise.}
17
18\item{gmodel}{A "generic model", which is a function returning a predict
19function (taking X as only argument) from the tuple
20(dataHO, targetHO, param), where 'HO' stands for 'Hold-Out',
21referring to cross-validation. Cross-validation is run on an array
22of 'param's. See params argument. Default: see R6::Model.}
23
24\item{params}{A list of parameters. Often, one list cell is just a
25numerical value, but in general it could be of any type.
26Default: see R6::Model.}
27
28\item{loss}{A function assessing the error of a prediction.
29Arguments are y1 and y2 (comparing a prediction to known values).
30loss(y1, y2) --> real number (error). Default: see R6::AgghooCV.}
31}
32\value{
33An R6::AgghooCV object o. Then, call o$fit() and finally o$predict(newData)
34}
35\description{
36Run the agghoo procedure (or standard cross-validation).
37Arguments specify the list of models, their parameters and the
38cross-validation settings, among others.
39}
40\examples{
41# Regression:
42a_reg <- agghoo(iris[,-c(2,5)], iris[,2])
43a_reg$fit()
44pr <- a_reg$predict(iris[,-c(2,5)] + rnorm(450, sd=0.1))
45# Classification
46a_cla <- agghoo(iris[,-5], iris[,5])
47a_cla$fit()
48pc <- a_cla$predict(iris[,-5] + rnorm(600, sd=0.1))
49
50}
51\references{
52Guillaume Maillard, Sylvain Arlot, Matthieu Lerasle. "Aggregated hold-out".
53Journal of Machine Learning Research 22(20):1--55, 2021.
54}