Refactoring: separate standard CV from agghoo + some bug fixes
[agghoo.git] / man / agghoo.Rd
CommitLineData
c5946158
BA
1% Generated by roxygen2: do not edit by hand
2% Please edit documentation in R/agghoo.R
3\name{agghoo}
4\alias{agghoo}
5\title{agghoo}
6\usage{
504afaad 7agghoo(data, target, task = NULL, gmodel = NULL, params = NULL, loss = NULL)
c5946158
BA
8}
9\arguments{
10\item{data}{Data frame or matrix containing the data in lines.}
11
504afaad
BA
12\item{target}{The target values to predict. Generally a vector,
13but possibly a matrix in the case of "soft classification".}
c5946158
BA
14
15\item{task}{"classification" or "regression". Default:
16regression if target is numerical, classification otherwise.}
17
18\item{gmodel}{A "generic model", which is a function returning a predict
19function (taking X as only argument) from the tuple
20(dataHO, targetHO, param), where 'HO' stands for 'Hold-Out',
21referring to cross-validation. Cross-validation is run on an array
22of 'param's. See params argument. Default: see R6::Model.}
23
24\item{params}{A list of parameters. Often, one list cell is just a
25numerical value, but in general it could be of any type.
26Default: see R6::Model.}
27
504afaad 28\item{loss}{A function assessing the error of a prediction.
c5946158 29Arguments are y1 and y2 (comparing a prediction to known values).
504afaad 30loss(y1, y2) --> real number (error). Default: see R6::AgghooCV.}
c5946158
BA
31}
32\value{
504afaad 33An R6::AgghooCV object o. Then, call o$fit() and finally o$predict(newData)
c5946158
BA
34}
35\description{
504afaad
BA
36Run the agghoo procedure (or standard cross-validation).
37Arguments specify the list of models, their parameters and the
38cross-validation settings, among others.
c5946158
BA
39}
40\examples{
41# Regression:
42a_reg <- agghoo(iris[,-c(2,5)], iris[,2])
43a_reg$fit()
44pr <- a_reg$predict(iris[,-c(2,5)] + rnorm(450, sd=0.1))
45# Classification
46a_cla <- agghoo(iris[,-5], iris[,5])
504afaad 47a_cla$fit()
c5946158
BA
48pc <- a_cla$predict(iris[,-5] + rnorm(600, sd=0.1))
49
50}
504afaad
BA
51\references{
52Guillaume Maillard, Sylvain Arlot, Matthieu Lerasle. "Aggregated hold-out".
53Journal of Machine Learning Research 22(20):1--55, 2021.
54}