GBDT训练分类器时,残差是如何计算的?
大家都知道,对于回归任务,GBDT的loss=(y-pred)^2,因此残差residual=2*(y-pred)很容易理解。那么,GBDT做分类任务时,残差是怎样的呢???Gradient Boosting attempts to solve this minimization problem numerically via steepest descent,The st