Transformation Forests



Regression models for supervised learning problems with a continuous response are commonly understood as models for the conditional mean of the response given predictors.  This notion is simple and therefore appealing for interpretation and visualisation.  Information about the whole underlying conditional distribution is, however, not available from these models. A more general understanding of regression models as models for conditional distributions allows much broader inference from such models, for example the computation of prediction intervals.  Several random forest-type algorithms aim at estimating conditional distributions, most prominently quantile regression forests (Meinshausen, 2006, JMLR). We propose a novel approach based on a parametric family of distributions characterised by their transformation function.  A dedicated novel “transformation tree” algorithm able to detect distributional changes is developed.  Based on these transformation trees, we introduce “transformation forests” as an adaptive local likelihood estimator of conditional distribution functions.  The resulting predictive distributions are fully parametric yet very general and allow inference procedures, such as likelihood-based variable importances, to be applied in a straightforward way.  The procedure allows general transformation models to be estimated without the necessity of a priori specifying the dependency structure of parameters.  Applications include the computation of probabilistic forecasts, modelling differential treatment effects, or the derivation of counterfactural distributions for all types of response variables. An application to model-based survival forests will be discussed.

Technical Reports available from and

See also the Econometric Institute event page: