**GUIDE Classification and Regression Trees and Forests (version 29.7)**

### © Wei-Yin Loh 1997-2018

GUIDE is a multi-purpose machine learning algorithm for constructing classification and regression trees. It is designed and maintained by Wei-Yin Loh at the University of Wisconsin, Madison. GUIDE stands for *Generalized, Unbiased, Interaction Detection and Estimation.*

Development of GUIDE is supported in part by research grants from the U.S. Army Research Office, National Science Foundation, National Institutes of Health, Bureau of Labor Statistocs, and Eli Lilly. Work on precursors of GUIDE was additionally supported by IBM and Pfizer.

** Properties and features:**

- Choice of classification or regression trees
- Negligible bias in split variable selection
- Importance ranking and identification of unimportant variables
- Power to detect local interactions between pairs of predictor variables
- Ability to use ordered (continuous) and unordered (categorical) predictor variables
- Automatic handling of missing values, including splits on missingness
- Automatic prediction for new (unseen) samples
- Choice of weighted least squares (Gaussian), least median of squares, Poisson, quantile (including median), proportional hazards, or multi-response (e.g., longtudinal) regression tree models
- Choice of piecewise constant, best simple polynomial, multiple, or stepwise linear regression models
- Choice of roles for predictor variables (splitting only, node modeling only, both, or none)
- Choice of using categorical variables for splitting only or both splitting and fitting through dummy 0-1 vectors (ANCOVA)
- Choice of stopping rules: no pruning, pruning by cross-validation, or pruning with a test sample
- Choice of batch or interactive mode of operation
- On-the-fly generation of products and powers of predictor variables as regressor variables
- Generation of LaTeX ( MikTeX for Windows) source code for the tree diagrams in PostScript (.ps) format. The LaTeX code requires the PSTricks package which is normally included in most LaTeX distributions. See PSTricks User Guide and TUG India doc for some excellent documentation on PSTricks. The .ps files may be converted to pdf with ps2pdf (which comes with Ghostscript) or to enhanced windows meta file (.emf) with pstoedit. Emf format is best for use in Word and PowerPoint documents. For a short introduction to LaTeX, look here.
- Generation of R source code for prediction of future cases
- Free executables for Windows, Macintosh, and Linux computers

See Table 1 for a feature comparison between GUIDE and other classification tree algorithms.

See Table 2 for a feature comparison between GUIDE and other regression tree algorithms.

**Documentation:**

- Loh, W.-Y., Man, M. and Wang, S. (2019),
Subgroups from regression trees with adjustment for prognostic
effects and post-selection inference,
*Statistics in Medicine*, in press. DOI - Loh, W.-Y., Eltinge, J., Cho, M. and Li, Y. (2019),
Classification and regression trees and forests for
incomplete data from sample surveys,
*Statistica Sinica*, in press. - Loh, W.-Y., Fu, H., Man, M., Champion, V. and Yu,
M. (2016),
Identification of subgroups with differential treatment
effects for longitudinal and multiresponse
variables,
*Statistics in Medicine*, vol. 35, 4837-4855. DOI - Loh, W.-Y., He, X., and Man, M. (2015),
A regression tree approach to identifying subgroups with differential treatment effects,
*Statistics in Medicine*, vol. 34, 1818-1833. DOI - Loh, W.-Y. (2014), Fifty years of classification and regression trees (with discussion),
*International Statistical Review*, vol. 34, 329-370. DOI - Loh, W.-Y. and Zheng, W. (2013), Regression trees for longitudinal and multiresponse data,
*Annals of Applied Statistics*, vol. 7, 496-522. DOI - Loh, W.-Y. (2012), Variable selection for classification and regression in large p, small n problems,
*Lecture Notes in Statistics---Proceedings*, A. Barbour, H.P. Chan and D. Siegmund (Eds.), vol 205, Springer, pp 133--157. - Loh, W.-Y. (2011), Classification and regression trees,
*Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery*, vol.1, 14-23. DOI - Loh, W.-Y. (2010), Tree-structured classifiers,
*Wiley Interdisciplinary Reviews: Computational Statistics*, vol.2, 364-369. DOI - Loh, W.-Y. (2009), Improving the precision of classification trees,
*Annals of Applied Statistics*, vol. 3, 1710-1737. DOI [The definitive reference for GUIDE classification.] - Loh, W.-Y. (2008), Classification and regression tree methods,
*Encyclopedia of Statistics in Quality and Reliability*, F. Ruggeri, R. Kenett, and F. W. Faltin (Eds.) Wiley, pp. 315-323. - Loh, W.-Y. (2008), Regression by parts: Fitting visually interpretable models with GUIDE,
*Handbook of Computational Statistics, vol. III*, 447-469, Springer. - Loh, W.-Y., Chen, C.-W., and Zheng, W.(2007), Extrapolation errors in linear model trees,
*ACM Transactions on Knowledge Discovery in Data*, vol. 1, issue 2, article 6. DOI. - Kim, H., Loh, W.-Y., Shih, Y.-S., and Chaudhuri, P. (2007), Visualizable and interpretable regression models with good prediction power ,
*IIE Transactions*, vol. 39, Issue 6, June 2007, pp. 565-579. DOI. Datasets - Loh, W.-Y. (2006), Regression tree models for designed experiments,
*Second Lehmann Symposium, Institute of Mathematical Statistics Lecture Notes-Monograph Series*, vol. 49, 210-228. - Loh, W.-Y. (2002), Regression trees with unbiased variable selection and interaction detection,
*Statistica Sinica*, vol. 12, 361-386. [The definitive reference for GUIDE regression.] - Chaudhuri, P. and Loh, W.-Y. (2002), Nonparametric estimation of conditional quantiles using quantile regression trees,
*Bernoulli*, vol. 8, 561-576. - Chaudhuri, P., Lo, W.-D., Loh, W.-Y., and Yang, C.-C. (1995), Generalized regression trees,
*Statistica Sinica*, vol. 5, 641-666. - Chaudhuri, P., Huang, M.-C., Loh, W.-Y., and Yao, R. (1994), Piecewise-polynomial regression trees,
*Statistica Sinica*, vol. 4, 143-167. - Loh, W.-Y., and Vanichsetakul, N. (1988), Tree-structured classification via generalized discriminant analysis (with discussion),
*Journal of the American Statistical Association*, vol. 83, 715-728. [This the article that started it all.]

(Mostly) third-party applications of GUIDE, QUEST, CRUISE, and LOTUS: Look hereGUIDE compiled binaries: The following executable files may be freely distributed but not sold for profit.

- guide.gz for 64-bit Linux (compiled with Intel Fortran Compiler 18.0.1, CentOS release 6.9). Puts scratch files in TMPDIR if the environment variable is defined, otherwise in the current folder. Faster on Intel processors.
- guide.gz for 64-bit Linux (compiled with NAG Fortran 6.2 (Chiyoda) Build 6207, CentOS release 6.9). Puts scratch files in TMPDIR if the environment variable is defined, otherwise in /tmp. Faster on AMD processors.
- guide.gz for 64-bit Linux (compiled with GFortran 5.4.0, Ubuntu 16.04). Puts scratch files in TMPDIR if the environment variable is defined, otherwise in /tmp.
- guide.gz for Mac OS X High Sierra 10.13.6 (compiled with NAG Fortran 6.1)
- guide.gz for Mac OS X High Sierra 10.13.6 (compiled with GFortran 8.1; requires GFortran to be installed--see manual)
- guide.zip for 64-bit Windows (compiled with Intel 64 16.0.3 and Visual Studio 2013 in Windows 10)
- guide.zip for 64-bit Windows (compiled with Gfortran 6.3.0 in Msys2 and Windows 10 Pro)
- guide.zip for 32-bit Windows (compiled with IA-32 16.0.3 and Visual Studio 2013 in Windows 10)

GUIDE manual:guideman.pdfManual example data files:datafiles.zipGUIDE revision history:history.txt

**Earlier algorithms developed by Wei-Yin Loh and his students:**

QUEST: Binary classification tree CRUISE: Classification tree that splits each node into two or more subnodes LOTUS: Logistic regression tree

** License:**

Copyright (c) 1997-2018 Wei-Yin Loh. All rights reserved.

Redistribution and use in binary forms, with or without modification, are permitted provided that the following condition is met:

Redistributions in binary form must reproduce the above copyright notice, this condition and the following disclaimer in the documentation and/or other materials provided with the distribution.

THIS SOFTWARE IS PROVIDED BY WEI-YIN LOH "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL WEI-YIN LOH BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

The views and conclusions contained in the software and documentation are those of the author and should not be interpreted as representing official policies, either expressed or implied, of the University of Wisconsin.

Last modified: November 30, 2018 by Wei-Yin Loh