Capítulo 13 Referências

Agresti, Alan. 2002. Categorical Data Analysis. Vol. 482. John Wiley & Sons.
Ah-Pine, Julien, and Xinyu Wang. 2016. “Similarity Based Hierarchical Clustering with an Application to Text Collections.” In Advances in Intelligent Data Analysis XV: 15th International Symposium, IDA 2016, Stockholm, Sweden, October 13-15, 2016, Proceedings 15, 320–31. Springer.
Albert, Jim. 2009. Bayesian Computation with R. Springer.
Allaire, JJ, and François Chollet. 2021. Keras: R Interface to ’Keras’.
Altman, Douglas G. 1990. Practical Statistics for Medical Research. CRC press.
Amos, DE, and WG Bulgren. 1972. “Computation of a Multivariate f Distribution.” Mathematics of Computation 26 (117): 255–64.
Anderson, Edgar. 1935. “The Irises of the Gaspe Peninsula.” Bull. Am. Iris Soc. 59: 2–5.
Arruda, Marcelo Leme de. 2000. “Poisson, Bayes, Futebol e DeFinetti.” PhD thesis, Universidade de São Paulo.
Asimov, Daniel. 1985. “The Grand Tour: A Tool for Viewing Multidimensional Data.” SIAM Journal on Scientific and Statistical Computing 6 (1): 128–43.
Bang-Jensen, Jørgen, and Gregory Z Gutin. 2008. Digraphs: Theory, Algorithms and Applications. Springer Science & Business Media.
Bauer, Jan O, and Bernhard Drabant. 2021. “Principal Loading Analysis.” Journal of Multivariate Analysis 184: 104754.
Bayes, Thomas. 1763. “An Essay Towards Solving a Problem in the Doctrine of Chances. By the Late Rev. Mr. Bayes, FRS Communicated by Mr. Price, in a Letter to John Canton, AMFR S.” Philosophical Transactions of the Royal Society of London, no. 53: 370–418.
Belsley, David A, Edwin Kuh, and Roy E Welsch. 2004. Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. John Wiley & Sons.
Beran, Rudolf, and Gilles R Ducharme. 1991. Asympotic Theory for Bootstrap Methods in Statistics. ?
Berg, Sven. 1996. “Condorcet’s Jury Theorem and the Reliability of Majority Voting.” Group Decision and Negotiation 5 (3): 229–38.
Bishop, Christopher M. 1999. “Bayesian PCA.” In Advances in Neural Information Processing Systems, 382–88.
Blei, David M, Alp Kucukelbir, and Jon D McAuliffe. 2017. “Variational Inference: A Review for Statisticians.” Journal of the American Statistical Association 112 (518): 859–77.
Blei, David M, Andrew Y Ng, and Michael I Jordan. 2003. “Latent Dirichlet Allocation.” Journal of Machine Learning Research 3 (Jan): 993–1022.
Box, George EP. 1949. “A General Distribution Theory for a Class of Likelihood Criteria.” Biometrika 36 (3/4): 317–46.
———. 1950. “Problems in the Analysis of Growth and Wear Curves.” Biometrics 6 (4): 362–89.
———. 1979. “Robustness in the Strategy of Scientific Model Building.” In Robustness in Statistics, 201–36. Elsevier.
Box, George EP, and George C Tiao. 1973. Bayesian Inference in Statistical Analysis. John Wiley & Sons.
Breiman, Leo. 1996a. “Bagging Predictors.” Machine Learning 24 (2): 123–40.
———. 1996b. “Bias, Variance, and Arcing Classifiers.” University of California, Berkeley.
———. 1996c. “Heuristics of Instability and Stabilization in Model Selection.” The Annals of Statistics 24 (6): 2350–83.
———. 2001. “Random Forests.” Machine Learning 45 (1): 5–32.
Breiman, Leo, Jerome Friedman, Charles J Stone, and Richard A Olshen. 1984. Classification and Regression Trees. CRC press.
Bryson Jr, Arthur E, Walter F Denham, and Stewart E Dreyfus. 1963. “Optimal Programming Problems with Inequality Constraints I: Necessary Conditions for Extremal Solutions.” AIAA Journal 1 (11): 2544–50.
Buchta, Christian, and Michael Hahsler. 2019. Cba: Clustering for Business Analytics.
Canty, Angelo, and B. D. Ripley. 2022. boot: Bootstrap R (S-Plus) Functions.
Cardot, Hervé. 2021. Gmedian: Geometric Median, k-Medians Clustering and Robust Median PCA.
Cardot, Hervé, Peggy Cénac, and Jean-Marie Monnez. 2012. “A Fast and Recursive Algorithm for Clustering Large Datasets with k-Medians.” Computational Statistics & Data Analysis 56 (6): 1434–49.
Cardot, Hervé, Peggy Cénac, Pierre-André Zitt, et al. 2013. “Efficient and Fast Estimation of the Geometric Median in Hilbert Spaces with an Averaged Stochastic Gradient Algorithm.” Bernoulli 19 (1): 18–43.
Census, United States. Bureau of the. 1975. Statistical Abstract of the United States, 1975. Hoover’s.
Chambers, John M., and Trevor J. Hastie. 1993. Statistical Models in s. Chapman & Hall, London.
Chatterjee, Samprit, and Ali S Hadi. 2012. Regression Analysis by Example. John Wiley & Sons.
Cody, William J. 1988. “Algorithm 665: Machar: A Subroutine to Dynamically Determined Machine Parameters.” ACM Transactions on Mathematical Software (TOMS) 14 (4): 303–11.
Condorcet, Marquis de. 1785. Essai Sur l’application de l’analyse à La Probabilité Des Décisions Rendues à La Pluralité Des Voix. Paris, Imprimerie Royale.
Cortes, Corinna, and Vladimir Vapnik. 1995. “Support-Vector Networks.” Machine Learning 20 (3): 273–97.
Cortez, Paulo, and Anı́bal de Jesus Raimundo Morais. 2007. “A Data Mining Approach to Predict Forest Fires Using Meteorological Data.”
Cramer, Jan Salomon. 2002. “The Origins of Logistic Regression.”
Cybenko, George. 1989. “Approximation by Superpositions of a Sigmoidal Function.” Mathematics of Control, Signals and Systems 2 (4): 303–14.
Dalgaard, Peter. 2008. Introductory Statistics with r. Springer New York.
Davison, Anthony Christopher, and David Victor Hinkley. 1997. Bootstrap Methods and Their Application. Cambridge University Press.
Drost, HG. 2018. “Philentropy: Information Theory and Distance Quantification with R.” Journal of Open Source Software 3 (26): 765.
Dua, Dheeru, and Casey Graff. 2019. UCI Machine Learning Repository.” University of California, Irvine, School of Information; Computer Sciences.
Duda, Richard O, Peter E Hart, and David G Stork. 2001. Pattern Classification. John Wiley & Sons, Inc.
Efron, Bradley. 1979. “Bootstrap Methods: Another Look at the Jackknife.” Annals of Statistics 7 (1): 1–26.
Efron, Bradley, and Trevor Hastie. 2016. Computer Age Statistical Inference: Algorithms, Evidence, and Data Science. Cambridge University Press. Cambridge, UK.
Everitt, Brian S, and Anders Skrondal. 2006. “The Cambridge Dictionary of Statistics.”
Ezekiel, Mordecai. 1930. “Methods of Correlation Analysis.”
Far From Alaska. 2014. Rolling Dice.”
Farrar, Donald E, and Robert R Glauber. 1967. “Multicollinearity in Regression Analysis: The Problem Revisited.” The Review of Economic and Statistics, 92–107.
Fisher, N. I., and P. Switzer. 1985. “Chi-Plots for Assessing Dependence.” Biometrika 72 (2): 253–65.
———. 2001. “Graphical Assessment of Dependence: Is a Picture Worth 100 Tests?” The American Statistician 55 (3): 233–39.
Fisher, Ronald A. 1922. “On the Mathematical Foundations of Theoretical Statistics.” Philosophical Transactions of the Royal Society of London. Series A, Containing Papers of a Mathematical or Physical Character 222 (594-604): 309–68.
———. 1936. “The Use of Multiple Measurements in Taxonomic Problems.” Annals of Eugenics 7 (2): 179–88.
———. 1938. “The Statistical Utilization of Multiple Measurements.” Annals of Eugenics 8 (4): 376–86.
———. 1940. “The Precision of Discriminant Functions.” Annals of Eugenics 10 (1): 422–29.
Fokoue, E. 2020. “Speaker Accent Recognition Data Set.” University of California, Irvine, School of Information; Computer Sciences.
Forgy, Edward W. 1965. “Cluster Analysis of Multivariate Data: Efficiency Versus Interpretability of Classifications.” Biometrics 21: 768–69.
Freund, Yoav, and Robert E Schapire. 1997. “A Decision-Theoretic Generalization of on-Line Learning and an Application to Boosting.” Journal of Computer and System Sciences 55 (1): 119–39.
Freund, Yoav, and Robert E. Schapire. 1996. “Experiments with a New Boosting Algorithm.” In Icml, 96:148–56. Citeseer.
Friedman, Jerome H, and Werner Stuetzle. 1981. “Projection Pursuit Regression.” Journal of the American Statistical Association 76 (376): 817–23.
Friedman, Jerome H, and John W Tukey. 1974. “A Projection Pursuit Algorithm for Exploratory Data Analysis.” IEEE Transactions on Computers 100 (9): 881–90.
Friedman, Nir, and Daphne Koller. 2003. “Being Bayesian about Network Structure. A Bayesian Approach to Structure Discovery in Bayesian Networks.” Machine Learning 50: 95–125.
Friendly, Michael, John Fox, and Phil Chalmers. 2021. Matlib: Matrix Functions for Teaching and Learning Linear Algebra and Multivariate Statistics.
Gamerman, Dani, and Hedibert F Lopes. 2006. Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference. CRC press.
Gelfand, Alan E, and Adrian FM Smith. 1990. “Sampling-Based Approaches to Calculating Marginal Densities.” Journal of the American Statistical Association 85 (410): 398–409.
Geman, Stuart, and Donald Geman. 1984. “Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images.” IEEE Transactions on Pattern Analysis and Machine Intelligence, no. 6: 721–41.
Genz, Alan. 1992. “Numerical Computation of Multivariate Normal Probabilities.” Journal of Computational and Graphical Statistics 1 (2): 141–49.
Genz, Alan, and Frank Bretz. 2009. Computation of Multivariate Normal and t Probabilities. Lecture Notes in Statistics. Heidelberg: Springer-Verlag.
Genz, Alan, Frank Bretz, Tetsuhisa Miwa, Xuefei Mi, Friedrich Leisch, Fabian Scheipl, and Torsten Hothorn. 2021. mvtnorm: Multivariate Normal and t Distributions.
Glur, Christoph. 2020. Data.tree: General Purpose Hierarchical Data Structure.
González, Ignacio, and Sébastien Déjean. 2021. CCA: Canonical Correlation Analysis.
Goodfellow, Ian, Yoshua Bengio, and Aaron Courville. 2016. Deep Learning. MIT Press.
Guha, Sudipto, Rajeev Rastogi, and Kyuseok Shim. 2000. ROCK: A Robust Clustering Algorithm for Categorical Attributes.” Information Systems 25 (5): 345–66.
Gunst, Richard F, and John T Webster. 1973. “Density Functions of the Bivariate Chi-Square Distribution.” Journal of Statistical Computation and Simulation 2 (3): 275–88.
Hand, David J, and Keming Yu. 2001. “Idiot’s Bayes—Not so Stupid After All?” International Statistical Review 69 (3): 385–98.'s_Bayes_Not_So_Stupid_after_All/links/569d4f9708aed27a702fa0a3/Idiots-Bayes-Not-So-Stupid-after-All.pdf.
Hartigan, John A. 1975. Clustering Algorithms. John Wiley & Sons, Inc.
Hartigan, John A, and Manchek A Wong. 1979. “Algorithm AS 136: A k-Means Clustering Algorithm.” Journal of the Royal Statistical Society. Series C (Applied Statistics) 28 (1): 100–108.
Harvey, Andrew C, and Simon Peters. 1990. “Estimation Procedures for Structural Time Series Models.” Journal of Forecasting 9 (2): 89–108.
Hastie, Trevor, and Robert Tibshirani. 1986. “[Generalized Additive Models]: Comment.” Statistical Science 1 (3): 297–318.
Hastie, Trevor, Robert Tibshirani, and Jerome Friedman. 2009. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Science & Business Media.
Hastings, W Keith. 1970. “Monte Carlo Sampling Methods Using Markov Chains and Their Applications.”
Hinton, Geoffrey. 2022. “The Forward-Forward Algorithm: Some Preliminary Investigations.” arXiv Preprint arXiv:2212.13345.
Holgate, Philip. 1964. “Estimation for the Bivariate Poisson Distribution.” Biometrika 51 (1-2): 241–87.
Hornik, Kurt, Christian Buchta, and Achim Zeileis. 2009. “Open-Source Machine Learning: R Meets Weka.” Computational Statistics 24 (2): 225–32.
Hornik, Kurt, Maxwell Stinchcombe, and Halbert White. 1989. “Multilayer Feedforward Networks Are Universal Approximators.” Neural Networks 2 (5): 359–66.
Hotelling, Harold. 1931. “The Generalization of Student’s Ratio.” The Annals of Mathematical Statistics Vol. 2 (Number 3).
———. 1933. “Analysis of a Complex of Statistical Variables into Principal Components.” Journal of Educational Psychology 24 (6): 417.
———. 1935. “The Most Predictable Criterion.” Journal of Educational Psychology 26 (2): 139.
Huang, Zhexue. 1997. “A Fast Clustering Algorithm to Cluster Very Large Categorical Data Sets in Data Mining.” DMKD 3 (8): 34–39.
Husson, Francois, Julie Josse, Sebastien Le, and Jeremy Mazet. 2023. FactoMineR: Multivariate Exploratory Data Analysis and Data Mining.
Hyndman, Rob. 2020. Fpp2: Data for "Forecasting: Principles and Practice" (2nd Edition).
———. 2021. Fpp3: Data for "Forecasting: Principles and Practice" (3rd Edition).
Hyndman, Rob J, and George Athanasopoulos. 2021. Forecasting: Principles and Practice, 3rd Ed. OTexts.
Imdad, M. U., M. Aslam, S. Altaf, and A. Munir. 2019. “Some New Diagnostics of Multicollinearity in Linear Regression Model.” Sains Malaysiana 48(9): 2051–60.
Izbicki, Rafael, and Tiago Mendonça dos Santos. 2020. Aprendizado de Máquina: Uma Abordagem Estatística.
Jaynes, Edwin T. 1963. “Information Theory and Statistical Mechanics (Notes by the Lecturer).” Statistical Physics 3, 181.
Johnson, Norman Lloyd, Samuel Kotz, and Narayanaswamy Balakrishnan. 1997. Discrete Multivariate Distributions. John Wiley & Sons, Inc.
Johnson, Richard A., and Dean W. Wichern. 1998. Applied Multivariate Statistical Analysis. Prentice Hall Upper Saddle River, New Jersey.
Jones, M Chris, and Robin Sibson. 1987. “What Is Projection Pursuit?” Journal of the Royal Statistical Society: Series A (General) 150 (1): 1–18.
Jordan, Camille. 1875. “Essai Sur La géométrie à \(n\) Dimensions.” Bulletin de La Société Mathématique de France 3: 103–74.
Kaufman, P., and P. J. Rousseeuw. 1987. “Clustering by Means of Medoids.”
Kent, JT, John Bibby, and KV Mardia. 1979. Multivariate Analysis. Academic Press Amsterdam.
Kohavi, Ron. 1996. “Scaling up the Accuracy of Naive-Bayes Classifiers: A Decision-Tree Hybrid.” In Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, 96:202–7.
Korb, Kevin B, and Ann E Nicholson. 2011. Bayesian Artificial Intelligence. 2nd ed. CRC press.
Kotz, Samuel, N Balakrishnan, and Norman L Johnson. 2000. “Continuous Multivariate Distributions. Vol. 1. Models and Applications.” Wiley-Interscience, New York.
Kotz, Samuel, and Saralees Nadarajah. 2004. Multivariate t-Distributions and Their Applications. Cambridge University Press.
Kuhn, Max. 2022. Caret: Classification and Regression Training.
Kuhn, Max, and Ross Quinlan. 2021. C50: C5.0 Decision Trees and Rule-Based Models.
Kullback, Solomon. 1978. Information Theory and Statistics. Dover Publications, Inc.
Kullback, Solomon, and Richard A Leibler. 1951. “On Information and Sufficiency.” The Annals of Mathematical Statistics 22 (1): 79–86.
Lee, Eun-Kyung. 2018. PPtreeViz: An R Package for Visualizing Projection Pursuit Classification Trees.” Journal of Statistical Software 83 (8): 1–30.
Legendre, Pierre, and Louis Legendre. 2012. Numerical Ecology. Elsevier.
Lehmann, Erich L, and Joseph P Romano. 2005. Testing Statistical Hypotheses, 3rd Edition. Springer.
Lloyd, Stuart P. 1957. “Least Squares Quantization in PCM.” Tecnical Note at Bell Laboratories in 1957, Published After in IEEE Transactions on Information Theory in 1982 28 (2): 129–37.
Maathuis, Marloes, Mathias Drton, Steffen Lauritzen, and Martin Wainwright. 2018. Handbook of Graphical Models. CRC Press.
MacQueen, James. 1967. “Some Methods for Classification and Analysis of Multivariate Observations.” In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, 1:281–97. 14. Oakland, CA, USA.
McCullagh, Peter, and John Ashworth Nelder. 1989. Generalized Linear Models. Chapman Hall, London. 2nd ed.
McCulloch, Warren S., and Walter Pitts. 1943. “A Logical Calculus of the Ideas Immanent in Nervous Activity.” The Bulletin of Mathematical Biophysics 5 (4): 115–33.
McNeil, Donald R. 1977. “Interactive Data Analysis: A Practical Primer.” (No Title).
Menzel, Uwe. 2022. CCP: Significance Tests for Canonical Correlation Analysis (CCA).
Metropolis, Nicholas, Arianna W Rosenbluth, Marshall N Rosenbluth, Augusta H Teller, and Edward Teller. 1953. “Equation of State Calculations by Fast Computing Machines.” The Journal of Chemical Physics 21 (6): 1087–92.
Metropolis, Nicholas, and Stanislaw Ulam. 1949. The Monte Carlo Method.” Journal of the American Statistical Association 44 (247): 335–41.
Meyer, Paul L. 1970. Introductory Probability and Statistical Applications. 2nd ed. Addison-Wesley Publishing Company.
Milborrow, Stephen. 2020. Rpart.plot: Plot ’Rpart’ Models: An Enhanced Version of ’Plot.rpart’.
Murtagh, Fionn, and Pedro Contreras. 2011. “Methods of Hierarchical Clustering.” CoRR abs/1105.0121.
Murtagh, Fionn, and Pierre Legendre. 2014. “Ward’s Hierarchical Agglomerative Clustering Method: Which Algorithms Implement Ward’s Criterion?” Journal of Classification 31: 274–95.
Nagarajan, Radhakrishnan, Marco Scutari, and Sophie Lèbre. 2013. “Bayesian Networks in R.” Springer 122: 125–27.
Nason, Guy P. 2001. “Robust Projection Indices.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 63 (3): 551–67.
———. 2006. “On the Sum of t and Gaussian Random Variables.” Statistics & Probability Letters 76 (12): 1280–86.
Neter, John, Michael H Kutner, Christopher J Nachtsheim, and William Wasserman. 2005. Applied Linear Statistical Models. 5th ed. McGraw Hill/Irwin New York.
O’Hagan, Anthony, and Jonathan Forster. 1994. “Kendall’s Advanced Theory of Statistics: Volume 2B.” Bayesian Inference.
Ossani, Paulo Cesar, and Marcelo Angelo Cirillo. 2021. Projection Pursuit.
Patel, Jagdish K., and Campbell B. Read. 1982. “Handbook of the Normal Distribution.” Marcel Dekker, New York.
Paula, Gilberto Alvarenga. 2013. Modelos de Regressão: Com Apoio Computacional. IME-USP São Paulo.
Paulino, Carlos Daniel Mimoso, Maria Antónia Amaral Turkman, and Bento Murteira. 2003. Estatı́stica Bayesiana. Fundação Calouste Gulbenkian, Lisboa.
Pearl, Judea. 1988. “Probabilistic Reasoning in Intelligent Systems. Representation & Reasoning.” Morgan Kaufmann Publishers San Mateo, CA.
Pearson, Karl. 1901. “On Lines and Planes of Closest Fit to Systems of Points in Space.” Philosophical Magazine 2 (11): 559–72.
Peng, Roger D. 2022. Advanced Statistical Computing. Work in Progress.
Peters, Andrea, and Torsten Hothorn. 2021. Ipred: Improved Predictors.
Pettitt, Anthony N, and Candice M Hincksman. 2013. Introduction to MCMC.” Case Studies in Bayesian Statistical Modelling and Analysis, 17–29.
Pourzanjani, Arya A, Richard M Jiang, and Linda R Petzold. 2017. “Improving the Identifiability of Neural Networks for Bayesian Inference.” In NIPS Workshop on Bayesian Deep Learning, 4:29.
Quinlan, J. Ross. 1993. C4.5: Programs for Machine Learning. Morgan Kaufmann Publishers, Inc., San Mateo, CA.
Ripley, Brian D. 2007. Pattern Recognition and Neural Networks. Cambridge University Press.
———. 2019. Tree: Classification and Regression Trees.
Robert, Christian P, and George Casella. 2010. Introducing monte carlo methods with R. Springer.
Robert Tibshirani. Original R port by Friedrich Leisch, Trevor Hastie &, Kurt Hornik, and Brian D. Ripley. Balasubramanian Narasimhan has contributed to the upgrading of the code. 2020. Mda: Mixture and Flexible Discriminant Analysis.
Rosenblatt, Frank. 1958. “The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain.” Psychological Review 65 (6): 386.
———. 1962. “Principles of Neurodynamics: Perceptrons and the Theory Of.” Brain Mechanisms, 555–59.
Rousseeuw, Peter J, and L Kaufman. 1990. “Finding Groups in Data.” Hoboken: Wiley Online Library.
Rumelhart, David E, Geoffrey E Hinton, and Ronald J Williams. 1985. “Learning Internal Representations by Error Propagation.” California Univ San Diego La Jolla Inst for Cognitive Science.
Schapire, Robert E., and Yoav Freund. 1995. “A Decision-Theoretic Generalization of on-Line Learning and an Application to Boosting.” In Second European Conference on Computational Learning Theory, 23–37.
Scutari, Marco. 2010. “Learning Bayesian Networks with the bnlearn R Package.” Journal of Statistical Software 35 (3): 1–22.
Shannon, Claude E. 1948. “A Mathematical Theory of Communication.” The Bell System Technical Journal 27 (3): 379–423.
Sokal, Robert R, and F James Rohlf. 1962. “The Comparison of Dendrograms by Objective Methods.” Taxon, 33–40.
Sparapani, Rodney, Charles Spanbauer, and Robert McCulloch. 2021. “Nonparametric Machine Learning and Efficient Computation with Bayesian Additive Regression Trees: The BART r Package.” Journal of Statistical Software 97 (1): 1–66.
Spedicato, Giorgio Alfredo. 2017. “Discrete Time Markov Chains with r.” The R Journal, July.
S, StatLib, Robert Tibshirani, and Friedrich Leisch. 2019. Bootstrap: Functions for the Book "an Introduction to the Bootstrap".
Stasinopoulos, D. Mikis, and Robert A. Rigby. 2007. “Generalized Additive Models for Location Scale and Shape (GAMLSS) in R.” Journal of Statistical Software 23 (7): 1–46.
Statisticat, and LLC. 2021. LaplacesDemon: Complete Environment for Bayesian Inference. vignette("BayesianInference").
Tattar, Prabhanjan N., Suresh Ramaiah, and Bangalore G. Manjunath. 2016. A Course in Statistics with R. John Wiley & Sons.
Taylor, Sean J, and Benjamin Letham. 2018. “Forecasting at Scale.” The American Statistician 72 (1): 37–45.
Taylor, Sean, and Ben Letham. 2021. Prophet: Automatic Forecasting Procedure.
Therneau, Terry, and Beth Atkinson. 2019. Rpart: Recursive Partitioning and Regression Trees.
Thorndike, Edward Lee. 1939. Your City. Harcourt Brace College Publishers.
Thorndike, Robert L. 1953. “Who Belongs in the Family?” Psychometrika 18 (4): 267–76.
Tibshirani, Robert, Guenther Walther, and Trevor Hastie. 2001. “Estimating the Number of Clusters in a Data Set via the Gap Statistic.” Journal of the Royal Statistical Society: Series B (Statistical Methodology) 63 (2): 411–23.
Tong, Yung L. 1990. The Multivariate Normal Distribution. Springer-Verlag New York Inc.
Van Der Maaten, Laurens, Eric Postma, Jaap Van den Herik, et al. 2009. “Dimensionality Reduction: A Comparative.” J Mach Learn Res 10 (66-71): 13.
Vapnik, Vladimir. 1995. The Nature of Statistical Learning Theory. 1st ed. Springer Science & Business Media.
———. 2000. The Nature of Statistical Learning Theory. 2nd ed. Springer Science & Business Media.
Venables, W. N., and B. D. Ripley. 2002. Modern Applied Statistics with s. Fourth. New York: Springer.
Wang, Zhendong, and Xingzhong Xu. 2020. “Testing High Dimensional Covariance Matrices via Posterior Bayes Factor.” Journal of Multivariate Analysis 181: 104674.
Ward Jr, Joe H. 1963. Hierarchical Grouping to Optimize an Objective Function.” Journal of the American Statistical Association 58 (301): 236–44.
Wechsler, Sergio. 1993. “Exchangeability and Predictivism.” Erkenntnis, 343–50.
Weihs, Claus, Uwe Ligges, Karsten Luebke, and Nils Raabe. 2005. “klaR Analyzing German Business Cycles.” In Data Analysis and Decision Support, edited by D. Baier, R. Decker, and L. Schmidt-Thieme, 335–43. Berlin: Springer-Verlag.
Whittaker, Joe. 1990. Graphical Models in Applied Multivariate Statistics. Wiley Publishing.
Wickramasinghe, Indika, and Harsha Kalutarage. 2020. “Naive Bayes: Applications, Variations and Vulnerabilities: A Review of Literature with Code Snippets for Implementation.” Soft Computing 25 (3): 2277–93.
Wilkinson, GN, and CE Rogers. 1973. “Symbolic Description of Factorial Models for Analysis of Variance.” Journal of the Royal Statistical Society Series C: Applied Statistics 22 (3): 392–99.
Wishart, John. 1928. “The Generalised Product Moment Distribution in Samples from a Normal Multivariate Population.” Biometrika, 32–52.
Witten, Ian H., and Eibe Frank. 2005. Data Mining: Practical Machine Learning Tools and Techniques. 2nd ed. San Francisco: Morgan Kaufmann.
Woods, Hubert, Harold H Steinour, and Howard R Starke. 1932. “Effect of Composition of Portland Cement on Heat Evolved During Hardening.” Industrial & Engineering Chemistry 24 (11): 1207–14.
Zabala, F. J. 2009. “Desempate Técnico.” PhD thesis, USP - Universidade de São Paulo.
Zhang, Harry. 2004. “The Optimality of Naive Bayes.” American Association for Artificial Intelligence 1 (2): 3.
Zhang, Huajie, and Charles X Ling. 2001. “Learnability of Augmented Naive Bayes in Nominal Domains.” A A 1 (2): 3.