References

1. E. H. L. Aarts and P. J. M. van Laarhoven. Statistical cooling: A general approach to combinatorial optimization problems. Philips Journal of Research, 40:193–226, 1985.

2. M. Abramowitz and I. A. Stegun, editors. Handbook of Mathematical Functions. National Bureau of Standards Applied Mathematics Series, No. 55. U.S. Government Printing Office, Washington, DC, 1964.

3. I. S. Abramson. On bandwidth variation in kernel estimates—a square root law. Annals of Statistics, 10:1217–1223, 1982.

4. D. Ackley. A Connectionist Machine for Genetic Hillclimbing. Kluwer, Boston, 1987.

5. D. H. Ackley. An empirical study of bit vector function optimization. In L. Davis, editor. Genetic Algorithms and Simulated Annealing. Morgan Kauffman, Los Altos, CA, 1987.

6. R. P. Agarwal, M. Meehan, and D. O'Regan. Fixed Point Theory and Applications. Cambridge University Press, Cambridge, 2001.

7. H. Akaike. Information theory and an extension of the maximum likelihood principle. In B. N. Petrox and F. Caski, editors. Proceedings of the Second International Symposium on Information Theory. Akedemia Kiaodo, Budapest, 1973.

8. J. T. Alander. On optimal population size of genetic algorithms. In Proceedings of CompEuro 92, pages 65–70. IEEE Computer Society Press. The Hague, The Netherlands, 1992.

9. M. Aldrin. Moderate projection pursuit regression for multivariate response data. Computational Statistics and Data Analysis, 21:501–531, 1996.

10. N. S. Altman. An introduction to kernel and nearest-neighbor nonparametric regression. The American Statistican, 46:175–185, 1992.

11. C. M. Anderson and R. P. Labelle. Update of comparative occurrence rates for offshore oil spills. Spill Science and Technology Bulletin, 6:303–321, 2000.

12. C. Andrieu and J. Thoms. A tutorial on adaptive MCMC. Statistics and Computing, 18(4):343–373, 2008.

13. J. Antonisse. A new interpretation of schema notation that overturns the binary encoding constraint. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms. Morgan Kaufmann, Los Altos, CA, 1989.

14. L. Armijo. Minimization of functions having Lipschitz-continuous first partial derivatives. Pacific Journal of Mathematics, 16:1–3, 1966.

15. K. Arms and P. S. Camp. Biology, 4th ed. Saunders College Publishing, Fort Worth, TX, 1995.

16. Y. Atchadé, G. Fort, E. Moulines, and P. Priouret. Adaptive Markov chain Monte Carlo: Theory and methods. In Bayesian Time Series Models. Cambridge University Press. Cambridge, UK, 2011.

17. T. Bäck. Evolutionary Algorithms in Theory and Practice. Oxford University Press, New York, 1996.

18. J. E. Baker. Adaptive selection methods for genetic algorithms. In J. J. Grefenstette, editor. Proceedings of an International Conference on Genetic Algorithms and Their Applications. Lawrence Erlbaum Associates, Hillsdale, NJ, 1985.

19. J. E. Baker. Reducing bias and inefficiency in the selection algorithm. In J. J. Grefenstette, editor. Proceedings of the 2nd International Conference on Genetic Algorithms and Their Applications. Lawrence Erlbaum Associates, Hillsdale, NJ, 1987.

20. S. G. Baker. A simple method for computing the observed information matrix when using the EM algorithm with categorical data. Journal of Computational and Graphical Statistics, 1:63–76, 1992.

21. D. L. Banks, R. T. Olszewski, and R. Maxion. Comparing methods for multivariate nonparametric regression. Communications in Statistics—Simulation and Computation, 32:541–571, 2003.

22. G. A. Barnard. Discussion of paper by M. S. Bartlett. Journal of the Royal Statistical Society, Series B, 25:294, 1963.

23. O. E. Barndorff-Nielsen and D. R. Cox. Inference and Asymptotics. Chapman & Hall, London, 1994.

24. R. R. Barton and J. S. Ivey Jr. Nelder-Mead simplex modifications for simulation optimization. Management Science, 42:954–973, 1996.

25. L. E. Baum, T. Petrie, G. Soules, and N. Weiss. A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. Annals of Mathematical Statistics, 41:164–171, 1970.

26. R. Beran. Prepivoting to reduce level error of confidence sets. Biometrika, 74:457–468, 1987.

27. R. Beran. Prepivoting test statistics: A bootstrap view of asymptotic refinements. Journal of the American Statistical Association, 83:687–697, 1988.

28. J. O. Berger. Statistical Decision Theory: Foundations, Concepts, and Methods. Springer, New York, 1980.

29. J. O. Berger and M.-H. Chen. Predicting retirement patterns: Prediction for a multinomial distribution with constrained parameter space. The Statistician, 42(4):427–443, 1993.

30. N. Bergman. Posterior Cramér-Rao bounds for sequential estimation. In A. Doucet, N. de Freitas, and N. Gordon, editors. Sequential Monte Carlo Methods in Practice. Springer, New York, 2001.

31. N. Bergman, L. Ljung, and F. Gustafsson. Terrain navigation using Bayesian statistics. IEEE Control Systems, 19:33–40, 1999.

32. A. Berlinet and L. Devroye. A comparison of kernel density estimates. Publications de l'Institute de Statistique de l'Université de Paris, 38:3–59, 1994.

33. D. Bertsimas and J. Tsitsiklis. Simulated annealing. Statistical Science, 8:10–15, 1993.

34. C. Berzuini, N. G. Best, W. R. Gilks, and C. Larizza. Dynamic conditional independence models and Markov chain Monte Carlo methods. Journal of the American Statistical Association, 87:493–500, 1997.

35. J. Besag. Spatial interaction and the statistical analysis of lattice systems (with discussion). Journal of the Royal Statistical Society, Series B, 36:192–236, 1974.

36. J. Besag. On the statistical analysis of dirty pictures (with discussion). Journal of the Royal Statistical Society, Series B, 48:259–302, 1986.

37. J. Besag. Comment on “Representations of knowledge in complex systems” by Grenander and Miller. Journal of the Royal Statistical Society, Series B, 56:591–592, 1994.

38. J. Besag and P. Clifford. Generalized Monte Carlo significance tests. Biometrika, 76:633–642, 1989.

39. J. Besag and P. Clifford. Sequential Monte Carlo p-values. Biometrika, 78:301–304, 1991.

40. J. Besag, P. Green, D. Higdon, and K. Mengersen. Bayesian computation and stochastic systems (with discussion). Statistical Science, 10:3–66, 1995.

41. J. Besag and P. J. Green. Spatial statistics and Bayesian computation. Journal of the Royal Statistical Society, Series B, 55(1):25–37, 1993.

42. J. Besag and C. Kooperberg. On conditional and intrinsic autoregressions. Biometrika, 82:733–746, 1995.

43. J. Besag, J. York, and A. Mollié. Bayesian image restoration, with two applications in spatial statistics (with discussion). Annals of the Institute of Statistical Mathematics, 43:1–59, 1991.

44. N. Best, S. Cockings, J. Bennett, J. Wakefield, and P. Elliott. Ecological regression analysis of environmental benzene exposure and childhood leukaemia: Sensitivity to data inaccuracies, geographical scale and ecological bias. Journal of the Royal Statistical Society, Series A, 164(1):155–174, 2001.

45. N. G. Best, R. A. Arnold, A. Thomas, L. A. Waller, and E. M. Conlon. Bayesian models for spatially correlated disease and exposure data. In J. O. Berger, J. M. Bernardo, A. P. Dawid, D. V. Lindley, and A. F. M. Smith, editors. Bayesian Statistics 6, pages 131–156. Oxford University Press, Oxford, 1999.

46. R. J. H. Beverton and S. J. Holt. On the Dynamics of Exploited Fish Populations, volume 19 of Fisheries Investment Series 2. UK Ministry of Agriculture and Fisheries, London, 1957.

47. P. J. Bickel and D. A. Freedman. Some asymptotics for the bootstrap. Annals of Statistics, 9:1196–1217, 1981.

48. C. Biller. Adaptive Bayesian regression splines in semiparametric generalized linear models. Journal of Computational and Graphical Statistics, 9(1):122–140, 2000.

49. P. Billingsley. Probability and Measure, 3rd ed. Wiley, New York, 1995.

50. C. M. Bishop. Neural Networks for Pattern Recognition. Oxford University Press. Oxford, UK, 1995.

51. C. M. Bishop, editor. Neural Networks and Machine Learning. Springer, 1998.

52. F. Black and M. Scholes. The pricing of options and corporate liabilities. Journal of Political Economy, 81:635–654, 1973.

53. C. L. Blake and C. J. Merz. UCI Repository of Machine Learning Databases, University of California, Irvine, Dept. of Information and Computer Sciences. Available from http://www.ics.uci.edu/~mlearn/MLRepository.html, 1998.

54. L. B. Booker. Improving search in genetic algorithms. In L. Davis, editor. Genetic Algorithms and Simulated Annealing. Morgan Kauffman, Los Altos, CA, 1987.

55. D. L. Borchers, S. T. Buckland, and W. Zucchini. Estimating Animal Abundance. Springer, London, 2002.

56. A. Bowman. An alternative method of cross-validation for the smoothing of density estimates. Biometrika, 71:353–360, 1984.

57. G. E. P. Box and D. R. Cox. An analysis of transformations. Journal of the Royal Statistical Society, Series B, 26:211–246, 1964.

58. G. E. P. Box, W. G. Hunter, and J. S. Hunter. Statistics for Experimenters. Wiley, New York, 1978.

59. P. Boyle, M. Broadie, and P. Glasserman. Monte Carlo methods for security pricing. Journal of Economic Dynamics and Control, 21:1267–1321, 1997.

60. R. A. Boyles. On the convergence of the EM algorithm. Journal of the Royal Statistical Society, Series B, 45:47–50, 1983.

61. C. J. A. Bradshaw, R. J. Barker, R. G. Harcourt, and L. S. Davis. Estimating survival and capture probability of fur seal pups using multistate mark–recapture models. Journal of Mammalogy, 84(1):65–80, 2003.

62. C. J. A. Bradshaw, C. Lalas, and C. M. Thompson. Cluster of colonies in an expanding population of New Zealand fur seals (Arctocephalus fosteri). Journal of Zoology, 250: 41–51, 2000.

63. L. Breiman. Bagging predictors. Machine Learning, 24:123–140, 1996.

64. L. Breiman and J. H. Friedman. Estimating optimal transformations for multiple regression and correlation (with discussion). Journal of the American Statistical Association, 80:580–619, 1985.

65. L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone. Classification and Regression Trees. Wadsworth. Boca Raton, FL, 1984.

66. L. Breiman, W. Meisel, and E. Purcell. Variable kernel estimates of multivariate densities. Technometrics, 19:135–144, 1977.

67. P. Brémaud. Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues. Springer, New York, 1999.

68. R. P. Brent. Algorithms for Minimization without Derivatives. Prentice-Hall, Englewood Cliffs, NJ, 1973.

69. N. E. Breslow and D. G. Clayton. Approximate inference in generalized linear mixed models. Journal of the American Statistical Association, 88:9–25, 1993.

70. S. P. Brooks. Markov chain Monte Carlo method and its application. The Statistician, 47:69–100, 1998.

71. S. P. Brooks and A. Gelman. General methods for monitoring convergene of iterative simulations. Journal of Computational and Graphical Statistics, 7:434–455, 1998.

72. S. P. Brooks and P. Giudici. Markov chain Monte Carlo convergence assessment via two-way analysis of variance. Journal of Computational and Graphical Statistics, 9(2): 266–285, 2000.

73. S. P. Brooks, P. Giudici, and A. Philippe. Nonparametric convergence assessment for MCMC model selection. Journal of Computational and Graphical Statistics, 12(1):1–22, 2003.

74. S. P. Brooks, P. Giudici, and G. O. Roberts. Efficient construction of reversible jump Markov chain Monte Carlo proposal distributions. Journal of the Royal Statistical Society, Series B, 65(1):3–39, 2003.

75. S. P. Brooks and B. J. T. Morgan. Optimization using simulated annealing. The Statistican, 44:241–257, 1995.

76. S. P. Brooks and G. O. Roberts. Assessing convergence of Markov chain Monte Carlo algorithms. Statistics and Computing, 8:319–335, 1999.

77. W. J. Browne, F. Steele, M. Golalizadeh, and M. J. Green. The use of simple reparameterizations to improve the efficiency of Markov chain Monte Carlo estimation for multilevel models with applications to discrete time survival models. Journal of the Royal Statistical Society: Series A (Statistics in Society), 172(3):579–598, 2009.

78. C. G. Broyden. Quasi-Newton methods and their application to function minimization. Mathematics of Computation, 21:368–381, 1967.

79. C. G. Broyden. The convergence of a class of double-rank minimization algorithms. Journal of the Institute of Mathematics and Its Applications, 6:76–90, 1970.

80. C. G. Broyden. Quasi-Newton methods. In W. Murray, editor. Numerical Methods for Unconstrained Optimization, pages 87–106. Academic, New York, 1972.

81. P. Bühlmann. Sieve bootstrap for time series. Bernoulli, 3:123–148, 2007.

82. P. Bühlmann. Sieve bootstrap for smoothing nonstationary time series. Annals of Statistics, 26:48–83, 2008.

83. P. Bühlmann and H. R. Künsch. Block length selection in the bootstrap for time series. Computational Statistics and Data Analysis, 31:295–310, 2009.

84. A. Buja. Remarks on functional canonical variates, alternating least squares methods, and ACE. Annals of Statistics, 18:1032–1069, 1989.

85. Á. Bürmen, J. Puhan, and T. Tuma. Grid restrained Nelder–Mead algorithm. Computational Optimization and Applications, 34:359–375, 2006.

86. K. P. Burnham and D. R. Anderson. Model Selection and Inference: A Practical Information Theoretic Approach, 2nd ed. Springer, New York, 2002.

87. E. Cameron and L. Pauling. Supplemental ascorbate in the supportive treatment of cancer: Re-evaluation of prolongation of survival times in terminal human cancer. Proceedings of the National Academy of Sciences of the USA, 75(9):4538–4542, 1978.

88. R. Cao, A. Cuevas, and W. González-Mantiega. A comparative study of several smoothing methods in density estimation. Computational Statistics and Data Analysis, 17:153–176, 1994.

89. O. Cape, C. P. Rober, and T. Ryden. Reversible jump, birth-and-death and more general continuous time Markov chain Monte Carlo sampler. Journal of the Royal Statistical Society, Series B, 65(3):679–700, 2003.

90. O. Cappé, S. J. Godsill, and E. Moulines. An overview of existing methods and recent advances in sequential Monte Carlo. Procedings of the IEEE, 95:899–924, 2007.

91. B. P. Carlin, A. E. Gelfand, and A. F. M. Smith. Hierarchical Bayesian analysis of changepoint problems. Applied Statistics, 41:389–405, 1992.

92. B. P. Carlin and T. A. Louis. Bayes and Empirical Bayes Methods for Data Analysis. Chapman & Hall, London, 1996.

93. E. Carlstein. The use of subseries methods for estimating the variance of a general statistic from a stationary time series. Annals of Statistics, 14:1171–1179, 1986.

94. E. Carlstein, K.-A. Do, P. Hall, T. Hesterberg, and H. R. Künsch. Matched-block bootstrap for dependent data. Bernoulli, 4:305–328, 1998.

95. J. Carpenter, P. Clifford, and P. Fernhead. Improved particle filter for nonlinear problems. IEE Proceedings Radar, Sonar, & Navigation, 146:2–7, 1999.

96. G. Casella and R. L. Berger. Statistical Inference, 2nd ed. Brooks/Cole, Pacific Grove, CA, 2001.

97. G. Casella and E. I. George. Explaining the Gibbs sampler. The American Statistican, 46(3):167–174, 1992.

98. G. Casella, K. L. Mengersen, C. P. Robert, and D. M. Titterington. Perfect samplers for mixtures of distributions. Journal of the Royal Statistical Society, Series B, 64(4):777–790, 2002.

99. G. Casella and C. Robert. Rao–Blackwellization of sampling schemes. Biometrika, 83:81–94, 1996.

100. G. Casella and C. P. Robert. Post-processing accept–reject samples: Recycling and rescaling. Journal of Computational and Graphical Statistics, 7:139–157, 1998.

101. J. M. Chambers and T. J. Hastie, editors. Statistical Models in S. Chapman & Hall, New York, 1992.

102. K. S. Chan and J. Ledholter. Monte Carlo EM estimation for time series models involving counts. Journal of the American Statistical Association, 90:242–252, 1995.

103. R. N. Chapman. The quantitative analysis of environmental factors. Ecology, 9:111–122, 1928.

104. M.-H. Chen and B. W. Schmeiser. Performance of the Gibbs, hit-and-run, and Metropolis samplers. Journal of Computational and Graphical Statistics, 2:251–272, 1993.

105. M.-H. Chen and B. W. Schmeiser. General hit-and-run Monte Carlo sampling for evaluating multidimensional integrals. Operations Research Letters, 19:161–169, 1996.

106. M.-H. Chen, Q.-M. Shao, and J. G. Ibrahim. Monte Carlo Methods in Bayesian Computation. Springer, New York, 2000.

107. M. H. Chen and Q. M. Shao. Monte Carlo estimation of Bayesian credible and HPD intervals. Journal of Computational and Graphical Statistics, 8(1):69–92, 1999.

108. Y. Chen, P. Diaconis, S. P. Holmes, and J. S. Liu. Sequential Monte Carlo methods for statistical analysis of tables. Journal of the American Statistical Association, 100:109–120, 2005.

109. Y. Chen, I. H. Dinwoodie, and S. Sullivant. Sequential importance sampling for multiway tables. Annals of Statistics, 34:523–545, 2006.

110. S. Chib and B. P. Carlin. On MCMC sampling in hierarchical longitudinal models. Statistics and Computing, 9(1):17–26, 1999.

111. S. Chib and E. Greenberg. Understanding the Metropolis–Hastings algorithm. The American Statistican, 49(4):327–335, 1995.

112. H. A. Chipman, E. I. George, and R. E. McCulloch. Bayesian CART model search (with discussion). Journal of the American Statistical Association, 93:935–960, 1998.

113. N. Chopin. A sequential particle filter method for static models. Biometrika, 89:539–551, 2002.

114. A. Ciampi, C.-H. Chang, S. Hogg, and S. McKinney. Recursive partitioning: A verstatile method for exploratory data analysis in biostatistics. In I. B. MacNeil and G. J. Umphrey, editors. Biostatistics, pages 23–50. Reidel, Dordrecht, Netherlands, 1987.

115. L. A. Clark and D. Pregiborn. Tree-based models. In J. M. Chambers and T. Hastie, editors. Statistical Models in S, pages 377–419. Duxbury, New York, 1991.

116. W. S. Cleveland. Robust locally weighted regression and smoothing scatter plots. Journal of the American Statistical Association, 74:829–836, 1979.

117. W. S. Cleveland, E. Grosse, and W. M. Shyu. Local regression models. In J. M. Chambers and T. J. Hastie, editors. Statistical Models in S. Chapman & Hall, New York, 1992.

118. W. S. Cleveland and C. Loader. Smoothing by local regression: Principles and methods (with discussion). In W. H. Härdle and M. G. Schimek, editors. Statistical Theory and Computational Aspects of Smoothing. Springer, New York, 1996.

119. M. Clyde. Discussion of “Bayesian model averaging: A tutorial” by Hoeting, Madigan, Raftery and Volinsky. Statistical Science, 14(4):382–417, 1999.

120. A. R. Conn, N. I. M. Gould, and P. L. Toint. Convergence of quasi-Newton matrices generated by the symmetric rank one update. Mathematical Programming, 50:177–195, 1991.

121. S. D. Conte and C. de Boor. Elementary Numerical Analysis: An Algorithmic Approach. McGraw-Hill, New York, 1980.

122. J. Corander and M. J. Sillanpaa. A unified approach to joint modeling of multiple quantitative and qualitative traits in gene mapping. Journal of Theoretical Biology, 218(4):435–446, 2002.

123. J. N. Corcoran and R. L. Tweedie. Perfect sampling from independent Metropolis–Hastings chains. Journal of Statistical Planning and Inference, 104(2):297–314, 2002.

124. M. K. Cowles. Efficient model-fitting and model-comparison for high-dimensional Bayesian geostatistical models. Journal of Statistical Planning and Inference, 112:221–239, 2003.

125. M. K. Cowles and B. P. Carlin. Markov chain Monte Carlo convergence diagnostics: A comparative review. Journal of the American Statistical Association, 91(434):883–904, 1996.

126. M. K. Cowles, G. O. Roberts, and J. S. Rosenthal. Possible biases induced by MCMC convergence diagnostics. Journal of Statistical Computing and Simulation, 64(1):87–104, 1999.

127. D. R. Cox and D. V. Hinkley. Theoretical Statistics. Chapman & Hall, London, 1974.

128. N. A. C. Cressie. Statistics for Spatial Data. Wiley, New York, 1993.

129. J. D. Cryer and K. Chan. Time Series Analysis: With Applications in R. Springer, New York, 2008.

130. V. Cimgrny. A thermodynamical approach to the travelling salesman problem: An efficient simulation algorithm. Journal of Optimization Theory and Applications, 45:41–55, 1985.

131. G. Dahlquist and Å. Björck, translated by N. Anderson. Numerical Methods. Prentice-Hall, Englewood Cliffs, NJ, 1974.

132. P. Damien, J. Wakefield, and S. Walker. Gibbs sampling for Bayesian non-conjugate and hierarchical models by using auxiliary variables. Journal of the Royal Statistical Society, Series B, 61:331–344, 1999.

133. G. B. Dantzig. Linear Programming and Extensions. Princeton University Press, Princeton, NJ, 1963.

134. W. C. Davidon. Variable metric methods for minimization. AEC Research and Development Report ANL-5990, Argonne National Laboratory. Argonne, IL, 1959.

135. L. Davis. Applying adaptive algorithms to epistatic domains. In Proceedings of the 9th Joint Conference on Artificial Intelligence, pages 162–164, 1985.

136. L. Davis. Job shop scheduling with genetic algorithms. In Proceedings of the 1st International Conference on Genetic Algorithms and Their Applications, pages 136–140, 1985.

137. L. Davis. Adapting operator probabilities in genetic algorithms. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms. Morgan Kaufmann, San Mateo, CA, 1989.

138. L. Davis, editor. Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York, 1991.

139. P. J. Davis and P. Rabinowitz. Methods of Numerical Integration. Academic, New York, 1984.

140. A. C. Davison and P. Hall. On studentizing and blocking methods for implementing the bootstrap with dependent data. Australian Journal of Statistics, 35:215–224, 1993.

141. A. C. Davison, D. Hinkley, and B. J. Worton. Bootstrap likelihoods. Biometrika, 79:113–130, 1992.

142. A. C. Davison and D. V. Hinkley. Bootstrap Methods and Their Applications. Cambridge University Press, Cambridge, 1997.

143. A. C. Davison, D. V. Hinkley, and E. Schechtman. Efficient bootstrap simulation. Biometrika, 73:555–566, 1986.

144. C. de Boor. A Practical Guide to Splines. Springer, New York, 1978.

145. F. R. de Hoog and M. F. Hutchinson. An efficient method for calculating smoothing splines using orthogonal transformations. Numerische Mathematik, 50:311–319, 1987.

146. K. A. DeJong. An Analysis of the Behavior of a Class of Genetic Adaptive Systems. Ph.D. thesis, University of Michigan, 1975.

147. P. Dellaportas and J. J. Forster. Markov chain Monte Carlo model determination for hierarchical and graphical log-linear models. Biometrika, 86(3):615–633, 1999.

148. P. Dellaportas, J. J. Forster, and I. Ntzoufras. On Bayesian model and variable selection using MCMC. Statistics and Computing, 12(2):27–36, 2002.

149. B. Delyon, M. Lavielle, and E. Moulines. Convergence of a stochastic approximation version of the EM algorithm. Annals of Statistics, 27:94–128, 1999.

150. A. P. Dempster, N. Laird, and D. B. Rubin. Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39:1–38, 1977.

151. D. G. T. Denison, B. K. Mallick, and A. F. M. Smith. A Bayesian CART algorithm. Biometrika, 85:363–377, 1998.

152. J. E. Dennis Jr., D. M. Gay, and R. E. Welsch. An adaptive nonlinear least-squares algorithm. ACM Transactions on Mathematical Software, 7:369–383, 1981.

153. J. E. Dennis Jr. and R. B. Schnabel. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs, NJ, 1983.

154. J. E. Dennis Jr. and D. J. Woods. Optimization on microcomputers: The Nelder–Mead simplex algorithm. In A. Wouk, editor. New Computing Environments, pages 116–122. SIAM, Philadelphia, 1987.

155. G. Der and B. S. Everitt. A Handbook of Statistical Analyses Using SAS, 2nd ed. Chapman & Hall/CRC, Boca Raton, FL, 2002.

156. E. H. Dereksdóttir and K. G. Magnússon. A strike limit algorithm based on adaptive Kalman filtering with application to aboriginal whaling of bowhead whales. Journal of Cetacean Research and Management, 5:29–38, 2003.

157. R. D. Deveaux. Finding transformations for regression using the ACE algorithm. Sociological Methods & Research, 18(2–3):327–359, 1989.

158. L. Devroye. Non-uniform Random Variate Generation. Springer, New York, 1986.

159. L. Devroye. A Course in Density Estimation. Birkhäuser, Boston, 1987.

160. L. Devroye and L. Györfi. Nonparametric Density Estimation: The L1 View. Wiley, New York, 1985.

161. P. Diaconis and M. Shahshahani. On non-linear functions of linear combinations. SIAM Journal of Scientific and Statistical Computing, 5:175–191, 1984.

162. R. Dias and D. Gamerman. A Bayesian approach to hybrid splines non-parametric regression. Journal of Statistical Computation and Simulation, 72(4):285–297, 2002.

163. T. J. DiCiccio and B. Efron. Bootstrap confidence intervals (with discussion). Statistical Science, 11:189–228, 1996.

164. P. Dierckx. Curve and Surface Fitting with Splines. Clarendon, New York, 1993.

165. X. K. Dimakos. A guide to exact simulation. International Statistical Review, 69(1):27–48, 2001.

166. P. Djuric, Y. Huang, and T. Ghirmai. Perfect sampling: A review and applications to signal processing. IEEE Transaction on Signal Processing, 50(2):345–356, 2002.

167. A. Doucet, N. de Freitas, and N. Gordon. Sequential Monte Carlo Methods in Practice. Springer, New York, 2001.

168. A. Doucet, S. Godsill, and C. Andrieu. On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing, 10:197–208, 2000.

169. K. A. Dowsland. Simulated annealing. In C. R. Reeves, editor. Modern Heuristic Techniques for Combinatorial Problems. Wiley, New York, 1993.

170. N. R. Draper and H. Smith. Applied Regression Analysis. Wiley, New York, 1966.

171. R. P. W. Duin. On the choice of smoothing parameter for Parzen estimators of probability density functions. IEEE Transactions on Computing, C-25:1175–1179, 1976.

172. R. Durbin, S. Eddy, A. Krogh, and G. Mitchison. Biological Sequence Analysis: Probabilistic Models of Proteins and Nucleic Acids. Cambridge University Press, Cambridge, 1998.

173. E. S. Edgington. Randomization Tests, 3rd ed. Marcel Dekker, New York, 1995.

174. R. G. Edwards and A. D. Sokal. Generalization of the Fortuin–Kasteleyn–Swendsen–Wang representation and Monte Carlo algorithm. Physical Review D, 38(6):2009–2012, 1988.

175. B. Efron. Bootstrap methods: Another look at the jackknife. Annals of Statistics, 7:1–26, 1979.

176. B. Efron. Nonparametric standard errors and confidence intervals (with discussion). Canadian Journal of Statistics, 9:139–172, 1981.

177. B. Efron. The Jackknife, the Bootstrap, and Other Resampling Plans. Number 38 in CBMS–NSF Regional Conference Series in Applied Mathematics. SIAM, Philadelphia, 1982.

178. B. Efron. Better bootstrap confidence intervals (with discussion). Journal of the American Statistical Association, 82:171–200, 1987.

179. B. Efron. Computer-intensive methods in statistical regression. SIAM Review, 30:421–449, 1988.

180. B. Efron. Jackknife-after-bootstrap standard errors and influence functions (with discussion). Journal of the Royal Statistical Society, Series B, 54:83–111, 1992.

181. B. Efron and G. Gong. A leisurely look at the bootstrap, the jackknife, and cross-validation. The American Statistican, 37:36–48, 1983.

182. B. Efron and D. V. Hinkley. Assessing the accuracy of the maximum likelihood estimator: Observed versus expected Fisher information. Biometrika, 65:457–482, 1978.

183. B. Efron and R. J. Tibshirani. An Introduction to the Bootstrap. Chapman & Hall, New York, 1993.

184. R. J. Elliott and P. E. Kopp. Mathematics of Financial Markets. Springer, New York, 1999.

185. Environmental Monitoring and Assessment Program, Mid-Atlantic Highlands Streams Assessment, EPA-903-R-00-015, US Environmental Protection Agency, National Health and Environmental Effects Research Laboratory, Western Ecology Division, Corvallis, OR, 2000.

186. V. A. Epanechnikov. Non-parametric estimation of a multivariate probability density. Theory of Probability and Its Applications, 14:153–158, 1969.

187. L. J. Eshelman, R. A. Caruana, and J. D. Schaffer. Biases in the crossover landscape. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms. Morgan Kaufmann, Los Altos, CA, 1989.

188. R. L. Eubank. Spline Smoothing and Nonparametric Regression. Marcel Dekker, New York, 1988.

189. M. Evans. Adaptive importance sampling and chaining. Contemporary Mathematics, 115 (Statistical Multiple Integration):137–143, 1991.

190. M. Evans and T. Swartz. Approximating Integrals via Monte Carlo and Deterministic Methods. Oxford University Press, Oxford, 2000.

191. U. Faigle and W. Kern. Some convergence results for probabilistic tabu search. ORSA Journal on Computing, 4:32–37, 1992.

192. J. Fan and I. Gijbels. Local Polynomial Modelling and Its Applications. Chapman & Hall, New York, 1996.

193. J. A. Fill. An interruptible algorithm for perfect sampling via Markov chains. Annals of Applied Probability, 8(1):131–162, 1998.

194. R. A. Fisher. Design of Experiments. Hafner, New York, 1935.

195. G. S. Fishman. Monte Carlo. Springer, New York, 1996.

196. J. M. Flegal, M. Haran, and G. L. Jones. Markov chain Monte Carlo: Can we trust the third significant figure. Statistical Science, 23(2):250–260, 2008.

197. R. Fletcher. A new approach to variable metric algorithms. Computer Journal, 13:317–322, 1970.

198. R. Fletcher. Practical Methods of Optimization, 2nd ed. Wiley, Chichester, UK, 1987.

199. R. Fletcher and M. J. D. Powell. A rapidly convergent descent method for minimization. Computer Journal, 6:163–168, 1963.

200. D. B. Fogel. Evolutionary Computation: Toward a New Philosophy of Machine Intelligence, 2nd ed. IEEE Press, Piscataway, NJ, 2000.

201. B. L. Fox. Simulated annealing: Folklore, facts, and directions. In H. Niederreiter and P. J. Shiue, editors. Monte Carlo and Quasi-Monte-Carlo Methods in Scientific Computing. Springer, New York, 1995.

202. E. J. Freireich, E. Gehan, E. Frei III, L. R. Schroeder, I. J. Wolman, R. Anabari, E. O. Burgert, S. D. Mills, D. Pinkel, O. S. Selawry, J. H. Moon, B. R. Gendel, C. L. Spurr, R. Storrs, F. Haurani, B. Hoogstraten, and S. Lee. The effect of 6-mercaptopurine on the duraction of steriod-induced remissions in acute leukemia: A model for evaluation fo other potentially useful therapy. Blood, 21(6):699–716, June 1963.

203. D. Frenkel and B. Smit. Understanding Molecular Simulation. Academic, New York, 1996.

204. H. Freund and R. Wolter. Evolution of bit strings II: A simple model of co-evolution. Complex Systems, 7:25–42, 1993.

205. J. H. Friedman. A variable span smoother. Technical Report 5, Dept. of Statistics, Stanford University, Palo Alto, CA, 1984.

206. J. H. Friedman. Exploratory projection pursuit. Journal of the American Statistical Association, 82:249–266, 1987.

207. J. H. Friedman. Multivariate additive regression splines (with discussion). Annals of Statistics, 19(1):1–141, 1991.

208. J. H. Friedman and W. Steutzle. Smoothing of scatterplots. Technical Report ORION-003, Dept. of Statistics, Stanford University, Palo Alto, CA, 1982.

209. J. H. Friedman and W. Stuetzle. Projection pursuit regression. Journal of the American Statistical Association, 76:817–823, 1981.

210. J. H. Friedman, W. Stuetzle, and A. Schroeder. Projection pursuit density estimation. Journal of the American Statistical Association, 79:599–608, 1984.

211. K. Fukunaga. Introduction to Statistical Pattern Recognition. Academic, New York, 1972.

212. W. A. Fuller. Introduction to Statistical Time Series. Wiley, New York, 1976.

213. G. M. Furnival and R. W. Wilson, Jr. Regressions by leaps and bounds. Technometrics, 16:499–511, 1974.

214. M. R. Garey and D. S. Johnson. Computers and Intractability: A Guide to the Theory of NP-Completeness. Freeman, San Francisco, 1979.

215. C. Gaspin and T. Schiex. Genetic algorithms for genetic mapping. In J.-K. Hao, E. Lutton, E. Ronald, M. Schoenauer, and D. Snyers, editors. Artificial Evolution 1997, pages 145–156. Springer, New York, 1997.

216. A. Gelfand and A. F. M. Smith. Sampling based approaches to calculating marginal densities. Journal of the American Statistical Association, 85:398–409, 1990.

217. A. E. Gelfand, J. A. Silander Jr., S. Wu, A. Latimer, P. O. Lewis, A. G. Rebelo, and M. Holder. Explaining species distribution patterns through hierarachical modeling. Bayesian Analysis, 1(1):41–92, 2006.

218. A. E. Gelfand, S. K. Sahu, and B. P. Carlin. Efficient parametrisations for normal linear mixed models. Biometrika, 82(3):479–488, 1995.

219. A. E. Gelfand, S. K. Sahu, and B. P. Carlin. Efficient parametrizations for generalized linear mixed models, (with discussion). In Bayesian Statistics 5. Oxford University Press. Oxford, UK, 1996.

220. A. Gelman. Iterative and non-iterative simulation algorithms. Computing Science and Statistics, 24:433–438, 1992.

221. A. Gelman, J. B. Carlin, H. S. Stern, and D. B. Rubin. Bayesian Data Analysis, 2nd ed. Chapman & Hall, London, 2004.

222. A. Gelman and X.-L. Meng. Simulating normalizing constants: From importance sampling to bridge sampling to path sampling. Statistical Science, 13:163–185, 1998.

223. A. Gelman, G. Roberts, and W. Gilks. Efficient Metropolis jumping rules. Bayesian statistics, 5:599–608, 1996.

224. A. Gelman and D. B. Rubin. Inference from iterative simulation using multiple sequences (with discussion). Statistical Science, 7:457–511, 1992.

225. A. Gelman, D. A. Van Dyk, Z. Huang, and W. J. Boscardin. Using redundant parameterizations to fit hierarchical models. Journal of Computational and Graphical Statistics, 17(1):95–122, 2008.

226. S. Geman and D. Geman. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-6(6):721–741, 1984.

227. J. E. Gentle. Random Number Generation and Monte Carlo Methods. Springer, New York, 1998.

228. R. Gentleman and R. Ihaka. The Comprehensive R Archive Network. Available at http://lib.stat.cmu.edu/R/CRAN/, 2003.

229. E. I. George and R. E. McCulloch. Variable selection via Gibbs sampling. Journal of the American Statistical Association, 88:881–889, 1993.

230. E. I. George and C. P. Robert. Capture–recapture estimation via Gibbs sampling. Biometrika, 79(4):677–683, 1992.

231. C. J. Geyer. Burn-in is unneccessary. Available at http://www.stat.umn.edu/~charlie/mcmc/burn.html.

232. C. J. Geyer. Markov chain Monte Carlo maximum likelihood. In E. Keramigas, editor. Computing Science and Statistics: The 23rd Symposium on the Interface. Interface Foundation, Fairfax Station, VA, 1991.

233. C. J. Geyer. Practical Markov chain Monte Carlo (with discussion). Statistical Science, 7:473–511, 1992.

234. C. J. Geyer and E. A. Thompson. Constrained Monte Carlo maximum likelihood for dependent data. Journal of the Royal Statistical Society, Series B, 54:657–699, 1992.

235. C. J. Geyer and E. A. Thompson. Annealing Markov chain Monte Carlo with applications to ancestral inference. Journal of the American Statistical Association, 90:909–920, 1995.

236. Z. Ghahramani. An introduction to hidden Markov models and Bayesian networks. International Journal of Pattern Recognition and Artificial Intelligence, 15:9–42, 2001.

237. W. R. Gilks. Derivative-free adaptive rejection sampling for Gibbs sampling. In J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. F. M. Smith, editors. Bayesian Statistics 4. Oxford, Clarendon. Oxford, UK, 1992.

238. W. R. Gilks. Adaptive rejection sampling, MRC Biostatistics Unit, Software from the BSU. Available at http://www.mrc-bsu.cam.ac.uk/BSUsite/Research/software.shtml, 2004.

239. W. R. Gilks and C. Berzuini. Following a moving target—Monte Carlo inference for dynamic Bayesian systems. Journal of the Royal Statistical Society, Series B, 63:127–146, 2001.

240. W. R. Gilks, N. G. Best, and K. K. C. Tan. Adaptive rejection Metropolis sampling within Gibbs sampling. Applied Statistics, 44:455–472, 1995.

241. W. R. Gilks, S. Richardson, and D. J. Spiegelhalter. Markov Chain Monte Carlo Methods in Practice. Chapman & Hall/CRC, London, 1996.

242. W. R. Gilks and G. O. Roberts. Strategies for improving MCMC. In W. R. Gilks, S. Richardson, and D. J. Spiegelhalter, editors. Markov Chain Monte Carlo in Practice, pages 89–114. Chapman & Hall/CRC, London, 1996.

243. W. R. Gilks, A. Thomas, and D. J. Spiegelhalter. A language and program for complex Bayesian modeling. The Statistician, 43:169–178, 1994.

244. W. R. Gilks and P. Wild. Adaptive rejection sampling for Gibbs sampling. Applied Statistics, 41:337–348, 1992.

245. P. E. Gill, G. H. Golub, W. Murray, and M. A. Saunders. Methods for modifying matrix factorizations. Mathematics of Computation, 28:505–535, 1974.

246. P. E. Gill and W. Murray. Newton-type methods for unconstrained and linearly constrained optimization. Mathematical Programming, 28:311–350, 1974.

247. P. E. Gill, W. Murray, and M. Wright. Practical Optimization. Academic, London, 1981.

248. P. Giudici and P. J. Green. Decomposable graphical Gaussian model determination. Biometrika, 86(4):785–801, 1999.

249. G. H. Givens. Empirical estimation of safe aboriginal whaling limits for bowhead whales. Journal of Cetacean Research and Management, 5:39–44, 2003.

250. G. H. Givens, J. R. Beveridge, B. A. Draper, P. Grother, and P. J. Phillips. How Features of the Human Face Affect Recognition: a Statistical Comparison of Three Face Recognition Algorithms. IEEE Conference on Computer Vision and Pattern Recognition, pages 381–388, June 2004.

251. G. H. Givens, J. R. Beveridge, B. A. Draper, and D. Bolme. A statistical assessment of subject factors in the PCA recognition of human faces. In IEEE Conference on Computer Vision and Pattern Recognition. December 2003.

252. G. H. Givens and A. E. Raftery. Local adaptive importance sampling for multivariate densities with strong nonlinear relationships. Journal of the American Statistical Association, 91:132–141, 1996.

253. J. R. Gleason. Algorithms for balanced bootstrap simulations. The American Statistican, 42:263–266, 1988.

254. F. Glover. Tabu search, Part I. ORSA Journal on Computing, 1:190–206, 1989.

255. F. Glover. Tabu search, Part II. ORSA Journal on Computing, 2:4–32, 1990.

256. F. Glover and H. J. Greenberg. New approaches for heuristic search: A bilateral link with artificial intelligence. European Journal of Operational Research, 39:119–130, 1989.

257. F. Glover and M. Laguna. Tabu search. In C. R. Reeves, editor. Modern Heuristic Techniques for Combinatorial Problems. Wiley, New York, 1993.

258. F. Glover and M. Laguna. Tabu Search. Kluwer, Boston, 1997.

259. F. Glover, E. Taillard, and D. de Werra. A user's guide to tabu search. Annals of Operations Research, 41:3–28, 1993.

260. S. Godsill and T. Clapp. Improvement strategies for Monte Carlo particle filters. In A. Doucet, N. de Freitas, and N. Gordon, editors. Sequential Monte Carlo Methods in Practice, pages 139–158. Springer, New York, 2001.

261. S. J. Godsill. On the relationship between Markov chain Monte Carlo methods for model uncertainty. Journal of Computational and Graphical Statistics, 10(2):230–248, 2001.

262. D. E. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA, 1989.

263. D. E. Goldberg. A note on Boltzmann tournament selection for genetic algorithms and population-oriented simulated annealing. Complex Systems, 4:445–460, 1990.

264. D. E. Goldberg and K. Deb. A comparative analysis of selection schemes used in genetic algorithms. In G. Rawlins, editor. Foundations of Genetic Algorithms and Classifier Systems. Morgan Kaufmann, San Mateo, CA, 1991.

265. D. E. Goldberg, K. Deb, and B. Korb. Messy genetic algorithms revisited: Studies in mixed size and scale. Complex Systems, 4:415–444, 1990.

266. D. E. Goldberg, K. Deb, and B. Korb. Don't worry, be messy. In R. K. Belew and L. B. Booker, editors. Proceedings of the 4th International Conference on Genetic Algorithms. Morgan Kaufmann, San Mateo, CA, 1991.

267. D. E. Goldberg, B. Korb, and K. Deb. Messy genetic algorithms: Motivation, analysis, and first results. Complex Systems, 3:493–530, 1989.

268. D. E. Goldberg and R. Lingle. Alleles, loci, and the travelling salesman problem. In J. J. Grefenstette, editor. Proceedings of an International Conference on Genetic Algorithms and Their Applications, pages 154–159. Lawrence Erlbaum Associates, Hillsdale, NJ, 1985.

269. D. Goldfarb. A family of variable metric methods derived by variational means. Mathematics of Computation, 24:23–26, 1970.

270. A. A. Goldstein. On steepest descent. SIAM Journal on Control and Optimization, 3:147–151, 1965.

271. P. I. Good. Permutation Tests: A Practical Guide to Resampling Methods for Testing Hypotheses, 2nd ed. Springer, New York, 2000.

272. P. I. Good. Resampling Methods: A Practical Guide to Data Analysis, 2nd ed. Birkhäuser, Boston, 2001.

273. N. J. Gordon. A hybrid bootstrap filter for target tracking in clutter. IEEE Transactions on Aerospace and Electronic Systems, 33:353–358, 1997.

274. N. J. Gordon, D. J. Salmon, and A. F. M. Smith. A novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEEE Proceedings in Radar and Signal Processing, 140:107–113, 1993.

275. F. Götze and H. R. Künsch. Second-order correctness of the blockwise bootstrap for stationary observations. Annals of Statistics, 24:1914–1933, 1996.

276. B. S. Grant and L. L. Wiseman. Recent history of melanism in American peppered moths. Journal of Heredity, 93:86–90, 2002.

277. D. Graybill. Campito mountain data set. igbp pages/world data center for paleoclimatology data contribution series 1983-ca533.rwl. NOAA/NCDC Paleoclimatology Program, Boulder, CO, 1983.

278. P. J. Green. Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82:711–732, 1995.

279. P. J. Green. Trans-dimensional Markov chain Monte Carlo. In P. J. Green, N. L. Hjort, and S. Richardson, editors. Highly Structured Stochastic Systems, pages 179–198. Oxford University Press, Oxford, 2003.

280. P. J. Green and B. W. Silverman. Nonparametric Regression and Generalized Linear Models. Chapman & Hall, New York, 1994.

281. J. W. Greene and K. J. Supowit. Simulated annealing without rejected moves. In Proceedings of the IEEE International Conference on Computer Design, 1984.

282. J. W. Greene and K. J. Supowit. Simulated annealing without rejected moves. IEEE Transactions on Computer-Aided Design, CAD-5:221–228, 1986.

283. U. Grenander and M. Miller. Representations of knowledge in complex systems (with discussion). Journal of the Royal Statistical Society, Series B, 56:549–603, 1994.

284. B. Grund, P. Hall, and J. S. Marron. Loss and risk in smoothing parameter selection. Journal of Nonparametric Statistics, 4:107–132, 1994.

285. C. Gu. Smoothing spline density estimation: A dimensionless automatic algorithm. Journal of the American Statistical Association, 88:495–504, 1993.

286. A. Guisan, T. C. Edwards Jr., and T. Hastie. Generalized linear and generalized additive models in studies of speices distributions: Setting the scene. Ecological Modelling, 157:89–100, 2002.

287. F. Gustafsson, F. Gunnarsson, N. Bergman, U. Forssell, J. Jansson, R. Karlsson, and P-J. Nordlund. Particle filters for positioning, navigation, and tracking. IEEE Transactions on Signal Processing, 50:425–437, 2002.

288. H. Haario, E. Saksman, and J. Tamminen. An adaptive Metropolis algorithm. Bernoulli, 7(2):223–242, 2001.

289. J. D. F. Habbema, J. Hermans, and K. Van Der Broek. A stepwise discriminant analysis program using density estimation. In G. Bruckman, editor. COMPSTAT 1974, Proceedings in Computational Statistics. Physica, Vienna, 1974.

290. S. Haber. Numerical evaluation of multiple integrals. SIAM Review, 12:481–526, 1970.

291. R. P. Haining. Spatial Data Analysis: Theory and Practice. Cambridge University Press, Cambridge, 2003.

292. B. Hajek. Cooling schedules for optimal annealing. Mathematics of Operations Research, 13:311–329, 1988.

293. P. Hall. Large sample optimality of least squares cross-validation in density estimation. Annals of Statistics, 11:1156–1174, 1983.

294. P. Hall. Antithetic resampling for the bootstrap. Biometrika, 76:713–724, 1989.

295. P. Hall. The Bootstrap and Edgeworth Expansion. Springer, New York, 1992.

296. P. Hall and J. L. Horowitz. Bootstrap critical values for tests based on generalized-method-of-moments estimators. Econometrica, 64:891–916, 1996.

297. P. Hall, J. L. Horowitz, and B.-Y. Jing. On blocking rules for the bootstrap with dependent data. Biometrika, 82:561–574, 1995.

298. P. Hall and J. S. Marron. Extent to which least squares cross-validation minimises integrated squared error in nonparametric density estimation. Probability Theory and Related Fields, 74:567–581, 1987.

299. P. Hall and J. S. Marron. Lower bounds for bandwidth selection in density estimation. Probability Theory and Related Fields, 90:149–173, 1991.

300. P. Hall, J. S. Marron, and B. U. Park. Smoothed cross-validation. Probability Theory and Related Fields, 92:1–20, 1992.

301. P. Hall, S. J. Sheather, M. C. Jones, and J. S. Marron. On optimal data-based bandwidth selection in kernel density estimation. Biometrika, 78:263–269, 1991.

302. P. Hall and S. R. Wilson. Two guidelines for bootstrap hypothesis testing. Biometrics, 47:757–762, 1991.

303. J. Hammersley and K. Morton. Poor man's Monte Carlo. Journal of the Royal Statistical Society, Series B, 16:23–28, 1954.

304. J. M. Hammersley and K. W. Morton. A new Monte Carlo technique: Antithetic variates. Proceedings of the Cambridge Philosophical Society, 52:449–475, 1956.

305. M. H. Hansen and C. Kooperberg. Spline adaptation in extended linear models (with discussion). Statistical Science, 17:2–51, 2002.

306. P. Hansen and B. Jaumard. Algorithms for the maximum satisfiability problem. Computing, 44:279–303, 1990.

307. W. Härdle. Resistant smoothing using the fast Fourier transform. Applied Statistics, 36:104–111, 1986.

308. W. Härdle. Applied Nonparametric Regression. Cambridge University Press, Cambridge, 1990.

309. W. Härdle. Smoothing Techniques: With Implementation in S. Springer, New York, 1991.

310. W. Härdle, P. Hall, and J. S. Marron. How far are automatically chosen regression smoothing parameters from their optimum? (with discussion). Journal of the American Statistical Association, 83:86–99, 1988.

311. W. Härdle, P. Hall, and J. S. Marron. Regression smoothing parameters that are not far from their optimum. Journal of the American Statistical Association, 87:227–233, 1992.

312. W. Härdle, J. Horowitz, and J-P. Kreiss. Bootstrap methods for time series. International Statistical Review, 71:435–459, 2003.

313. W. Härdle and J. S. Marron. Random approximations to an error criterion of nonparametric statistics. Journal of Multivariate Analysis, 20:91–113, 1986.

314. W. Härdle and M. G. Schimek, editors. Statistical Theory and Computational Aspects of Smoothing. Physica, Heidelberg, 1996.

315. W. Härdle and D. Scott. Smoothing by weighted averaging using rounded points. Computational Statistics, 7:97–128, 1992.

316. G. H. Hardy. Mendelian proportions in a mixed population. Science, 28:49–50, 1908.

317. J. A. Hartigan and M. A. Wong. A k-means clustering algorithm. Applied Statistics, 28:100–108, 1979.

318. D. I. Hastie and P. J. Green. Model choice using reversible jump Markov chain Monte Carlo. Statistica Neerlandica, 66(3):309–338, 2012.

319. T. J. Hastie. Principal curve library for S. Available at http://lib.stat.cmu.edu/, 2004.

320. T. J. Hastie and D. Pregibon. Generalized linear models. In J. M. Chambers and T. J. Hastie, editors. Statistical Models in S. Chapman & Hall, New York, 1993.

321. T. J. Hastie and W. Steutzle. Principal curves. Journal of the American Statistical Association, 84:502–516, 1989.

322. T. J. Hastie and R. J. Tibshirani. Generalized Additive Models. Chapman & Hall, New York, 1990.

323. T. J. Hastie, R. J. Tibshirani, and J. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, New York, 2001.

324. W. K. Hastings. Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57:97–109, 1970.

325. P. Henrici. Elements of Numerical Analysis. Wiley, New York, 1964.

326. T. Hesterberg. Weighted average importance sampling and defensive mixture distributions. Technometrics, 37:185–194, 1995.

327. D. Higdon. Comment on “Spatial statistics and Bayesian computation,” by Besag and Green. Journal of the Royal Statistical Society, Series B, 55(1):78, 1993.

328. D. M. Higdon. Auxiliary variable methods for Markov chain Monte Carlo with applications. Journal of the American Statistical Association, 93:585–595, 1998.

329. M. D. Higgs and J. A. Hoeting. A clipped latent variable model for spatially correlated ordered categorical data. Computational Statistics & Data Analysis, 54(8):1999–2011, 2010.

330. J. S. U. Hjorth. Computer Intensive Statistical Methods: Validation, Model Selection, and Bootstrap. Chapman & Hall, New York, 1994.

331. J. A. Hoeting, D. Madigan, A. E. Raftery, and C. T. Volinsky. Bayesian model averaging: A tutorial (with discussion). Statistical Science, 14:382–417, 1999.

332. J. A. Hoeting, A. E. Raftery, and D. Madigan. Bayesian variable and transformation selection in linear regression. Journal of Computational and Graphical Statistics, 11(3):485–507, 2002.

333. J. H. Holland. Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, 1975.

334. C. C. Holmes and B. K. Mallick. Generalized nonlinear modeling with multivariate free-knot regression splines. Journal of the American Statistical Association, 98(462):352–368, 2003.

335. A. Homaifar, S. Guan, and G. E. Liepins. Schema analysis of the traveling salesman problem using genetic algorithms. Complex Systems, 6:533–552, 1992.

336. D. W. Hosmer and S. Lemeshow. Applied Logistic Regression. Wiley, New York, 2000.

337. Y. F. Huang and P. M. Djuric. Variable selection by perfect sampling. EURASIP Journal on Applied Signal Processing, 1:38–45, 2002.

338. P. J. Huber. Projection pursuit. Annals of Statistics, 13:435–475, 1985.

339. K. Hukushima and K. Nemoto. Exchange Monte Carlo method and application to spin glass simulations. Journal of the Physical Society of Japan, 64:1604–1608, 1996.

340. J. N. Hwang, S. R. Lay, and A. Lippman. Nonparametric multivariate density estimation: A comparative study. IEEE Transactions on Signal Processing, 42:2795–2810, 1994.

341. R. J. Hyndman. Time series data library. Available at http://robjhyndman.com/TSDL, 2011.

342. A. C. Jacinto, R. D. Ambrosini, and R. F. Danesi. Experimental and computational analysis of plates under air blast loading. International Journal of Impact Engineering, 25:927–947, 2001.

343. M. Jamshidian and R. I. Jennrich. Conjugate gradient acceleration of the EM algorithm. Journal of the American Statistical Association, 88:221–228, 1993.

344. M. Jamshidian and R. I. Jennrich. Acceleration of the EM algorithm by using quasi-Newton methods. Journal of the Royal Statistical Society, Series B, 59:569–587, 1997.

345. M. Jamshidian and R. I. Jennrich. Standard errors for EM estimation. Journal of the Royal Statistical Society, Series B, 62:257–270, 2000.

346. C. Z. Janikow and Z. Michalewicz. An experimental comparison of binary and floating point representations in genetic algorithms. In R. K. Belew and L. B. Booker, editors. Proceedings of the 4th International Conference on Genetic Algorithms. Morgan Kaufmann, San Mateo, CA, 1991.

347. B. Jansen. Interior Point Techniques in Optimization: Complementarity, Sensitivity and Algorithms. Kluwer, Boston, 1997.

348. P. Jarratt. A review of methods for solving nonlinear algebraic equations in one variable. In P. Rabinowitz, editor. Numerical Methods for Nonlinear Algebraic Equations. Gordon and Breach, London, 1970.

349. R. G. Jarrett. A note on the intervals between coal-mining disasters. Biometrika, 66:191–193, 1979.

350. H. Jeffreys. Theory of Probability, 3rd ed. Oxford University Press, New York, 1961.

351. D. S. Johnson. Bayesian Analysis of State-Space Models for Discrete Response Compositions. Ph.D. thesis, Colorado State University, 2003.

352. D. S. Johnson, C. R. Aragon, L. A. McGeoch, and C. Schevon. Optimization by simulated annealing: An experimental evaluation; Part I, graph partitioning. Operations Research, 37:865–892, 1989.

353. L. W. Johnson and R. D. Riess. Numerical Analysis. Addison-Wesley, Reading, MA, 1982.

354. R. W. Johnson. Fitting percentage of body fat to simple body measurements. Journal of Statistics Education, 4(1):265–266, 1996.

355. G. L. Jones, M. Haran, B. S. Caffo, and R. Neath. Fixed-width output analysis for Markov chain Monte Carlo. Journal of the American Statistical Association, 101(476):1537–1547, 2006.

356. M. C. Jones. Variable kernel density estimates. Australian Journal of Statistics, 32:361–371, 1990.

357. M. C. Jones. The roles of ISE and MISE in density estimation. Statistics and Probability Letters, 12:51–56, 1991.

358. M. C. Jones, J. S. Marron, and B. U. Park. A simple root n bandwidth selector. Annals of Statistics, 19:1919–1932, 1991.

359. M. C. Jones, J. S. Marron, and S. J. Sheather. A brief survey of bandwidth selection for density estimation. Journal of the American Statistical Association, 91:401–407, 1996.

360. M. C. Jones, J. S. Marron, and S. J. Sheather. Progress in data-based bandwidth selection for kernel density estimation. Computational Statistics, 11:337–381, 1996.

361. B. H. Juang and L. R. Rabiner. Hidden Markov models for speech recognition. Technometrics, 33:251–272, 1991.

362. N. Karmarkar. A new polynomial-time algorithm for linear programming. Combinatorica, 4:373–395, 1984.

363. J. R. Karr and D. R. Dudley. Ecological perspectives on water quality goals. Environmental Management, 5(1):55–68, 1981.

364. R. E. Kass, B. P. Carlin, A. Gelman, and R. M. Neal. Markov chain Monte Carlo in practice: A roundtable discussion. American Statistican, 52:93–100, 1998.

365. R. E. Kass and A. E. Raftery. Bayes factors. Journal of the American Statistical Association, 90:773–795, 1995.

366. D. E. Kaufman and R. L. Smith. Direction choice for accelerated convergence in hit-and-run sampling. Operations Research, 46:84–95, 1998.

367. B. Kégl. Principal curve webpage. Available at http://www.iro.umontreal.ca/~kegl/research/pcurves/.

368. C. T. Kelley. Detection and remediation of stagnation in the Nelder-Mead algorithm using a sufficient decrease condition. SIAM Journal on Optimization, 10:43–55, 1999.

369. C. T. Kelley. Iterative Methods for Optimization. Society for Industrial and Applied Mathematics, Philadelphia, PA, 1999.

370. A. G. Z. Kemna and A. C. F. Vorst. A pricing method for options based on average asset values. Journal of Banking and Finance, 14:113–129, 1990.

371. M. Kendall and A. Stuart. The Advanced Theory of Statistics, volume 1, 4th ed. Macmillan, New York, 1977.

372. J. Kennedy and R. Eberhart. Particle swarm optimization. In Neural Networks, 1995. Proceedings, IEEE International Conference on, vol. 4, pages 1942–1948, vol.4. Nov/Dec 1995.

373. J. Kennedy and R. C. Roberts. Swarm Intelligence. Morgan Kaufmann, San Francisco, 2001.

374. W. J. Kennedy Jr., and J. E. Gentle. Statistical Computing. Marcel Dekker, New York, 1980.

375. H. F. Khalfan, R. H. Byrd, and R. B. Schnabel. A theoretical and experimental study of the symmetric rank-one update. SIAM Journal of Optimization, 3:1–24, 1993.

376. D. R. Kincaid and E. W. Cheney. Numerical Analysis. Wadsworth, Belmont, CA, 1991.

377. R. Kindermann and J. L. Snell. Markov Random Fields and Their Applications, volume 1 of Contemporary Mathematics. American Mathematical Society, Providence, 1980.

378. S. Kirkpatrick, C. D. Gellat, and M. P. Vecchi. Optimization by simulated annealing. Science, 220:671–680, 1983.

379. G. Kitagawa. Monte Carlo filter and smoother for non-Gaussian nonlinear state space models. Journal of Computational and Graphical Statistics, 5:1–25, 1996.

380. S. Klinke and J. Grassmann. Projection pursuit regression. In M. G. Schimek, editor. Smoothing and Regression: Approaches, Computation, and Application, pages 277–327. Wiley, New York, 2000.

381. T. Kloek and H. K. Van Dijk. Bayesian estimates of equation system parameters: An application of integration by Monte Carlo. Econometrica, 46:1–20, 1978.

382. L. Knorr-Held and H. Rue. On block updating in Markov random field models for disease mapping. Scandinavian Journal of Statistics, 29(4):567–614, 2002.

383. D. Knuth. The Art of Computer Programming 2: Seminumerical Algorithms, 3rd ed. Addison-Wesley, Reading, MA, 1997.

384. M. Kofler. Maple: An Introduction and Reference. Addison-Wesley, Reading, MA, 1997.

385. T. G. Kolda, R. M. Lewis, and V. Torczon. Optimization by direct search: New perspectives on some classical and modern methods. SIAM Review, 45:385–482, 2003.

386. A. Kong, J. S. Liu, and W. H. Wong. Sequential imputations and Bayesian missing data problems. Journal of the American Statistical Association, 89:278–288, 1994.

387. A. S. Konrod. Nodes and Weights of Quadrature Formulas. Consultants Bureau Enterprises, New York, 1966.

388. C. Kooperberg. Polspline. Available at http://cran.r-project.org/src/contrib/Descriptions/polspline, 2004.

389. C. Kooperberg and C. J. Stone. Logspline density estimation. Computational Statistics and Data Analysis, 12:327–347, 1991.

390. C. Kooperberg and C. J. Stone. Logspline density estimation for censored data. Journal of Computational and Graphical Statistics, 1:301–328, 1992.

391. C. Kooperberg, C. J. Stone, and Y. K. Truong. Hazard regression. Journal of the American Statistical Association, 90:78–94, 1995.

392. T. Koski. Hidden Markov Models of Bioinformatics. Kluwer, Dordrecht, Netherlands, 2001.

393. J.-P. Kreiss. Bootstrap procedures for AR(∞)-processes. In K.-H. Jöckel, G. Rothe, and W. Sendler, editors. Bootstrapping and Related Techniques, pages 107–113. Springer, Berlin, 1992.

394. K. Kremer and K. Binder. Monte Carlo simulation of lattice models for macromolecules. Computer Physics Reports, 7:259–310, 1988.

395. V. I. Krylov, translated by A. H. Stroud. Approximate Calculation of Integrals. Macmillan, New York, 1962.

396. H. R. Künsch. The jackknife and the bootstrap for general stationary observations. Annals of Statistics, 17:1217–1241, 1989.

397. J. C. Lagarias, J. A. Reeds, M. H. Wright, and P. E. Wright. Convergence properties of the Nelder-Mead simplex algorithm in low dimensions. SIAM Journal of Optimization, 9:112–147, 1998.

398. S. N. Lahiri. Edgeworth correction by “moving block” bootstrap for stationary and nonstationary data. In R. LePage and L. Billard, editors. Exploring the Limits of the Bootstrap, pages 183–214. Wiley, New York, 1992.

399. S. N. Lahiri. On Edgeworth expansion and moving block bootstrap for studentized M-estimators in multiple linear regression models. Journal of Multivariate Analysis, 56:42–59, 1996.

400. S. N. Lahiri. Theoretical comparisons of block bootstrap methods. Annals of Statistics, 27:386–404, 1999.

401. S. N. Lahiri. On the jackknife-after-bootstrap method for dependent data and its consistency properties. Econometric Theory, 18:79–98, 2002.

402. S. N. Lahiri. Resampling Methods for Dependent Data. Springer, New York, 2003.

403. S. N. Lahiri. Consistency of the jackknife-after-bootstrap variance estimator for block bootstrap quantiles of a studentized statistic. Annals of Statistics, 33:2475–2506, 2005.

404. S. N. Lahiri, K. Furukawa, and Y.-D. Lee. A nonparametric plug-in rule for selecting optimal block lengths for block bootstrap methods. Statistical Methodology, 4:292–321, 2007.

405. C. Lalas and B. Murphy. Increase in the abundance of New Zealand fur seals at the Catlins, South Island, New Zealand. Journal of the Royal Society of New Zealand, 28:287–294, 1998.

406. D. Lamberton and B. Lapeyre. Introduction to Stochastic Calculus Applied to Finance. Chapman & Hall, London, 1996.

407. K. Lange. A gradient algorithm locally equivalent to the EM algorithm. Journal of the Royal Statistical Society, Series B, 57:425–437, 1995.

408. K. Lange. A quasi-Newton acceleration of the EM algorithm. Statistica Sinica, 5:1–18, 1995.

409. K. Lange. Numerical Analysis for Statisticians. Springer, New York, 1999.

410. K. Lange, D. R. Hunter, and I. Yang. Optimization transfer using surrogate objective functions (with discussion). Journal of Computational and Graphical Statistics, 9:1–59, 2000.

411. A. B. Lawson. Statistical Methods in Spatial Epidemiology. Wiley, New York, 2001.

412. S. Z. Li. Markov Random Field Modeling in Image Analysis. Springer, Tokyo, 2001.

413. R. J. A. Little and D. B. Rubin. Statistical Analysis with Missing Data, 2nd ed. Wiley, Hoboken, NJ, 2002.

414. E. L. Little, Jr. Atlas of United States Trees, Minor Western Hardwoods, volume 3 of Miscellaneous Publication 1314. U.S. Department of Agriculture, 1976.

415. C. Liu and D. B. Rubin. The ECME algorithm: A simple extension of EM and ECM with faster monotone convergence. Biometrika, 81:633–648, 1994.

416. J. S. Liu. Nonparametric hierarchical Bayes via sequential imputations. Annals of Statistics, 24:910–930, 1996.

417. J. S. Liu. Monte Carlo Strategies in Scientific Computing. Springer, New York, 2001.

418. J. S. Liu and R. Chen. Blind deconvolution via sequential imputations. Journal of the American Statistical Association, 90:567–576, 1995.

419. J. S. Liu and R. Chen. Sequential Monte Carlo methods for dynamical systems. Journal of the American Statistical Association, 93:1032–1044, 1998.

420. J. S. Liu, F. Liang, and W. H. Wong. The multiple-try method and local optimization in Metropolis sampling. Journal of the American Statistical Association, 95:121–134, 2000.

421. J. S. Liu, D. B. Rubin, and Y. Wu. Parameter expansion to accelerate EM: The PX-EM algorithm. Biometrika, 85:755–770, 1998.

422. C. R. Loader. Bandwidth selection: Classical or plug-in? Annals of Statistics, 27:415–438, 1999.

423. P. O. Loftsgaarden and C. P. Quesenberry. A nonparametric estimate of a multivariate probability density function. Annals of Mathematical Statistics, 28:1049–1051, 1965.

424. T. A. Louis. Finding the observed information matrix when using the EM algorithm. Journal of the Royal Statistical Society, Series B, 44:226–233, 1982.

425. R. Y. Lui and K. Singh. Moving blocks jackknife and bootstrap capture weak dependence. In R. LePage and L. Billard, editors. Exploring the Limits of the Bootstrap, pages 225–248. Wiley, New York, 1992.

426. M. Lundy and A. Mees. Convergence of an annealing algorithm. Mathematical Programming, 34:111–124, 1986.

427. D. J. Lunn, N. Best, and J. C. Whittaker. Generic reversible jump MCMC using graphical models. Statistics and Computing, 19(4):395–408, 2009.

428. D. J. Lunn, N. Best, and J. C. Whittaker. Generic reversible jump MCMC using graphical models. Statistics and Computing, 19(4):395–408, 2009.

429. S. N. MacEachern and L. M. Berliner. Subsampling the Gibbs sampler. American Statistican, 48(3):188–190, 1994.

430. S. N. MacEachern, M. Clyde, and J. S. Liu. Sequential importance sampling for nonparametric Bayes models: The next generation. Canadian Journal of Statistics, 27:251–267, 1999.

431. A. Maddison. Dynamic Forces in Capitalist Development: A Long-Run Comparative View. Oxford University Press, New York, 1991.

432. N. Madras. Lecture Notes on Monte Carlo Methods. American Mathematical Society, Providence, RI, 2002.

433. N. Madras and M. Piccioni. Importance sampling for families of distributions. Annals of Applied Probability, 9:1202–1225, 1999.

434. B. A. Maguire, E. S. Pearson, and A. H. A. Wynn. The time intervals between industrial accidents. Biometrika, 39:168–180, 1952.

435. C. L. Mallows. Some comments on Cp. Technometrics, 15:661–675, 1973.

436. E. Mammen. Resampling methods for nonparametric regression. In M. G. Schimek, editor. Smoothing and Regression: Approaches, Computation, and Application. Wiley, New York, 2000.

437. J.-M. Marin and C. Robert. Importance sampling methods for Bayesian discrimination between embedded models. In Frontiers of Statistical Decision Making and Bayesian Analysis, pages 513–527. Springer, New York, 2010.

438. E. Marinari and G. Parisi. Simulated tempering: A new Monte Carlo scheme. Europhysics Letters, 19:451–458, 1992.

439. J. S. Maritz. Distribution Free Statistical Methods, 2nd ed. Chapman & Hall, London, 1996.

440. J. S. Marron and D. Nolan. Canonical kernels for density estimation. Statistics and Probability Letters, 7:195–199, 1988.

441. G. Marsaglia. Random variables and computers. In Transactions of the Third Prague Conference on Information Theory, Statistical Decision Functions and Random Processes. Czechoslovak Academy of Sciences, Prague, 1964.

442. G. Marsaglia. The squeeze method for generating gamma variates. Computers and Mathematics with Applications, 3:321–325, 1977.

443. G. Marsaglia. The exact-approximation method for generating random variables in a computer. Journal of the American Statistical Association, 79:218–221, 1984.

444. G. Marsaglia and W. W. Tsang. A simple method for generating gamma variables. ACM Transactions on Mathematical Software, 26:363–372, 2000.

445. W. L. Martinez and A. R. Martinez. Computational Statistics Handbook with MATLAB. Chapman & Hall/CRC, Boca Raton, FL, 2002.

446. P. McCullagh and J. A. Nelder. Generalized Linear Models. Chapman & Hall, New York, 1989.

447. S. McGinnity and G. W. Irwin. Multiple model bootstrap filter for maneuvering target tracking. IEEE Transactions on Aerospace and Electronic Systems, 36:1006–1012, 2000.

448. K. I. M. McKinnon. Convergence of the Nelder-Mead simplex method to a nonstationary point. SIAM Journal on Optimization, 9:148–158, 1998.

449. G. J. McLachlan and T. Krishnan. The EM Algorithm and Extensions. Wiley, New York, 1997.

450. I. Meilijson. A fast improvement to the EM algorithm on its own terms. Journal of the Royal Statistical Society, Series B, 51:127–138, 1989.

451. J. Meinguet. Multivariate interpolation at arbitrary points made simple. Journal of Applied Mathematics and Physics, 30:292–304, 1979.

452. X.-L. Meng. On the rate of convergence of the ECM algorithm. Annals of Statistics, 22:326–339, 1994.

453. X.-L. Meng and D. B. Rubin. Using EM to obtain asymptotic variance–covariance matrices: The SEM algorithm. Journal of the American Statistical Association, 86:899–909, 1991.

454. X.-L. Meng and D. B. Rubin. Maximum likelihood estimation via the ECM algorithm: A general framework. Biometrika, 80:267–278, 1993.

455. X.-L. Meng and D. B. Rubin. On the global and componentwise rates of convergence of the EM algorithm. Linear Algebra and Its Applications, 199:413–425, 1994.

456. X.-L. Meng and D. van Dyk. The EM algorithm—an old folk-song sung to a fast new tune. Journal of the Royal Statistical Society, Series B, 59:511–567, 1997.

457. X.-L. Meng and W. H. Wong. Simulating ratios of normalizing constants via a simple identity: A theoretical exploration. Statistica Sinica, 6:831–860, 1996.

458. K. L. Mengersen, C. P. Robert, and C. Guihenneuc-Jouyaux. MCMC convergence diagnostics: A “reviewwww” (with discussion). In J. O. Berger, J. M. Bernardo, A. P. Dawid, D. V. Lindley, and A. F. M. Smith, editors. Bayesian Statistics 6, pages 415–440. Oxford University Press, Oxford, 1999.

459. R. C. Merton. Theory of rational option pricing. Bell Journal of Economics and Management Science, 4:141–183, 1973.

460. N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller, and E. Teller. Equation of state calculation by fast computing machines. Journal of Chemical Physics, 21:1087–1091, 1953.

461. N. Metropolis and S. Ulam. The Monte Carlo method. Journal of the American Statistical Association, 44:335–341, 1949.

462. S. P. Meyn and R. L. Tweedie. Markov Chains and Stochastic Stability. Springer, New York, 1993.

463. Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs. Springer, New York, 1992.

464. Z. Michalewicz and D. B. Fogel. How to Solve It: Modern Heuristics. Springer, New York, 2000.

465. A. J. Miller. Subset Selection in Regression, 2nd ed. Chapman & Hall/CRC, Boca Raton, FL, 2002.

466. A. Mira, J. Mimgller, and G. O. Roberts. Perfect slice samplers. Journal of the Royal Statistical Society, Series B, 63(3):593–606, 2001.

467. A. Mira and L. Tierney. Efficiency and convergence properties of slice samplers. Scandinavian Journal of Statistics, 29(1):1–12, 2002.

468. J. Mimgller. Perfect simulation of conditionally specified models. Journal of the Royal Statistical Society, Series B, 61(1):251–264, 1999.

469. J. F. Monahan. Numerical Methods of Statistics. Cambridge University Press, Cambridge, 2001.

470. A. M. Mood, F. A. Graybill, and D. C. Boes. Introduction to the Theory of Statistics, 3rd ed. McGraw-Hill, New York, 1974.

471. R. J. Muirhead. Aspects of Multivariate Statistical Theory. Wiley, New York, 1982.

472. P. Müller. A generic approach to posterior integration and Gibbs sampling. Technical Report, Statistics Department, Technical Report 91-09. Purdue University, 1991.

473. D. J. Murdoch and P. J. Green. Exact sampling from a continuous state space. Scandinavian Journal of Statistics, 25(3):483–502, 1998.

474. D. J. Murdoch and J. S. Rosenthal. Efficient use of exact samples. Statistics and Computing, 10:237–243, 2000.

475. W. Murray, editor. Numerical Methods for Unconstrained Optimization. Academic, New York, 1972.

476. E. A. Nadaraya. On estimating regression. Theory of Probability and Its Applications, 10:186–190, 1964.

477. Y. Nagata and S. Kobayashi. Edge assembly crossover: A high-power genetic algorithm for the traveling salesman problem. In T. Bäck, editor. Proceedings of the 7th International Conference on Genetic Algorithms. Morgan Kaufmann, Los Altos, CA, 1997.

478. J. C. Naylor and A. F. M. Smith. Applications of a method for the efficient computation of posterior distributions. Applied Statistics, 31:214–225, 1982.

479. L. Nazareth and P. Tseng. Gilding the lily: A variant of the Nelder-Mead algorithm based on golden-section search. Computational Optimization and Applications, 22:133–144, 2002.

480. R. Neal. Sampling from multimodal distributions using tempered transitions. Statistics and Computing, 6:353–366, 1996.

481. R. M. Neal. Slice sampling. Annals of Statistics, 31(3):705–767, 1999.

482. J. A. Nelder and R. Mead. A simplex method for function minimization. Computer Journal, 7:308–313, 1965.

483. J. Neter, M. H. Kutner, C. J. Nachtsheim, and W. Wasserman. Applied Linear Statistical Models. Irwin, Chicago, 1996.

484. M. A. Newton and C. J. Geyer. Bootstrap recycling: A Monte Carlo alternative to the nested bootstrap. Journal of the American Statistical Association, 89:905–912, 1994.

485. M. A. Newton and A. E. Raftery. Approximate Bayesian inference with the weighted likelihood bootstrap (with discussion). Journal of the Royal Statistical Society, Series B, 56:3–48, 1994.

486. J. Nocedal and S. J. Wright. Numerical Optimization. Springer, New York, 1999.

487. I. Ntzoufras, P. Dellaportas, and J. J. Forster. Bayesian variable and link determination for generalised linear models. Journal of Statistical Planning and Inference, 111(1–2):165–180, 2003.

488. J. Null. Golden Gate Weather Services, Climate of San Francisco. Available at http://ggweather.com/sf/climate.html.

489. Numerical Recipes Home Page. Available at http://www.nr.com, 2003.

490. M. S. Oh and J. O. Berger. Adaptive importance sampling in Monte Carlo integration. Journal of Statistical Computation and Simulation, 41:143–168, 1992.

491. M. S. Oh and J. O. Berger. Integration of multimodal functions by Monte Carlo importance sampling. Journal of the American Statistical Association, 88:450–456, 1993.

492. I. Oliver, D. Smith, and J. R. Holland. A study of permutation crossover operators on the traveling salesman problem. In J. J. Grefenstette, editor. Proceedings of the 2nd International Conference on Genetic Algorithms, pages 224–230. Lawrence Erlbaum Associates, Hillsdale, NJ, 1987.

493. D. M. Olsson and L. S. Nelson. The Nelder–Mead simplex procedure for function minimization. Technometrics, 17:45–51, 1975.

494. J. M. Ortega, W. C. Rheinboldt, and J. M. Orrega. Iterative Solution of Nonlinear Equations in Several Variables. SIAM, Philadelphia, 2000.

495. A. M. Ostrowski. Solution of Equations and Systems of Equations, 2nd ed. Academic, New York, 1966.

496. F. O'Sullivan. Discussion of “Some aspects of the spline smoothing approach to nonparametric regression curve fitting” by Silverman. Journal of the Royal Statistical Society, Series B, 47:39–40, 1985.

497. C. H. Papadimitriou and K. Steiglitz. Combinatorial Optimization: Algorithms and Complexity. Prentice-Hall, Englewood Cliffs, NJ, 1982.

498. E. Paparoditis and D. N. Politis. Tapered block bootstrap. Biometrika, 88:1105–1119, 2001.

499. E. Paparoditis and D. N. Politis. The tapered block bootstrap for general statistics from stationary sequences. Econometrics Journal, 5:131–148, 2002.

500. B. U. Park and J. S. Marron. Comparison of data-driven bandwidth selectors. Journal of the American Statistical Association, 85:66–72, 1990.

501. B. U. Park and B. A. Turlach. Practical performance of several data driven bandwidth selectors. Computational Statistics, 7:251–270, 1992.

502. J. M. Parkinson and D. Hutchinson. An investigation of into the efficiency of variants on the simplex method. In F. A. Lootsma, editor. Numerical Methods for Nonlinear Optimization, pages 115–135. Academic, New York, NY, 1972.

503. C. Pascutto, J. C. Wakefield, N. G. Best, S. Richardson, L. Bernardinelli, A. Staines, and P. Elliott. Statistical issues in the analysis of disease mapping data. Statistics in Medicine, 19:2493–2519, 2000.

504. J. J. Pella and P. K. Tomlinson. A generalized stock production model. Inter-American Tropical Tuna Commission Bulletin, 13:419–496, 1969.

505. A. Penttinen. Modelling Interaction in Spatial Point Patterns: Parameter Estimation by the Maximum Likelihood Method. Ph.D. thesis, University of Jyväskylä, 1984.

506. A. Philippe. Processing simulation output by Riemann sums. Journal of Statistical Computation and Simulation, 59:295–314, 1997.

507. A. Philippe and C. P. Robert. Riemann sums for MCMC estimation and convergence monitoring. Statistics and Computing, 11:103–115, 2001.

508. D. B. Phillips and A. F. M. Smith. Bayesian model comparison via jump diffusions. In S. T. Richardson W. R. Gilks and D. J. Spiegelhalter, editors. Markov Chain Monte Carlo in Practice, pages 215–240. Chapman & Hall/CRC, London, 1996.

509. E. J. G. Pitman. Significance tests which may be applied to samples from any population. Royal Statistical Society Supplement, 4:119–130, 225–232, 1937.

510. E. J. G. Pitman. Significance tests which may be applied to samples from any population. Part iii. The analysis of variance test. Biometrika, 29:322–335, 1938.

511. M. Plummer, N. Best, K. Cowles, and K. Vines. Coda: Convergence diagnosis and output analysis for MCMC. R News, 6(1):7–11, 2006.

512. D. N. Politis and J. P. Romano. A circular block—Resampling procedure for stationary data. In R. LePage and L. Billard, editors. Exploring the Limits of the Bootstrap, pages 263–270. Wiley, New York, 1992.

513. D. N. Politis and J. P. Romano. The stationary bootstrap. Journal of the American Statistical Association, 89:1303–1313, 1994.

514. M. J. D. Powell. A view of unconstrained optimization. In L. C. W. Dixon, editor. Optimization in Action, pages 53–72. Academic, London, 1976.

515. M. J. D. Powell. Direct search algorithms for optimization calculations. Acta Numerica, 7:287–336, 1998.

516. G. Pozrikidis. Numerical Computation in Science and Engineering. Oxford University Press, New York, 1998.

517. W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. Numerical Recipes: The Art of Scientific Computing. Cambridge University Press, Cambridge, UK, 2007.

518. C. J. Price, I. D. Coope, and D. Byatt. A convergent variant of the Nelder–Mead algorithm. Journal of Optimization Theory and Applications, 113:5–19, 2002.

519. J. Propp and D. Wilson. Coupling from the past: A user's guide. In D. Aldous and J. Propp, editors. Microsurveys in Discrete Probability, volume 41 of DIMACS Series in Discrete Mathematics and Theoretical Computer Science, pages 181–192. American Mathematical Society. Princeton, NJ, 1998.

520. J. G. Propp and D. B. Wilson. Exact sampling with coupled Markov chains and applications to statistical mechanics. Random Structures and Algorithms, 9:223–252, 1996.

521. M. H. Protter and C. B. Morrey. A First Course in Real Analysis. Springer, New York, 1977.

522. J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.

523. L. R. Rabiner and B. H. Juang. An introduction to hidden Markov models. IEEE Acoustics, Speech, and Signal Processing Magazine, 3:4–16, 1986.

524. N. J. Radcliffe. Equivalence class analysis of genetic algorithms. Complex Systems, 5:183–205, 1991.

525. A. E. Raftery and V. E. Akman. Bayesian analysis of a Poisson process with a change point. Biometrika, 73:85–89, 1986.

526. A. E. Raftery and S. M. Lewis. How many iterations in the Gibbs sampler? In J. M. Bernardo, J. O. Berger, A. P. Dawid, and A. F. M. Smith, editors. Bayesian Statistics 4, pages 763–773. Oxford University Press, Oxford, 1992.

527. A. E. Raftery, D. Madigan, and J. A. Hoeting. Bayesian model averaging for linear regression models. Journal of the American Statistical Association, 92:179–191, 1997.

528. A. E. Raftery and J. E. Zeh. Estimating bowhead whale, Balaena mysticetus, population size and rate of increase from the 1993 census. Journal of the American Statistical Association, 93:451–463, 1998.

529. M. B. Rajarshi. Bootstrap in Markov Sequences based on estimates of transition density. Annals of the Institute of Statistical Mathematics, 42:253–268, 1990.

530. R. A. Redner and H. F. Walker. Mixture densities, maximum likelihood and the EM algorithm. SIAM Review, 26:195–239, 1984.

531. C. R. Reeves. Genetic algorithms. In C. R. Reeves, editor. Modern Heuristic Techniques for Combinatorial Problems. Wiley, New York, 1993.

532. C. R. Reeves. A genetic algorithm for flowshop sequencing. Computers and Operations Research, 22(1):5–13, 1995.

533. C. R. Reeves and J. E. Rowe. Genetic Algorithms—Principles and Perspectives. Kluwer, Norwell, MA, 2003.

534. C. R. Reeves and N. C. Steele. A genetic algorithm approach to designing neural network architecture. In Proceedings of the 8th International Conference on Systems Engineering. 1991.

535. J. R. Rice. Numerical Methods, Software, and Analysis. McGraw-Hill, New York, 1983.

536. S. Richardson and P. J. Green. On Bayesian analysis of mixtures with an unknown number of components (with discussion). Journal of the Royal Statistical Society, Series B, 59:731–792, 1997. Correction, p. 661, 1998.

537. C. J. F. Ridders. 3-Point iterations derived from exponential curve fitting. IEEE Transactions on Circuits and Systems, 26:669–670, 1979.

538. B. Ripley. Computer generation of random variables. International Statistical Review, 51:301–319, 1983.

539. B. Ripley. Stochastic Simulation. Wiley, New York, 1987.

540. B. D. Ripley. Pattern Recognition and Neural Networks. Cambridge University Press, 1996.

541. C. Ritter and M. A. Tanner. Facilitating the Gibbs sampler: The Gibbs stopper and the griddy-Gibbs sampler. Journal of the American Statistical Association, 87(419):861–868, 1992.

542. C. P. Robert. Discretization and MCMC Convergence Assessment, volume 135 of Lecture Notes in Statistics. Springer, New York, 1998.

543. C. P. Robert and G. Casella. Monte Carlo Statistical Methods, 2nd ed. Springer, New York, 2004.

544. C. P. Robert and G. Casella. Convergence monitoring and adaptation for MCMC algorithms. Introducing Monte Carlo Methods with R, pages 237–268. Springer New York, 2010.

545. G. O. Roberts, A. Gelman, and W. R. Gilks. Weak convergence and optimal scaling or random walk Metropolis algorithms. Annals of Probability, 7(1):110–120, 1997.

546. G. O. Roberts and S. K. Sahu. Updating schemes, correlation structure, blocking and parameterization for the Gibbs sampler. Journal of the Royal Statistical Society, Series B, 59(2):291–317, 1997.

547. G. O. Roberts and R. L. Tweedie. Exponential convergence of Langevin diffusions and their discrete approximations. Bernoulli, 2:344–364, 1996.

548. G. O. Roberts and J. S. Rosenthal. Optimal scaling of discrete approximations to Langevin diffusions. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 60(1):255–268, 1998.

549. G. O. Roberts and J. S. Rosenthal. Optimal scaling for various Metropolis-Hastings algorithms. Statistical Science, 16(4):351–367, 2001.

550. G. O. Roberts and J. S. Rosenthal. Coupling and ergodicity of adaptive Markov chain Monte Carlo algorithms. Journal of Applied Probability, 44(2):458, 2007.

551. G. O. Roberts and J. S. Rosenthal. Examples of adaptive MCMC. Journal of Computational and Graphical Statistics, 18(2):349–367, 2009.

552. C. Roos, T. Terlaky, and J. P. Vial. Theory and Algorithms for Linear Optimization: An Interior Point Approach. Wiley, Chichester, UK, 1997.

553. M. Rosenbluth and A. Rosenbluth. Monte Carlo calculation of the average extension of molecular chains. Journal of Chemical Physics, 23:356–359, 1955.

554. Jeffrey S. Rosenthal. Optimal proposal distributions and adaptive MCMC. In Handbook of Markov Chain Monte Carlo Methods. Chapman & Hall/CRC, Hoboken, NJ, 2011.

555. S. M. Ross. Simulation, 2nd ed. Academic, San Diego, CA, 1997.

556. S. M. Ross. Introduction to Probability Models, 7th ed. Academic, 2000.

557. R. Y. Rubenstein. Simulation and the Monte Carlo Method. Wiley, New York, 1981.

558. D. B. Rubin. The Bayesian bootstrap. Annals of Statistics, 9:130–134, 1981.

559. D. B. Rubin. A noniterative sampling/importance resampling alternative to the data augmentation algorithm for creating a few imputations when fractions of missing information are modest: The SIR algorihm. Discussion of M. A. Tanner and W. H. Wong. Journal of the American Statistical Association, 82:543–546, 1987.

560. D. B. Rubin. Using the SIR algorithm to simulate posterior distributions. In J. M. Bernardo, M. H. DeGroot, D. V. Lindley, and A. F. Smith, editors. Bayesian Statistics 3, pages 395–402. Clarendon, Oxford, 1988.

561. M. Rudemo. Empirical choice of histograms and kernel density estimators. Scandinavian Journal of Statistics, 9:65–78, 1982.

562. W. Rudin. Principles of Mathematical Analysis, 3rd ed. McGraw-Hill, New York, 1976.

563. H. Rue. Fast sampling of Gaussian Markov random fields. Journal of the Royal Statistical Society, Series B, 63:325–338, 2001.

564. D. Ruppert, S. J. Sheather, and M. P. Wand. An effective bandwidth selector for local least squares regression. Journal of the American Statistical Association, 90:1257–1270, 1995.

565. A. S. Rykov. Simplex algorithms for unconstrained minimization. Problems of Control and Information Theory, 12:195–208, 1983.

566. S. M. Sait and H. Youssef. Iterative Computer Algorithms with Applications to Engineering: Solving Combinatorial Optimization Problems. IEEE Computer Society Press, Los Alamitos, CA, 1999.

567. D. B. Sanders, J. M. Mazzarella, D. C. Kim, J. A. Surace, and B. T. Soifer. The IRAS revised bright galaxy sample (RGBS). Astronomical Journal, 126:1607–1664, 2003.

568. G. Sansone. Orthogonal Functions. Interscience Publishers, New York, 1959.

569. D. J. Sargent, J. S. Hodges, and B. P. Carlin. Structured Markov chain Monte Carlo. Journal of Computational and Graphical Statistics, 9(2):217–234, 2000.

570. L. Scaccia and P. J. Green. Bayesian growth curves using normal mixtures with nonparametric weights. Journal of Computational and Graphical Statistics, 12(2):308–331, 2003.

571. J. D. Schaffer, R. A. Caruana, L. J. Eshelman, and R. Das. A study of control parameters affecting online performance of genetic algorithms for function optimization. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms. Morgan Kaufmann, Los Altos, CA, 1989.

572. T. Schiex and C. Gaspin. CARTHAGENE: Constructing and joining maximum likelihood genetic maps. In T. Gaasterland, P. D. Karp, K. Karplus, C. Ouzounis, C. Sander, and A. Valencia, editors. Proceedings of the 5th International Conference on Intelligent Systems for Molecular Biology, pages 258–267. Menlo Park, CA, 1997. Association for Artificial Intelligence (AAAI).

573. M. G. Schimek, editor. Smoothing and Regression: Approaches, Computation, and Application. Wiley, New York, 2000.

574. M. G. Schimek and B. A. Turlach. Additive and generalized additive models. In M. G. Schimek, editor. Smoothing and Regression: Approaches, Computation, and Application, pages 277–327. Wiley, New York, 2000.

575. U. Schneider and J. N. Corcoran. Perfect simulation for Bayesian model selection in a linear regression model. Journal of Statistical Planning and Inference, 126(1):153–171, 2004.

576. C. Schumacher, D. Whitley, and M. Vose. The no free lunch and problem description length. In Genetic and Evolutionary Computation Conference, GECCO-2001, pages 565–570. Morgan Kaufmann, San Mateo, CA, 2001.

577. L. L. Schumaker. Spline Functions: Basic Theory. Wiley, New York, 1993.

578. E. F. Schuster and G. G. Gregory. On the nonconsistency of maximum likelihood density estimators. In W. G. Eddy, editor. Proceedings of the Thirteenth Interface of Computer Science and Statistics, pages 295–298. Springer, New York, 1981.

579. G. Schwartz. Estimating the dimension of a model. Annals of Statistics, 6:497–511, 1978.

580. D. W. Scott. Average shifted histograms: Effective nonparametric estimators in several dimensions. Annals of Statistics, 13:1024–1040, 1985.

581. D. W. Scott. Multivariate Density Estimation: Theory, Practice, and Visualization. Wiley, New York, 1992.

582. D. W. Scott and L. E. Factor. Monte Carlo study of three data-based nonparametric density estimators. Journal of the American Statistical Association, 76:9–15, 1981.

583. D. W. Scott and G. R. Terrell. Biased and unbiased cross-validation in density estimation. Journal of the American Statistical Association, 82:1131–1146, 1987.

584. J. M. Scott, P. J. Heglund, M. L. Morrison, J. B. Haufler, M. G. Raphael, W. Q. Wall, and F. B. Samson, editors. Predicting Species Occurrences—Issues of Accuracy and Scale. Island Press, Washington, DC, 2002.

585. G. A. F. Seber. The Estimation of Animal Abundance and Related Parameters, 2nd ed. Charles Griffin, London, 1982.

586. R. Seydel. Tools for Computational Finance. Springer, Berlin, 2002.

587. K. Shahookar and P. Mazumder. VLSI cell placement techniques. ACM Computing Surveys, 23:143–220, 1991.

588. D. F. Shanno. Conditioning of quasi-Newton methods for function minimization. Mathematics of Computation, 24:647–657, 1970.

589. J. Shao and D. Tu. The Jackknife and Bootstrap. Springer, New York, 1995.

590. X. Shao. The dependent wild bootstrap. Journal of the American Statistical Association, 105:218–235, 2010.

591. X. Shao. Extended tapered block bootstrap. Statistica Sinica, 20:807–821, 2010.

592. S. J. Sheather. The performance of six popular bandwidth selection methods on some real data sets. Computational Statistics, 7:225–250, 1992.

593. S. J. Sheather and M. C. Jones. A reliable data-based bandwidth selection method for kernel density estimation. Journal of the Royal Statistical Society, Series B, 53:683–690, 1991.

594. Y. Shi and R. Eberhart. A modified particle swarm optimizer. In Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence, The 1998 IEEE International Conference on, pages 69–73. May 1998.

595. G. R. Shorack. Probability for Statisticians. Springer, New York, 2000.

596. B. W. Silverman. Kernel density estimation using the fast Fourier transform. Applied Statistics, 31:93–99, 1982.

597. B. W. Silverman. Some aspects of the spline smoothing approach to non-parametric regression curve fitting (with discussion). Journal of the Royal Statistical Society, Series B, 47:1–52, 1985.

598. B. W. Silverman. Density Estimation for Statistics and Data Analysis. Chapman & Hall, London, 1986.

599. J. S. Simonoff. Smoothing Methods in Statistics. Springer, New York, 1996.

600. S. Singer and S. Singer. Efficient implementation of the Nelder–Mead search algorithm. Applied Numerical Analysis and Computational Mathematics, 1:524–534, 2004.

601. K. Singh. On the asymptotic accuracy of efron's bootstrap. Annals of Statistics, 9:1187–1195, 1981.

602. D. J. Sirag and P. T. Weisser. Towards a unified thermodynamic genetic operator. In J. J. Grefenstette, editor. Proceedings of the 2nd International Conference on Genetic Algorithms and Their Applications. Lawrence Erlbaum Associates, Hillsdale, NJ, 1987.

603. S. A. Sisson. Transdimensional markov chains. Journal of the American Statistical Association, 100(471):1077–1089, 2005.

604. S. A. Sisson. Transdimensional Markov chains: A decade of progress and future perspectives. Journal of the American Statistical Association, 100(471):1077–1090, 2005.

605. A. F. M. Smith and G. O. Roberts. Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods (with discussion). Journal of the Royal Statistical Society, Series B, 55:3–23, 1993.

606. A. F. M. Smith, A. M. Skene, J. E. H. Shaw, and J. C. Naylor. Progress with numerical and graphical methods for practical Bayesian statistics. The Statistician, 36:75–82, 1987.

607. B. J. Smith. boa: An R package for MCMC output convergence assessment and posterior inference. Journal of Statistical Software, 21(11):1–37, 2007.

608. P. J. Smith, M. Shafi, and H. Gao. Quick simulation: A review of importance sampling techniques in communications systems. IEEE Journal on Selected Areas in Communications, 15:597–613, 1997.

609. D. Sorenson and D. Gianola. Likelihood, Bayesian and MCMC Methods in Quantitative Genetics. Springer, New York, 2002.

610. D. Spiegelhalter, D. Thomas, N. Best, and D. Lunn. WinBUGS User Manual, Version 1.4. MRC Biostatistics Unit, Institute of Public Health, Cambridge, 2003. Available at http://www.mrc-bsu.cam.ac.uk/bugs.

611. P. Stavropoulos and D. M. Titterington. Improved particle filters and smoothing. In A. Doucet, N. de Freitas, and N. Gordon, editors. Sequential Monte Carlo Methods in Practice, pages 295–317. Springer, New York, 2001.

612. D. Steinberg. Salford Systems. Available at http://www.salford-systems.com, 2003.

613. M. Stephens. Bayesian analysis of mixture models with an unknown number of components—an alternative to reversible jump methods. Annals of Statistics, 28(1):40–74, 2000.

614. C. J. Stone. An asymptotically optimal window selection rule for kernel density estimation. Annals of Statistics, 12:1285–1297, 1984.

615. C. J. Stone, M. Hansen, C. Kooperberg, and Y. K. Truong. Polynomial splines and their tensor products in extended linear modeling (with discussion). Annals of Statistics, 25:1371–1470, 1997.

616. M. Stone. Cross-validatory choice and assessment of statistical predictions. Journal of the Royal Statistical Society, Series B, 36:111–147, 1974.

617. O. Stramer and R. L. Tweedie. Langevin-type models I: Diffusions with given stationary distributions, and their discretizations. Methodology and Computing in Applied Probability, 1:283–306, 1999.

618. O. Stramer and R. L. Tweedie. Langevin-type models II: Self-targeting candidates for MCMC algorithms. Methodology and Computing in Applied Probability, 1:307–328, 1999.

619. A. H. Stroud. Approximate Calculation of Multiple Integrals. Prentice-Hall, Englewood Cliffs, NJ, 1971.

620. A. H. Stroud and D. Secrest. Gaussian Quadrature Formulas. Prentice-Hall, Englewood Cliffs, NJ, 1966.

621. R. H. Swendsen and J.-S. Wang. Nonuniversal critical dynamics in Monte Carlo simulations. Physical Review Letters, 58(2):86–88, 1987.

622. G. Syswerda. Uniform crossover in genetic algorithms. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms, pages 2–9. Morgan Kaufmann, Los Altos, CA, 1989.

623. G. Syswerda. Schedule optimization using genetic algorithms. In L. Davis, editor. Handbook of Genetic Algorithms, pages 332–349. Van Nostrand Reinhold, New York, 1991.

624. M. A. Tanner. Tools for Statistical Inference: Methods for the Exploration of Posterior Distributions and Likelihood Functions, 2nd ed. Springer, New York, 1993.

625. M. A. Tanner. Tools for Statistical Inference: Methods for the Exploration of Posterior Distributions and Likelihood Functions, 3rd ed. Springer, New York, 1996.

626. R Development Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria, 2012.

627. G. R. Terrell. The maximal smoothing principle in density estimation. Journal of the American Statistical Association, 85:470–477, 1990.

628. G. R. Terrell and D. W. Scott. Variable kernel density estimation. Annals of Statistics, 20:1236–1265, 1992.

629. T. Therneau and B. Atkinson. An introduction to recursive partitioning using the RPART routines. Technical Report, Mayo Clinic, 1997. Available at http://lib.stat.cmu.edu, 1997.

630. R. A. Thisted. Elements of Statistical Computing: Numerical Computation. Chapman & Hall, New York, 1988.

631. R. Tibshirani. Estimating optimal transformations for regression via additivity and variance stabilization. Journal of the American Statistical Association, 82:559–568, 1988.

632. R. Tibshirani and K. Knight. Model search by bootstrap “bumping.” Journal of Computational and Graphical Statistics, 8:671–686, 1999.

633. L. Tierney. Markov chains for exploring posterior distributions (with discussion). Annals of Statistics, 22:1701–1786, 1994.

634. D. M. Titterington. Recursive parameter estimation using incomplete data. Journal of the Royal Statistical Society, Series B, 46:257–267, 1984.

635. H. Tjelmeland and J. Besag. Markov random fields with higher-order interactions. Scandinavian Journal of Statistics, 25:415–433, 1998.

636. P. Tseng. Fortified-descent simplicial search method: A general approach. SIAM Journal on Optimization, 10:269–288, 2000.

637. E. Turro, A. Lewin, A. Rose, M. J. Dallman, and S. Richardson. Mmbgx: A method for estimating expression at the isoform level and detecting differential splicing using whole-transcript affymetrix arrays. Nucleic Acids Research, 38(1):e4–e4, 2010.

638. G. L. Tyler, G. Balmino, D. P. Hinson, W. L. Sjogren, D. E. Smith, R. Woo, J. W. Armstrong, F. M. Flasar, R. A. Simpson, S. Asmar, A. Anabtawi, and P. Priest. Mars Global Surveyor Radio Science Data Products. Data can be obtained at http://www-star.stanford.edu/projects/mgs/public.html, 2004.

639. U.S. Environmental Protection Agency, Environmental Monitoring and Assessment Program (EMAP). Available at http://www.epa.gov/emap.

640. D. A. van Dyk and X.-L. Meng. The art of data augmentation (with discussion). Journal of Computational and Graphical Statistics, 10(1):1–111, 2001.

641. P. J. M. van Laarhoven and E. H. L. Aarts. Simulated Annealing: Theory and Applications. Kluwer, Boston, 1987.

642. W. N. Venables and B. D. Ripley. Modern Applied Statistics with S-Plus. Springer, New York, 1994.

643. W. N. Venables and B. D. Ripley. Modern Applied Statistics with S-Plus, 3rd ed. Springer, New York, 2002.

644. J. J. Verbeek. Principal curve webpage. Available at http://carol.wins.uva.nl/~jverbeek/pc/index_en.html.

645. C. Vogl and S. Xu. QTL analysis in arbitrary pedigrees with incomplete marker information. Heredity, 89(5):339–345, 2002.

646. M. D. Vose. The Simple Genetic Algorithm: Foundations and Theory. MIT Press, Cambridge, MA, 1999.

647. M. D. Vose. Form invariance and implicit parallelism. Evolutionary Computation, 9:355–370, 2001.

648. R. Waagepetersen and D. Sorensen. A tutorial on reversible jump MCMC with a view toward applications in QTL-mapping. International Statistical Review, 69(1):49–61, 2001.

649. G. Wahba. Spline Models for Observational Data. SIAM, Philadelphia, 1990.

650. F. H. Walters, L. R. Parker, S. L. Morgan, and S. N. Deming. Sequential Simplex Optimization. CRC Press, Boca Raton, FL, 1991.

651. M. P. Wand and M. C. Jones. Kernel Smoothing. Chapman & Hall, New York, 1995.

652. M. P. Wand, J. S. Marron, and D. Ruppert. Transformations in density estimation. Journal of the American Statistical Association, 86:343–353, 1991.

653. X. Wang, C. Z. He, and D. Sun. Bayesian population estimation for small sample capture–recapture data using noninformative priors. Journal of Statistical Planning and Inference, 137(4):1099–1118, 2007.

654. M. R. Watnik. Pay for play: Are baseball salaries based on performance? Journal of Statistics Education, 6(2), 1998.

655. G. S. Watson. Smooth regression analysis. Sankhyimg, Series A, 26:359–372, 1964.

656. G. C. G. Wei and M. A. Tanner. A Monte Carlo implementation of the EM algorithm and the poor man's data augmentation algorithms. Journal of the American Statistical Association, 85:699–704, 1990.

657. M. West. Modelling with mixtures. In J. M. Bernardo, M. H. DeGroot, and D. V. Lindley, editors. Bayesian Statistics 2, pages 503–524. Oxford University Press, Oxford, 1992.

658. M. West. Approximating posterior distributions by mixtures. Journal of the Royal Statistical Society, Series B, 55:409–422, 1993.

659. S. R. White. Concepts of scale in simulated annealing. In Proceedings of the IEEE International Conference on Computer Design. 1984.

660. D. Whitley. The GENITOR algorithm and selection pressure: Shy rank-based allocation of reproductive trials is best. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms. Morgan Kaufmann, Los Altos, CA, 1989.

661. D. Whitley. A genetic algorithm tutorial. Statistics and Computing, 4:65–85, 1994.

662. D. Whitley. An overview of evolutionary algorithms. Journal of Information and Software Technology, 43:817–831, 2001.

663. D. Whitley, T. Starkweather, and D. Fuquay. Scheduling problems and traveling salesman: The genetic edge recombination operator. In J. D. Schaffer, editor. Proceedings of the 3rd International Conference on Genetic Algorithms, pages 133–140. Morgan Kaufmann, Los Altos, CA, 1989.

664. D. Whitley, T. Starkweather, and D. Shaner. The traveling salesman and sequence scheduling: Quality solutions using genetic edge recombination. In L. Davis, editor. Handbook of Genetic Algorithms, pages 350–372. Von Nostrand Reinhold, New York, 1991.

665. P. Wilmott, J. Dewynne, and S. Howison. Option Pricing: Mathmatical Models and Computation. Oxford Financial Press, Oxford, 1997.

666. D. B. Wilson. How to couple from the past using a read-once source of randomness. Random Structures and Algorithms, 16(1):85–113, 2000.

667. D. B. Wilson. Web site for perfectly random sampling with Markov chains. Available at http://dbwilson.com/exact, August 2002.

668. G. Winkler. Image Analysis, Random Fields and Markov Chain Monte Carlo Methods, 2nd ed. Springer, Berlin, 2003.

669. P. Wolfe. Convergence conditions for ascent methods. SIAM Review, 11:226–235, 1969.

670. R. Wolfinger and M. O'Connell. Generalized linear models: A pseudo-likelihood approach. Journal of Computational and Graphical Statistics, 48:233–243, 1993.

671. S. Wolfram. Mathematica: A System for Doing Mathematics by Computer. Addison-Wesley, Redwood City, CA, 1988.

672. D. H. Wolpert and W. G. Macready. No free lunch theorems for search. Technical Report SFI-TR-95-02-010, Santa Fe Institute, NM, 1995.

673. M. A. Woodbury. Discussion of “The analysis of incomplete data” by Hartley and Hocking. Biometrics, 27:808–813, 1971.

674. B. J. Worton. Optimal smoothing parameters for multivariate fixed and adaptive kernel methods. Journal of Statistical Computation and Simulation, 32:45–57, 1989.

675. M. H. Wright. Direct search methods: Once scorned, now respectable. In D. F. Griffiths and G. A. Watson, editors. Numerical Analysis 1995, Proc. 1995 Dundee Bienneal Conference in Numerical Analysis, pages 191–208. Harlow, U.K., Addison-Wesley Longman, 1996.

676. C. F. J. Wu. On the convergence properties of the EM algorithm. Annals of Statistics, 11:95–103, 1983.

677. H. Youssef, S. M. Sait, K. Nassar, and M. S. T. Benton. Performance driven standard-cell placement using genetic algorithm. In GLSVLSI'95: Fifth Great Lakes Symposium on VLSI. 1995.

678. B. Yu and P. Mykland. Looking at Markov samplers through cusum plots: A simple diagnostic idea. Statistics and Computing, 8:275–286, 1998.

679. J. L. Zhang and J. S. Liu. A new sequential importance sampling method and its application to the two-dimensional hydrophobic-hydrophilic model. Journal of Chemical Physics, 117:3492–3498, 2002.

680. P. Zhang. Nonparametric importance sampling. Journal of the American Statistical Association, 91:1245–1253, 1996.

681. W. Zhao, A. Krishnaswamy, R. Chellappa, D. L. Swets, and J. Weng. Discriminant analysis of principal components for face recognition. In H. Wechsler, P. J. Phillips, V. Bruce, F. F. Soulie, and T. S. Huang, editors. Face Recognition: From Theory to Applications, pages 73–85. Springer, Berlin, 1998.

682. Z. Zheng. On swapping and simulated tempering algorithms. Stochastic Processes and Their Applications, 104:131–154, 2003.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.15.137.75