282 Handbook of Big Data
41. D. L. Hanson and G. Pledger. Consistency in concave regression. Annals of Statistics,
4(6):1038–1050, 1976.
42. R. Hasminskii and I. Ibragimov. On density estimation in the view of Kolmogorov’s
ideas in approximation theory. Annals of Statistics, 18(3):999–1010, 1990.
43. D. Haussler and M. Opper. Mutual information, metric entropy and cumulative relative
entropy risk. Annals of Statistics, 25(6):2451–2492, 1997.
44. A. J. Izenman. Recent developments in nonparametric density estimation. Journal of
the American Statistical Association, 86(413):205–224, 1991.
45. H. K. Jankowski and J. A. Wellner. Estimation of a discrete monotone density.
Electronic Journal of Statistics, 3:1567–1605, 2009.
46. M. Kearns, Y. Mansour, D. Ron, R. Rubinfeld, R. Schapire, and L. Sellie. On the
learnability of discrete distributions. In Proceedings of the 26th STOC, pp. 273–282,
1994.
47. G. Kerkyacharian and D. Picard. Density estimation in Besov spaces. Statistics &
Probability Letters, 13(1):15–24, 1992.
48. G. Kerkyacharian, D. Picard, and K. Tribouley. Lp adaptive density estimation.
Bernoulli, 2(3):229–247, 1996.
49. R. Koenker and I. Mizera. Quasi-concave density estimation. Annals of Statistics,
38(5):2998–3027, 2010.
50. A. N. Kolmogorov and V. M. Tihomirov. ε-entropy and ε-capacity of sets in function
spaces. Uspekhi Matematicheskikh Nauk, 14:3–86, 1959.
51. B. Lindsay. Mixture Models: Theory, Geometry and Applications. Institute for Mathe-
matical Statistics, Hayward, CA, 1995.
52. G. G. Lorentz. Metric entropy and approximation. Bulletin of the American Mathemat-
ical Society, 72:903–937, 1966.
53. Y. Makovoz. On the Kolmogorov complexity of functions of finite smoothness. Journal
of Complexity, 2(2):121–130, 1986.
54. A. Moitra and G. Valiant. Settling the polynomial learnability of mixtures of Gaussians.
In FOCS, pp. 93–102, 2010.
55. K. Pearson. Contributions to the mathematical theory of evolution. ii. Skew variation
in homogeneous material. Philosophical Transactions of the Royal Society of London,
186:343–414, 1895.
56. B. L. S. Prakasa Rao. Estimation of a unimodal density. Sankhya Series A, 31:23–36,
1969.
57. R. A. Redner and H. F. Walker. Mixture densities, maximum likelihood and the EM
algorithm. SIAM Review, 26:195–202, 1984.
58. D. W. Scott. Multivariate Density Estimation: Theory, Practice and Visualization.
Wiley, New York, 1992.
59. D. W. Scott and S. R. Sain. Multidimensional Density Estimation.volume24of
Handbook of Statistics, pp. 229–261, 2005.
Learning Structured Distributions 283
60. B. W. Silverman. Density Estimation. Chapman & Hall, London, 1986.
61. C. J. Stone. The use of polynomial splines and their tensor products in multivariate
function estimation. Annals of Statistics, 22(1):118–171, 1994.
62. C. J. Stone, M. H. Hansen, C. Kooperberg, and Y. K. Truong. Polynomial splines and
their tensor products in extended linear modeling: 1994 Wald memorial lecture. Annals
of Statistics, 25(4):1371–1470, 1997.
63. A. T. Suresh, A. Orlitsky, J. Acharya, and A. Jafarpour. Near-optimal-sample esti-
mators for spherical gaussian mixtures. In Advances in Neural Information Processing
Systems, pp. 1395–1403, 2014.
64. D. M. Titterington, A. F. M. Smith, and U. E. Makov. Statistical Analysis of Finite
Mixture Distributions. Wiley, 1985.
65. A. B. Tsybakov. Introduction to Nonparametric Estimation. Springer, 2008.
66. L. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134–1142,
1984.
67. L. G. Valiant. A theory of the learnable. In Proceedings of the 16th Annual ACM
Symposium on Theory of Computing, pp. 436–445. ACM Press, New York, 1984.
68. A. W. van der Vaart and J. A. Wellner. Weak Convergence and Empirical Processes.
Springer Series in Statistics. Springer-Verlag, New York, 1996. (With applications to
statistics.)
69. G. Walther. Inference and modeling with log-concave distributions. Statistical Science,
24(3):319–327, 2009.
70. E. J. Wegman. Maximum likelihood estimation of a unimodal density. I. and II. The
Annals of Mathematical Statistics, 41:457–471, 2169–2174, 1970.
71. E. J. Wegman and I. W. Wright. Splines in statistics. Journal of the American Statistical
Association, 78(382):351–365, 1983.
72. R. Willett and R. D. Nowak. Multiscale poisson intensity and density estimation. IEEE
Transactions on Information Theory, 53(9):3171–3187, 2007.
73. Y. Yang and A. Barron. Information-theoretic determination of minimax rates of
convergence. Annals of Statistics, 27(5):1564–1599, 1999.
74. Y. G. Yatracos. Rates of convergence of minimum distance estimators and Kolmogorov’s
entropy. Annals of Statistics, 13:768–774, 1985.
This page intentionally left blankThis page intentionally left blank
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.139.83.62