Bibliography

[1] M. Herman, S. Rivera, S. Mills, J. Sullivan, P. Guerra, A. Cosmas, D. Farris, E. Kohlwey, P. Yacci, B. Keller, A. Kherlopian, and M. Kim, The Field Guide to Data Science. McLean, VA: Booz Allen, Nov. 2013.

[2] M. F. Smith, Software Prototyping: Adoption, Practice and Management. New York, NY: McGraw-Hill, Inc., 1991.

[3] Agile Alliance, “12 Principles Behind the Agile Manifesto,” Nov. 2015. https://www.agilealliance.org/agile101/12-principles-behind-the-agile-manifesto/

[4] I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. Cambridge, MA: MIT Press, 2016.

[5] R. Kohavi, R. Longbotham, D. Sommerfield, and R. M. Henne, “Controlled Experiments on the Web: Survey and Practical Guide,” Data Min. Knowl. Discov., vol. 18, no. 1, pp. 140–181, Feb. 2009.

[6] R. Kohavi and R. Longbotham, “Online Controlled Experiments and A/B Testing,” in Encyclopedia of Machine Learning and Data Mining, pp. 922–929, Boston, MA: Springer, 2017.

[7] T. Crook, B. Frasca, R. Kohavi, and R. Longbotham, “Seven Pitfalls to Avoid when Running Controlled Experiments on the Web,” in Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’09, (New York, NY), pp. 1105–1114, ACM, 2009.

[8] D. C. Montgomery, Design and Analysis of Experiments. New York: John Wiley & Sons, 2006.

[9] S. Newman, Building Microservices. Boston, MA: O’Reilly Media, Inc., 1st ed., 2015.

[10] “TimeComplexity,” Python Wiki.

[11] L. Breiman, “Random Forests,” Mach. Learn., vol. 45, no. 1, pp. 5–32, Oct. 2001.

[12] T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition. Springer Series in Statistics, New York: Springer-Verlag, 2 ed., 2009.

[13] T. P. Minka, “Algorithms for maximum-likelihood logistic regression,” Statistics Tech Report, abstract, Sept. 19, 2003. http://www.stat.cmu.edu/tr/tr758/tr758.pdf

[14] D. Arthur and S. Vassilvitskii, “How Slow is the K-means Method?” in Proceedings of the Twenty-second Annual Symposium on Computational Geometry, SCG ’06, (New York, NY), pp. 144–153, ACM, 2006.

[15] Y. Zhang, A. J. Friend, A. L. Traud, M. A. Porter, J. H. Fowler, and P. J. Mucha, “Community structure in Congressional cosponsorship networks,” Physica A: Statistical Mechanics and its Applications, vol. 387, no. 7, pp. 1705–1712, Mar. 2008.

[16] M. E J Newman, “Analysis of Weighted Networks,” Physical review. E, Statistical, nonlinear, and soft matter physics, vol. 70, p. 056131, Dec. 2004.

[17] S. Fortunato and M. Barthlemy, “Resolution limit in community detection,” Proceedings of the National Academy of Sciences, vol. 104, no. 1, pp. 36–41, Jan. 2007.

[18] B. H. Good, Y.-A. de Montjoye, and A. Clauset, “Performance of modularity maximization in practical contexts,” Physical Review E, vol. 81, no. 4, p. 046106, Apr. 2010.

[19] S. M. Omohundro, “Five Balltree Construction Algorithms,” Tech. Rep., 1989.

[20] J. L. Bentley, “Multidimensional Binary Search Trees Used for Associative Searching,” Commun. ACM, vol. 18, no. 9, pp. 509–517, Sept. 1975.

[21] “1.6. Nearest Neighbors,” scikit-learn 0.19.1 documentation.

[22] J. Pearl, Causality: Models, Reasoning and Inference. New York, NY: Cambridge University Press, 2nd ed., 2009.

[23] D. J. C. MacKay, Information Theory, Inference & Learning Algorithms. New York, NY: Cambridge University Press, 2002.

[24] K. P. Murphy, Machine Learning: A Probabilistic Perspective. Cambridge, MA: MIT Press, 2012.

[25] E. J. Elton and M. J. Gruber, “A Practitioner’s Guide to Factor Models,” Research Foundation Books, vol. 1994, no. 4, pp. 1–85, Mar. 1994.

[26] D. Lay, Linear Algebra and Its Applications. Hoboken, NJ: Pearson, 3rd ed., 2016.

[27] I. M. Johnstone and A. Y. Lu, “Sparse Principal Components Analysis,” arXiv:0901.4392 [math, stat], Jan. 2009. arXiv: 0901.4392.

[28] “sklearn.decomposition.PCA,” scikit-learn 0.19.1 documentation.

[29] M. E. Tipping and C. M. Bishop, “Probabilistic Principal Component Analysis,” Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 61, no. 3, pp. 611–622, Jan. 2002.

[30] J. V. Stone, Independent Component Analysis: A Tutorial Introduction. Cambridge, MA: MIT Press, 2004.

[31] A. Hyvarinen, “Fast and Robust Fixed-point Algorithms for Independent Component Analysis,” Trans. Neur. Netw., vol. 10, no. 3, pp. 626–634, May 1999.

[32] S. Shwartz, M. Zibulevsky, and Y. Y. Schechner, “ICA Using Kernel Entropy Estimation with NlogN Complexity,” in Independent Component Analysis and Blind Signal Separation, Lecture Notes in Computer Science, pp. 422–429, Berlin, Heidelberg: Springer, Sept. 2004.

[33] D. M. Blei, A. Y. Ng, and M. I. Jordan, “Latent Dirichlet Allocation,” Journal of Machine Learning Research, vol. 3, no. Jan, pp. 993–1022, 2003.

[34] R. R. ehu and P. Sojka, “Software Framework for Topic Modelling with Large Corpora,” abstract, p. 5. https://radimrehurek.com/gensim/lrec2010_final.pdf

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.219.244.12