↑Han, Jun; Morag, Claudio (1995). 〈The influence of the sigmoid function parameters on the speed of backpropagation learning〉. Mira, José; Sandoval, Francisco. 《From Natural to Artificial Neural Computation》. Lecture Notes in Computer Science 930. 195–201쪽. doi:10.1007/3-540-59497-3_175. ISBN978-3-540-59497-0.
↑Gibbs, M.N. (Nov 2000). “Variational Gaussian process classifiers”. 《IEEE Transactions on Neural Networks》 11 (6): 1458–1464. doi:10.1109/72.883477. PMID18249869.
참고 문헌
Mitchell, Tom M. (1997). 《Machine Learning》. WCB–McGraw–Hill. ISBN978-0-07-042807-2.. In particular see "Chapter 4: Artificial Neural Networks" (in particular pp. 96–97) where Mitchell uses the word "logistic function" and the "sigmoid function" synonymously – this function he also calls the "squashing function" – and the sigmoid (aka logistic) function is used to compress the outputs of the "neurons" in multi-layer neural nets.
Humphrys, Mark. “Continuous output, the sigmoid function”. 2015년 2월 2일에 원본 문서에서 보존된 문서. 2019년 5월 26일에 확인함. Properties of the sigmoid, including how it can shift along axes and how its domain may be transformed.