Kunihiko Fukushima

Kunihiko Fukushima
Born
16 March 1936 Japan
CitizenshipJapan
Alma materKyoto University
Known forArtificial neural networks, Neocognitron, Convolutional neural network architecture, Unsupervised learning, Deep learning, ReLU activation function
AwardsIEICE Achievement Award and Excellent Paper Awards, IEEE Neural Networks Pioneer Award, APNNA Outstanding Achievement Award, JNNS Excellent Paper Award, INNS Helmholtz Award, Bower Award and Prize for Achievement in Science
Scientific career
FieldsComputer science
InstitutionsFuzzy Logic Systems Institute

Kunihiko Fukushima (Japanese: 福島 邦彦, born 16 March 1936) is a Japanese computer scientist, most noted for his work on artificial neural networks and deep learning. He is currently working part-time as a senior research scientist at the Fuzzy Logic Systems Institute in Fukuoka, Japan.[1]

Notable scientific achievements

In 1980, Fukushima published the neocognitron,[2][3] the original deep convolutional neural network (CNN) architecture.[4][5] Fukushima proposed several supervised and unsupervised learning algorithms to train the parameters of a deep neocognitron such that it could learn internal representations of incoming data.[3][6] Today, however, the CNN architecture is usually trained through backpropagation. This approach is now heavily used in computer vision.[5][7]

In 1969 Fukushima introduced the ReLU (Rectifier Linear Unit) activation function in the context of visual feature extraction in hierarchical neural networks.[8][9][10] It was later argued that it has strong biological motivations and mathematical justifications.[11][12] In 2011 it was found to enable better training of deeper networks,[13] compared to the widely used activation functions prior to 2011, e.g., the logistic sigmoid (which is inspired by probability theory; see logistic regression) and its more practical[14] counterpart, the hyperbolic tangent. The rectifier is, as of 2017, the most popular activation function for deep neural networks.[15]

Education and career

In 1958, Fukushima received his Bachelor of Engineering in electronics from Kyoto University.[1] He became a senior research scientist at the NHK Science & Technology Research Laboratories. In 1989, he joined the faculty of Osaka University.[1] In 1999, he joined the faculty of the University of Electro-Communications. In 2001, he joined the faculty of Tokyo University of Technology. From 2006 to 2010, he was a visiting professor at Kansai University.[1]

Fukushima acted as founding president of the Japanese Neural Network Society (JNNS). He also was a founding member on the board of governors of the International Neural Network Society (INNS), and president of the Asia-Pacific Neural Network Assembly (APNNA).[1] He was one of the board of governors of the International Neural Network Society (INNIS) in 2003.

Awards

In 2020 Fukushima received the Bower Award and Prize for Achievement in Science.[16] He also received the IEICE Achievement Award and Excellent Paper Awards, the IEEE Neural Networks Pioneer Award, the APNNA Outstanding Achievement Award, the JNNS Excellent Paper Award and the INNS Helmholtz Award.[1]

External links

  1. ResearchMap profile

References

  1. ^ a b c d e f CIS Oral History Project (Don Wunsch) (2015). "Interview with Kunihiko Fukushima". IEEE TV. Retrieved 2019-02-27.
  2. ^ Fukushima, Neocognitron (1980). "A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position". Biological Cybernetics. 36 (4): 193–202. doi:10.1007/bf00344251. PMID 7370364. S2CID 206775608.
  3. ^ a b Fukushima, K. (2007). "Neocognitron". Scholarpedia. 2 (1): 1717. Bibcode:2007SchpJ...2.1717F. doi:10.4249/scholarpedia.1717.
  4. ^ Fogg, Andrew (2017). "A History of Deep Learning". import.io. Retrieved 2019-02-27.
  5. ^ a b Schmidhuber, Jürgen (2015). "Deep Learning". Scholarpedia. 10 (11): 1527–54. CiteSeerX 10.1.1.76.1541. doi:10.1162/neco.2006.18.7.1527. PMID 16764513. S2CID 2309950.
  6. ^ Fukushima, Kunihiko (2018). "Video: Artificial Vision by Deep CNN Neocognitron". Youtube. Retrieved 2019-03-25.
  7. ^ LeCun, Yann; Bengio, Yoshua; Hinton, Geoffrey (2015). "Deep learning". Nature. 521 (7553): 436–444. Bibcode:2015Natur.521..436L. doi:10.1038/nature14539. PMID 26017442. S2CID 3074096.
  8. ^ Fukushima, K. (1969). "Visual feature extraction by a multilayered network of analog threshold elements". IEEE Transactions on Systems Science and Cybernetics. 5 (4): 322–333. doi:10.1109/TSSC.1969.300225.
  9. ^ Fukushima, K.; Miyake, S. (1982). "Neocognitron: A Self-Organizing Neural Network Model for a Mechanism of Visual Pattern Recognition". Competition and Cooperation in Neural Nets. In Competition and Cooperation in Neural Nets, Lecture Notes in Biomathematics. Vol. 45. Springer. pp. 267–285. doi:10.1007/978-3-642-46466-9_18. ISBN 978-3-540-11574-8.
  10. ^ Schmidhuber, Juergen (2022). "Annotated History of Modern AI and Deep Learning". arXiv:2212.11279 [cs.NE].
  11. ^ Hahnloser, R.; Sarpeshkar, R.; Mahowald, M. A.; Douglas, R. J.; Seung, H. S. (2000). "Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit". Nature. 405 (6789): 947–951. Bibcode:2000Natur.405..947H. doi:10.1038/35016072. PMID 10879535. S2CID 4399014.
  12. ^ Hahnloser, R.; Seung, H. S. (2001). Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks. NIPS 2001.
  13. ^ Xavier Glorot; Antoine Bordes; Yoshua Bengio (2011). Deep sparse rectifier neural networks (PDF). AISTATS. Rectifier and softplus activation functions. The second one is a smooth version of the first.
  14. ^ Yann LeCun; Leon Bottou; Genevieve B. Orr; Klaus-Robert Müller (1998). "Efficient BackProp" (PDF). In G. Orr; K. Müller (eds.). Neural Networks: Tricks of the Trade. Springer.
  15. ^ Ramachandran, Prajit; Barret, Zoph; Quoc, V. Le (October 16, 2017). "Searching for Activation Functions". arXiv:1710.05941 [cs.NE].
  16. ^ "Kunihiko Fukushima". The Franklin Institute. 2020-01-25. Retrieved 2020-01-27.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Kunihiko_Fukushima&oldid=1218410169"