Donald School Journal of Ultrasound in Obstetrics and Gynecology

Register      Login

VOLUME 15 , ISSUE 3 ( July-September, 2021 ) > List of Articles

REVIEW ARTICLE

Recognition of Fetal Facial Expressions Using Artificial Intelligence Deep Learning

Yasunari Miyagi, Toshiyuki Hata, Saori Bouno, Aya Koyanagi, Takahito Miyake

Keywords : Artificial intelligence, Deep learning, Facial recognition, Fetus, Machine learning, Ultrasonography

Citation Information : Miyagi Y, Hata T, Bouno S, Koyanagi A, Miyake T. Recognition of Fetal Facial Expressions Using Artificial Intelligence Deep Learning. Donald School J Ultrasound Obstet Gynecol 2021; 15 (3):223-228.

DOI: 10.5005/jp-journals-10009-1710

License: CC BY-NC 4.0

Published Online: 30-09-2021

Copyright Statement:  Copyright © 2021; Jaypee Brothers Medical Publishers (P) Ltd.


Abstract

Fetal facial expressions are useful parameters for assessing brain function and development in the latter half of pregnancy. Previous investigations have studied subjective assessment of fetal facial expressions using four-dimensional ultrasound. Artificial intelligence (AI) can enable the objective assessment of fetal facial expressions. Artificial intelligence recognition of fetal facial expressions may open the door to the new scientific field, such as “AI science of fetal brain”, and fetal neurobehavioral science using AI is at the dawn of a new era. Our knowledge of fetal neurobehavior and neurodevelopment will be advanced through AI recognition of fetal facial expressions. Artificial intelligence may be an important modality in current and future research on fetal facial expressions and may assist in the evaluation of fetal brain function.


PDF Share
  1. Hata T, Dai SY, Marumo G. Ultrasound for evaluation of fetal neurobehavioural development: from 2‐D to 4‐D ultrasound. Inf Child Dev 2010;19:99–118. DOI: 10.1002/icd.659.
  2. Hata T, Kanenishi K, Hanaoka U, et al. HDIive and 4D ultrasound in the assessment of fetal facial expressions. Donald School J Ultrasound Obstet Gynecol 2015;9:44–50. DOI: 10.5005/jp-journals-10009-1388.
  3. Hata T. Current status of fetal neurodevelopmental assessment: 4D ultrasound study. J Obstet Gynecol Res 2016;42:1211–1221. DOI: 10.1111/jog.13099.
  4. Nijhuis JG. Fetal behavior. Neurobiol Aging 2003;24(Suppl. 1):S41–S46. DOI: 10.1016/S0197-4580(03)00054-X.
  5. Prechtl HF. State of the art of a new functional assessment of the young nervous system: an early predictor of cerebral palsy. Early Hum Dev 1997;50:1–11. DOI: 10.1016/S0378-3782(97)00088-1.
  6. de Vries JIP, Visser GHA, Prechtl HFR. The emergence of fetal behaviour. l. Qualitative aspects. Early Hum Dev 1982;7:301–322. DOI: 10.1016/0378-3782(82)90033-0.
  7. de Vries JIP, Visser GHA, Prechtl HFR. The emergence of fetal behaviour. ll. Quantitative aspects. Early Hum Dev 1985;12:99–120. DOI: 10.1016/0378-3782(85)90174-4.
  8. Prechtl HF. Qualitative changes of spontaneous movements in fetus and preterm infant are a marker of neurological dysfunction. Early Hum Dev 1990;23:151–158. DOI: 10.1016/0378-3782(90)90011-7.
  9. Prechtl HF, Einspieler C. Is neurological assessment of the fetus possible? Eur J Obstet Gynecol Reprod Biol 1997;75:81–84. DOI: 10.1016/S0301-2115(97)00197-8.
  10. Kuno A, Akiyama M, Yamashiro C, et al. Three-dimensional sonographic assessment of fetal behavior in the early second trimester of pregnancy. J Ultrasound Med 2001;20:1271–1275. DOI: 10.1046/j.1469-0705.2001.abs20-7.x.
  11. Hata T. Fetal face as predictor of fetal brain. Donald School J Ultrasound Obstet Gynecol 2018;12(1):56–59.
  12. Kurjak A, Miskovic B, Stanojevic M, et al. New scoring system for fetal neurobehavior assessed by three- and four-dimensional sonography. J Perinat Med 2008;36:73–81. DOI: 10.1515/JPM.2008.007.
  13. Stanojevic M, Talic A, Miskovic B, et al. An attempt to standardize Kurjak's antenatal neurodevelopmental test: osaka consensus statement. Donald School J Ultrasound Obstet Gynecol 2011;5:317–329. DOI: 10.5005/jp-journals-10009-1209.
  14. AboEllail MAM, Hata T. Fetal face as important indicator of fetal brain function. J Perinat Med 2017;45:729–736. DOI: 10.1515/jpm-2016-0377.
  15. Bodfish J, Powell S, Golden R, et al. Blink rate as an index of dopamine function in adults with mental retardation and repetitive movement disorders. Am J Ment Retard 1995;99:335–344.
  16. Kleven MS, Koek W. Differential effects of direct and indirect dopamine agonists on eye blink rate in cynomolgus monkeys. J Pharmacol Exp Ther 1996;279:1211–1219.
  17. Driebach G, Muller J, Goschke T, et al. Dopamine and cognitive control: the influence of spontaneous eyeblink rate and dopamine gene polymorphism on perseveration and distractibility. Behav Neurosci 2005;119:483–490.
  18. Colzato LS, van den Wildenberg WPM, van Wouwe NC, et al. Dopamine and inhibitory action control: evidence from spontaneous eye blink rates. Exp Brain Res 2009;196:467–474.
  19. Horimoto N, Koyanagi T, Nagata S, et al. Concurrence of mouthing movement and rapid eye movement/non-rapid eye movement phases with advance in gestation of the human fetus. Am J Obstet Gynecol 1989;161:344–351.
  20. Hata T, Kanenishi K, AboEllail MAM, et al. Fetal consciousness 4D ultrasound study. Donald School J Ultrasound Obstet Gynecol 2015;9:471–474. DOI: 10.5005/jp-journals-10009-1434.
  21. Reissland N, Francis B, Mason J. Can healthy fetuses show facial expressions of “pain” or “distress”? PLOS One 2013;8:e65530.
  22. Kawakami F, Yanaihara T. Smiles in the fetal period. Infant Behav Dev 2012;35:466–471.
  23. Reissland N, Francis B, Mason J, et al. Do facial expressions develop before birth? PLOS One 2011;6:e24081.
  24. Walusinski O, KurjakA, Andonotopo W, et al. Fetal yawning assessed by 3D and 4D sonography. Ultrasound Rev Obstet Gynecol 2005;5:210–217.
  25. Reissland N, Francis B, Manson J. Development of fetal yawn compared with non-yawn mouth openings from 24–36 weeks gestation. PLoS One 2012;7:e50569.
  26. Miyagi Y, Miyake T. Potential of artificial intelligence for estimating Japanese fetal weights. Acta Medica Okayama 2020;74:483–493. DOI: 10.18926/AMO/61207.
  27. Miyagi Y, Takehara K, Nagayasu Y, et al. Application of deep learning to the classification of uterine cervical squamous epithelial lesion from colposcopy images combined with HPV types. Oncol Lett 2020;19:1602–1610. DOI: 10.3892/ol.2019.11214.
  28. Miyagi Y, Takehara K, Miyake T. Application of deep learning to the classification of uterine cervical squamous epithelial lesion from colposcopy images. Mol Clin Oncol 2019;11:583–589. DOI: 10.3892/mco.2019.1932.
  29. Miyagi Y, Habara T, Hirata R, et al. Predicting a live birth by artificial intelligence incorporating both the blastocyst image and conventional embryo evaluation parameters. Artif Intell Med Imaging 2020;1:94–107. DOI: 10.35711/aimi.v1.i3.94.
  30. Miyagi Y, Habara T, Hirata R, et al. Feasibility of artificial intelligence for predicting live birth without aneuploidy from a blastocyst image. Reprod Med Biol 2019;18:204–211. DOI: 10.1002/rmb2.12267.
  31. Miyagi Y, Habara T, Hirata R, et al. Feasibility of deep learning for predicting live birth from a blastocyst image in patients classified by age. Reprod Med Biol 2019;18:190–203. DOI: 10.1002/rmb2.12266.
  32. Miyagi Y, Habara T, Hirata R, et al. Feasibility of predicting live birth by combining conventional embryo evaluation with artificial intelligence applied to a blastocyst image in patients classified by age. Reprod Med Biol 2019;18:344–356. DOI: 10.1002/rmb2.12284.
  33. Miyagi Y, Fujiwara K, Oda T, et al. Studies on development of new method for the prediction of clinical trial results using compressive sensing of artificial intelligence. In: Theory and Practice of Mathematics and Computer Science Ferreira MAM, ed., Hooghly, West Bengal, India: Book Publisher International; 2020. pp. 101–108. DOI: 10.9734/bpi/tpmcs/v2.
  34. Miyagi Y, Fujiwara K, Oda T, et al. Development of new method for the prediction of clinical trial results using compressive sensing of artificial intelligence. J Biostat Biometric App 2018;3:202.
  35. Miyagi Y, Tada K, Yasuhi I, et al. New method for determining fibrinogen and FDP threshold criteria by artificial intelligence in cases of massive hemorrhage during delivery. J Obstet Gynaecol Res 2020;46:256–265. DOI: 10.1111/jog.14166.
  36. Kim J, Kim B, Roy PP, et al. Efficient facial expression recognition algorithm based on hierarchical deep neural network structure. IEEE Access 2019;7:41273–41285. DOI: 10.1109/ACCESS.2019.2907327.
  37. Dixit AN, Kasbe T, A Survey on Facial Expression Recognition using Machine Learning Techniques. In 2nd International Conference on Data, Engineering and Applications (IDEA). 2020, 1-6 10.1109/IDEA49133.2020.9170706.
  38. Bengio Y, Courville A, Vincent P. Representation learning: a review and new perspectives. IEEE Trans Pattern Anal Mach Intell 2013;35:1798–1828. DOI: 10.1109/TPAMI.2013.50.
  39. LeCun YA, Bottou L, Orr GB, et al. Efficient backprop. In: Neural networks: tricks of the trade Montavon G, Orr GB, Müller KR, ed., Heidelberg, Berlin: Springer; 2012. pp. 9–48. DOI: 10.1007/978-3-642-35289-8_3.
  40. LeCun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition. Proc IEEE 1998;86:2278–2324. DOI: 10.1109/5.726791.
  41. LeCun Y, Boser B, Denker JS, et al. Back propagation applied to handwritten zip code recognition. Neural Comput 1989;1:541–551. DOI: 10.1162/neco.1989.1.4.541.
  42. Serre T, Wolf L, Bileschi S, et al. Robust object recognition with cortex-like mechanisms. IEEE Trans Pattern Anal Mach Intell 2007;29:411–426. DOI: 10.1109/TPAMI.2007.56.
  43. Wiatowski T, Bölcskei H. A mathematical theory of deep convolutional neural networks for feature extraction. IEEE Trans Inf Theory 2017;64:1845–1866. DOI: 10.1109/TIT.2017.2776228.
  44. Ciresan DC, Meier U, Masci J, et al., Flexible, High Performance Convolutional Neural Networks for Image Classification. In Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, Barcelona, Spain, 2011: 1237-1242.
  45. Scherer D, Müller A, Behnke S. Evaluation of pooling operations in convolutional architectures for object recognition. In: Artificial neural networks – ICANN 2010. Lecture notes in computer science Diamantaras K, Duch W, Iliadis LS, ed., Berlin, Heidelberg: Springer; 2010. 92–101. DOI: 10.1007/978-3-642-15825-4_10.
  46. Huang FJ, LeCun Y, Large-Scale Learning with Svm and Convolutional for Generic Object Categorization. In 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, USA. IEEE, 2006: 284–291 10.1109/CVPR.2006.164.
  47. Mnih V, Kavukcuoglu K, Silver D, et al. Human-level control through deep reinforcement learning. Nature 2015;518:529–533. DOI: 10.1038/nature14236.
  48. Szegedy C, Liu W, Jia Y, et al., Going Deeper with Convolutions. In Proceedings of the IEEE conference on computer vision and pattern recognition 2015. Computer Vision Foundation; Boston, USA, 2015: 1-9.
  49. Zheng Y, Liu Q, Chen E, et al. Time series classification using multi-channels deep convolutional neural networks. In: Web-age information management. WAIM 2014. Lecture notes in computer science Li F, Li G, Hwang S Yao B, Zhang Z, ed., Cham: Springer; 2014. pp. 298–310. DOI: 10.1007/978-3-319-08010-9_33.
  50. Ioff S, Szegedy C, Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. https://arxiv.org/abs/1502.03167v3.
  51. Glorot X, Bordes A, Bengio Y, Deep Sparse Rectifier Neural Networks. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics (AISTATS) 2011. AISTATS; Lauderdale, USA, 2011; 315-323.
  52. Nair V, Hinton GE, Rectified Linear Units Improve Restricted Boltzmann Machines. In Proceedings of the 27th international conference on machine learning (ICML-10). Omni press; Haifa, Israel, 2010: 807-814.
  53. Krizhevsky A, Sutskever I, Hinton GE, Imagenet Classification with Deep Convolutional Neural Networks. In Proceedings of the 25th International Conference on Neural Information Processing Systems. 2012: 1097-1105. http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf.
  54. Bridle JS. Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition. In: Neurocomputing Soulié FF, Hérault J, ed., Berlin, Heidelberg: Springer; 1990. pp. 227–236. DOI: 10.1007/978-3-642-76153-9_28.
  55. Miyagi Y, Hata T, Bouno S, et al. Recognition of facial expression of fetuses by artificial intelligence (AI). J Perinat Med 2021;49(5):596–603. DOI: 10.1515/jpm-2020-0537.
  56. Hata T, Kanenishi K, Mori N, et al. Mini KANET: simple fetal antenatal neurodevelopmental test. Donald School J Ultrasound Obstet Gynecol 2019;13(2):59–63.
PDF Share
PDF Share

© Jaypee Brothers Medical Publishers (P) LTD.