Artificial intelligence used in genome analysis studies

Open access

Abstract

Next Generation Sequencing (NGS) or deep sequencing technology enables parallel reading of multiple individual DNA fragments, thereby enabling the identification of millions of base pairs in several hours. Recent research has clearly shown that machine learning technologies can efficiently analyse large sets of genomic data and help to identify novel gene functions and regulation regions. A deep artificial neural network consists of a group of artificial neurons that mimic the properties of living neurons. These mathematical models, termed Artificial Neural Networks (ANN), can be used to solve artificial intelligence engineering problems in several different technological fields (e.g., biology, genomics, proteomics, and metabolomics). In practical terms, neural networks are non-linear statistical structures that are organized as modelling tools and are used to simulate complex genomic relationships between inputs and outputs. To date, Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNN) have been demonstrated to be the best tools for improving performance in problem solving tasks within the genomic field.

1. Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate. arXiv 2014: 1409.0473.

2. Hutter F, Hoos HH, Leyton-Brown K. Learning and intelligent optimization. (Berlin: Springer: 2011).

3. Friedman N. Inferring cellular networks using probabilistic graphical models. Science 2004; 303: 799–805.

4. Hastie T, Tibshirani R, Friedman J. The Elements of Statistical Learning: Data Mining, Inference and Prediction (Berlin: Springer: 2001).

5. Hamelryck T. Probabilistic models and machine learning in structural bioinformatics. Stat Methods Med Res 2009; 18: 505–526.

6. Zien A. Engineering support vector machine kernels that recognize translation initiation sites. Bioinformatics 2000; 16: 799–807.

7. Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv 2015; 1502.03167.

8. Bengio Y, Courville A, Vincent P. Representation learning: a review and new perspectives. Pattern Anal Mach Intell IEEE Trans 2013; 35: 1798–1828.

9. Jain V, Murray JF, Roth F, Turaga S, Zhigulin V, Briggman KL, Helmstaedter MN, Denk W, Seung HS. Supervised learning of image restoration with convolutional networks. Int Conf Computer Vision. 2007; 1–8.

10. Day N, Hemmaplardh A, Thurman RE, Stamatoyannopoulos JA, Noble WS. Unsupervised segmentation of continuous genomic data. Bioinformatics 2007; 23: 1424–1426.

11. Hoffman MM. Unsupervised pattern discovery in human chromatin structure through genomic segmentation. Nat Methods 2012; 9: 473–476.

12. Chapelle O, Schölkopf B, Zien A. Semi-supervised Learning (Cambridge Ma: MIT Press: 2006).

13. Ernst J, Kellis M. ChromHMM: automating chromatin-state discovery and characterization. Nat Methods. 2012; 9: 215–216.

14. Chapelle O, Schölkopf B, Zien A. Semi-supervised Learning. (Cambridge MA: MIT Press: 2006).

15. Urbanowicz RJ, Granizo-Mackenzie A, Moore JH. An analysis pipeline with statistical and visualization-guided knowledge discovery for Michigan-style learning classifier systems. IEEE Comput Intell Mag 2012; 7: 35–45.

16. Abadi M, Agarwal A, Barham P, Brevdo E, Chen Z, Citro C, Corrado GS, Davis A, Dean J, Devin M, Ghemawat S, Goodfellow I, Harp A, Irving G, Isard M, Jia Y, Josofowicz R, Kaiser L, Kudlur M, Levenberg J. TensorFlow: large-scale machine learning on heterogeneous distributed systems. arXiv 2016; 1603.04467

17. Xiong C, Merity S, Socher R. Dynamic memory networks for visual and textual question answering. arXiv 2016; 1603.01417.

18. Xu R, Wunsch D II., Frank R. Inference of genetic regulatory networks with recurrent neural network models using particle swarm optimization. IEEE/ACM Trans Comput Biol Bioinformatics 2007; 4: 681–692.

19. Xu Y, Mo T, Feng Q, Zhong P, Lai M, Chang EI. Deep learning of feature representation with multiple instance learning for medical image analysis. IEEE Int Conf Acoustics, Speech, Signal Processing. 2014; 1626–1630.

20. Zeiler MD, Fergus R. Visualizing and understanding convolutional networks. (Berlin: Springer: 2014).

21. Ng AY, Jordan MI. Advances in Neural Information Processing Systems. (Cabridge MA: MIT Press: 2002).

22. Wolpert DH, Macready WG. No free lunch theorems for optimization. IEEE Trans Evol Comput 1997; 1: 67–82.

23. Boser BE, Guyon IM, Vapnik VN. A Training Algorithm for Optimal Margin Classifiers. (NY: ACM Press: 1992).

24. Noble WS. What is a support vector machine? Nature Biotech 2006; 24: 1565–1567.

25. Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks. International Conference on Artificial Intelligence and Statistics. 2010; 249–256.

26. Troyanskaya OG, Dolinski K, Owen AB, Altman RB, Botstein DA. Bayesian framework for combining heterogeneous data sources for gene function prediction (in Saccharomyces cerevisiae). Proc Natl Acad Sci USA 2003; 100: 8348–8353.

27. Friedman N, Linial M, Nachman I, Peer D. Using Bayesian networks to analyze expression data. J Comput Biol 2000; 7: 601–620.

28. Koski TJ, Noble J. A review of Bayesian networks and structure learning. Math Applicanda 2012; 40: 51–103.

29. Friedman N, Linial M, Nachman I, Pe’er D. Using Bayesian networks to analyze expression data. J Comput Biol 2000; 7: 601–620.

30. Koski TJ, Noble J. A review of bayesian networks and structure learning. Math Applicanda 2012; 40: 51–103.

31. Brown M. Using Dirichlet mixture priors to derive hidden Markov models for protein families. Int Conf Intelligent Systems Mol Biol 1993; 47-55.

32. Keogh E, Mueen A. Encyclopedia of Machine Learning (Berlin: Springer: 2011).

33. Manning CD, Schütze H. Foundations of Statistical Natural Language Processing (Cambridge MA: MIT Press: 1999).

34. Friedman N. Inferring cellular networks using probabilistic graphical models. Science. 2004; 303: 799–805.

35. Hastie T, Tibshirani R.; Friedman, J. The Elements of Statistical Learning: Data mining, Inference and Prediction. (New York NY: Springer: 2001).

36. Yip KY, Cheng C, Gerstein M. Machine learning and genome annotation: a match meant to be? Genome biol 2013; 14:205.

37. Day N, Hemmaplardh A, Thurman RE, Stamatoyannopoulos JA, Noble WS. Unsupervised segmentation of continuous genomic data. Bioinformatics. 2007; 23: 1424–1426.

38. Boser BE, Guyon IM, Vapnik VN. A training algorithm for optimal margin classifiers. (Pittsburgh, PA: ACM Press: 1992).

39. Noble WS. What is a support vector machine? Nature Biotech 2006; 24: 1565–1567.

40. Hastie T, Tibshirani R, Friedman J, Franklin J. The elements of statistic learning: data mining, inference and prediction. Math Intell 2005; 27: 83–85.

41. He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. arXiv 2015; 1512.03385.

42. Hinton GE, Salakhutdinov RR. Reducing the dimensionality of data with neural networks. Science 2006; 313: 504–507.

43. Hinton GE, Osindero S, Teh Y-W. A fast learning algorithm for deep belief nets. Neural Comput 2006; 18: 1527–1554.

44. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015; 521: 436-444.

45. Schmidhuber J. Deep learning in neural networks: An overview. Neural Networks 2015; 61: 85-117.

46. Mamoshina P, Vieira A, Putin E, Zhavoronkov A (2016) Applications of deep learning in biomedicine. Mol Pharm 2016; 13: 1445–1454.

47. Murphy KP (2012) Machine learning: a probabilistic perspective. (Cambridge MA: MIT Press: 2012).

48. Rampasek L, Goldenberg A (2016) TensorFlow: biology’s gateway to deep learning? Cell Syst 2016; 2: 12–14.

49. Salakhutdinov R, Hinton G (2012) An efficient learning procedure for deep Boltzmann machines. Neural Comput 2012; 24: 1967–2006.

50. Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 2015; 61: 85–117.

51. Snoek J, Larochelle H, Adams RP. Practical bayesian optimization of machine learning algorithms. In Advances in neural information processing systems, pp 2951–2959. (Cambridge MA: MIT Press: 2012).

52. Spencer M, Eickholt J, Cheng J. A deep learning network approach to ab initio protein secondary structure prediction. IEEE/ACM Trans Comput Biol Bioinformatics 2015; 12: 103–112.

53. Eickholt J, Cheng J. Predicting protein residue-residue contacts using deep networks and boosting. Bioinformatics 2012; 28: 3066–3072.

54. Eickholt J, Cheng J. DNdisorder: predicting protein disorder using boosting and deep networks. BMC Bioinformatics 2013; 14: 88.

55. Gawehn E, Hiss JA, Schneider G. Deep learning in drug discovery. Mol Informatics 2016; 35: 3–14.

56. Che Z, Purushotham S, Khemani R, Liu Y. Distilling knowledge from deep networks with applications to healthcare domain. arXiv 2015; 1512.03542.

57. Bastien F, Lamblin P, Pascanu R, Bergstra J, Goodfellow I, Bergeron A, Bouchard N, Warde-Farley D, Bengio Y. Theano: new features and speed improvements. arXiv 2012; 1211.5590

58. Bengio Y. Practical recommendations for gradient-based training of deep architectures. In Neural networks: tricks of the trade, Montavon G, Orr G, Müller K-R (Kelley DR, Snoek J, Rinn J. Basset: learning the regulatory code of the accessible genome with deep convolutional neural networks. Mol Syst Biol. 2016; 12(7): 878.

59. Kingma DP, Welling M. Auto-encoding variational bayes. arXiv 2013; 1312.6114.

60. Kingma D, Ba J. Adam: a method for stochastic optimization. arXiv 2014; 1412.6980.

61. Leung MKK, Xiong HY, Lee LJ, Frey BJ. Deep learning of the tissue-regulated splicing code. Bioinformatics 2014; 30: 121–129.

62. Simonyan K, Vedaldi A, Zisserman A. Deep inside convolutional networks: visualising image classification models and saliency maps. arXiv 2013; 1312.6034.

63. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv 2014; 1409.1556.

64. Koh PW, Pierson E, Kundaje A. Denoising genome-wide histone ChIP-seq with convolutional neural networks. Bioinformatics 2017; 33(14): 225–233.

65. Dahl GE, Jaitly N, Salakhutdinov R. Multi-task neural networks for QSAR predictions. arXiv 2014; 1406.1231.

66. Lipton ZC (2015) A critical review of recurrent neural networks for sequence learning. arXiv 2015; 1506.00019.

67. Lipton ZC, Kale DC, Elkan C, Wetzell R (2015) Learning to diagnose with LSTM recurrent neural networks. arXiv 2015; 1511.03677.

68. Donahue J, Jia Y, Vinyals O, Hoffman J, Zhang N, Tzeng E, Darrell T. Decaf: a deep convolutional activation feature for generic visual recognition. arXiv 2013; 1310.1531.

69. Kraus OZ, Ba LJ, Frey B. Classifying and segmenting microscopy images using convolutional multiple instance learning. arXiv 2015; 1511.05286v1.

70. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature 2015; 521: 436–444.

71. Lee B, Lee T, Na B, Yoon S. DNA-level splice junction prediction using deep recurrent neural networks. arXiv 2015; 1512.05135

72. Park Y, Kellis M (2015) Deep learning for regulatory genomics. Nat Biotechnol 2015;33: 825–826.

73. Libbrecht MW, Noble WS (2015) Machine learning applications in genetics and genomics. Nat Rev Genet 2015; 16: 321–332.

74. Sutskever I, Vinyals O, Le QV. Advances in neural information processing systems. (Cambridge MA: MIT Press: 2014).

75. Wasson T, Hartemink AJ. An ensemble model of competitive multi-factor binding of the genome. Genome Res 2009; 19: 2102–2112.

76. Yip KY, Cheng C, Gerstein M. Machine learning and genome annotation: a match meant to be? Genome Biol 2013; 14: 205.

77. Zhou J, Troyanskaya OG (2015) Predicting effects of noncoding variants with deep learning based sequence model. Nat Methods 2015; 12: 931–934.

78. Swan AL, Mobasheri A, Allaway D, Liddell S, Bacardit J (2013) Application of machine learning to proteomics data: classification and biomarker identification in postgenomics biology. Omics 2013; 17: 595–610.

79. Alipanahi B, Delong A, Weirauch MT, Frey BJ (2015) Predicting the sequence specificities of DNA- and RNA-binding proteins by deep learning. Nat Biotechnol 2015; 33: 831–838.

80. Zhang J, White NM, Schmidt HK. Integrate: gene fusion discovery using whole genome and transcriptome data. Genome Res 2016; 26(1):108–118.

81. Degroeve S, Baets BD, de Peer YV, Rouz P. Feature subset selection for splice site prediction. Bioinformatics. 2002; 18: S75–S83.

82. Wasson, T., Hartemink, A. J. An ensemble model of competitive multi-factor binding of the genome. Genome Res 2009;19: 2102–2112.

83. Lanckriet GRG, Bie TD, Cristianini N, Jordan MI, Noble WS. A statistical framework for genomic data fusion. Bioinformatics 2004; 20: 2626–2635.

84. Pavlidis P, Weston J, Cai J, Noble WS. Learning gene functional classifications from multiple data types. J Computat Biol 2002; 9: 401–411.

85. Picardi E, Pesole G. Computational methods for ab initio and comparative gene finding. Meth Mol Biol 2010; 609: 269–284.

86. Degroeve S, Baets BD, de Peer YV, Rouzé P. Feature subset selection for splice site prediction. Bioinformatics 2002; 18: S75–S83.

87. Ouyang Z, Zhou Q, Wong HW. ChIP-Seq of transcription factors predicts absolute and differential gene expression in embryonic stem cells. PNAS USa. 2009; 106: 21521–21526.

88. Chen Y, Li Y, Narayan R, Subramanian A, Xie X. Gene expression inference with deep learning Bioinformatics 2016; 32: 1832–1839.

89. Troyanskaya OG, Dolinski K, Owen AB, Altman RB, Botstein D. A Bayesian framework for combining heterogeneous data sources for gene function prediction (in S. cerevisiae). PNAS USA 2003; 100: 8348–8353.

90. Upstill-Goddard R, Eccles D, Fliege J, Collins A. Machine learning approaches for the discovery of gene–gene interactions in disease data. Brief Bioinform 2013; 14: 251–260.

91. Urbanowicz R, Granizo-Mackenzie D, Moore J. An expert knowledge guided michigan-style learning classifier system for the detection and modeling of epistasis and genetic heterogeneity. Proc Parallel Problem Solving From Nature 2012; 12: 266–275.

92. Angermueller C, Lee H, Reik W, Stegle O. Accurate prediction of single-cell DNA methylation states using deep learning. Genome Biol 2017; 18: 67.

93. Ernst J, Kellis M. ChromHMM: automating chromatin-state discovery and characterization. Nature Methods 2012;9: 215–216 (2012).

94. Fraser AG, Marcotte EM. A probabilistic view of gene function. Nature Genet 2004; 36: 559–564.

95. Battle A, Khan Z, Wang SH, Mitrano A, Ford MJ, Pritchard JK, Gilad Y (2015) Genomic variation. Impact of regulatory variation from RNA to protein. Science 2015; 347: 664–667.

96. Kelley DR, Snoek J, Rinn JL. Basset: learning the regulatory code of the accessible genome with deep convolutional neural networks. Genome Res 2016; 26: 990-99.

97. Sønderby SK, Winther O. Protein secondary structure prediction with long short term memory networks. arXiv 2014; 1412.78.

98. Beer MA, Tavazoie S. Predicting gene expression from sequence. Cell 2004; 117: 185–198. Heintzman N. Distinct and predictive chromatin signatures of transcriptional promoters and enhancers in the human genome. Nature Genet 2007; 39: 311–318.

99. Pique-Regi R. Accurate inference of transcription factor binding from DNA sequence and chromatin accessibility data. Genome Res. 2011;21: 447–455.

100. Qiu J, Noble WS. Predicting co-complexed protein pairs from heterogeneous data. PLoS Comput Biol 2008; 4: e1000054.

101. Ramaswamy S. Multiclass cancer diagnosis using tumor gene expression signatures. Proc Natl Acad Sci USA 2001; 98: 15149–15154.

102. Saigo H, Vert JP, Akutsu T. Optimizing amino acid substitution matrices with a local alignment kernel. BMC Bioinformatics 2006; 7: 246.

103. Segal E. A genomic code for nucleosome positioning. Nature 2006;44, 772–778.

104. Karlic RR, Chung H, Lasserre J, Vlahovicek K, Vingron M. Histone modification levels are predictive for gene expression. PNAS USA 2010; 107: 2926–2931.

105. Bell JT, Pai AA, Pickrell JK, Gaffney DJ, Pique-Regi R, Degner JF, Gilad Y, Pritchard JK (2011) DNA methylation patterns associate with genetic and gene expression variation in HapMap cell lines. Genome Biol 2011; 12: R10.

106. Cuellar-Partida G, et al. Epigenetic priors for identifying active transcription factor binding sites. Bioinformatics 2011; 28: 56–62.

107. Kell DB (2005) Metabolomics, machine learning and modelling: towards an understanding of the language of cells. Biochem Soc Trans 2005; 33: 520–524.

108. Shen H, Zamboni N, Heinonen M, Rousu J. Metabolite identification through machine learning—Tackling CASMI challenge using fingerID. Metabolites 2013; 3: 484–505.

109. Glaab E, Bacardit J, Garibaldi JM, Krasnogor N. Using rule-based machine learning for candidate disease gene prioritization and sample classification of cancer gene expression data. Plos one. 2012; 7: e39932.

110. Menden MP, Iorio F, Garnett M, McDermott U, Benes CH, Ballester PJ, Saez-Rodriguez J. Machine learning prediction of cancer cell sensitivity to drugs based on genomic and chemical properties. PLos one 2013; 8: e61318.

111. Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Proceedings of the 25th International Conference on Neural Information Processing Systems Lake Tahoe, Nevada 2012: 1097-1105.

112. Lanchantin J, Lin Z, Qi Y. Deep motif: Visualizing genomic sequence classifications. arXiv 2016: 1605.01133.

113. Zeng H, Edwards MD, Liu G, Gifford DK. Convolutional neural network architectures for predicting DNA–protein binding. Bioinformatics 2016; 32(12): 121–127.

114. Chen J, Guo M, Wang X, Liu B. A comprehensive review and comparison of different computational methods for protein remote homology detection. Briefings in bioinformatics 2016; 108:256.

115. Torracinta R, Campagne F. Training genotype callers with neural networks. bioRxiv 2016; 097469.

116. Poplin R, Newburger D, Dijamco J, Nguyen N, Loy D, Gross SS, McLean CY, DePristo MA. Creating a universal SNP and small indel variant caller with deep neural networks. 2018; bioRxiv: doi.org/10.1101/092890.

117. Schreiber J, Libbrecht M, Bilmes J, Noble W. Nucleotide sequence and dnasei sensitivity are predictive of 3d chromatin architecture. bioRxiv; 2017: 103614.

118. Boza V, Brejova B, Vinar T. Deepnano: Deep recurrent neural networks for base calling in minion nanopore reads. Plos one 2017;12(6): e0178751.

119. Quang D, Xie X. Danq: a hybrid convolutional and recurrent deep neural network for quantifying the function of DNA sequences. Nucleic Acids Res 2016; 44(11): e107–e107. X.

120. Lee T, Yoon S. Boosted categorical restricted boltzmann machine for computational prediction of splice junctions. Int Conf Machine Learning; 2015: 2483–2492.

121. Baumgartner C, Böhm C, Baumgartner D. Modelling of classification rules on metabolic patterns including machine learning and expert knowledge. J Biomed Inform 2005; 38: 89–98.

122. Alakwaa FM, Chaudhary K, Garmire LX. Deep learning accurately predicts estrogen receptor status in breast cancer metabolomics data. J Proteom Res 2018; 17: 337–347.

123. Hao J, Astle W, De Iorio M, Ebbels T. BATMAN—An R package for the automated quantification ofmetabolites from NMR spectra using a Bayesian model. Bioinformatics 2012; 28: 2088–2090.

124. Ravanbakhsh S, Liu P, Bjorndahl TC, Mandal R, Grant JR, Wilson M, Eisner R, Sinelnikov I, Hu X, Luchinat C. Accurate, fully-automated NMR spectral profiling for metabolomics. PLos one 2015; 10: e0124219.

125. Hsu PD, Lander ES, Zhang F. Development and Applications of CRISPR-Cas9 for Genome Engineering. Cell 2014; 157: 1262.

126. Sternberg S, Doudna J. Expanding the Biologist’s Toolkit with CRISPR-Cas9.Molecular Cell. 2015; 58: 568.

127. Tsai SQ, Zheng Z, Nguyen NT, Liebers M, Topkar VV, Thapar V, Wyvekens N, Khayter C, Iafrate AJ, Le LP, Aryee MJ, Joung JK. GUIDE-seq enables genome-wide profiling of off-target cleavage by CRISPR-Cas nucleases. Nat Biotechnol 2015; 33(2): 187.

128. Slaymaker IM et al. Rationally engineered Cas9 nucleases with improved specificity. Science 2016; 351: 84–88.

129. Kim S, Kim D, Cho SW, Kim J, Kim JS. Highly efficient RNA-guided genome editing in human cells via delivery of purified Cas9 ribonucleoproteins. Genome Res 2014; 24 :1012–1019.

130. Casini A, Olivieri M, Petris G, Montagna C, Reginato G, Maule G, Lorenzin F, Prandi D, Romanel A, Demichelis F, Inga A, Cereseto A. A highly specific SpCas9 variant is identified by in vivo screening in yeast. Nature Biotech 2018; 36: 265–271.

131. Wilson H, Elizabeth D, McDonald M. (2002). Factors for success in customer relationship management (CRM) systems. J Marketing Manage 2002; 18(1): 193–219.

132. Costa FF. Big data in genomics: challenges and solutions. GIT Lab J 2012; 11: 1-4.

133. Ward RM, Schmieder R, Highnam G, Mittelman D. Big data challenges andopportunities in high-throughput sequencing. Syst Biomed 2013; 1: 29-34.

134. Eisenstein M. Big data: The power of petabytes. Nature 2015; 527: S2-S4.

135. Woodco Bacardit J, Llorà X. Large-scale data mining using genetics-based machine learning. Wiley Interdiscip Rev 2013; 3: 37–61.

Journal Information

Metrics

All Time Past Year Past 30 Days
Abstract Views 0 0 0
Full Text Views 357 357 61
PDF Downloads 209 209 42