Search Results

1 - 10 of 210 items :

  • "feature selection" x
Clear All

References 1. ABE, N., KUDO, M., TOYAMA, J., SHIMBO, M. 2006. Classifier-independent feature selection on the basis of divergence criterion. Pattern Analysis & Applications , 9 (2), 2006, pp. 127.-137. 2. CHRYSOSTOMOU, K. 2009. Wrapper Feature Selection. In J. Wang (Ed.), Encyclopedia of Data Warehousing and Mining , Second Edition, pp. 2103-2108. Hershey, PA: Information Science Reference. doi:10.4018/978-1-60566-010-3.ch322 3. DEVIJVER, P. A., KITTLER, J. 1982. Pattern Recognition: A Statistical Approach. Prentice Hall. 4. De VEAUX, R., Predictive Analytics

Conference on Artificial Intelligence, 2002, pp. 249-256. 4. Das, K., K. Bhaduri, H. Kargupta. A Local Asynchronous Distributed Privacy Preserving Feature Selection Algorithm for Large Peer-To-Peer Networks. – Knowledge and Information Systems, Vol. 24 , 2010, No 3, pp. 341-367. 5. Sheela, M. A., K. Vijayalakshmi. Partition Based Perturbation for Privacy Preserving Distributed Data Mining. – Cybernetics and Information Technologies, Vol. 17 , 2017, No 2, pp. 44-55. 6. Skillicorn, D. B., S. M. McConnell. Distributed Prediction from Vertically Partitioned Data. – Journal

Networks Recipes in C++, Academic Press Inc, 1993. [9] Peterson D. A., Knight J. N., Kirby M. J., Anderson Ch. W., Thaut M. H., Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface, EURASIP Journal on Applied Signal Processing , 19 , 2005, 3128-3140. [10] Pfurtscheller G., Flotzinger D., Kalcher J., Brain-computer interface - a new communication device for handicapped persons, Journal of Microcomputer Application , 16 , 1993, 293-299. [11] Pfurtscheller G., Lopes da Silva F. H., Event-related EEG/MEG synchronization and

References 1. Yu, L., H. Liu. Efficient Feature Selection via Analysis of Relevance and Redundancy. – J. Mach. Learn. Res., Vol. 5 , 2004, No Oct, pp. 1205-1224. 2. Gheyas, I. A., L. S. Smith. Feature Subset Selection in Large Dimensionality Domains. – Pattern Recognit., Vol. 43 , January 2010, No 1, pp. 5-13. 3. Yang, Y., J. O. Pedersen. A Comparative Study on Feature Selection in Text Categorization. – In: Proc. of 14th International Conference on Machine Learning, ICML’97, 1997, pp. 412-420. 4. Yan, K., D. Zhang. Feature Selection and Analysis on Correlated

Jasleen Kaur. Feature selection for website fingerprinting. Technical Report 18-001, 2018. [43] Rishab Nithyanand, Xiang Cai, and Rob Johnson. Glove: A bespoke website fingerprinting defense. In Proceedings of the 13th Workshop on Privacy in the Electronic Society , pages 131–134. ACM, 2014. [44] Pierre Geurts, Damien Ernst, and Louis Wehenkel. Extremely randomized trees. Machine learning , 63(1):3–42, 2006. [45] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D

nuclei in breast cancer histopathology images, PLOS ONE 11(9): 1-15. Piórkowski, A. (2016). A statistical dominance algorithm for edge detection and segmentation of medical images, in E. Pietka et al. (Eds.), Information Technologies in Medicine, Advances in Intelligent Systems and Computing, Vol. 471, Springer, Cham, pp. 3-14. Roffo, G. (2016). Feature selection library (Matlab toolbox), arXiv: 1607.01327. Ronneberger, O., Fischer, P. and Brox, T. (2015). U-net: Convolutional networks for biomedical image segmentation, CoRR: abs/1505.04597. Ruifrok, A.C. and Johnston

abnormalities. Therefore, analysis of ECG signals using a computer-aided tools, potentially helps physicians to efficiently identify abnormalities [ 4 , 5 ]. The four major stages in a heartbeat abnormalities diagnosis procedure are preprocessing, feature extraction, feature selection, and classification [ 6 ]. Various types of artifacts and noise usually contaminate ECG recordings. In the preprocessing stage, the goals are to decrease such artifacts and noise and to improve the signal for subsequent processing. As an important step, feature extraction converts the ECG

References [1] R. Kohavi, G.H. John, Wrappers for feature subset selection. Artificial Intelligence, 97, pp. 273-324, 1997. [2] S.B. Thrun, The Monk’s problems: a performance comparison of different learning algorithms, Tech. Rept. CMU-CS-91-197, Carnegie Mellon University, Pittsburgh, PA, 1991. [3] G.H. John, Enhancements to the data mining process. Ph.D. Thesis, Computer Science Department, Stanford University, CA, 1997. [4] M. Venkatadri, K. Srinivasa Rao, A multiobjective genetic algorithm for feature selection in data mining, International Journal of

classification rules with imbalanced data. Data Mining and Knowledge Discovery , 28 , 92–122. Pociecha, J., Pawełek, B., Baryła, M., Augustyn, S. (2014). Statystyczne metody prognozowania bankructwa w zmieniającej się koniunkturze gospodarczej . Kraków: Fundacja Uniwersytetu Ekonomicznego w Krakowie. Tomek, I. (1976). Two modifications of CNN. IEEE Trans. Systems, Man and Cybernetics , 6 , 769–772. Tsamardinos, I., Aliferis, C.F. (2003). Towards principled feature selection: relevancy, filters and wrappers. In Proceedings of the Workshop on Artificial Intelligence and

Abstract

The purpose of this article is to determine the influence of various methods of selection of diagnostic features on the sensitivity of classification. Three options of feature selection are presented: a parametric feature selection method with a sum (option I), a median of the correlation coefficients matrix column elements (option II) and the method of a reversed matrix (option III). Efficiency of the groupings was verified by the indicators of homogeneity, heterogeneity and the correctness of grouping. In the assessment of group efficiency the approach with the Weber median was used. The undertaken problem was illustrated with a research into the tourist attractiveness of voivodships in Poland in 2011.