{"id":13307,"date":"2022-06-26T10:53:27","date_gmt":"2022-06-26T07:53:27","guid":{"rendered":"http:\/\/journals.khnu.km.ua\/vestnik\/?p=13307"},"modified":"2023-02-01T23:19:06","modified_gmt":"2023-02-01T21:19:06","slug":"metod-klasteryzacziyi-obyektiv-na-zobrazhenni-na-osnovi-vyboru-oznak","status":"publish","type":"post","link":"https:\/\/journals.khnu.km.ua\/vestnik\/?p=13307","title":{"rendered":"\u041c\u0435\u0442\u043e\u0434 \u043a\u043b\u0430\u0441\u0442\u0435\u0440\u0438\u0437\u0430\u0446\u0456\u0457 \u043e\u0431\u2019\u0454\u043a\u0442\u0456\u0432 \u043d\u0430 \u0437\u043e\u0431\u0440\u0430\u0436\u0435\u043d\u043d\u0456 \u043d\u0430 \u043e\u0441\u043d\u043e\u0432\u0456 \u0432\u0438\u0431\u043e\u0440\u0443 \u043e\u0437\u043d\u0430\u043a"},"content":{"rendered":"<p><!--more--><\/p>\n<p style=\"text-align: center;\">\u041c\u0415\u0422\u041e\u0414 \u041a\u041b\u0410\u0421\u0422\u0415\u0420\u0418\u0417\u0410\u0426\u0406\u0407 \u041e\u0411\u2019\u0404\u041a\u0422\u0406\u0412 \u041d\u0410 \u0417\u041e\u0411\u0420\u0410\u0416\u0415\u041d\u041d\u0406 \u041d\u0410 \u041e\u0421\u041d\u041e\u0412\u0406 \u0412\u0418\u0411\u041e\u0420\u0423 \u041e\u0417\u041d\u0410\u041a<\/p>\n<p style=\"text-align: center;\">OBJECT CLUSTERIZATION METHOD IN PICTURES BASED ON FEATURE SELECTION<\/p>\n<p><strong>\u0421\u0442\u043e\u0440\u0456\u043d\u043a\u0438:\u00a0<\/strong><strong>26<\/strong><strong>0-<\/strong><strong>264<\/strong><strong>. \u041d\u043e\u043c\u0435\u0440: \u21163, 2022 (309)\u00a0\u00a0<\/strong> <a href=\"http:\/\/journals.khnu.km.ua\/vestnik\/wp-content\/uploads\/2022\/07\/vknu-ts-2022-n3-260-264.pdf\"> <img loading=\"lazy\" class=\"size-full wp-image-69 alignnone\" src=\"http:\/\/journals.khnu.km.ua\/vestnik\/wp-content\/uploads\/2021\/01\/pdf.png\" alt=\"\" width=\"76\" height=\"32\" \/><\/a><br \/>\n<strong>\u0410\u0432\u0442\u043e\u0440\u0438:<\/strong><br \/>\n\u0428\u0410\u041c\u0423\u0420\u0410\u0422\u041e\u0412 \u041e. \u042e.<br \/>\n\u041d\u0430\u0446\u0456\u043e\u043d\u0430\u043b\u044c\u043d\u0438\u0439 \u0443\u043d\u0456\u0432\u0435\u0440\u0441\u0438\u0442\u0435\u0442 &#8220;\u041b\u044c\u0432\u0456\u0432\u0441\u044c\u043a\u0430 \u041f\u043e\u043b\u0456\u0442\u0435\u0445\u043d\u0456\u043a\u0430&#8221;<br \/>\n<a href=\"https:\/\/orcid.org\/0000-0003-1913-5362\">https:\/\/orcid.org\/0000-0003-1913-5362<\/a><br \/>\ne-mail: <a href=\"mailto:oleksii.y.shamuratov@lpnu.ua\">oleksii.y.shamuratov@lpnu.ua<\/a><br \/>\nOleksiy SHAMURATOV<br \/>\nLviv Polytechnic National University<br \/>\n<strong>DOI:<\/strong>\u00a0<a href=\"https:\/\/www.doi.org\/10.31891\/2307-5732-2022-309-3-260-264\">https:\/\/www.doi.org\/10.31891\/2307-5732-2022-309-3-260-264<\/a><\/p>\n<p style=\"text-align: center;\"><strong>\u0410\u043d\u043e\u0442\u0430\u0446\u0456\u044f \u043c\u043e\u0432\u043e\u044e \u043e\u0440\u0438\u0433\u0456\u043d\u0430\u043b\u0443<\/strong><\/p>\n<p>\u0423 \u0441\u0442\u0430\u0442\u0442\u0456 \u043e\u043f\u0438\u0441\u0443\u0454\u0442\u044c\u0441\u044f \u0440\u043e\u0437\u0440\u043e\u0431\u043a\u0430 \u043c\u0435\u0442\u043e\u0434\u0443, \u0449\u043e \u0434\u043e\u0437\u0432\u043e\u043b\u044f\u0454 \u0441\u0442\u0432\u043e\u0440\u044e\u0432\u0430\u0442\u0438 \u043a\u043b\u0430\u0441\u0442\u0435\u0440\u0438 \u043d\u0430 \u043e\u0441\u043d\u043e\u0432\u0456 \u0432\u0438\u0431\u043e\u0440\u0443 \u043e\u0437\u043d\u0430\u043a. \u0423 \u0441\u0443\u0447\u0430\u0441\u043d\u043e\u043c\u0443 \u0441\u0432\u0456\u0442\u0456 \u0456\u043d\u0434\u0443\u0441\u0442\u0440\u0456\u044f \u0440\u043e\u0437\u0432\u0430\u0433 \u0432 \u0406\u043d\u0442\u0435\u0440\u043d\u0435\u0442\u0456 \u0448\u0432\u0438\u0434\u043a\u043e \u0440\u043e\u0437\u0432\u0438\u0432\u0430\u0454\u0442\u044c\u0441\u044f, \u0441\u0442\u0432\u043e\u0440\u044e\u044e\u0447\u0438 \u043f\u043e\u043f\u0438\u0442 \u043d\u0430 \u0431\u0456\u043b\u044c\u0448 \u044f\u043a\u0456\u0441\u043d\u0456 \u043f\u0440\u043e\u0434\u0443\u043a\u0442\u0438. \u0426\u0435 \u0432 \u0441\u0432\u043e\u044e \u0447\u0435\u0440\u0433\u0443 \u043f\u0440\u0438\u0437\u0432\u0435\u043b\u043e \u0434\u043e \u0432\u0438\u043a\u043e\u0440\u0438\u0441\u0442\u0430\u043d\u043d\u044f \u0448\u0442\u0443\u0447\u043d\u043e\u0433\u043e \u0456\u043d\u0442\u0435\u043b\u0435\u043a\u0442\u0443 \u043d\u0435 \u0442\u0456\u043b\u044c\u043a\u0438 \u0432 \u043d\u0430\u0443\u0446\u0456, \u0430 \u0439 \u0443 \u0440\u043e\u0437\u0432\u0430\u0433\u0430\u0445. \u041d\u0430 \u0434\u0430\u043d\u0438\u0439 \u043c\u043e\u043c\u0435\u043d\u0442 \u043d\u0430\u0431\u0438\u0440\u0430\u044e\u0442\u044c \u043f\u043e\u043f\u0443\u043b\u044f\u0440\u043d\u043e\u0441\u0442\u0456 \u043f\u0440\u043e\u0433\u0440\u0430\u043c\u0438, \u0449\u043e \u0434\u043e\u0437\u0432\u043e\u043b\u044f\u044e\u0442\u044c \u0441\u0442\u0432\u043e\u0440\u044e\u0432\u0430\u0442\u0438 \u0430\u043d\u0456\u043c\u0430\u0446\u0456\u044e \u043e\u0431\u02bc\u0454\u043a\u0442\u0456\u0432 \u043d\u0430 \u0444\u043e\u0442\u043e\u0433\u0440\u0430\u0444\u0456\u044f\u0445. \u0423 \u0446\u0456\u0439 \u0441\u0442\u0430\u0442\u0442\u0456 \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u0438\u0439 \u043f\u0456\u0434\u0445\u0456\u0434 \u0434\u043e \u0432\u0438\u0440\u0456\u0448\u0435\u043d\u043d\u044f \u043f\u0440\u043e\u0431\u043b\u0435\u043c\u0438 \u0432\u0438\u0437\u043d\u0430\u0447\u0435\u043d\u043d\u044f \u043e\u0431\u2019\u0454\u043a\u0442\u0456\u0432 \u0434\u043b\u044f \u0430\u043d\u0456\u043c\u0430\u0446\u0456\u0457<br \/>\n<strong>\u041a\u043b\u044e\u0447\u043e\u0432\u0456 \u0441\u043b\u043e\u0432\u0430:<\/strong> \u043a\u043b\u0430\u0441\u0442\u0435\u0440\u0438\u0437\u0430\u0446\u0456\u044f, \u0432\u0438\u0431\u0456\u0440 \u043e\u0437\u043d\u0430\u043a, \u0430\u043d\u0456\u043c\u0430\u0446\u0456\u044f<\/p>\n<p style=\"text-align: center;\"><strong>\u0420\u043e\u0437\u0448\u0438\u0440\u0435\u043d\u0430 \u0430\u043d\u043e\u0442\u0430\u0446\u0456\u044f \u0430\u043d\u0433\u043b\u0456\u0439\u0441\u044c\u043a\u043e\u044e \u00a0\u043c\u043e\u0432\u043e\u044e<\/strong><\/p>\n<p>The article describes the development of a method that allows you to create clusters based on selecting feature features. In today\u02bcs world, the entertainment industry on the Internet is developing rapidly, creating a demand for better products. This factor has led to the use of artificial intelligence not only in science but also in entertainment. Currently, applications that allow you to create animations of objects in photos are gaining popularity. This article presents an approach to solving the problem of defining objects for animation. To classify and further identify objects, their characteristics should be determined. This is one of the options for abstraction, in which the input set of properties of the object is reduced to the minimum required number of features by which you can identify the object.<br \/>\nThe algorithm can be used to determine the main features of objects, such as area and perimeter, radii of inscribed and circumscribed circles, sides of the described rectangle, number and relative position of angles, gradient of the object histogram. Based on these features clustering and classification of the image are implemented. The artificial neural network was trained on image samples, each class contained from 2528 to 16185 images of 64&#215;64 pixels. 1000 images of objects of each class were then selected for testing. The success of recognition based on a convolutional neural network was evaluated. According to the results, we can conclude that the smaller the invariance of the class, the greater the accuracy of recognition. The amount of data in the training sample has little effect on the accuracy of the algorithm. After calculating the intensity gradient, you should divide the image into a cell and build a histogram of the gradient object for each pocket of cell; the histogram module corresponds to the intensity gradient at the point.<br \/>\n<strong>Keywords:<\/strong> clustering, feature selection, animation<br \/>\n<strong>\u00a0<\/strong><\/p>\n<p style=\"text-align: center;\"><strong>\u041b\u0456\u0442\u0435\u0440\u0430\u0442\u0443\u0440\u0430<\/strong><\/p>\n<ol>\n<li>Tien D. B., Ching Y. S., Zi-Cai L., Yuan Y. T., \u201cComputer Transformation of Digital Images and Patterns\u201d, p. 276, 1989.<\/li>\n<li>Y.-Q. Wang, \u201cAn Analysis of Viola-Jones Face Detection Algorithm\u201d, IPOL Journal, 2013.<\/li>\n<li>Khan H. Abdullah, M. Shamian Bin Zainal, \u00abEfficient eyes and mouth detection algorithm using combination of viola jones and skin color pixel detection\u00bb International Journal of Engineering and Applied Sciences, \u2116 Vol. 3 \u2116 4, 2013.<\/li>\n<li>V. Gaede \u0438 O. Gunther, \u201cMultidimensional Access Methods\u201d, ACM Computing Surveys, pp. 170-231, 1998.<\/li>\n<li>S. Khan, H. Rahmani, Syed Afaq Ali Shah, M. Bennamoun, G. Medioni, S. Medioni, \u201cA Guide to Convolutional Neural Networks for Computer Vision\u201d, Morgan &amp; Claypool, p. 207, 2018.<\/li>\n<li>Sibt ul Hussain, \u201cMachine Learning Methods for Visual Object Detection\u201d. p. 160, 2012.<\/li>\n<li>P. Arabie, L. J. Hubert, G. De Soete, \u201cClustering and Classification\u201d, p. 500, 1996.<\/li>\n<li>D. Parks, \u201cObject Detection and Analysis: A Coherency Filtering Approach\u201d, p. 172, 2008.<\/li>\n<li>Yongqiang Z., Chen Y., Seong G. K., Quan P., Yongmei C., \u201cMulti-band Polarization Imaging and Applications\u201d 1st ed., p. 204, 2016.<\/li>\n<li>Manikandan S., \u201cVision Based Assistive System for Label and Object Detection\u201d, p. 64, 2015.<\/li>\n<li>Salma H., \u201cObject Detection Using Histogram Of Gradients\u201d, p. 52, 2018.<\/li>\n<li>Wu J., \u201cAdvances in K-means Clustering: A Data Mining Thinking\u201d, Springer Science &amp; Business Media, p. 180, 2021.<\/li>\n<li>J.Loy, \u201cNeural Network Projects with Python: The ultimate guide to using Python to explore the true power of neural networks through six projects\u201d, Packt Publishing, p. 308, 2019.<\/li>\n<li>Brannon W. C., \u201cObject Detection in Low-spatial-resolution Aerial Imagery Using Convolutional Neural Networks\u201d, p. 79, 2019.<\/li>\n<li>Dataset https:\/\/knowyourdata-tfds.withgoogle.com<\/li>\n<\/ol>\n<p style=\"text-align: center;\"><strong>References<\/strong><\/p>\n<ol>\n<li>Tien D. B., Ching Y. S., Zi-Cai L., Yuan Y. T., \u201cComputer Transformation of Digital Images and Patterns\u201d, p. 276, 1989.<\/li>\n<li>Y.-Q. Wang, \u201cAn Analysis of Viola-Jones Face Detection Algorithm\u201d, IPOL Journal, 2013.<\/li>\n<li>Khan H. Abdullah, M. Shamian Bin Zainal, \u00abEfficient eyes and mouth detection algorithm using combination of viola jones and skin color pixel detection\u00bb International Journal of Engineering and Applied Sciences, \u2116 Vol. 3 \u2116 4, 2013.<\/li>\n<li>V. Gaede \u0438 O. Gunther, \u201cMultidimensional Access Methods\u201d, ACM Computing Surveys, pp. 170-231, 1998.<\/li>\n<li>S. Khan, H. Rahmani, Syed Afaq Ali Shah, M. Bennamoun, G. Medioni, S. Medioni, \u201cA Guide to Convolutional Neural Networks for Computer Vision\u201d, Morgan &amp; Claypool, p. 207, 2018.<\/li>\n<li>Sibt ul Hussain, \u201cMachine Learning Methods for Visual Object Detection\u201d. p. 160, 2012.<\/li>\n<li>P. Arabie, L. J. Hubert, G. De Soete, \u201cClustering and Classification\u201d, p. 500, 1996.<\/li>\n<li>D. Parks, \u201cObject Detection and Analysis: A Coherency Filtering Approach\u201d, p. 172, 2008.<\/li>\n<li>Yongqiang Z., Chen Y., Seong G. K., Quan P., Yongmei C., \u201cMulti-band Polarization Imaging and Applications\u201d 1st ed., p. 204, 2016.<\/li>\n<li>Manikandan S., \u201cVision Based Assistive System for Label and Object Detection\u201d, p. 64, 2015.<\/li>\n<li>Salma H., \u201cObject Detection Using Histogram Of Gradients\u201d, p. 52, 2018.<\/li>\n<li>Wu J., \u201cAdvances in K-means Clustering: A Data Mining Thinking\u201d, Springer Science &amp; Business Media, p. 180, 2021.<\/li>\n<li>J.Loy, \u201cNeural Network Projects with Python: The ultimate guide to using Python to explore the true power of neural networks through six projects\u201d, Packt Publishing, p. 308, 2019.<\/li>\n<li>Brannon W. C., \u201cObject Detection in Low-spatial-resolution Aerial Imagery Using Convolutional Neural Networks\u201d, p. 79, 2019.<\/li>\n<li>Dataset https:\/\/knowyourdata-tfds.withgoogle.com<\/li>\n<\/ol>\n<p><!--more--><\/p>\n","protected":false},"excerpt":{"rendered":"","protected":false},"author":3,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[66],"tags":[],"_links":{"self":[{"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=\/wp\/v2\/posts\/13307"}],"collection":[{"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=13307"}],"version-history":[{"count":3,"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=\/wp\/v2\/posts\/13307\/revisions"}],"predecessor-version":[{"id":15750,"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=\/wp\/v2\/posts\/13307\/revisions\/15750"}],"wp:attachment":[{"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=13307"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=13307"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/journals.khnu.km.ua\/vestnik\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=13307"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}