Using deep learning methods for visualization and detection of malware
DOI:
https://doi.org/10.30837/rt.2025.4.223.09Keywords:
deep learning, convolutional neural network, malware, visualization, PE-file, transfer learning, cybersecurityAbstract
The purpose of the article is to conduct a comparative analysis of the effectiveness of common convolutional neural network architectures for the task of classifying malware, which was previously converted into graphic images. The approach is based on converting executable PE-files into grayscale images. The transfer learning method is applied to classify these images. The paper provides an experimental comparison of seven architectures (XceptionNet, DenseNet169, EfficientNetB0, MobileNetV2, InceptionV3, ResNet50V2, and VGG16) based on Accuracy, Precision, Recall, and F1-score metrics to determine the most effective and computationally balanced model for threat analysis.
The article will be useful to specialists in the field of cybersecurity and machine learning who are involved in the development of intelligent systems for detecting "zero-day" and polymorphic threats.
References
Brosolo M. et al. Security through the Eyes of AI: How Visualization is Shaping Malware Detection. (2025). arXiv:2505.07574v4. Available: https://doi.org/10.48550/arXiv.2505.07574.
Shahnawaz M. et al. Dynamic Malware Classification of Windows PE Files using CNNs and Greyscale Images Derived from Runtime API Call Argument Conversion. arXiv:2505.24231v1. (2025). Available: https://doi.org/10.48550/arXiv.2505.24231.
Bensaoud A. et al. A Survey of Malware Detection Using Deep Learning. (2024). arXiv:2407.19153v1. Available: https://doi.org/10.48550/arXiv.2407.19153.
Ashawa M., Owoh N., Hosseinzadeh S., & Osamor J. Enhanced Image-Based Malware Classification Using Transformer-Based Convolutional Neural Networks (CNNs) // Electronics. 2024. Vol. 13(20). P. 4081. Available: https://doi.org/10.3390/electronics13204081.
Kim H. & Kim M. Malware Detection and Classification System Based on CNN-BiLSTM // Electronics. 2024. Vol. 13. P. 2539. Available: https://doi.org/10.3390/electronics13132539.
Huang G., Liu Z., Van Der Maaten L. and Weinberger K. Q. Densely Connected Convolutional Networks // 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 2017. P. 2261–2269. Available: doi: 10.1109/CVPR.2017.243.
Simonyan K., & Zisserman A. Very Deep Convolutional Networks for Large-Scale Image Recognition // Proceedings of the 3rd International Conference on Learning Representations (ICLR). (2015). Available: https://doi.org/10.48550/arXiv.1409.1556.
He K., Zhang X., Ren S., & Sun J. Identity Mappings in Deep Residual Networks // Proceedings of the European Conference on Computer Vision (ECCV). (2016). Available: https://doi.org/10.48550/arXiv.1603.05027.
Chollet F. Xception: Deep Learning with Depthwise Separable Convolutions // 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA. 2017. P. 1800–1807. [Online]. Available: doi: 10.1109/CVPR.2017.195.
MaleVis Dataset Home Page. Режим доступу: https://web.cs.hacettepe.edu.tr/~selman/malevis/.
GitHub – ESultanik/bin2png: A simple cross-platform script for encoding any binary file into a lossless PNG. GitHub. Режим доступу: https://github.com/ESultanik/bin2png.
Федюшин О. І., Хижняк К. М. CNN та їх використання для класифікації MALWARE // Матеріали Одинадцятої міжнар. наук.-техн. конф. «Проблеми інформатизації», Харків, 16–17 листопада 2023 р. С. 49.
Downloads
Published
How to Cite
Issue
Section
License
Authors who publish with this journal agree to the following terms:
1. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).


