A Comparative Analysis of Convolutional Neural Networks for Trash Classification
##plugins.themes.bootstrap3.article.main##
Abstract
In the era of the twenty-first century, automation is the biggest field of research. Although several works have been done successfully, there is still so much to do. Increasing trash could be the biggest threat to a better human life in the future. To automate the task, machines need to understand the type of trash to be recycled or separate similar trash for recycling. For this process, the perfect classification of trash plays a vital role in making a better life and a cleaner world. There are so many popular convolutional neural network (CNN) models for image classification. In this work, we examine and analyze the outcomes of several Residual Network (ResNet) and Visual Geometry Group (VGG) CNN models on a trash dataset. Our main investigation is the accuracy evaluation for different VGG and ResNet models where the training dataset, test dataset, number of epochs, and batch size are the same for all the models. Finally, we compare VGG and ResNet models with each other. We have got the peak accuracy for ResNet152 among all ResNet models and the peak accuracy on VGG16 among all VGG models. And we have got the maximum accuracy on ResNet152 among all the ResNet and VGG models which is about 94%.
##plugins.themes.bootstrap3.article.details##
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
References
D. Hoornweg and P. Bhada-Tata, “What a waste: a global review of solid waste management. washington (dc): World bank; 2012.”
N. Johansson and H. Corvellec, “Waste policies gone soft: An analysis of european and swedish waste prevention plans,” Waste management, vol. 77, pp. 322–332, 2018.
K. Simonyan and A. Zisserman, “Very deep convolutional networks for large-scale image recognition,” arXiv preprintarXiv:1409.1556, 2014.
K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778.
M. Atikuzzaman, M. Asaduzzaman, and M. Z. Islam, “Vehicle number plate detection and categorization using cnns,” in 2019 International Conference on Sustainable Technologies for Industry 4.0 (STI). IEEE, 2019, pp. 1–5.
M. Atikuzzaman, T. R. Rahman, E. Wazed, M. P. Hossain, and M. Z. Islam, “Human activity recognition system from different poses with cnn,” in 2020 2nd International Conference on Sustainable Technologies for Industry 4.0 (STI). IEEE, 2020, pp. 1–5.
M. Pervej, S. Das, M. P. Hossain, M. Atikuzzaman, M. Mahin, and M. A. Rahaman, “Real-time computer vision-based bangla vehicle license plate recognition using contour analysis and prediction algorithm,” International Journal of Image and Graphics, p. 2150042, 2021.
A. H. Vo, M. T. Vo, T. Le et al., “A novel framework for trash classification using deep transfer learning,” IEEE Access, vol. 7, pp.178 631–178 639, 2019.
O. Adedeji and Z. Wang, “Intelligent waste classification system using deep learning convolutional neural network,” Procedia Manufacturing, vol. 35, pp. 607–612, 2019.
M. Yang and G. Thung, “Classification of trash for recyclability status,” CS229 Project Report, vol. 2016, 2016.
A. Krizhevsky, I. Sutskever, and G. E. Hinton, “Imagenet classification with deep convolutional neural networks,” in Advances in neural information processing systems, 2012, pp. 1097–1105.
M. A. Rahaman, M. P. Hossain, M. M. Rana, M. A. Rahman, and T. Akter, “A rule based system for bangla voice and text to bangla
sign language interpretation,” in 2020 2nd International Conference on Sustainable Technologies for Industry 4.0 (STI), 2020, pp. 1–6.
J. Donovan, “Auto-trash sorts garbage automatically at the TechCrunch disrupt hackathon,” 2016.
G. Mittal, K. B. Yagnik, M. Garg, and N. C. Krishnan, “Spotgarbage: smartphone app to detect garbage using deep learning,” in Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 2016, pp. 940–945.