A New Paradigm for Waste Classification Based on YOLOv5
Author:
Affiliation:

1.Department of Mechanical Engineering, University of West of England;
2. Department of Mechanical Engi-neering, Faculty of Engineering Technology, The Open University of Sri Lanka, Nugegoda, Sri Lanka

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Classification of garbage is of paramount importance prior to process them to categorise physically and this process helps to manage wastes by maintaining pollution free environment. Many systems that have capability segregate garbage are on the rise, but efficient and accurate segmentation with recognition mechanisms draw the attention of researchers. A computer vision approach for classifying garbage into respective recyclable categories could be one of the effective and economical ways of processing waste. This project mainly focused on capturing real-time images of a single piece of garbage and classifying it into three divisions: paper, or metal, or biodegradable (food) waste. Each garbage class contains around 2000 images obtained from an open-source dataset and images collected from Google and personally collected custom images. The developed intelligent models provide the effectiveness of the machine and deep learning in classification with structural and nonstructural data. The model used was a Convolutional Neural Network (CNN) named YOLOv5. The project showcased vision based approach capable of maintaining an accuracy of 61%. The CNN was not trained to its maximum capacity due to the difficulty of finding optimal hyperparameters, as most of the images were gathered from Google Images.

    Reference
    Related
    Cited by
Get Citation

Mohammed SAJID, Nimali T MEDAGEDARA.[J]. Instrumentation,2021,8(4):9-17

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: May 26,2022
  • Published:
License
  • Copyright (c) 2023 by the authors. This work is licensed under a Creative
  • Creative Commons Attribution-ShareAlike 4.0 International License.