Red-green-blue to normalized difference vegetation index translation: a robust and inexpensive approach for vegetation monitoring using machine vision and generative adversarial networks

Aitazaz A. Farooque, Hassan Afzaal, Rachid Benlamri, Salem Al-Naemi, Evan MacDonald, Farhat Abbas, Kaelyn MacLeod, Hassan Ali

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

High-resolution multispectral imaging of agricultural fields is expensive but helpful in detecting subtle variations in plant health and stress symptoms before the appearance of visible indications. To aid precision agriculture (PA) practices, an innovative and inexpensive protocol for robust and timely monitoring of vegetation symptoms has been evaluated. This innovative but inexpensive protocol uses machine vision (MV) and generative adversarial networks (GAN) to translate red-green-blue (RGB) imagery captured with unmanned aerial vehicle (UAV) into a valuable normalized difference vegetation index (NDVI) map. This study used direct translation of RGB imagery in NDVI index, in contrast with similar studies that used GANs in near-infrared (NIR) translation. The protocol was tested by flying a fixed-winged UAV developed by senseFly Inc. (Cheseaux-sur-Lausanne, Switzerland) model Ebee-X, equipped with a RedEdge-MX sensor, to capture images from five different potatoes fields located in Prince Edward Island – Canada, during the growing season of 2021. The images were captured throughout the growing season under vegetation (15–30 DAP; days after plantation), tuber formation (30–45 DAP), tuber bulking (75–110 DAP), and tuber maturation stages (> 110 DAP). The NDVI was calculated from captured UAV aerial surveys using NIR and red bands to develop pairwise datasets for the training of GANs. Five hundred pairwise images were used (80% training, 10% validation, and 10% testing) for training and evaluation of GANs. Two famous GANs, namely Pix2Pix and Pix2PixHD, were tested compared to various training and evaluation indicators. The Pix2PixHD outperformed Pix2Pix GAN by recording lower root mean square error (RMSE) (5.40 to 13.73) and higher structural similarity index matrix (SSIM) score (0.69 to 0.90) during the evaluation of the protocol. The results of this study are breakthroughs to be used for economic vegetation and orchard health monitoring after the training of models. The trained GANs can translate simple RGB domains into useful vegetation indices maps for variable rate PA practices. This innovative protocol can also translate remote sensing imagery of large-scale agricultural fields and commercial orchards into NDVI to extract useful information about plant health indicators.

Original languageEnglish
Pages (from-to)1097-1115
Number of pages19
JournalPrecision Agriculture
Volume24
Issue number3
DOIs
Publication statusPublished - Jun 2023
Externally publishedYes

Keywords

  • Conditional GANs
  • Deep learning
  • Image to image translation
  • Pix2Pix
  • Vegetation Indices

Fingerprint

Dive into the research topics of 'Red-green-blue to normalized difference vegetation index translation: a robust and inexpensive approach for vegetation monitoring using machine vision and generative adversarial networks'. Together they form a unique fingerprint.

Cite this