TY - JOUR
T1 - Red-green-blue to normalized difference vegetation index translation
T2 - a robust and inexpensive approach for vegetation monitoring using machine vision and generative adversarial networks
AU - Farooque, Aitazaz A.
AU - Afzaal, Hassan
AU - Benlamri, Rachid
AU - Al-Naemi, Salem
AU - MacDonald, Evan
AU - Abbas, Farhat
AU - MacLeod, Kaelyn
AU - Ali, Hassan
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2023/6
Y1 - 2023/6
N2 - High-resolution multispectral imaging of agricultural fields is expensive but helpful in detecting subtle variations in plant health and stress symptoms before the appearance of visible indications. To aid precision agriculture (PA) practices, an innovative and inexpensive protocol for robust and timely monitoring of vegetation symptoms has been evaluated. This innovative but inexpensive protocol uses machine vision (MV) and generative adversarial networks (GAN) to translate red-green-blue (RGB) imagery captured with unmanned aerial vehicle (UAV) into a valuable normalized difference vegetation index (NDVI) map. This study used direct translation of RGB imagery in NDVI index, in contrast with similar studies that used GANs in near-infrared (NIR) translation. The protocol was tested by flying a fixed-winged UAV developed by senseFly Inc. (Cheseaux-sur-Lausanne, Switzerland) model Ebee-X, equipped with a RedEdge-MX sensor, to capture images from five different potatoes fields located in Prince Edward Island – Canada, during the growing season of 2021. The images were captured throughout the growing season under vegetation (15–30 DAP; days after plantation), tuber formation (30–45 DAP), tuber bulking (75–110 DAP), and tuber maturation stages (> 110 DAP). The NDVI was calculated from captured UAV aerial surveys using NIR and red bands to develop pairwise datasets for the training of GANs. Five hundred pairwise images were used (80% training, 10% validation, and 10% testing) for training and evaluation of GANs. Two famous GANs, namely Pix2Pix and Pix2PixHD, were tested compared to various training and evaluation indicators. The Pix2PixHD outperformed Pix2Pix GAN by recording lower root mean square error (RMSE) (5.40 to 13.73) and higher structural similarity index matrix (SSIM) score (0.69 to 0.90) during the evaluation of the protocol. The results of this study are breakthroughs to be used for economic vegetation and orchard health monitoring after the training of models. The trained GANs can translate simple RGB domains into useful vegetation indices maps for variable rate PA practices. This innovative protocol can also translate remote sensing imagery of large-scale agricultural fields and commercial orchards into NDVI to extract useful information about plant health indicators.
AB - High-resolution multispectral imaging of agricultural fields is expensive but helpful in detecting subtle variations in plant health and stress symptoms before the appearance of visible indications. To aid precision agriculture (PA) practices, an innovative and inexpensive protocol for robust and timely monitoring of vegetation symptoms has been evaluated. This innovative but inexpensive protocol uses machine vision (MV) and generative adversarial networks (GAN) to translate red-green-blue (RGB) imagery captured with unmanned aerial vehicle (UAV) into a valuable normalized difference vegetation index (NDVI) map. This study used direct translation of RGB imagery in NDVI index, in contrast with similar studies that used GANs in near-infrared (NIR) translation. The protocol was tested by flying a fixed-winged UAV developed by senseFly Inc. (Cheseaux-sur-Lausanne, Switzerland) model Ebee-X, equipped with a RedEdge-MX sensor, to capture images from five different potatoes fields located in Prince Edward Island – Canada, during the growing season of 2021. The images were captured throughout the growing season under vegetation (15–30 DAP; days after plantation), tuber formation (30–45 DAP), tuber bulking (75–110 DAP), and tuber maturation stages (> 110 DAP). The NDVI was calculated from captured UAV aerial surveys using NIR and red bands to develop pairwise datasets for the training of GANs. Five hundred pairwise images were used (80% training, 10% validation, and 10% testing) for training and evaluation of GANs. Two famous GANs, namely Pix2Pix and Pix2PixHD, were tested compared to various training and evaluation indicators. The Pix2PixHD outperformed Pix2Pix GAN by recording lower root mean square error (RMSE) (5.40 to 13.73) and higher structural similarity index matrix (SSIM) score (0.69 to 0.90) during the evaluation of the protocol. The results of this study are breakthroughs to be used for economic vegetation and orchard health monitoring after the training of models. The trained GANs can translate simple RGB domains into useful vegetation indices maps for variable rate PA practices. This innovative protocol can also translate remote sensing imagery of large-scale agricultural fields and commercial orchards into NDVI to extract useful information about plant health indicators.
KW - Conditional GANs
KW - Deep learning
KW - Image to image translation
KW - Pix2Pix
KW - Vegetation Indices
UR - http://www.scopus.com/inward/record.url?scp=85149362763&partnerID=8YFLogxK
U2 - 10.1007/s11119-023-10001-3
DO - 10.1007/s11119-023-10001-3
M3 - Article
AN - SCOPUS:85149362763
SN - 1385-2256
VL - 24
SP - 1097
EP - 1115
JO - Precision Agriculture
JF - Precision Agriculture
IS - 3
ER -