Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks
Loading...
Access rights
openAccess
publishedVersion
URL
Journal Title
Journal ISSN
Volume Title
A1 Alkuperäisartikkeli tieteellisessä aikakauslehdessä
This publication is imported from Aalto University research portal.
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
View publication in the Research portal (opens in new window)
View/Open full text file from the Research portal (opens in new window)
Other link related to publication (opens in new window)
Date
2022-06-01
Department
Major/Subject
Mcode
Degree programme
Language
en
Pages
21
Series
Remote Sensing, Volume 14, issue 11, pp. 1-21
Abstract
The objective of this study is to investigate the potential of novel neural network architectures for measuring the quality and quantity parameters of silage grass swards, using drone RGB and hyperspectral images (HSI), and compare the results with the random forest (RF) method and handcrafted features. The parameters included fresh and dry biomass (FY, DMY), the digestibility of organic matter in dry matter (D‐value), neutral detergent fiber (NDF), indigestible neutral detergent fiber (iNDF), water‐soluble carbohydrates (WSC), nitrogen concentration (Ncont) and nitrogen uptake (NU); datasets from spring and summer growth were used. Deep pre‐trained neural network architectures, the VGG16 and the Vision Transformer (ViT), and simple 2D and 3D convolutional neural networks (CNN) were studied. In most cases, the neural networks outperformed RF. The normalized root‐mean‐square errors (NRMSE) of the best models were for FY 19% (2104 kg/ha), DMY 21% (512 kg DM/ha), D‐value 1.2% (8.6 g/kg DM), iNDF 12% (5.1 g/kg DM), NDF 1.1.% (6.2 g/kg DM), WSC 10% (10.5 g/kg DM), Ncont 9% (2 g N/kg DM), and NU 22% (11.9 N kg/ha) using independent test dataset. The RGB data provided good results, particularly for the FY, DMY, WSC and NU. The HSI datasets provided advantages for some parameters. The ViT and VGG provided the best results with the RGB data, whereas the simple 3D‐CNN was the most consistent with the HSI data.Description
Funding Information: Funding: This research was funded by Academy of Finland ICT 2023 Smart‐HSI—“Smart hyper‐ spectral imaging solutions for new era in Earth and planetary observations” (Decision no. 335612), by the European Agricultural Fund for Rural Development: Europe investing in rural areas, Pohjois‐ Savon Ely‐keskus (Grant no. 145346) and by the European Regional Development Fund for “Cyber‐ Grass I—Introduction to remote sensing and artificial intelligence assisted silage production” pro‐ ject (ID 20302863) in European Union Interreg Botnia‐Atlantica programme. This research was car‐ ried out in affiliation with the Academy of Finland Flagship “Forest‐Human‐Machine Interplay— Building Resilience, Redefining Value Networks and Enabling Meaningful Experiences (UNITE)” (Decision no. 337127) ecosystem.
Keywords
CNN, drone, grass sward, hyperspectral, image transformer, remote sensing, RGB, silage production
Other note
Citation
Karila, K, Oliveira, R A, Ek, J, Kaivosoja, J, Koivumäki, N, Korhonen, P, Niemeläinen, O, Nyholm, L, Näsi, R, Pölönen, I & Honkavaara, E 2022, ' Estimating Grass Sward Quality and Quantity Parameters Using Drone Remote Sensing with Deep Neural Networks ', Remote Sensing, vol. 14, no. 11, 2692, pp. 1-21 . https://doi.org/10.3390/rs14112692