Unmanned Aerial System (UAS)-Based Phenotyping of Soybean using Multi-sensor Data Fusion and Extreme Learning Machine


Estimating crop biophysical and biochemical parameters with high accuracy at low-cost is imperative for high-throughput phenotyping in precision agriculture. Although fusion of data from multiple sensors is a common application in remote sensing, less is known on the contribution of low-cost RGB, multispectral and thermal sensors to rapid crop phenotyping. This is due to the fact that (1) simultaneous collection of multi-sensor data using satellites are rare and (2) multi-sensor data collected during a single flight have not been accessible until recent developments in Unmanned Aerial Systems (UASs) and UAS-friendly sensors that allow efficient information fusion. The objective of this study was to evaluate the power of high spatial resolution RGB, multispectral and thermal data fusion to estimate soybean (Glycine max) biochemical parameters including chlorophyll content and nitrogen concentration, and biophysical parameters including Leaf Area Index (LAI), above ground fresh and dry biomass. Multiple low-cost sensors integrated on UASs were used to collect RGB, multispectral, and thermal images throughout the growing season at a site established near Columbia, Missouri, USA. From these images, vegetation indices were extracted, a Crop Surface Model (CSM) was advanced, and a model to extract the vegetation fraction was developed. Then, spectral indices/features were combined to model and predict crop biophysical and biochemical parameters using Partial Least Squares Regression (PLSR), Support Vector Regression (SVR), and Extreme Learning Machine based Regression (ELR) techniques. Results showed that: (1) For biochemical variable estimation, multispectral and thermal data fusion provided the best estimate for nitrogen concentration and chlorophyll (Chl) a content (RMSE of 9.9% and 17.1%, respectively) and RGB color information based indices and multispectral data fusion exhibited the largest RMSE 22.6%; the highest accuracy for Chl a + b content estimation was obtained by fusion of information from all three sensors with an RMSE of 11.6%. (2) Among the plant biophysical variables, LAI was best predicted by RGB and thermal data fusion while multispectral and thermal data fusion was found to be best for biomass estimation. (3) For estimation of the above mentioned plant traits of soybean from multi-sensor data fusion, ELR yields promising results compared to PLSR and SVR in this study. This research indicates that fusion of low-cost multiple sensor data within a machine learning framework can provide relatively accurate estimation of plant traits and provide valuable insight for high spatial precision in agriculture and plant stress assessment.


Civil, Architectural and Environmental Engineering


Funding for this work was provided by National Science Foundation (IIA-1355406 and IIA-1430427), NASA (NNX15AK03H), and the Center for Sustainability at Saint Louis University.

Keywords and Phrases

Agriculture; Amino acids; Biophysics; Chlorophyll; Cost estimating; Costs; Crops; Data fusion; Knowledge acquisition; Learning systems; Least squares approximations; Nitrogen; Nitrogen fixation; Regression analysis; Remote sensing; Unmanned aerial vehicles (UAV); Vegetation; Extreme learning machine (ELM); High-throughput phenotyping; Multisensor data fusion; Nitrogen concentrations; Partial least squares regressions (PLSR); Phenotyping; Support vector regression (SVR); Unmanned aerial systems; Sensor data fusion; Agricultural modeling; Biochemistry; Biomass; Biophysics; Data assimilation; Leaf area index; Machine learning; Phenotype; Regression analysis; Remote sensing; Satellite imagery; Sensor; Soybean; Spatial resolution; Statistical analysis; Unmanned vehicle; Columbia [Missouri]; Missouri; Glycine max; Extreme learning machine based regression (ELR); Unmanned aerial system (UAS)

International Standard Serial Number (ISSN)


Document Type

Article - Journal

Document Version


File Type





© 2017 Elsevier, All rights reserved.

Publication Date

01 Dec 2017