Previous PageTable Of ContentsNext Page

Field-based rapid phenotyping with Unmanned Aerial Vehicles (UAV)

Eileen M. Perry1, Jason Brand2, Surya Kant3 and Glenn J. Fitzgerald4

1 Future Farming Systems Research Division, Department of Primary Industries. Email eileen.perry@dpi.vic.gov.au
2
Future Farming Systems Research Division, Department of Primary Industries . Email jason.brand@dpi.vic.gov.au
3
Biosciences Research Division, Department of Primary Industries. Email surya.kant@dpi.vic.gov.au
4
Future Farming Systems Research Division, Department of Primary Industries. Email glenn.fitzgerald@dpi.vic.gov.au

Abstract

Improving the phenotype (i.e. observable traits related to morphology, development and physiology) is the ultimate aim of plant breeders and biologists and is essential to ensure improved productivity of cereals, pulses, and oilseed crops in Australia. Traditional field-based crop phenotyping methods can be slow and destructive (e.g. biomass estimates) and variable (e.g. visual disease rating or flowering time estimates). Various sensor-based crop monitoring methods have been used at our research facility and the equipments can be deployed on motorbikes, used handheld, and installed in full-sized aircraft. However, there is a need to develop techniques for rapid acquisition of high spatial and temporal resolution data during critical crop development periods. Because of costs and logistical issues, a full size aircraft-based imaging system with multispectral and thermal infrared sensors is being replaced with miniaturized sensors that can be flown on an unmanned aerial vehicle (UAV). This system is being designed for flights less than 30 minutes duration to cover large plots and paddock scale research experiments, and it will include an autopilot system to guide the UAV on planned circuits. The objective is to deliver crop information such as drought tolerance, senescence progression, percent ground cover, biomass estimation and canopy nitrogen amounts, and to use this information for trait selection and confirmation under field conditions.

Key Words

Unmanned aerial vehicle, remote sensing, sensors, phenotyping

Introduction

Identification of novel agronomic and phenotypic traits for higher crop yield is critical to ensure improved agricultural productivity. The phenotype of a plant is the outcome of interaction between genetic make-up and environmental factors. Improving the plant phenotype (i.e. observable morphological traits related to growth, development and physiology) is the ultimate aim of plant biologists to improve crop productivity. Field based phenotyping is still mostly based on traditional methods that can be time consuming and destructive (e.g. biomass estimates) and variable (e.g. visual disease scoring or flowering time estimate).

Sensor-based crop phenotyping methods have been deployed for our experiments on a variety of platforms: motorbikes and tractors, handheld above the canopy, and flown in full-sized aircraft. The airborne system has provided multispectral and thermal imagery at a spatial resolution of 0.5 m or better (Tilling et al. 2007). However, the logistics and cost of deploying a full size aircraft during critical crop development periods (e.g., late winter in Victoria) can negatively impact research. To overcome the cost and logistical issues, the full size aircraft-based imaging system is being replaced with an unmanned aerial vehicle (UAV) equipped with miniaturized sensors (Figure 1). While UAV platforms are available that provide mainly digital images (e.g., CropCam, Manitoba, Canada), these new miniaturized sensors are an enabling technology that progresses the use of UAV to acquire reflectance and temperature measurements of crops. Unfortunately, the miniaturized sensors and UAV system are not available as a ‘turnkey’ package, but we are faced with integrating the sensors, on-board computing, and UAV system. This will be the first UAV system in Australia (to our knowledge) designed to collect high resolution thermal infrared imagery for crop research. The system will be similar to that described in Zarco-Tejada et al. (2012), which is used for crop research in Spain.

Methods

Design of the UAV system

This system is being designed for short duration flights (within 30 minutes) to cover large plot (e.g., greater than 1 m per side) and paddock scale (e.g., 5 ha) research trials, and it will include an autopilot system to guide the UAV on planned circuits. The system will carry multispectral and thermal infrared cameras, simultaneously acquiring images over the targets. The multispectral imaging sensor (Tetracam, Chatsworth, California, USA) has six bands covering the visible and near infrared wavelengths (Table 1). Each of the six spectral bands has a separate lens and removable filter, and produces a 1280 x 1024 pixel image which is stored on compact flash memory on the camera. The thermal infrared sensor (Thermoteknix Systems Ltd, Cambridge, UK) measures the radiometric surface temperature in the 8 to 12 μm range, and produces digital video output at 25 frames per second, with a frame resolution of 640 x 480 pixels. The total payload is slightly less than 5 kg, which includes the two camera systems, the computer system, guidance, and the batteries to power the equipment and UAV. The fixed wing airframe is being designed and built for the specific sensor payload, and constructed of expanded polystyrene (EPS) foam with carbon fibre and wood structural reinforcements for the wings.

Figure 1. The new UAV system is designed to carry a six band multispectral camera (left panel) and a thermal infared camera (right panel).

Table 1. Multispectral and thermal infrared cameras specifications.

Sensor

Spectral Wavelengths

Image Dimensions

Lens

Output

Multispectral Camera

470, 550, 660, 710, 730, 810 nm (all 10 nm wide)

1280x1024

8.5 mm
Field of View:
Approx. 80 x 68

8 or 10 bit, individual frames or video

Thermal Infrared
Camera

8-12 μm

640x480

14.95 mm (f1.3)
Field of View: 56.3 x 43.7

Digital video output (USB2)
25 frames per second, 14 bits per pixel

From imagery to data

During each flight, in-scene ground control targets will be employed in order to relate the image pixels to surface reflectance (multispectral camera) and canopy temperature (thermal infrared camera). The post-processing of the images will include geo-referencing with the onboard GPS, and calibration to reflectance and temperature. Various vegetation indices and other analyses will be applied to the corrected imagery data to determine information about the current crop condition (Table 2).

Table 2. Examples of remote sensing vegetation indices correlated with crop phenotypic information.

Indices

Phenotypic information

Reference

Normalized Difference Vegetation Index, NDVI

Green fractional cover, senescence progression

Fitzgerald (2010)

Canopy Chorophyll Concentration Index, CCCI

Canopy nitrogen assessments

Perry et al. (2012); Fitzgerald et al. (2010)

Crop Water Stress Index, CWSI

Crop water stress, drought tolerance

Abuzar et al. (2009)

Discussion

Current status

We have performed the initial testing of the multispectral camera and have completed the initial integration of the thermal infrared camera and on-board computer. System integration of the UAV platform and sensor components is scheduled to be completed by September 2012. Deployment of the complete UAV system with imagery sensors installed for use in field crop phenotyping is planned after integration is complete. Initially the system will be tested in conjunction with the Australian Grains Free Air CO2 Enrichment (AgFACE) experiment near Horsham, VIC (Mollah et al. 2011). The system will be used later in 2012 for field experiments in horticultural crops.

Challenges

While an operational UAV will certainly improve deployment time, weather and sky conditions during winter and spring can still impact image acquisition. The strategy is to take advantage of operational ‘windows’ that would be too short to risk the expense of full size aircraft deployment. Another challenge is the volume of data generated, given successful image acquisition. The six band multispectral camera generates images of approximately 8 megabytes each, while the thermal imagery is acquired as digital video files of approximately 7 megabytes per minute. For the planned experiments, the image processing will focus on selecting frames directing over the desired plots, rather than producing mosaics of large areas.

Conclusion

The use of UAV systems for field-based crop phenotyping is novel, but may become an important tool for crop trait characterization under field conditions. Imaging sensor technology should continue to improve in the future, providing even higher quality imagery from sensors that are smaller and lighter. This will, in turn, reduce the size of the UAV platform required. Ultimately, we hope that thermal infrared and multispectral sensors integrated with UAV systems will be commercially available to researchers and advisors providing crop information to growers.

References

Abuzar M, O'Leary G andFitzgerald GF (2009). Measuring water stress in a wheat crop on a spatial scale using airborne thermal and multispectral imagery. Field Crops Research 112, 55-65.

Fitzgerald G, Rodriguez D and O'Leary G (2010). Measuring and predicting canopy nitrogen nutrition in wheat using a spectral index- the canopy chlorophyll content index (CCCI). Field Crops Research 18-324.

Fitzgerald GJ (2010). Comparison of vegetation indices from active and passive sensors. International Journal of Remote Sensing 31, 4335 - 4348.

Mollah M, Partington D and Fitzgerald G (2011). Understand distribution of carbon dioxide to interpret crop growth data: Australian grains free-air carbon dioxide enrichment experiment. Crop & Pasture Science 62, 883-891.

Perry EM, Fitzgerald GJ, Nuttall JG, O’Leary GJ, Schulthess U and Whitlock A (2012). Rapid Estimation of Canopy Nitrogen of Cereal Crops at Paddock Scale Using a Canopy Chlorophyll Content Index. Field Crops Research, accepted.

Tilling AK, O'Leary GJ, Ferwerda JG, Jones SD, Fitzgerald GJ, Rodriguez D and Belford R (2007). Remote sensing of nitrogen and water stress in wheat. Field Crops Research 104, 77-85.

Zarco-Tejada PJ, Gonzlez-Dugo V and Berni JAJ (2012). Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sensing of Environment 117, 322–337.

Previous PageTop Of PageNext Page