Previous PageTable Of ContentsNext Page

Fallow Management affects the Risk of Deep Water Loss

Kirsten Verburg, Warren J. Bond and Chris J. Smith

CSIRO Land and Water / APSRU, GPO Box 1666, Canberra, ACT2601.
Email kirsten.verburg@csiro.au, warren.bond@csiro.au, and chris.j.smith@csiro.au

Abstract

In southern Australia a fallow period provides an opportunity to store water for the subsequent crop. Experimental data are presented that show that fallow management, in particular that of weeds and residue cover, not only affects the amount of water stored or lost during the summer, but also has an effect on the loss of water past the root zone during the subsequent growing season. Model simulations capture these effects and a scenario analysis indicates that retaining residues past sowing increases the risk of deep water loss relatively rapidly. This suggests that managing weeds and residues according to seasonal conditions has the potential to balance the agronomic benefits and environmental impacts of water storage.

Media summary

Analysis of experimental data and model simulations shows that summer fallow management in dryland cropping affects the risk of deep water loss.

Key Words

stubble management, dryland cropping, APSIM, farming systems, water balance, deep drainage

Introduction

In southern Australia annual crops such as spring wheat (Triticium aestivum) and canola (Brassica napus) are grown over winter to make use of the relatively short period when rainfall exceeds potential evapotranspiration. Highly variable growing season rainfall means that these systems often rely on soil water conserved during the preceding short summer fallow (December – May) or long fallow (August/September – May). While summer weeds are sometimes retained for grazing, the fallow period is also used to break the cycles of leaf and root disease, which requires strict control of weeds. A combination of weed control and residue retention maximises soil water storage. In wetter years, however, this increases the loss of water and nutrients beyond the roots of crops and pastures. Although research on residue retained (no tillage) systems has focussed on their benefit in storing soil water, their impact on increasing deep water loss (deep drainage) has also been observed (O’Leary et al. 1996; Turpin et al. 1998; Kirkegaard et al. 2001). Minimising deep water loss is required to avoid land degradation and dryland salinity (Keating et al. 2002; Williams and Gascoigne 2003).

In this paper we explore the relation between fallow management and the risk of deep water loss through analysis of experimental data and model simulations. We focus on summer fallows in the cropping phase for a temperate climate with a mean annual rainfall of 558 mm, 63% of which falls between April and October (Wagga Wagga, NSW, Australia). While many of the observations will apply equally to drier and/or more Mediterranean climates, the magnitudes of certain effects may differ.

Materials and methods

Soil water content was monitored in four paddocks (all Red Kandosol soils) at three sites near Wagga Wagga, NSW between 1998 and 2000. Measurements were made to a depth of 3 or 6 m with the neutron moisture meter method at intervals of 2 to 6 weeks. Net summer water storage or loss within the root zone (1.3 m), and annual drainage loss from it, were calculated. One of the paddocks (at Charles Sturt University, Wagga Wagga) had two replicate weighing lysimeters (‘north’ and ‘south’) allowing direct assessment of evapotranspiration (Et) and drainage. The lysimeters were managed to represent the conditions of the field as closely as possible, but had 90-95 % of crop residues removed when biomass was measured at harvest.

Simulations of the lysimeter data reflected the experimental history of annual crops and a 4-year lucerne phase. They were carried out with the APSIM model (Agricultural Productions Systems Simulator; Keating et al. 2003; version 2.1). Configuration and parameterisation details were given by Verburg and Bond (2003). Rainfall for the simulations was obtained directly from the continuous lysimeter output when possible. In addition, long-term scenarios were set up to reflect a continuous wheat cropping system. These simulations used historical weather data (1957-2003) obtained from the SILO Patched Point Dataset (Jeffrey et al. 2001) for the nearby Australian Bureau of Meteorology station 73127. Sowing of wheat was conditional on rainfall within a sowing window (1 May – 15 June) or sown “dry” on 15 June. Simulation scenarios were run for four years and repeated 44 times by starting them every year from 1957 to 2000. The first three years of the simulation were identical in each scenario to minimise initialisation effects. Fallow management was varied after the third year and its effect evaluated during that fallow and subsequent growing season.

Experimental results

Field evidence

Deep water loss measured in the four paddocks was most affected by growing season rainfall and soil water deficit at sowing (Fig. 1). While it is not possible to control rainfall, the soil water deficit at sowing can be managed. It is affected by soil water deficit after the preceding crop and the storage or loss of water during summer. The former is optimised by good management of the crop, while the latter is affected by summer rainfall and management of plant growth and residues in summer. This is illustrated by the contrasting behaviour of two of the experimental sites during the 1999-2000 summer fallow (Table 1). At site 1 ("Waerawi", Old Junee, paddock D4) canola resprouted in response to 174 mm of rain between windrowing in early November and 31 December. During the summer fallow this caused a depletion of soil water storage of 29 mm. In contrast, site 2 (Charles Sturt University, Wagga Wagga, paddock 14) was covered with thick triticale residue and although its rainfall for the period was slightly less than at site 1, there was a net gain of 38 mm over summer. During the subsequent growing season this led to deep water loss (past 1.3 m) of 26 mm, whereas none was observed at site 1.

Table 1: Comparison of water balances of two sites between December 1999 and September 2000.

 

Site 1

Site 2

Soil water deficit at harvest in November/December 1999

62 mm

76 mm

Change over summer fallow

29 mm used

38 mm stored

Soil water deficit at sowing in June 2000

91 mm

38 mm

Deep water loss (below 1.3 m) winter 2000

0 mm

26 mm

Figure 1: Correlation between deep water loss and growing season rainfall and soil water deficit at sowing.

Lysimeter evidence

There was a difference in Et between the two lysimeters in all but two summer fallow seasons between December 1992 and July 2002, ranging from 11 to 54 mm. Observations during the last three summer fallows indicated that different weed growth was responsible. During the 2001-02 summer fallow sufficiently detailed measurements were made to demonstrate that when weeds were included in APSIM simulations the different Et of the two lysimeters was predicted (Verburg and Bond, 2003).

Different weed growth during the summer fallow was also found to impact quite strongly on drainage collected from the lysimeters in 1993 (Fig. 2). Although the north lysimeter received supplementary nitrogen fertiliser in August, cumulative Et from the two lysimeters differed by less than 6.5 mm until early October. Most drainage had already occurred by late September, with significant differences between the two lysimeters in both amount and timing. The lysimeters were shown to have the same water storage at harvest in 1992. This implies that the different drainage patterns observed in 1993 were a consequence of different water loss during the 1992-93 summer fallow. Simulations discussed in the next section indicate that this can be explained by different weed growth.

Differences in drainage between the two lysimeters for the whole 9½ year record are striking (Table 2). The lysimeters were managed identically during the cropping seasons, except for extra fertiliser applied to the north lysimeter in 1993, and an irrigation experiment carried out on only the south lysimeter in 1996, which contributed a difference of < 5 mm in drainage in 1993, and 40 mm in 1996. The remaining difference is attributed to differences during the summer fallow, as confirmed by the simulations discussed below.

Table 2: Measured and predicted cumulative drainage (mm) from the lysimeters (1.8 m) between December 1992 and July 2002.

 

North lysimeter

South lysimeter

Measured

69

197

Simulated

59

177

Figure 2: Observed cumulative Et (a) and observed and predicted drainage at 1.8 m from lysimeters (b) during the 1993 growing seasons.

Figure 3: Effect of time of residue removal on predicted long-term (1960-2003) average deep water loss at 1.2 m.

Simulations

Verburg and Bond (2003) presented a detailed evaluation of the water balance capabilities of APSIM using data from the lysimeters. They showed that the model captured the different drainage behaviours of the two lysimeters, with measured and simulated drainage for the whole 9½ year simulation period agreeing closely with observations (Table 2). When detailed information on weed dynamics was available its water use was also simulated very well (Verburg and Bond 2003). There has been little testing of residue impacts, largely due to a lack of adequate data, but the limited evidence presented by Verburg and Bond (2003) suggests that the model captures the effects on water storage satisfactorily, subject to uncertainties caused by residue configuration (standing vs. flat). This provides the necessary confidence to carry out a simulation analysis of the implications of the practice of retaining residues.

Previous studies on the impact of surface residues on soil water storage have pointed out that the effects are most marked when rainfall occurs regularly (Felton et al. 1987; Fischer 1987). The effects of different residue management will, therefore, be greater in wetter summers. We hypothesize that the effects of residue cover on soil water storage may also be more marked following the autumn break, when rainfall events tend to be more frequent and evaporative demand is lower. To test this hypothesis we studied the effect of the date of 90% residue removal on deep water loss during the subsequent growing season. By resetting the residue amount at the start of the fallow season to 4 t/ha and not allowing decomposition to take place, the effect of timing of removal was evaluated for the natural variation from year to year of soil water and rainfall, but in the absence of the confounding effect of varying residue amounts. An amount of 4 t/ha was chosen because it would not interfere with the sowing operation. It was further assumed that crop establishment was not affected by the residue cover and that summer weeds were controlled perfectly.

The results indicate that the effect of residue cover was indeed more critical late in the fallow period and early in the growing season before transpiration becomes significant (Fig. 3). During this period the long-term average deep water loss increased relatively rapidly for every extra week that residue cover was retained. Of course removal of residues between sowing and harvest is not possible in reality, but it does point out that the risk of deep water loss is reduced by residue removal before the growing season, using a light burn, or practices that enhance decomposition, such as mulching or rolling. Removal of 90% of residues on 1 May resulted in 42% less drainage (44 year average) than if they were left until harvest. This benefit is, however, accompanied by a 14% reduction in yield for the well fertilised and disease-free conditions simulated. The effect of residue management on deep water loss and yield in individual years depends on seasonal conditions; for example in very dry years the extra yield obtained by retaining residue is not accompanied by an increase in deep water loss, and in very wet years the reverse is true. Residue management should therefore be adjusted in accordance with the seasonal conditions to minimise the risk of deep water loss while maximising yield. Deriving guidelines to balance these benefits and penalties is the subject of ongoing research and will include socio-economic as well as biophysical considerations.

Concluding remarks

The experimental and simulation analyses presented here suggest that water storage or loss during the fallow impacts quite strongly on the risk of deep water loss. This means that fallow management needs to find a balance between agronomic benefits and environmental impacts of water storage. In particular it has to be more responsive to seasonal conditions and aim for less year to year variation in soil water deficits at sowing than would be the case if one managed for optimal productivity or minimal environmental impact alone.

Acknowledgements

The research described here was funded by CSIRO with supporting funds from GRDC (Projects CSO197 and CSO232) and Land and Water Australia (Project CDS20). We acknowledge assistance from landholders Bernard Hart, Neil Munro, and CSU farm manager Jim Mellor, in whose paddocks the measurements reported here were made. We also thank Frank Dunin for many thought provoking discussions and access to the lysimeters at CSU and the early data collected from them. This work relied heavily on the tireless contributions of our technical staff: Gordon McLachlan, Seija Tuomi, Aimee Walker, and many others who have made contributions over a number of years. We thank John Kirkegaard for helpful comments on an earlier draft of the manuscript.

References

Felton WL, Freebairn DM, Fettell NA, Thomas JB (1987) In ‘Tillage: new directions in Australian agriculture’. (Eds. P.S. Cornish and J.E. Pratley) 171-193, (Inkata, Melbourne).

Fischer RA (1987) In ‘Tillage: new directions in Australian agriculture’. (Eds. P.S. Cornish and J.E. Pratley) 194-221, (Inkata, Melbourne).

Jeffrey SJ, Carter JO, Moodie KB, Beswick AR (2001) Using spatial interpolation to construct a comprehensive archive of Australian climate data. Environmental Modelling and Software 16, 309-330.

Kirkegaard J, Howel GN, Simpfendorfer S, Angus JF, Gardner PA and Hutchinson P (2001) Poor wheat yield response to conservation cropping - causes and consequences during 10 years of the Harden tillage trial. Proceedings of the 10th Australian Agronomy Conference, Hobart, (Australian Society of Agronomy). www.regional.org.au/au/asa/2001/4/c/kirkegaard.htm

Keating BA, Gaydon D, Huth NI, Probert ME, Verburg K, Smith CJ, Bond WJ (2002) Use of modelling to explore the water balance of dryland farming systems in the Murray-Darling Basin, Australia. European Journal of Agronomy 18, 159-169.

Keating BA, Carberry PS, Hammer GL, Probert ME, Robertson MJ, Holzworth D, Huth NI, Hargreaves JNG, Meinke H, Hochman Z, McLean G, Verburg K, Snow V, Dimes JP, Silburn M, Wang E, Brown S, Bristow KL, Asseng S, Chapman S, McCown RL, Freebairn DM, Smith CJ (2003) An overview of APSIM, a model designed for farming systems simulation. European Journal of Agronomy 18, 267-288.

O’Leary GL (1996) The effects of conservation tillage on potential groundwater recharge. Agricultural Water Management 31, 65-73.

Turpin JE, Thompson JP, Waring SA and MacKenzie J (1998) Nitrate and chloride leaching in vertosols for different tillage and stubble practices in fallow-grain cropping. Australian Journal of Soil Research 36, 31-44.

Verburg K and Bond WJ (2003) Use of APSIM to simulate water balances of dryland farming systems in south eastern Australia, Technical Report 50/03, (CSIRO Land and Water). www.clw.csiro.au/publications/technical2003/

Williams J, Gascoigne H (2003) Redesign of plant production systems for Australian landscapes. Proceedings of the 11th Australian Agronomy Conference, Geelong, (Australian Society of Agronomy). www.regional.org.au/au/asa/2003/i/4/williams.htm

Previous PageTop Of PageNext Page