Choosing the right nutrient rate for your cropping system
Sunday, March 3, 2013
There are economic and environmental consequences of applying either too much or too little nutrient. The answer will depend on your focus as much as on the cropping system on your farm
by KEITH REID
In my last column, I wrote about choosing the right source and right time for nutrient applications. This month, I am going to tackle the most difficult of the "4Rs" for many farmers – choosing the right rate of nutrients to apply.
Why is rate important? The response to this may seem obvious but, as with many "simple" questions, the answer will depend as much on the way you think as on the cropping system on your farm. If your focus is on crop yield, your bias may be towards applying rates to make sure you don't run short of nutrients. If your focus is on environmental protection, you will likely have the opposite bias and look for reasons to apply lower rates of nutrients. These goals are not mutually exclusive, but the challenge is finding the rate that gives the best economic return while minimizing impacts off the field.
There are economic and environmental consequences of applying either too much or too little nutrient. We know there comes a point where adding more nutrients doesn't generate any extra yield. Beyond this point, the cost of growing the crop increases, but the value of the crop stays the same, so profits go down. Going far beyond this point may actually cause yields to go down, so there is a squeeze on both the income and expense side.
These excess nutrients are also the ones at greatest risk for loss into the environment, so there are both environmental and economic incentives to avoid rates beyond what the crop needs.
At the other end of the scale, we tend to focus on the lost yield from nutrient deficiencies if not enough nutrients are applied, so there is a strong economic incentive to apply enough nutrients. The risk of losses to the environment don't decline with lower fertilizer rates once you are below the optimum, but the counter-intuitive part is that the risks can actually increase if the crop is significantly short of nutrients. Severe nutrient deficiencies will limit the growth of the crop, leaving more soil open for erosion by wind and water, and reducing the transpiration of water so more nutrients can be leached down through the soil profile.
Because there are differences between mobile and immobile nutrients, there are differences in how close you need to be to the right rate each year. Immobile nutrients, like phosphorus or potassium, will stay in the soil, so what is not used this year will generally be available in future years. Nitrogen (N), on the other hand, can be easily lost through the winter by leaching or denitrification, so we don't count on being able to capitalize on any leftover N.
Steps to determine nutrient needs. Determining how much fertilizer we need to apply should be fairly simple. All you need to answer is how much will the crop take up? How much can the soil supply as estimated from soil properties and measured by soil tests and, secondly, what other nutrient sources (manure, compost, biosolids) have been applied?
The difference between crop needs and the nutrients available from the soil or added materials is the shortfall that would normally be filled by fertilizer.
Not all of the fertilizer is immediately taken up by the plant, so the rate would need to be adjusted to reflect the uptake efficiency. There will also be losses from the system that will need to be replaced.
These steps are good in theory, but fall short for practical application in the field. Crop uptake will vary, not just between crop species but from year to year. The weather will drive crop growth, so better weather will mean a bigger crop that needs more nutrients. It will also mean a bigger root system and more mineralization of nutrients from the soil organic matter.
Weather conditions will also affect the availability of nutrients from organic sources and the losses from fertilizer applications. Until we develop better tools to predict the impact of weather on nutrient availability from the soil, the best indicator of optimum fertilizer rates is the accumulation of many years of field trials.
Should you adjust rates for different crop yield expectations? The answer is yes and no. Yes, for nitrogen, since there is evidence to show that higher-yielding crops need, on average, more N than low-yielding crops. Yes, for P (phosphorus) and K (potassium) if your goal is to maintain soil fertility, since you will want to replace what you take out. This replacement can be done after the fact, so it is quite all right to base P and K applications on last year's yield map.
No, if your goal is to maximize the return to fertilizer in the year you apply it. Higher-yielding crops will have larger and more efficient root systems, so they will take up more nutrients from the soil than a poor-yielding crop. In fact, the poor yielding crop may need more nutrients to overcome some of the inherent limitations in the soil.
In a future column, I'll finish up the discussion of the 4Rs with the hows and whys of putting nutrients in the right place. BF
Keith Reid is manager (Eastern Canada), Soil Nutrient and GHG Management Agriculture and Agri-Food Canada, Guelph.