Author: Kathy Voth | Published on: March 20, 2017
In June of 2014, Grist reporter Nathanael Johnson reported on a battle between two men in New South Wales Australia. Clive Kirkby and John Kirkegaard were having it out over the proper handling of crop residues after harvest. Kirkby was trying to get farmers to stop torching wheat stubble. Rather than letting fire release all that carbon into the atmosphere, he told them that they could increase soil organic matter and build healthier, carbon-rich soils by leaving the stubble in the field. John Kirkegaard, an agronomist, told Kirkby he was wrong. The practice of burning and cultivating was what was growing the best crops.
As most folks will tell you nowadays, cultivating, or plowing, disrupts soil microbes and releases even more carbon into the air. That’s why no-till is becoming increasingly popular. But the practice that Kirkby was promoting didn’t seem to be making a difference either. After six years of leaving stubble in the field, Kirkegaard’s data showed that soil organic matter and the carbon it holds wasn’t increasing, and in some cases, it was even decreasing.
Farmers have been encouraged to leave stubble in the field for the same reason that management-intensive grazing proponents leave plenty of forage behind in pasture: It’s food for the soil. Put more precisely, it’s fuel for a complex, not entirely understood food web of fungi, insects, and microbes eating the residue and each other and transforming plant remains into stable, carbon-rich soil.