It's easier to show than explain.
I went to the NOAA website(
NCDC: U.S. Climate Normals -) and selected Iowa. I downloaded the historical normals for Iowa. I don't know where in Iowa BigTiller is, his profile says "Central Iowa," so a picked a weather station in the middle of the state, Ankeny.
On page 17 it has heating degree-days by month. This is the line for Ankeny:
009 ANKENY HDD 1450 1148 894 497 199 26 6 23 120 431 866 1301 6961
The numbers after HDD are the months, 1450 is January, 1301 is December, 6961 is the total for the year. I'm going to assume that the heating season is October 15 to April 15, 182 days. I'll also assume that all of the HDD in October and April are during the heating season. The total HDD for the heating season is 6123. That's for an indoor temperature of 65F. With a heating season of 182 days, each extra degree of indoor temperature is 182 extra HDD, so an indoor temperature of 68F means 6669 HDD. Raising that to 70F means 7033 HDD.
The difference between an indoor temperature of 68F and 70F is 364 HDD, or 5.5%. If 68F means you need 1000 gallons, 70F would need 5.5% more, or 1055 gallons.
It's a simplistic model to assume that energy consumed is directly proportional to difference in temperature, but if your heating system was engineered, it's the model the designer used.