</font><font color="blue" class="small">( A rule of thumb for mowing time is the mowers width, in inches, times the speed of the tractor, in MPH, divided by 100. This gives you a rough acres per hour rate. A 6ft (72") mower at 4MPH will cover about 2.9 acres per hour. This easy estimate allows a little time to allow for fuel fills, short breaks and overlap. With this mower and speed configuration, it will take about 35 hours of seat time to mow your 100 acres. A 15ft batwing @ 5.5 MPH will cover 9.9 acres per hour and take a little over 10 hours to cover 100 acres. I think these rates are very realistic as the bigger the tractor, the faster you can drive comfortably. You may want to figure the time you want to spend mowing and work backwards to the size of tractor and mower needed to accomplish the task in your time frame. /forums/images/graemlins/smile.gif
)</font>
I read a simular "formula" on a Purdue University website several years back. The only difference was they had an "efficiency factor" included. That was/is an arbitrary number that was used to multiply your final number to get a true indication of acres per hour.
It included time spent turning, over-lap, fueling time, breaks, and any other loss of productivity.
I've found over the years that it usually equates to as much as 20% INefficiency (Inches width X MPH divide by 100 X .80)
I usually mow with a 7' cutter at 5 MPH. That would equate to 4.2 acres "in a perfect world". I've found "in reality", I can depend on mowing something like 3 to 3-1/4 acres per hour.