Why should the volume of standard base (or acid) needed to reach the equilvalence point in a titration be more than 10.0 mL?

The usual buret used is 50.0 mL. The reading error is about 0.02 mL. So if you make a reading error of 0.02 mL in 10 mL that is an error of (0.02/10)*100 = 0.2% or 2 parts/1000. If you make the same reading error in 40 mL it is (0.02/40)*100 = 0.05% or 0.5 parts/1000 which is essentially negligible while the error in 10 is the best you can do if the remaining parts of the entire procedure are perfect. A really good analytical chemist tries to do 1 or 2 parts per thousand on "standard procedures" (perhaps I should say CAN DO since we all TRY TO DO the best we can.)