I frequently hear people make statements that somehow do not seem right. Today I heard a good example. During a discussion of laser levels, the topic of required accuracy came up. I heard a contractor make the following statement:
Who cares if the laser level is accurate within an 1/16th of inch at 100 feet? The Earth curves away from a horizontal line by 1/8th of inch for every 100 feet of horizontal distance. The Earth's curvature will swamp out your instrument error in less than 100 feet.
The statement about the curvature of Earth got me thinking. How much does the Earth's surface deviate from a horizontal line over a distance of 100 feet? The contractor's number intuitively seemed wrong because the Earth is round and the deviation from horizontal should be a function of distance. A little math will give me the answer. For consistency's sake, I will perform all computations in US customary units.
Figure 1 illustrates the situation and contains the derivation of both an exact and a approximate solution. The triangle formed by x, R + δ, and R is a right triangle, which means that the Pythagorean theorem can be used to produce an exact solution. In addition, a simple approximation for δ is also developed assuming R >> x and using a linear approximation for the square root. In Appendix A, I give examples of the computations in Mathcad.
Given the situation shown in Figure 1, we can compute the deviation from horizontal as follows.
- R is the radius of the Earth (3963.2 miles)
- x is the horizontal distance of interest (100 ft)
The contractor had stated that the curvature of the Earth causes level to deviate from horizontal by an 1/8th of an inch (125 thousandths of inch) for 100 feet of horizontal distance. The actual deviation is ~2.9 thousandths of an inch for 100 feet of horizontal distance, which is almost 45 times less than the contractor claimed. So it is meaningful to buy a laser level that is accurate to 1/16th of an inch over 100 feet, i.e. the laser level error is not swamped by the curvature of the Earth.
Why did the contractor make the claim that the Earth's curvature is 1/8th inch over 100 feet? He made a simple mistake. He did not understand that the deviation from horizontal for short distances is given by a square-law relationship, shown in Equation 2. In Equation 2, I include an approximation that is only valid when R is much greater than x, which is true in typical construction problems.
If we use Equation 1 or Equation 2 to calculate the deviation from horizontal at 1 mile, we get 8 inches. This value is quoted in a number of references on surveying (e.g. here is one, here is another, yet another). What the contractor did was erroneously assume that the deviation varied linearly with distance, which would mean that a deviation of 8 inches at 1 mile is equivalent to an 1/8th of inch at 100 feet.
For those of you who may be interested in the related question of the error in horizontal distances caused by living on a spherical planet, see this blog post.
Aside: Here is an interesting discussion that references this web page.
Appendix A: Computation Examples
Figure 2 shows a few computation examples. Normally, I let Mathcad do the unit conversion, but I do show one example with explicit unit conversion.