Evenly Spaced Points on Logarithmic Graphic Using Excel

Quote of the Day

If I had my life to live over, I would do it all again, but this time I would be nastier.

Jeannette Rankin, the only congressman to vote against the declaration of war on Japan after Pearl Harbor.


Figure 1: Example of Noise Injected as Part of a Susceptibility Test.

Figure 1: Example of Noise Injected as Part of a Susceptibility Test.

I am doing some testing at an Electromagnetic Compatibility (EMC) facility this week. Part of the test specification requires that we inject audio frequency interference on the power supply lines at discrete frequencies that range from 10Hz to 100+KHz, with 30 frequencies selected from each decade of frequencies (e.g. 10 Hz to 100 Hz, 100 Hz to 1 kHz, etc.). Figure 1 shows a specification similar to the one I am performing. My test facility that has chosen the discrete frequencies to be evenly spaced on a logarithmic axis. I started to wonder how the frequencies were selected – let's work through it.

The key to determining the points is to observe that consecutive, evenly-spaced points on a logarithmic axis share a common ratio. We can use this fact to derive a solution using Mathcad as shown in Figure 2. Excel users can see the solution here.

Figure 2: Mathcad Calculation of 30 Points Evenly Spaced on a Log Scale.

Figure 2: Mathcad Calculation of 30 Points Evenly Spaced on a Log Scale.

 
This entry was posted in Excel, General Mathematics. Bookmark the permalink.

One Response to Evenly Spaced Points on Logarithmic Graphic Using Excel

  1. Mark Tucker says:

    Mark -

    This is only remotely related to the point of this blog but does relate to logarithmic intervals (in a way).

    Recently read this about a novel way to bound the input values in the infamous "Drake Equation" for determining the number of Civilizations in the cosmos:

    https://arxiv.org/pdf/1806.02404.pdf

    The authors of the paper are honest enough to realize no one has the foggiest idea of the values for most of the terms in the equation. Instead, they used very wide ranges of estimates (10^-11 to 10^-3 or something similar) for each variable and then used a Monte Carlo analysis to come up with probabilities of N. The novelty (and I think I've got this terminology right) is that they used the Log-normal probability for each variable (where each decade of uncertainty was weighed equally) rather than the arithmetic range (i.e. 0.000000 to 1). This produced a huge range of values for N but centered roughly on 1-100 civilizations. Their use of the logarithmic scale came to mind as I read this blog point.

    Sorry for the tangential nature of this comment but thought you might be interested.

    Mark Tucker

     

Leave a Reply

Your email address will not be published. Required fields are marked *