• Mean vs. Median Salary: Why was mean ever used? (Part III)

    In a previous post, we saw how time consuming it is to calculate with pencil and paper the median salary, or any median of a data set of more than a few data points.

    Even after computers became common in the 1950's, and could start doing the work, the mean still was used, because of one further nasty property of medians. Medians require retaining information about every value until the end of the period for which a median is calculated. It you want to know the median salary, you need to save every employee's salary.

    Means do not require nearly as much information. In the early days of computers, storing information was expensive, so the mean was still favored for "typical".

  • Mean vs. Median salary: Why was mean ever used? (Part II)

    In the last post, I looked at how hard it was to calculate the median vs. the arithmetic mean ("average") to understand why we ever got in the mess of using mean salary to identify a typical annual salary. To make things simple, I used the example of counting checks and check sizes.

    Even for the small data set of 7 days and 15 checks, calculating the median number of checks per day and dollars per check was starting to get laborious. What if you were interested in these numbers for a whole year? How much harder is it to calculate medians vs. means for 365 days and ~750 checks?

  • Mean vs. Median Salary: Why was mean ever used?

    In a previous post, I showed how the mean salary and median salary can be very different.

    The median is much better than the arithmetic mean for giving a “typical" annual salary; median is the method that we favor in our PayScale salary survey. In fact, the median is better for characterizing “typical” in almost any data set. So then why and how did the mean become the standard for “average” or “typical?”

    A Mean Mistake

    A historical accident caused the mean to be used for typical or average. Before the first personal computers were introduced, it was much easier to calculate the mean than the median. Scientists in the 1800's, when statistics was being developed, like today, were lazy (I speak from experience). Hence they settled on the easier, but less accurate, way of computing typical values.

  • Average Salaries: Are they really highest in San Francisco?

    average salaries, median salary, average salary, median salaries, average salaries San Francisco, median salary San Francisco, average salary San Francisco, median salaries San Francisco, salary survey, San Francisco Recruiters, Fix It San Francisco, national median salary A recent article in the Portland Business Journal claims that average salaries in San Francisco are the highest in the country. Is this true? Does it pay to live in San Francisco? Is it really that bad a deal to live in Birmingham, Alabama, the place with the lowest average salaries?

    If you have read my earlier posts, you know to be skeptical about any statement about "median", "typical", or "average" salaries. This particular study reports on the median salary in each city for jobs that have a national median salary of $30,000. This was a little too vague for me, so I looked in the PayScale salary survey data for my own two typical jobs: Certified Public Accountant (CPA) and Registered Nurse (RN).

  • Why is median better than mean for a typical salary?

    In a previous post, I commented that PayScale's Salary Survey preferentially reports typical salaries based on the median instead of the arithmetic mean (average).

    Why is the median better than the mean for measuring "typical" values? The best way to understand what is wrong with the mean is to look at how both behave in answering a simple question: how well have Stephon Marbury's Lincoln High School basketball teammates done in their careers in the last 10 years?

  • Why is there no salary standard deviation on PayScale?

    I am sometimes asked, "why doesn't the PayScale Salary Report and Salary Research Center show the standard deviation of the data? (See Wikipedia for the (useless) mathematical definition of standard deviation.)

    People are are interested in the standard deviation, because it attempts to give the typical variation in salaries. The first thing they calculate with it is a typical range of salaries, by simply adding and subtracting it from the mean.

  • Average Salary vs. Median Salary: Which should I use?

    In my second post, I gave the mathematical definitions of median and arithmetic mean (average). These were pretty useless, like all mathematical definitions, because I did not explain when to use median vs. mean.

    O.K., time for everyone to cringe: remember "word problems" from 4th grade mathematics? It turns out life is a word problem :-) A computer can do math calculations for you (including calculus), but computers are really bad at turning word problems into meaningful answers. A human has to decide which is the best equation to solve a word problem.

  • Average salary vs. median: What's the difference?