• Hourly Wage vs. Salary, Exempt vs. Non-Exempt

    Post by Dr. Al Lee, Payscale.com

    Most people in the US work force have the heard the terms “exempt” and “non-exempt,” but what do they mean? While many web sites talk about pay rate, there is not a whole lot of explanation regarding exempt and non-exempt status.

    While I am not a lawyer, or even an HR specialist, I am an employee, and also hopelessly curious about all things related to pay and employment. The basic law is that employers are required by the Fair Labor Standards Act (FLSA) to classify their employees as either exempt or non-exempt.

    The more I read about the meaning of "exempt" vs. "non-exempt", the more a lyric of the Paul Simon song "Train in the Distance" goes through my head, "...with disagreements about the meaning of a marriage contract, conversations hard and wild." Like a marriage, in the US an employee/employer relationship is governed by a little law, and a lot of social convention. Since much is not written down, misunderstandings are common.

    Before we delve into the details, why not check out where your salary fits into all of this controversy?  Find out with our ever-handy salary calculator.

  • How Large a Salary Survey Sample is Enough? (II)

    In a previous post, I claimed that as few as 5 employee profiles are enough to report accurately on the typical pay for a job. How can that be?

    In this post, we'll look at how statistics work, and why a small, targeted, data set is often preferable to a much larger, but poorly characterized one. You don't even need fancy math to calculate this.

    If you are curious what kind of sample we have for your job, try the PayScale salary calculator.

  • How Large a Salary Survey Sample is Enough?

    PayScale often receives questions about how many salary survey employee profiles we have.

    Our answer is we have enough, and the number is growing rapidly. :-)

    This begs the question, how large a salary survey data set is enough? How many data points are required for PayScale data to be truly significant? The number needed depends on what questions are being asked. In this post, I'll look at the questions the United States Bureau of Labor Statistics and PayScale typically ask, and the amount of data each needs to handle statistical fluctuations.

    You can experience our data techniques first-hand by trying the PayScale salary survey.

  • Mean vs. Median Salary: Why was mean ever used? (Part III)

    In a previous post, we saw how time consuming it is to calculate with pencil and paper the median salary, or any median of a data set of more than a few data points.

    Even after computers became common in the 1950's, and could start doing the work, the mean still was used, because of one further nasty property of medians. Medians require retaining information about every value until the end of the period for which a median is calculated. It you want to know the median salary, you need to save every employee's salary.

    Means do not require nearly as much information. In the early days of computers, storing information was expensive, so the mean was still favored for "typical".

  • Mean vs. Median salary: Why was mean ever used? (Part II)

    In the last post, I looked at how hard it was to calculate the median vs. the arithmetic mean ("average") to understand why we ever got in the mess of using mean salary to identify a typical annual salary. To make things simple, I used the example of counting checks and check sizes.

    Even for the small data set of 7 days and 15 checks, calculating the median number of checks per day and dollars per check was starting to get laborious. What if you were interested in these numbers for a whole year? How much harder is it to calculate medians vs. means for 365 days and ~750 checks?

  • Mean vs. Median Salary: Why was mean ever used?

    In a previous post, I showed how the mean salary and median salary can be very different.

    The median is much better than the arithmetic mean for giving a “typical" annual salary; median is the method that we favor in our PayScale salary survey. In fact, the median is better for characterizing “typical” in almost any data set. So then why and how did the mean become the standard for “average” or “typical?”

    A Mean Mistake

    A historical accident caused the mean to be used for typical or average. Before the first personal computers were introduced, it was much easier to calculate the mean than the median. Scientists in the 1800's, when statistics was being developed, like today, were lazy (I speak from experience). Hence they settled on the easier, but less accurate, way of computing typical values.

  • Average Salaries: Are they really highest in San Francisco?

    average salaries, median salary, average salary, median salaries, average salaries San Francisco, median salary San Francisco, average salary San Francisco, median salaries San Francisco, salary survey, San Francisco Recruiters, Fix It San Francisco, national median salary A recent article in the Portland Business Journal claims that average salaries in San Francisco are the highest in the country. Is this true? Does it pay to live in San Francisco? Is it really that bad a deal to live in Birmingham, Alabama, the place with the lowest average salaries?

    If you have read my earlier posts, you know to be skeptical about any statement about "median", "typical", or "average" salaries. This particular study reports on the median salary in each city for jobs that have a national median salary of $30,000. This was a little too vague for me, so I looked in the PayScale salary survey data for my own two typical jobs: Certified Public Accountant (CPA) and Registered Nurse (RN).

  • Why is median better than mean for a typical salary?

    In a previous post, I commented that PayScale's Salary Survey preferentially reports typical salaries based on the median instead of the arithmetic mean (average).

    Why is the median better than the mean for measuring "typical" values? The best way to understand what is wrong with the mean is to look at how both behave in answering a simple question: how well have Stephon Marbury's Lincoln High School basketball teammates done in their careers in the last 10 years?

  • Why is there no salary standard deviation on PayScale?

    I am sometimes asked, "why doesn't the PayScale Salary Report and Salary Research Center show the standard deviation of the data? (See Wikipedia for the (useless) mathematical definition of standard deviation.)

    People are are interested in the standard deviation, because it attempts to give the typical variation in salaries. The first thing they calculate with it is a typical range of salaries, by simply adding and subtracting it from the mean.

  • Average Salary vs. Median Salary: Which should I use?

    In my second post, I gave the mathematical definitions of median and arithmetic mean (average). These were pretty useless, like all mathematical definitions, because I did not explain when to use median vs. mean.

    O.K., time for everyone to cringe: remember "word problems" from 4th grade mathematics? It turns out life is a word problem :-) A computer can do math calculations for you (including calculus), but computers are really bad at turning word problems into meaningful answers. A human has to decide which is the best equation to solve a word problem.

  • Average salary vs. median: What's the difference?