How employers get more value for money
There are at least three components to the employer-employee contract:
- Work week (quantity)
- Worker productivity on the job (quality)
As discussed in my recent post on average wage increases in a recession, lowering a worker's pay is usually the last tool employers use.
Ahh, I'm also gonna need you to go ahead and come in on Sunday, too...
The first strategy employers try is to increase the number of hours worked per week. This works best for “exempt” workers, also called salaried, and who are not paid per hour worked, like most high tech knowledge workers.
For example, one way to do this strategy is to lay off 20% of the exempt workers, and expect the remaining workers to continue to produce the same total team output.
As long as worker productivity is proportional to hours worked (not the best assumption, but good in the short run), having each worker put in 50 hours a week instead of 40 should make up for the missing staff.
This is trickier for hourly "non-exempt" workers, since those longer hours in principle have to be paid at higher overtime rates. However, there are tricks for non-exempt too: "salaried non-exempt" is an amazing, legal, device that seems, based on emails to Dr. Salary, to be the approach of choice in the downturn. That is the subject for another column.
Work smarter, not longer
The second strategy is to increase worker productivity per hour. At first blush, this seems hard. Are not workers already as productive per hour as they can be? It is not as hard as one might think, because worker pay does not grow linearly with productivity.
For example, by laying off the 20% of workers who are least productive, the average productivity per worker goes up.
Since more productive workers are not paid 100% of their greater productivity, even if the remaining workers actually earn more per hour than the ones who were laid off, total production per dollar can still go up.
As I discussed previously, this is why average wages can go up at a company that has had layoffs.
This is particularly true in software development, because of the incredible range of productivity (see the Mythical Man-Month for a historical view).
As a software program manager, not developer, I was enough on the outside to see this phenomena first hand. The higher productivity of the best developers can be much more valuable to the company than the extra money they earn.
Who was the cheapest developer I knew at Microsoft, in terms of amount of useful software produced per dollar paid? The one who earned over $500,000 per year (salary and stock). He produced more flawless code in 2 days than most produced in a month, and the other developers were no slouches.
He also was a better program manager, in the sense of knowing what the right features to produce were and in his ability to write specs, than most (all?) of the program managers. Needless to say, his productivity was hard on morale for the team who worked with him. :-)
Hire only the best...for less
When unemployment is high, an employer can effectively get more, higher quality, work for less, without actually paying less.
For example, in Seattle as for January 1, experienced senior software developers working for small tech companies are paid between $77,000 and $121,000 per year (10th to 90th percentile).
Consider a company that advertises for this position with a budget of $90,000.
When times are good, they may have hired someone with a bachelor's degree from a lesser school, 8 years experience, and some solid application development experience on internal corporate applications.
Now they may be able to hire someone with 10 years experience, a CS degree from Stanford, and impressive multi-threaded server-side coding experience at high scale consumer web startup that went belly up.
The company isn't paying less, they are just getting more for the same price, often in characteristics that are not tracked in traditional compensation surveys.
But employers do pay less in a recession, right?
Reducing pay of existing employees is the third, and least used, knob in an employer’s tool set for controlling employee cost in a recession. It does happen that existing salaried employees have their salaries cut, but this is not the common case.
Traditionally, not giving raises, and letting inflation take the money back, was the approach of choice (think the 70's) for cutting wages. However, we are currently in a low inflation to deflationary environment, so this trick doesn't work.
The second approach is to cut back "variable" pay. Part of the reason most large companies give ~10% bonuses to most of their salaried employees is the ease with which this can be trimmed. If everyone gets a less than stellar review, and the average "performance" bonus is reduced to 6%, effectively wages have been cut by 4%.
At PayScale, we are tracking wages closely, looking for actual decreases in wages.
We are definitely seeing that, for people who still have a job, wages for most jobs are not going up 5% a year, as was common for a lot of Seattle tech jobs last spring.
But wages also are not, yet, going down.
Since we track these more subtle differences in worker quality, we should be able to see employers getting more for the same money. We will let you know when we do.
In the current economy, it is important to know what you are worth. When you want powerful salary data and comparisons customized for your exact position and qualifications, be sure to build a complete profile and get an accurate report by taking PayScale's Full Salary Survey.