When looking at the discourse around jobs, careers, and work, one of the most striking aspects is how precious a “stable job”, a “stable career”, and “job security” is to so many people. While those are real advantages, what if the pursuit of career stability and job security is in the final analysis a trap? What if embracing volatility and uncertainty, in a measured way, leads to a more lucrative and, paradoxically, even more stable career?
After all, it’s now well-known that accepting volatility when investing leads to higher returns, to the extent that investing all your money (and then some, by using leverage!) in the stock market leads to higher returns over the long run. Not until the 1990s was this wisdom completely accepted by the mainstream, less-volatile and, as it turns out, less-lucrative bonds having long been preferred. The embrace of stocks, volatility and all, has made many individual investors millions since then. Might a similar leap in the world of careers be possible?
For all the discussion of how to avoid having an unstable or insecure job, it’s striking how safe, secure, and stable most jobs are, or at least appear to be. Employees don’t usually think about it much, but how their labor schedules and paychecks work don’t reflect how (for want of a better phrase) real life works.
Employment: a Bubble of Unreality?
It might seem like a strange thing to say, and in some sense I suppose it is, but consider that employees get a large and predictable stream of income, the same amount coming to them week after week, month after month, year after year, with the occasional merit raise or cost-of-living adjustment giving a small boost. Employees often also have the same amount of work to do with the same amount of hours required to do it week after week, month after month, year after year.
To most people this might seem normal, but entrepreneurs, business owners, and investors usually experience a starkly different reality. In their world income streams are large but are very unpredictable, ranging from outright losing money one week to a large windfall the next week, only being able to guess at what income they will have a year or more from now. They almost never have the same amount of work to do with the same amount of hours; sometimes there’s so much work to do there aren’t enough hours in the day to do it all, other times there’s virtually no work to do.
Since entrepreneurs, business owners, and investors, i.e. the owning class, are the people who pay employees their wages, it’s safe to say that their experience of the economy is more in tune with how it actually works than employees’ is. The economy and the financial markets don’t go up in straight lines, they move up and down in quick bursts of change. The markets ebb and they flow, but the one thing they don’t do is remain unchanging over a long period of time.
So why don’t most jobs work the same way? The answer is complex and multifaceted, but one reason is fairness. If upper management screw up and tank the company’s earnings employees who performed well will nevertheless get pay cuts, which reduces morale and the incentive to put in good work, unless it’s a small company or they’re in the upper management of a big company. As it turns out start-ups and upper management are two settings where it’s standard to be paid in stock or in some way that’s correlated with corporate earnings rather than a steady cash income.
One might also object that employees have large fixed expenses (like mortgage payments) that would go unpaid if they didn’t make income during lean years, but that’s what saving and investing a large share of the fat years’ income is for, so in principle this wouldn’t present a problem, just as it doesn’t for those with volatile incomes today. Nevertheless some people like the predictability of a fixed-rate mortgage and a fixed-wage job.
Fixed Rates, Fixed Wages, Fixed Life?
Fixed-wage jobs have real drawbacks, though. Let’s take the example of a fixed-rate mortgage. You take out a loan from the bank to buy a house and the bank guarantees you an unchanging rate of interest, usually for the next 30 years. That sounds like a good deal to most people, but it’s a problem for the bank, because the rate of interest the bank pays and takes in from its own ventures is variable; there is a risk it will go up or down over those 30 years, costing the bank money it could otherwise harvest. So what does the bank do? It charges you a higher rate of interest in order to insure against that risk.
This is the reason why fixed-rate mortgages usually carry a higher interest rate than adjustable-rate mortgages, and the reason why on the whole fixed-rate mortgages inflict losses on homeowners. All other things being equal you will pay more to the bank with a fixed-rate mortgage than you will with an adjustable-rate mortgage, just like an investment portfolio with a protective put option costs more than one without it. Alan Greenspan was right to point out in his infamous 2004 speech extolling the virtues of adjustable-rate mortgages that security comes at a price.
Is it a price worth paying? Greenspan conceded the possibility that for many it might be, but curiously few people ever seem to view it through that lens. Is sending those thousands of extra dollars a year to the bank for interest rate protection more worthwhile than saving and investing the money yourself? Interesting question.
An even more interesting question that’s seldom discussed is whether the price of job and wage security is worth paying. When a business suffers losses it usually tries to preserve as many jobs as feasible until the recession or other crisis passes. Since the business isn’t earning money during these periods, it stands to reason that the insurance against the employees losing their wages is paid by the employees during the fatter years. It also stands to reason that like all insurance the participants on net lose money.
Businesses also make an effort to pay employees the same amount of money they’ve been accustomed to; efforts to normalize the practice of cutting pay during lean years have only gone mainstream very recently (as in last year), and even then the savings for the company aren’t usually very substantial. This has been offered as an alternative to cutting the number of employees to save money, the standard way for companies to cut labor costs.
Job Security: not so Secure after all?
And this brings us to one of the biggest risks of the conventional employee model. Much like selling put options on stock market indices or shorting the volatility index, conventional employment is characterized by a steady accumulation of revenue as long as the expected happens, but when the unexpected happens large losses are suffered in one blow.
In employment’s case, the large loss is being laid off, which as most people who have careers can attest is in fact very common. Overnight you go from having a steady and seemingly safe income to having no income, not until you can find another job anyway. A foundational premise of the planning you’ve done for your life, including most prominently your large fixed expenses, is swept away.
Between lay-offs and quitting, the “failure rate” of new jobs is actually higher than the failure rate of new businesses! It might sound crazy, but it’s true: while 45% of new businesses will still be operating in five years, only 32% of new jobs are still being worked in five years.
Of course 68% of jobs don’t “fail” in five years. People simply may have been promoted or have left for better or more exciting opportunities. But by the same token 55% of businesses don’t “fail” either; many of those entrepreneurs start newer or better businesses or find a better opportunity as an employee.
It turns out new businesses are more robust beasts than they’re usually given credit for. This makes some amount of sense, since it’s easier to retool or adapt a business than a conventional job, and having multiple clients instead of one employer minimizes the risk of all of your income disappearing. Flexibility in the face of change is advantageous.
Flexibility: a win for Workers and Owners alike
It’s so advantageous, in fact, that the secular trend is to introduce such flexibility into the realm of the employee. The “contingent workforce”, the vast collection of part-timers, temporary workers, and independent contractors, have been growing as a share of the workforce since around the end of the Second World War.
It makes a lot of sense. If you’re a business, why take the risk of giving someone lifetime employment when you could try them before hiring them? Why hire a fixed number of people working a fixed number of hours instead of adding and subtracting personnel and hours as needed? If you’re a worker, why work for just one employer your whole life when you could work for a succession of employers that all offer something new and exciting? Why commit to showing up week after week when you could have the choice to accept or reject an assignment? In the case of independent contractors, why work to someone else’s schedule when you could choose your own hours?
The strange Triumph of Job Security
Indeed, it makes so much sense one has to wonder why full-time long-haul permanent employment ever took off in the first place. This model centered on safety, stability, and security was not the norm historically. Until the early 20th century, flexible work was the norm, with most workers being independent. Even at the Ford Motor Company turnover exceeded 400% before Henry Ford introduced the model that would come to define the 20th century workforce.
What exactly drove this development, usually associated with “Fordism”, is a matter of great speculation, but whatever it was it seems to be on its way out. Its heyday, such as it was, didn’t even seem to last very long; elements of “post-Fordism” were becoming significant components of the economy as early as the 1940s.
The 20th century labor model is often described as an advancement for workers, and is still held up by (so-called) liberals, progressives, and social-democrats as the ideal, but it should not be forgotten that “Fordism” and its offering of “job security” rose hand-in-hand with not only the welfare state but also corporate paternalism.
The latter is particularly interesting, since the age of “welfare capitalism”, starting in the late 19th century but only really taking off in the 20th century, was the birthplace of the still-common practice of paying employees benefits instead of actual money. Many people, usually the same types that want “stable careers”, love benefits packages, but it’s not often considered that all benefits are in the final analysis deducted from employees’ paychecks.
Paternalism: the Dark Side of Job Security
This is a rather big deal, because it robs employees of choices and thus power over their own lives. When you get your compensation in the form of money, you can buy anything with it you want. When you get your compensation in the form of a retirement plan, a health insurance plan, and the like you can only access the goods and services your employer wants you to have. Indeed, at one point in the late 19th century William Hesketh Lever, one of the pioneers of the paternalist model, essentially admitted that was the point:
It would not do you much good if you send it [the profit-sharing] down your throats in the form of bottles of whisky, bags of sweets, or fat geese at Christmas. On the other hand, if you leave the money with me, I shall use it to provide for you everything that makes life pleasant—nice houses, comfortable homes, and healthy recreation.
As one can see, an elite obsession with controlling the diets of the working class for their own good is not a recent development driven by the “obesity epidemic”. Ford Motor Company, the namesake of “Fordism” itself, as part of its famous “$5 a day wage” initiative, took corporate paternalism to the extreme in the 1910s, imposing strict conditions to get the higher wages and benefits, enforced by investigators that scrutinized employee’s spending habits, marital relations, child-rearing practices, and other aspects of his personal life. This was the so-called “Sociological Department”, which amounted to a corporate morality police. Even at the time it was controversial, and proved to be short-lived.
The larger project of controlling the working class through doling out benefits controlled by elites continued, however, and not just at the corporate level. At the governmental level the welfare state, pioneered in the late 19th century by characters like Otto von Bismarck (no radical advocate of workers’ liberation), was built up in the first half of the 20th century, crowding out mutual-aid societies and individual initiative controlled by the workers themselves in favor of an apparatus controlled by the state, and hence the ruling class.
The Rebirth of Flexibility
Through the Second World War this model went from strength to strength, and had something of an Indian Summer in the post-war period even in the face of the challenge from the contingent workforce. As it turns out the capitalist class like having a flexible, instead of rigid, workforce, even if other members of the ruling (and especially middle) classes more conservative and ensconced in bureaucracies prefer stability over all else.
A more important challenge came from the costs the model incurred. As it turns out maintaining large pensions and health insurance, especially in the face of “cost disease”, i.e. the middle class’s ever-escalating demands for bureaucratic make-work jobs, which was already starting to exert effects on the latter even then, is expensive.
This very factor is what doomed Henry Ford’s corporate morality police in the 1910s, and when the bills started coming due for less-intensive models of corporate paternalism in the 1970s businesses were forced to cut either wages or benefits. Given workers’ allergy to wage cuts, benefits were the obvious choice, and so the “post-Fordist” or “neoliberal” workforce model entered the mainstream.
In many ways this is just a rebirth of the regime of flexibility that characterized the pre-“Fordist” era. Considering in retrospect how brief the regime of security over flexibility was in employment, the age of lifetime employment, as it’s sometimes known as, will I suspect go down in history as an aberration that previous and subsequent generations alike will marvel at with incomprehension.
In the “post-Fordist” era welfare states, being run by governments that are less sensitive to costs, have continued to expand, albeit at a slower rate. Even so, unlike the “Fordist” period, in recent times we see the odd spasm of welfare spending cuts.
Credentialism, Licensing, Cost Disease: the Ruling Class strikes back?
What’s really interesting is that this same period, the 1970s up to the present, has also seen an explosion in occupational licensing requirements. When the laboring classes can’t be regulated by offering benefits outright state violence will be used it seems.
Over the past half century the ruling and middle classes have seen few problems they believed couldn’t be solved by imposing or tightening licensing laws. And it’s not just licensing laws; regulation of the whole economy, while increasing beforehand, has exploded in scope and depth over the past half century.
Another interesting aspect is that credentialism has also exploded during this time. Although the years of schooling required to get a good-paying job have been steadily increasing since the 19th century, only relatively recently has a college degree been necessary. Even more interestingly, the cost to get these degrees has exploded. Like health care, “cost disease” has long affected education, but over the past half century cost increases have accelerated markedly.
This has served to effectively isolate anyone except the upper and middle classes from accessing a great many opportunities in the professional or even business world. While the owning class have their investments and thus don’t care much about jobs, the middle class of professionals and managers immediately beneath them do, and that makes their experience as the people who the discourse says enjoy “stable careers” involving “real jobs” interesting.
Job Security: cheating the Power Law?
Although not enjoying lifetime employment in most cases and (in the case of doctors and some others) being stripped of much autonomy they formerly enjoyed, it is true that in much of the professional class you are more or less guaranteed a certain salary if you jump through the right hoops.
This is actually quite curious, since almost all these occupations have high barriers to entry. There are even occupations that aren’t upper-middle-class in salary but are highly regulated, and they follow the same pattern. On the other hand, in occupations, even high-paying ones, that have low barriers to entry, there are no guarantees. Earnings tend to follow a power-law distribution, with most of the proceeds being earned by a small fraction of the workers.
This kind of distribution holds across a shockingly diverse array of human endeavors. In a given workplace most of the work is done by a small fraction of people. It’s common for a small fraction of artists’ works to generate most of the revenue, for a small fraction of stocks to generate most of the gains of a given portfolio or index, or even for a small fraction of the up days to generate most of the gains in a bull market. It’s also safe to say that most economic growth is generated by a small fraction of the population.
Yet here we have many occupations that don’t really follow this distribution. Admittedly, there are many innocent explanations for such a result, but the heavily-regulated high-barrier-to-entry nature of these professions being so correlated with this result has to make one wonder if there are structural factors at work masking an underlying more volatile reality. As in the other instances discussed in this post, the price for this job security, i.e. insurance against volatility, might be preventing the realization of greater earnings.
I suppose my goal with this blog post is to point out that, amid all the exhortations to get a stable career and all the nostalgia for a lost golden age of job security, the price of job security is substantial and high, both in lost optionality and opportunity, so high as to not be worth it.
Being the master of your own fate, as either a rugged individual or together with others of like mind, is far better than being a ward of a bureaucracy. Making your own security, through saving, investment, mutual aid, and above all the flexibility to do what you know is right for your life is far better than relying on someone else to provide it for you.
Once you escape the mindset of being an object for others to act upon, which the siren song of job security sweetly whispers into the ears of anyone susceptible to its lies, and start approaching your life with a mindset of being a subject that acts, a mindset of ownership over your present and future, a mindset more aristocratic and independent, you will have opened yourself up to any opportunities that might come your way, and taken the first step down the path to joining the owning class.