Wednesday, November 21, 2012

Measuring inflation

Inflation is caused when a one-ounce coin of pure silver is diluted and is now made with 50 per cent silver and 50 per cent nickel. That same size coin now contains half as much silver and, if known to the person accepting the coin, it should be worth half as much as the pure coin. Buyers would need to pay two diluted coins for the same product that one coin bought before the dilution.

This is the classic definition of inflation and while other theories exist, this has been accepted as holy writ since money started being made from intrinsically valuable materials. The theory continued when gold backed paper currency; then each unit of paper currency is worth some known percentage of one ounce of gold. Print more paper and each paper unit can now be redeemed for some percentage less in real gold.

That definition has been outmoded since US dollar stopped being backed by gold, since without that gold, only someone’s opinion of what that paper is worth sets the value. In 1971 then President Richard Nixon eliminated the fixed gold price for US currency.

I’d like to challenge that classic definition. I believe a better measure of inflation is when my hour of labor, how ever paid – silver, gold, or paper dollars – buys me less of the same goods. If I have to work 15 minutes to by a quart of milk today but have to work 16 minutes tomorrow, then the cost of a quart of milk is inflated by 1 minute of my time.

Why reduce it to hours of labor? First; because that’s the product that most of us sell, our time, and I only have so many hours I can sell. Before you start shouting that you are paid for your skill, how is your paycheck calculated? If you’re expected to work 40 hours and your boss will get upset if you only show up for 30 hours, you are really selling time. Commission sales is not tied to the number of hours worked but directly to results. While those exist and some jobs have a mix of hourly rate and commission or piecework schemes, the vast majority of workers get paid for showing up for a fixed number of hours.

Secondly; because if the price of the products I need goes up and I can’t pass that on to my employer in real time, I will have to absorb the cost increase by reducing what I buy or by working more hours.

My theory is that the general population will always be the financial losers in any financial system that doesn’t holed their labor as the measure of economic health.

Current financial theory includes the “rising tide lifts all boats” concept, where any increase in the economy will result in an increase for the majority of individuals. This rising tide theory fails to include the rope tying the boat to the anchor dug into the mud on the sea bottom. The rope represents your salary, and if the rope doesn’t get longer to allow the tide to lift your boat, the bow gets pulled under and that rising tide swamps and sinks your boat. As the tide rises the rope must get longer - in other words, your salary must increase.
Whether your income increases from an individual raise or a general cost of living allowance (COLA) your salary must keep pace with costs or you get sucked under.

If you buy that, whose wage do we use as a standard; the president of General Motors, the national average of all wages, or the official minimum wage? The extreme high and low wage represent outliers while the average by its nature incorporates those extremes, but if the high is too far from the low the average gets distorted. Even the mean wage is distorted if the high is too many multiples of the lowest wage.  The official minimum wage would be a fair measure of salaries in general since there is a direct link between the minimum and the higher wages for harder to find skills.

One complaint against raising the minimum wage is that it drives up all wages. As the minimum wage goes up, then the cost of goods and services created by the minimum wage worker go up and everyone pays more for those goods. That in turn drives up the rest of the salaries to account for those costs increases.

If the theory of spiraling costs caused by raising minimum wages is real; then the buying power of the minimum wage would be a better measure of inflation than the arbitrary value of a US dollar estimated against other currencies by a limited number of currency traders.

Changing the measurement of inflation will help display what the economy is actually doing with greater clarity and precision, supporting better decision-making by planers. It will also create a more obvious link between effort and reward, between labor and purchasing power.

A bold statement you say? If your theory is correct, that changing the minimum wage really does change prices and labor costs across the economy, then using my idea of the ratio of minimum wage to the cost of goods you buy daily will display those changes more closely to real time. The closer our measures are to cause and effect and in real time, the more quickly we can identify and respond to economic changes.

I submit that current economic theory is clinging to outdated theories to measure economic health. Just as checking the oil on your car tells you nothing about the condition of the brakes, using this outdated measure of economic health tells us little about the economy. If you check the oil and think that reports the brakes as good, you are much more likely to have brake failure. In the same way, looking at the wrong economic indicators means that you are much more likely to be caught by surprise and make the wrong decisions.