During her speech to the National Association of Business Economics on Tuesday, Federal Reserve Chair Janet Yellen made a rather startling admission: The Fed may have “misspecified” its models for inflation and “misjudged” the strength of wages and the job market. Leaving aside the odd choice of words, Yellen—true to her training and temperament—proved herself more interested in understanding the world as it is rather than being right, a rarity in a policy world that often rewards hubris over wisdom.
But her acknowledgment that economic patterns, and inflation especially, are not unfolding as she and the Fed expected should be taken as a sign that the world has changed and that the Fed, and other policymakers, have not yet grasped the extent of those shifts. It is still fighting the last war, and that can be problematic.
To wit, Yellen said inflation and wages are not rising as expected. Nonetheless, she believes the Fed should continue on its path of raising interest rates, because diverging would risk inflation getting out of control once it starts to rise, as she believes it inevitably must.
To which the question should be: Really, must it? What if the combined and continuing effects of technology and a globalized market of goods and labor are so altering commerce and prices that the 20th century script is as outmoded as an IBM Selectric typewriter?
For most of the 20th century, inflation was perceived as the critical threat to economic stability, in the US, Europe, and much of the world. The Fed was created in 1913 to manage that threat, under the guise of “price stability.” After a series of missteps in the early years of the Great Depression, the Fed refined its methods, and built experience and data to understand the warp and woof of inflation. It also became a supreme crisis-manager, most evident during the 2008-2009 global financial meltdown.
Since then, the Fed has remained deeply concerned about inflation, anticipating its return. So too have other central banks, which all rely on the same 20th century script: economies ebb and flow, and when they ebb, the Fed provides money, ballast and juice to get the financial system, and the overall economy, moving again. As economic output picks up, companies start hiring, the labor market tightens, wages rise, which spurs prices to rise, hence inflation, at which point the Fed prepares to raise interest rates and slow things down.
That is a simplified version, for sure. But when Yellen acknowledges that the Fed may have misjudged, she is speaking to the fact that over the past eight years, economic output has picked up and employment has grown, but neither wages nor prices have risen much. Inflation has barely nudged 2% in the past decade.
The question is why. For the Fed, the safest assumption is that it is taking longer for inflation and economic patterns to return to “normal,” but they will do so soon enough. The alternative, which Yellen admirably admits, is that structural changes are invalidating past assumptions and patterns.
But what if these trends are not just short-term blips? We need to start taking that possibility more seriously. The Fed has increased its benchmark rate four times since late 2015 and signaled it would pare its financial holdings. But interest rates on bonds haven’t risen and inflation is nowhere to be found (statistically). How long is something anomalous before we consider the possibility that it’s a new normal? Unemployment has plunged from a high of 10% to the mid-4% range that is about as low as ever. But many of the newly created jobs pay less than their pre-2008 predecessors. Crucially, there’s still little growth in wages.
The absence of inflation doesn’t mean that everyone can afford basic necessities such as health care. But that reflects massive inefficiencies in how we deliver care as much as underlying cost trends.
Instead, the cost of most of life’s necessities, from food to clothing to shelter, has stabilized or dropped over the past two decades care of the deflationary effects of technology. It isn’t just that you can get a large flat-screen TV for next to nada. You can get a car that uses less fuel and is far safer for less money (inflation adjusted) than a gas guzzler of yesteryear. Thank, in part, composite materials, which also require less energy to produce than 20th century steel. You can get a smorgasbord of caloric abundance for a fraction of the cost of a much less varied diet in 1950; you can access new medicines to extend lives by years; and you can access for free on the Internet incalculable reams of data, costing you nothing but your time.
For some aspects of our lives, there is no apples-to-apples comparison with the past. With Moore’s Law and the compression of data and power, today’s smartphones are the equivalent of yesterday’s supercomputer that cost 1,000 times as much, guzzled electricity and demanded expensive cooling systems. Electrifying a grid that needed to fuel that and billions of incandescent bulbs was costly compared with the dollop of energy needed to power LEDs. That washing machine, with its smart chips monitoring the size of your load? That smart thermostat in your home dynamically adjusting heat and air-conditioning? They also reduce costs, and overall electric demand, even in their limited numbers so far.
And this doesn’t even begin to adjust for the possible efficiencies and benefits of the app economy that can connect buyers of goods and services with sellers with fewer frictional costs of middlemen scheduling and booking and coordinating. The TaskRabbit and Uber economy has pitfalls to be sure, but it surely does not drive prices up.
Continue at: https://www.wired.com/story/no-inflation-technology-may-have-left-it-back-in-the-20th-century/
The text above is owned by the site bellow referred.
Here is only a small part of the article, for more please follow the link