A Socialist by Necessity: The Anonymous Societies

Witnessing the effects of business deregulation was a significant element of my decades-long leftward shift from Centrist Republican to Socialist.  The earliest deregulatory actions of the Reagan Administration and the later capture of the legislative and regulatory functions created the environment for what has become a wholesale pillaging by the corporate sector and the uber-wealthy.  It has been the route by which millions of jobs have been sacrificed to short-term profit goals and the American Middle Class has been strangled.  This progressive regulatory neutering has fertilized the bloom of kudzu-like moral hazard and allowed the return of a corporate culture that was shackled by our predecessors:  the Anonymous Societies.

What exactly is a corporation?  The phrase that sticks with me from a distant business law class – more notable by the presence of a fellow student’s bandana-clad labrador retriever that sat at a desk two rows over – is “fictitious legal entity”, a construct that forms the groundwork for giving a non-breathing, bloodless entity the same standing legal standing as a real person.  There were other descriptions and meanings but the full meaning of a corporation didn’t sink in until years later.  At its core, a corporation is simply a legal mechanism to pool capital to gain the requisite critical mass necessary to fund a new venture to gain wealth.  Jeff Bezos and Sergei Brin might have the financial capital necessary to fund a new venture but the great mass of humanity will have to figure out a way and the corporation is a decent mechanism if you don’t have $25 million in pocket.  You come up with a plan and seed money, sell the idea to a group of interested individuals and in return for a promise to share in the profits, they provide the necessary funding.

Corporations go back to the Middle Ages, when they existed for finite periods of time to raise money for specific purposes such as chartering a school or building a cathedral.  That changed at the beginning of the 17th century when the English king provided a charter to the East Indies Tea Company; that company then raised money for the express purpose of turning a profit for the new shareholders.  These early corporations, including the West Indies Tea Company and the Dutch East Indies Company, not only lived to make money but to assist their respective governments in the colonization of their respective regions.  These companies finally failed by the 1800s when their respective colonial regions either became independent or impossible to administer.  It was only in the early years of the 19th century Industrial Revolution that the modern variant of the corporation came into being, existing in perpetuity (hopefully) to turn a profit for the shareholders.

If you understand the comment that evil is twice around the block before good has even put on shoes, then you have a sense of the relationship between corporate behavior and government oversight.  Throughout history, regulation has largely been behind in responding to the various corporate misbehaviors.  Before any meaningful regulation began in the late 19th and early 20th centuries, men grown so wealthy as to be known as robber barons engaged in such activities as bribing Civil War telegraphers to obtain advance notice of battle results to sell or purchase gold in advance of the public.  They manipulated the price of company stocks like a Duncan Yo-yo.  They engaged in bare-knuckle price fixing to eliminate competition.  They became the earliest lobbyists by camping in the lobby of Civil War Washington’s Willard Hotel to buttonhole Union officials to procure contracts.  They fought – and sometimes killed – labor organizers in disputes about working conditions.  And in one of the more entertaining episodes now known as the Erie Railroad War, two robber barons swindled another by simply printing thousands of new stock certificates to sell to him as he attempted to buy up control of their railroad.  The point being that in the absence of meaningful regulatory oversight, gross illegalities – with significant collateral damage – occurred in the pursuit of profit.

The term Anonymous Society is foreboding, the image evoking shadowy figures moving in the background to satisfy their own ends.  It is also a term explicitly linked to the corporation.  I first learned the term in a college Spanish class when a professor corrected me in conversation, clarifying that the correct Spanish term for corporation was Sociedad Anonima and that any Spanish corporation would carry the identifier SA after its name.  But it was decades later that I learned the history behind the phrase Sociedad Anonima.  Unlike today, where ownership of shares is recorded and reportable, the practice in 19th century Europe was that share ownership was not recorded.  A European corporation of that period didn’t know who owned their shares; the shareholders were literally anonymous and company dividends could only be paid to those individuals who showed up with their certificate chits as proof of ownership.  Corporate directors did not know if the shares changed hands between owners and the early European corporations were literally anonymous societies of shareholders.  Because the practice fed illegalities, most often tax evasion, that anonymity was eliminated but the original terminology, SA, remained.

The collective decline of the American Middle Class, since the Reagan Administration, is rooted in the notion of shareholder capitalism.  It was during the first part that I was witness to one of those countless actions in the name of shareholder capitalism.  At that time in the early 1990s, I worked on the corporate staff of a multinational telecommunications company that provided long-distance services.  The firm had an immensely successful marketing program, Friends and Family, which offered lower phone service rates to anyone enrolled who was calling a friend or family who was likewise enrolled.  The program was marketed and sold via phone sales from multiple call centers located across the Midwestern United States.

The call centers were a mix of fixed and variable costs, equipment and labor respectively.  To keep costs down, corporate would place them in economically distressed areas; they would find a locality with cheap property, empty building and a population with higher unemployment.  The farm debt-ravaged Midwest met that criteria in spades.  The company would establish a site, lease and retrofit an unused warehouse or empty supermarket as a call center and then hire the locals to work there as phone bank operators to sell the Friends and Family program.  Understand this about the corporate mindset:  Labor is viewed as an accounting concept, a variable cost.  You cannot just tear up hundreds of miles of fiber-optic cable nor recoup the cost of switching equipment installed in temperature controlled rooms.  Those costs are fixed and woe to the executive who advocated for those decisions if they don’t pan out.  But people can be hired and terminated, in many states at will.

My position was in Risk Management, but my small department was located oddly in the Treasury Operations Group.  A part of the job was to make periodic trips to the division that managed the F&F program and that entailed flights to those call centers.  Senior executives could take the corporate jets to Hong Kong or London, but I caught an evening flight to Minneapolis and a subsequent crop-duster to Iowa or Missouri.  There were instances that I sat as an observer with the call center workers.  The system would auto-dial a number and it would be routed to the employee, who would commence the sales pitch upon being answered.  Upon the call’s ending, often unpleasantly, the employee would have a few seconds to mentally reset before the system repeated the process again.  During breaks, I did what I frequently do:  I chatted with the people.  They were not there for a career but solely to make ends meet in a difficult place at a difficult time.  They were college students, divorced parents and ex-farmers working for second income money and modest benefits, supporting immediate or extended family.  They were real people doing their best to handle real financial situations.

Fast forward to a late-Spring Friday afternoon in the downtown Washington, DC headquarters.  My cubicle was located on the third floor immediately outside the front conference room adjacent to the Treasurer’s office.  As I worked, I overheard him and others entering the conference room, joined shortly afterwards by the CFO and other senior executives and staff.  It was notable because the CFO and those executives typically stayed on an upper floor with a view towards the Washington Monument and the White House.  When everyone had later gone and the day was winding down, I stepped into the office of a cash management director and made a crack about the presence of the gods amongst us mortals.  She didn’t respond at first, unlike other times, but then commented that the senior management was concerned about the share price and that it had been stalled near a particular level.  The executives wanted to make a gesture to the market to demonstrate that they were “serious” about controlling costs and they would announce that they would be shuttering multiple call centers.  I didn’t think that the decision made great sense since this program was a certifiable marketing phenomenon with wide brand recognition and yes, the company was in the black.  Was it as profitable as it could have been?  Probably not.  But the decision was framed within the context of proving a point to the stock market and as executives with stock options, the decision makers in that room had a vested interest in seeing that price rise.

The decision to close centers, with the resultant loss of hundreds of  jobs, was announced the following week.  Like tufts of dried dandelions in stiff breeze, the jobs were simply gone.  Mortgages, health insurance, family circumstances, whatever…all were meaningless to those executives so long as the market understood that they were serious.

Although my own job was safe, the experience was educational.  When we found that my wife was pregnant about a year later, we talked at length and decided that it made more sense if one of us stayed home with the child.  This experience was not the principal reason behind my decision to resign and stay home but it certainly lurked in the background of my thinking.  This act, and the countless others throughout the economy, proved that corporate loyalty to the employee was dead and that I could be unemployed regardless of my competence or job performance.  We would take a significant short-term pay cut but my wife’s long-term employment would be far more stable.

Just a few years ago, almost twenty-five years after the closures, I wandered the local high school auditorium lobby during a play’s intermission.  I was perusing the plaques honoring notable alumni and stopped abruptly at one plaque, which honored the CFO of that corporation for his contribution to establishing a gift to the school.  I looked him up upon returning home and found that he had left the company a few years after me and was now a principal at an established investment firm.  He had managed to avoid the final implosion of the company after it was purchased by Worldcom, which itself ceased to exist because of a massive accounting scandal.  The Treasurer was himself established as the CFO of an oil company and the CEO, who wasn’t in the room but would certainly have signed off on the closure decision, was now an independent investor specializing in tech start-ups.  My thought now, as it was that night?  You got yours, you Bastards.  What about everyone else?

A corporation is a valuable tool but in the end, it is only that, a tool.  There are certainly decent corporate leaders who abide by the rule of law, but the cumulative acts of the others are sufficiently damaging that we can no longer allow the conflicts of interest arising through stock options.  We can no longer accept them at their words that the books weren’t cooked (Enron and Worldcom), their scientific research wasn’t flawed (Theranos), their actions weren’t damaging to the marketplace (Amazon).  I am sure at this point that you can identify any number of other corporations which can fit the bill here.

People, many being supporters of President Trump, fear the impact of a Democratic party victory upon shareholder capitalism.  They neither recall nor understand that much of the damage to their middle class is a result of shareholder capitalism, a Randian and morally bankrupt conceit that serves as window dressing to justify the legalized looting and pillaging by corporate elites for decades.  Without the re-imposition of government oversight and regulation, the pillaging will continue until the Anonymous Societies succeed as neo-feudal lords amongst hundreds of millions of American serfs.

A Socialist by Necessity: Capitalism? Let Them Eat Cake

“To admire, and almost to worship, the rich and powerful, and to despise, or, at least, to neglect persons of poor and mean condition, though necessary both to establish and to maintain the distinction of ranks and the order of society, is, at the same time, the great and most universal cause of the corruption of our moral sentiments.”

Adam Smith, The Theory of Moral Sentiments (1759)

“It is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own self-interest.  We address ourselves, not to their humanity, but to their self-love, and never talk to them of our own necessities, but of their advantages.”

Adam Smith, The Wealth of Nations (1776)

“Follow the money.”

Deep Throat to Bob Woodward, 1973

If you want to understand why I shifted from Centrist Republican to Socialist, you need to think about Adam Smith and Capitalism.  What we presently call Capitalism has spawned a massive division of wealth – gutting an entire Middle Class – because it leans upon a bastardized propaganda version of Smith.

Capitalism is a young concept relative to the history of the world.  First elucidated by Smith in his 1776 The Wealth of Nations, it gained traction during the early 19th century with the Industrial Revolution.  It spawned fortunes to a very few at the expense of the many.  Early capitalism grew explosively during the technological upheaval of that period, fostered by weak legislative and judicial systems which not only permitted, but supported, a system free of any regulation.  Toss in Western Civilization’s social and religious structure, recognizing and reinforcing class distinctions, and it was off to the proverbial races for the wealthy few.  It was only through decades of muck-raking journalism, labor unrest and early legislative efforts that society’s ball was slowly moved forward on the field against the power of concentrated wealth.  In the United States, these efforts largely failed during what we know as The Roaring Twenties as public sentiment bowed to the supposed acumen of the business and corporate class.  When Great Depression congressional hearings uncovered just how badly that class screwed up, it was a series of regulatory actions and social programs that set the stage for the version of mid-20th century American Capitalism that most reflected the full range of Adam Smith’s work.

According to the International Monetary Fund, “Capitalism is an economic system in which private actors own and control property in accord with their interest, and demand and supply freely set prices in markets in a way that can serve the best interests of society.”  In the event that you think that in the best interests of society is a socialist term added by some IMF pinko leftist, just remember that Adam Smith was a strict Scot Presbyterian who wrote The Theory of Moral Sentiments almost two decades before he penned The Wealth of Nations.  While everyone gloms onto his phrase about the rational self-interest of the butcher and baker as justification for the inherent success of capitalism, they ignore the fact that in a sense, Smith was using the idea of capitalism as a means of raising the proverbial tide of all boats in the waters of society.  Fine, says Smith, the best way to make money and improve their condition is to understand that people will act in their own best self-interest.  But understand that fixating upon the accumulation of wealth is self-corruptive and ignores the larger obligations that we owe to society.  

How did we manage to divorce capitalism from any concern for the welfare of society at large?

Start with the question, what exactly is Capital?  We have an entire economic system predicated upon the word and I frankly doubt that many have even thought about it.  Google the question what is capital? and the response can vary by site.  Investopedia is the first response and defines capital as “…financial assets such as funds held in deposit accounts and/or funds obtained from special financing sources…Capital assets can include cash, cash equivalents, and marketable securities as well as manufacturing equipment, production facilities, and storage facilities.”  Wikipedia (yeah, I know, I know…) defines capital as “human created assets that can enhance one’s power to perform economically useful work.”  Most sites ignore the concept that people can also be capital; the Corporate Finance Institute notes that capital has a human component including social, intellectual, physical and talent skills.  And this is the crux of our present issue.  There is supposed to be a degree of balance between the various aspects of capital, a general equilibrium between the human, financial and physical assets with all of them coming together to create wealth and grow the economy.  But in the four decades since the election of Ronald Reagan, that equilibrium has vanished and those who view capital as solely the purview of financial instruments have gained control of the economy.

The wealthy are like the poor in that they have always been with us, their power flowing and ebbing over the political events of generations and constrained by taxation, regulation and occasionally, the guillotine.  Their most recent ebb came after the Great Depression, when Congressional inquiries revealed the depth and extent of their financial misbehavior.  But they didn’t leave and they continually lobbied over decades for the lowering of taxes and the repeal of the regulations that held them in check.  My kids have heard me say on multiple occasions that the seed for our era was planted by Ronald Reagan, who rode the promise of a renewed America and lower taxation into office in a landslide.  Tax rates were cut and that freed money to sustain the pent-up demand unleashed after the problematic 1970s.  But there were two singular changes in Reagan’s first term that set the stage for what has happened since.

Like an Amazonian butterfly whose flapping wings spawn a Bangladeshi monsoon, the first of these changes was notable only in the stunningly dull pages of the B section of the Wall Street Journal.  As a college business/economics student during that term, I was required to subscribe to the WSJ and read it regularly.  My morning routine consisted of early classes followed by a walk to the post box for the paper and mail, and then to the cafeteria for a coffee and morning read.  That read began with the stock results in the C section and then onwards to what I now recognize as the utterly schizophrenic A section editorial page, then the print counterpart to today’s Fox News.  In each case, the news side was essentially reliable but the Editorial side was highly conservative and occasionally batshit crazy.  What popped out one morning in the B section was a wonkish accounting article discussing a proposal to alter CEO pay packages to allow stock options in addition to the standard salary.  The rationale was that the presence of equity options would provide a greater sense of ownership in the company and by extension, a greater willingness to accept risk.  It was a shift of philosophy from company steward to company owner.  There were a few articles afterwards and the practice was quietly adopted.

Score an early point for potential conflict of interest.

The second change was the innocuously sounding passage of SEC rule 10b-18  in 1982, which again legalized the practice of corporate stock buybacks.  Buying back one’s stocks was once legal, but was abolished with the passage of the Securities and Exchange Act in 1934 when Congressional inquiries discovered that more than a few high-flyers of the 1920s markets used the practice to manipulate their share price upwards.  The rationale behind the 1982 allowance was rooted ostensibly in the high interest rate environment of that period.  In corporate finance, the idea is to invest in those projects which provide a sufficiently large rate of return relative to the cost of money or whatever management deemed the appropriate internal rate of return.  In an environment where the cost of money was 15% and higher, there were few projects which could even come close that benchmark.  The proposal was that instead of companies just sitting on cash, it was better to simply return it to the shareholders and let them use it to their own best advantage.  The optimal way to do that would be the buyback of stock instead of paying higher dividends to gain the tax advantage of the buyback’s capital gains rate versus the dividend’s higher income tax rate.

Score another point for potential conflict of interest and manipulation.

After these two early acts set the stage for the later triumphant looting, events took place in distinct strands that ultimately came together to literally throttle the American Middle Class and contribute to our immense wealth gap.

The first strand was conservatism’s foothold through and after the Reagan years.  Conservatives were no longer stodgy middle-aged lodge members that sold insurance and met for Wednesday league golf.  They became acolytes to a gospel cult of Wall Street and profitability.  I began working in the corporate office of a now-defunct multinational telecommunications firm in 1992.  As a first entered the Treasury Group area, I met the prototype for the Young Republican capitalist:  tanned and handsome with wavy sandy blonde hair, dressed in an impeccable suit with matching red power tie and suspenders.  As he shook my hand and introduced himself, his next words were Y’know, Reagan is a God.  I don’t recall my response apart from a mental note to the effect of well, this is special.  Our subsequent conversation afterwards was an inquiry about my weekend and then a storyline about how wasted he had been at the beach the immediate weekend before.  He was an avatar of what was to follow – supremely self-confident, narcissistic and completely vacuous.

Capitalism during the Reagan years began morphing into what is now termed shareholder capitalism, which was proposed and promoted by Nobel award economist Milton Friedman in the 1970s.  Shareholder capitalism posited that executives were to work at the behest of shareholders to the exclusion of all others, including customers and employees.  I don’t recall anybody pointing out the inherent conflict of interest as top executives were now shareholders as well.  This was the point at which the concept of capital became the purview of financial instruments to the practical exclusion of all else.

Corporate finance shifted with the rise of corporate raiders such as Carl Icahn and firms such as Kohlberg, Kravis and Roberts.  These adhered to the shareholder capitalism principles and readily used large amounts of debt – the so-called Other People’s Money – to acquire companies and dismember them, selling off profitable divisions.  They then took their proceeds and left the surviving portion of the new firm with to deal with their massively increased debt load.  More than a few companies perished because they were unable to cope and more recent examples of this include retailers Toys R Us and Neiman Marcus.

My beach capitalist former co-worker would have felt at home amidst these proceedings.

Assisting in this period was the creation and growth of conservative media, which paved the way with a new attack narrative questioning the role of government and painting the poor as lazy and undeserving.

The key event event for shareholder capitalism however was the 1999 repeal of the 1934 Glass-Steagall Act.  This was the fundamental act which regulated the actions of the banking sector, effectively forcing banks to focus upon commercial banking.  It was actively promoted by Limbaugh and his peers and permitted money center banks such as JP Morgan Chase and Citibank to again become involved in investment banking.  They became active participants in the subsequent use and sale of Mortgage Backed Securities and other derivative instruments which ultimately led to the Financial Crisis of 2008 as they became actual threats to the stability of the entire financial system.  They were now effectively Too Big to Fail.

The next strand was woven within the political arena.  Conservative principles of deregulation also led to the neutering of regulatory bodies such as the SEC via both defunding and executive direction to scale back investigations.  The promise of lucrative private sector careers for cooperative regulatory officials put the cherry on the sundae of what is known as regulatory capture.  The legislative branch was compromised by the failure to control lobbying and campaign financing.  My own personal bugaboo is the growth of the American Legislative Exchange Council, which acts as a conduit of legislation between corporate beneficiaries and legislators, most at the state level.  ALEC was a pre-Reagan entity but came into its own after Reagan came to office.

Score ten points for self-interest.

The third strand was a new attitude towards failure.  A key feature of capitalism is that success is obviously met with reward and failure with loss; more power to you if you succeed but you had better be ready for the consequences of failure.  But remove the consequences of failure and what begins to occur is moral hazard.  Market participants begin to expect that they will be spared the cost of failure and are thus willing to undertake greater and more irrational risk.  Perhaps the first financial event to evoke this was the collapse of a private fund named Long Term Capital Management in 1998.  The gist of the firm’s strategy was to make money arbitraging the differences in interest rates of different instruments, a transaction with a very small profit margin.  To maximize their return, the owners borrowed larger and larger sums of money to create a critical mass of capital to make the strategy lucrative.  But when some of the supporting bonds defaulted, the remaining assets could not support their nosebleed levels.  The government was forced to step in and arrange, through multiple large banks, an orderly winding down of the trades to prevent a collapse that would have likewise destroyed other firms interlocked via a web of financial relationships.  Yes, the firm failed and people lost money but there was a lesson to those few paying attention.

Go ahead and engage in the excessive risk-taking, the system will be protected should things go south.  I can argue that this was the precursor event signaling to shareholder capitalists that it was a new game.  Hey…we can do all kinds of things now and take our rewards.  The market will be fine and we can make money until then.  We just have to get out first.

The final strand was a new philosophy about interest rates.  The arrival of Ben Bernanke as Federal Reserve Chair in 2006 was a signal moment for the American financial sector.  Bernanke’s academic focus was the study of the Great Depression and his premise was that the 1929 Fed exacerbated the collapse by raising interest rates and siphoning liquidity from the system.  His proposal would be to lower interest rates and flood the markets with liquidity.  The hiring was a tacit acknowledgment of market over-value and that a crash was to be expected.  Any issues with the market would be met with a flood of liquidity.  The 2008 collapse put this proposal to the test and rates were lowered to historic levels.  This was additionally matched by Fed mechanisms, aka “the windows”, to flood the severely damaged financial sector with gob-smackingly stupid amounts of liquidity.

It was here that cheap credit became the new crack.

Significant market issues would be met with rate adjustment downwards and despite efforts in the twelve years since then, rates have not returned to pre-crisis levels.  So how is this crack?  Corporate executives saw the opportunity to use ultra-low rate debt to their advantage:  they began borrowing signficant sums of money on their corporate books and using it…to buy back shares of stock.  Cumulative growth of BBB rated corporate debt rose 400% between 2008 and 2018.  The effect of this on cumulative Earnings Per Share has been a rise more than seven times higher than sales per share over the same period.  Oh, and by the way, the executives are concerned that the market will tank if we ban stock buybacks.

The Federal Reserve has literally become the enabling mother of narcissist sociopaths.

So after Milton Friedman and Ronald Reagan, four strands were woven:

  • Inherent conflicts of interest were legalized and allowed;
  • A national narrative attacking government regulation occurred and the subsequent financial decline was blamed upon out-groups such as the poor and illegal immigrants;
  • The legislative and regulatory processes were seized and neutered;
  • Monetary policy and the Federal Reserve were co-opted by the practices of Mutually Assured Financial Destruction brought about by the first three bullet points.

These strands have been woven into the rope that is literally strangling the nation.

The tributes and nods to Adam Smith from the conservative media are a joke.  The conservative movement has seized upon a singular work, cherry-picked it and beaten it like a drum to justify decades-long chicanery, theft and wholesale looting.  They have helped to rig a system that purposefully strip-mines the wealth of large segments of American society and used their media allies to clothe it in respectability.

Smith would be appalled by what has happened and would likely say this:  Read the other book.

 

 

 

 

Pandemic Food Price Index: 7/20

In order to ascertain the impact of the Covid-19 Pandemic and it’s dislocations on food pricing, I re-booted an old project to track the pricing on a market-basket of typical food items in three local grocery stores.  The original pricing for this new project was done in May, 2020 and the pricing for July was completed last week.  The  results of May 2020 are the base index level of 100.  Following are the results and some comments.

First, the Index rose slightly from June’s 100.66 ($89.09/basket) to 100.77 ($89.19/basket).  Although prices were relatively stable, increasing by about .1% over the previous month, there was considerable turbulence within the basket of foodstuffs.  Specifically:

  1. As with last month, there were 7 separate items of 111 priced (37 items in the basket at 3 stores) not available at all, for a missing rate of 6.3%.  Not only missing from the shelf, but with the shelf label either completely gone or else with a notice that the item is temporarily discontinued.  What was also notable about these missing items was that five of the seven were in the locally owned supermarket; the shelves at multiple points in that store also had temporarily discontinued labels on a number of other foodstuff items that were not considered since they were not in the sampled basket.  (Note:  The attached monthly pricing at the bottom has eight prices missing instead of seven; there hasn’t been a price for 80% ground beef for one store from the outset in May.  That store no longer carries 80% ground beef in any form and the missing prices thus exclude that item.)
  2. Because these items were missing, they were not priced and were thus removed from calculation for the monthly result.  It is impossible to say what the basket value would have been had they been on the shelves.
  3. The largest change however, was seen in the staples section, specifically for five pound bags of flour and four pound bags of sugar ($.09/lb and $.61/lb respectively).  Topping that off, the largest supermarket chain (internationally based) simply did not even carry their store brand sugar and were sparse on levels of the name brand sugar.

My wife, BH, linked an article from the NY Times to my phone.  Specifically, the writer noted that foodstuffs were reappearing again, most especially toilet paper and flour but that the breadth of product offerings was less.  In other words, I can get toilet paper and flour but not in the variety that was offered previously.  This would perhaps account for what I am seeing in the locally owned grocery as the store brand, offered via an independent grocer Group Purchasing Organization.  The food suppliers are shifting their production to the larger and more flush entities, crippling the offerings of the independent supermarkets.  With greater economies of scale among their stores, the larger entities will be at a greater advantage to the smaller independent stores moving forward.  So, yeah, Mick was right:  you can’t always get what you want, but you can get what you need.

Basket Results (7/2020)

 

The Consumer Economy Headshot

The truth is that the consumer-driven model is now functionally dead, an economic zombie shambling along and awaiting the merciful head shot that drops it, allowing it to be kicked into the gutter and out of the way.

PracticalDad, Post-Consumer Parenting (April 8, 2016)

The consumer-driven model that has powered this nation’s economy for three quarters of a century is now officially dead, the head shot delivered by…a virus.  Like any zombie, it was compelled to mindlessly consume yet was malnourished by an increasingly severe lack of purchasing power.  I would have been less surprised by the manner of death than to find that Bette White was cast as the new villain on The Walking Dead.

It starved for years, certainly longer than April 2016, when I wrote the above linked post.  Zombification occurred in stages over the course of decades.  One contributing factor was the effort by corporate employers to shift from pensions to 401k plans, citing the need to cut costs and allow for funding to compete against other companies.  Another was the claimed throttling cost of benefits, consequently cutting back on health care benefits in the face of rising costs.  Yet another was the drive to maximize shareholder value by decreasing labor costs, shipping jobs – even entire plants – overseas or increasing the drive to automate them.  Even with these ongoing hits, the process was accelerated by an economic demand that now mandated a college degree for entry into the fabled American Middle Class.

The condition however, became terminal with the Financial Crisis of 2008, from which it never recovered.

The symptoms have been there for years in any variety of news articles:

To quote Captain America:  I can do this all day.

Understanding the impact of this collapse is helped by understanding how the model came about in the first place and for that, you have to return to the period immediately prior to the Great Depression.  Economists were developing the Expenditures Theory of Gross Domestic Product:  C + I + G +(X – M) to help create a systemic framework for understanding the national economy .  Simply put, a nation’s GDP is a function of the aggregate spending of Consumers (C), Business (I), Governments (G) as well as the aggregate international balance of trade between exports (X) and imports (M).

We might recall the decade as the “Roaring Twenties” but the reality was different.  There was supreme confidence in the business community and many industrialists and financiers bought into the notion that the historic business cycle of boom-and-bust had been eliminated.  But there was an awareness among others that significant problems still existed.  The agricultural sector was mired in an economic depression as crop prices had collapsed years before the Wall Street collapse.  Some were aware of the inequitable distribution of wealth in society and others noted that the lion’s share of the economy’s productivity gains through the decade had accrued to the wealthiest class.  The average American worker saw significant wage gains but the top 5% of wage earners garnered 34% of the disposable income, up from 24% in 1920.  Then came the collapse of 1929.

President Herbert Hoover’s responses to the Great Depression were constrained by the philosophy – along with almost everyone else – that the Federal Government must annually balance it’s budget.  There was plenty of rah-rah jawboning and some effort to run a small deficit and create additional relief programs but in the end, he was bound by his personal belief that it was up to American individualism to find a way out instead of government action.  Relief programs were left to charity and local and state governments but the continued spiral downwards left everyone without money so that by 1932, destitution reigned; the economy was at the point of real collapse and Senators were warning of open revolt by the election of 1932.

Yet debate among economists continued during that period and it was in 1930 that John Maynard Keynes wrote A Treatise on Money, which became the basis for what we now know as Keynesian Economics.  Another economist caught the ear of nominee FDR in that period however, and his name was William Trufant Foster.

The heart of Foster’s concept was that the Depression was ultimately caused by under-consumption, that the average person simply didn’t have the financial wherewithal to support the purchasing power required for all of the economic output produced.  If there was to be renewed growth of output and through it, employment and wage growth, it had to come via increased consumption in any fashion, whether by the individual, the business sector or the government.  Keynes’ work provided the intellectual justification to allow government deficit spending to spur that aggregate demand in economic downturns.

I don’t know that we can now appreciate the level of political and economic chaos in the period between FDR’s election in November 1932 and his inauguration in March of 1933 (the inauguration date was later moved to January).  Farmers were banding together to actively deter farm foreclosures via threat and in some cases, actual violence.  The Soviet government actively supported a rising Communist party and through it all, hundreds of banks across the country were closing their doors, destroying the little savings that were left to the individual.  Two states independently declared bank holidays, temporarily closing all banks within the state for a one or two week period.  Why?  Fear.  The average American had so lost faith in the system and government that, anticipating his or her own bank to collapse, began pulling all their remaining money from banks.  By doing so, they themselves guaranteed a collapse.

Fear.

This was the backdrop for FDR’s now-famous First 100 Days.  It was the backdrop for the creation of new and untested programs to get people working and money once again flowing through the economy.  Fear was the enemy that FDR fought in that early period of his Administration and was the basis for his statement in his first inaugural address, The only thing we have to fear is fear itself.  FDR understood that money must be flow and consumption must be restored and in the short-term was willing to use the government budget to do it.  He also acknowledged the power of the budget and knew that in the longer term, the average citizen would have to step up and this could not happen until fear was lessened and purchasing power grown.

Why the introduction of bank deposit insurance via the FDIC?  To lessen the fear of bank collapse with the resulting loss of savings.

Why the introduction of Social Security?  To lessen the fear of poverty in old age.

Why the creation of multiple job and agricultural programs?  To lessen the fear of poverty, bankruptcy and ultimately, starvation.

And why the creation of multiple public authorities such as Rural Electrification and the Tennessee Valley Authority?  To spur the development of the physical infrastructure necessary for future growth and keep it out of the hands of the private sector, most particularly the financiers.

All of this was undertaken to rebuild the purchasing power of the American citizen and ultimately, to diminish fear because fear eroded faith in the system.

Remember that phrase:  purchasing power.

Government spending alone was insufficient however, and it was clear by the severe recession of 1937 that something new had to be tried.  This was interrupted by the Second World War and any other activity was shelved for the duration.  What happened through the post-war period however, was a series of measures that, by design or happenstance, assisted not only the economy and purchasing power of the American consumer but diminished the fear that kept it from being exercised.

  • The wide-spread availability of health insurance from employers meant that Americans were relieved of the fear of crippling medical bills.
  • Higher education was made more available to the large number of returning veterans via the GI Bill of 1944 and the quality of that education was increased with significantly higher public funding for facilities at state universities.
  • The existence of Social Security and the availability of company sponsored pension plans meant that Americans were, to a considerable extent, relieved of the fear of poverty in their old age.

This is where we find ourselves today.  The Consumer-driven economic model was predicated not just upon the wealth and incomes to support reasonable purchasing power, but also the assurance that there was a sufficient safety net to protect the constituent consumers.  The high cost of healthcare via premiums, deductibles and co-pays has shifted to the family with a subsequent loss of purchasing power.  The high cost of the college degree that is now a prerequisite to a job that at least promises stability has shifted first to the family, and then to the youth, with a subsequent loss of purchasing power.  The decrease of pension plans and the rise of self-funded retirement has shifted that to the family as well, with a subsequent loss of purchasing power.  Couple these with the disproportionate rise in the actual costs for healthcare and higher education?  Disaster.

There is a terminal lack of purchasing power.  That the average American had nothing upon which to rely when social distancing shutdowns occurred with no economic support forthcoming while the financial system and corporations were backstopped fed a smoldering anger.  That small business was forced to shutter while certain large retailers were declared essential spiked that anger.  People can talk all that they want about the pandemic measures impinging upon their rights, the underlying fear is that they face economic ruin unless they can return to their jobs.  Regardless of where you are on the political spectrum, it is ultimately an anger built upon the practical implications of economic inequality that we have allowed to take root.

Perhaps the only remote silver lining to this freakishly misbegotten shit show is that it is occurring in an election year.  What we have known as an economy is functionally dead.  The national savings rate has spiked to 33% in April 2020 and the economic establishment states that we are hoarding cash.  Do you know what I say?

Good.

Why should we now spend for anything other than necessities?  Why should we spend when government and corporate policies make it clear that our families will receive no meaningful support?  Why should we upgrade and consume when the products, although ostensibly American, are built overseas and profits are disseminated only to shareholders and senior executives?

There is now a debate brewing in Washington as to whether the temporary additional weekly unemployment benefits of $600 should be extended past their July 31, 2020 expiration.  This is occurring because research finds that fully 68% of American workers now have UI benefits greater than their weekly wage.  Conservative legislators fear – understandably – that there is no longer an incentive to work and that such benefits constitute a moral hazard.  Yet they oppose an increase in the minimum wage.  They oppose any public sector financing for healthcare.  They oppose any increase in public funding for higher education and some even support decreasing funding for elementary and secondary education.  And they support a President whose 2020 budget proposal called for Medicare and Medicaid cuts to address a trillion dollar deficit.

And they do not answer the underlying question:  How have we come to this juncture that the wages are so disproportionate to what is required to survive in America today?

This is why the election year timing matters now.  There must be clear and progressive – even radical – policy choices made to help create a new model driving economic growth, one that is not piled upon the back of an American citizen again bereft of purchasing power and crippled by fear.

And yes, one that actively encompasses a real core of social justice.

Pandemic Food Price Index (6/2020): You Can’t Always Get What You Want…

May of 2020 saw the inaugural edition of the Pandemic Food Price Index, a 37 item market basket of foodstuffs to measure the impact of the Pandemic on grocery store food prices.  Here are the results and observations for June, 2020.

  • The nominal cost of the 37 item basket, shown at the bottom, was $89.09 in June.  This was an increase of $.58 from the May 2020 baseline cost of $88.51.  The Pandemic FPI is now at 100.66:

(((89.09-88.51)/88.51)*100) + 100 = 100.66

  • An increase of .66% seems minor, but this is monthly and extrapolates to an annualized rate of 7.92%.
  • A basket of 37 items priced at three stores is a total item count of 111 individual items.  Of these 111 items, there were seven of the 111 not in stock at the time for an out-of-stock rate of 6.3%.  Note that in each of the seven items, there were still labels on the shelves so the items were still for sale but just not in stock at that moment.  This is in comparison to the three items out-of-stock in May, 2020 (rate of 2.7%).
  • Five of the seven out-of-stocks were at the largest, internationally owned grocer and involved three of the four Staple category items (Canola Oil, Sugar, Flour generics) and two of the three cereals (Frosted Flakes, Rice Chex generics).  In each of the cases however, there were other alternatives for sale so it really is a case of you can’t always get what you want…
  • The anomaly was a case of stealth inflation.  The local grocer was no longer offering the 32 ounce jar of generic grape jelly and had replaced it with an 18 ounce package at a lower price.  I recalculated the per ounce price at the 18 ounce package level to an equivalent price at the original 32 ounce and this recalculated amount was used in the Index.  The nominal impact is a price decrease of 36% for the lesser amount but the real impact, had the original package been used, would have been a 14% increase.
  • The meat stocks were plentiful and what was notable in June versus May was the presence of the three pound “chub” of ground beef.  These packages first appeared in my local markets in approximately 2014 and were produced from factory-style meat processing plants; I suspect that the grocers were introducing them in response to the declining purchasing power of their customers.  These items were absent when I did the initial pricing on May 2 and their absence was due to the issues with Covid exposures at the factory-style meat processing plants.

Comments

As I noted, an increase of .66% doesn’t seem like much but annualized to almost 8% in a nation in which the family income has been smashed and the Q1 GDP has dropped by 5%, it is highly problematic.  Fortunately, enhanced SNAP benefits are in place for the remainder of FY 2020 and WIC benefits are likewise enhanced through the end of September, 2021.  Where it is an immediate problem is for senior citizens relying upon their Social Security benefits.  Why?  Because the cost of food is not included in the calculations for determining the COLA increases that occur in the latter half of each year.  If the prices continue to rise, the effective purchasing power of the senior citizen will actually lessen as the COLA amounts for the past five years have averaged 1.34%.

The grocery shelves are not bare and in each of the cases that I noted, there were other options available.  Even toilet paper is back in each store, albeit in smaller package sizing.  Rice was there but typically in larger sized bags and even in April and May, there was rice available but it was in brands sold to the Hispanic community and hence in a separate aisle.  The generic brand staples (canola oil, sugar, flour) were missing in the one store and although the stocks were minimal in the other two as well, there were reasonable amounts of name brand items available.  And no, I’m not going to count the damned bags of flour.  The indication is thus that people are doing what they can to stretch their dollar as they hunker down.

The renewed presence of the lowest price per pound “chub” had a significant impact here.  Had I used even last month’s lowest per pound price for ground beef, the FPI would have had a June Index level of 101.52 versus the actual 100.66.  Extrapolating to an annualized rate, the impact would be a hypothetical 18.24% rise versus the present annualized rate of 7.92%.  But I have severe qualms about that since I know that those plants are operating under the aegis of the invoked Defense Production Act, despite increased infection rates for those in the meat packing plants because of work conditions.

And I wonder whether or not, were we asked as part of a communal sacrifice for the common good, we would be willing and able to forego so much meat as part of our diet.  It was what our great-great-grandparents did for the common good on the home front of the Second World War, sacrificing something for those on the front lines.  But in times of Pandemic, the front lines are here.

Pandemic FPI (6/2020)

 

Re-Boot: The Pandemic Food Price Index (FPI)

What is happening to the price of a market-basket of food due to the economic effects of the Covid-19 Pandemic upon:  (1) the upheaval in the supply chain; (2) the collapse in aggregate family income?

This two-pronged question is the reason for the resumption of the PracticalDad Price Index after an almost four year hiatus.  We know from the US Department of Labor’s report for April 2020 that food prices rose by the highest monthly amount in 46 years due to the supply chain upheaval.  It’s bad, but it only gives for general food groups (meats, vegetables, etc) and doesn’t go into further depth than that.  This kind of information will also miss the impact of the supply chain’s efforts to mitigate the cost increases and attempt to keep foods affordable for the shopper.

The modified index (for the original Index introduction, see here) will focus solely upon the original 37 foodstuff items from the original index, broken into categories of Meat, Dairy, Bread, Staples, Cereal, Produce and Grocery items.  The pricing will occur within the first five days of each month at the same three groceries, each unrelated to one another.  The groceries represent three separate tiers of size:  local, Mid-Atlantic regional chain, international chain.  The prices for the items from the three stores will be averaged and the the mean prices added to find the total cost for the composite basket.  The total for the composite basket as of the pricing for May, 2020 will serve as the index baseline of 100.  The total for the month composite baskets moving forward will be likewise totaled and their results shown as a comparison to the initial index level of 100.

Caveats:

  1.  This is not meant to be representative of national trends.  This is three store survey in a single county and is meant to be a data point in a larger picture.
  2. I will discontinue the Index if I believe that the supply chain is so kinked that I cannot present an accurate picture in good faith.
  3. I cannot objectively verify inventory quantity within the stores but I can provide anecdotal commentary about what I am seeing and that can serve as anecdotal evidence.
  4. Within the past month, I have noted that products that are simply gone from the supply chain have not only disappeared from the shelves , but the shelf labels themselves have been removed or covered over with blank labels.  If the item is not on the shelves but the label is present, I will treat that item as temporarily out of stock and report the price as listed on the shelf label.
  5. The items priced are almost all store- or off-brand, which would be purchased by a shopper attempting to extend a fixed food budget.  There are rare instances in the Index in which a name-brand product is used and that same product will be priced moving forward to assure consistency.
  6. Pricing will occur, whenever possible, in the morning.
  7. If new alternatives are offered for an item, as happened continuously in the old Index, that alternative will be used then and afterwards to assure consistency.  The same will happen with package sizing.

The May 2020 FPI results are shown in the linked pdf below.  Note that a few items are not listed in two columns; these items were part of the original survey and they were not available in those stores, even searching for shelf labels.

The Total Cost of the May 2020 FPI is $88.51 and that amount will serve as the Index level of 100.

FPI Base Results – 5/20

 

 

 

Resurrecting the Price Index: Rationale

We are again in Terra Incognita and our only guides are a few accumulated studies of a hundred year-old predecessor pandemic.  This is like trying to find the most efficient route from New York to Salt Lake City using a 1932 Rand McNally map.  The fear is palpable and not least of which is the concern about the national food supply, especially since John Tyson of Tyson Foods took out full page ads warning that the food supply chain was breaking.  While this piece was going national, there were also warnings about the virus having an inflationary impact on food prices.  Are there serious problems?  Absolutely.  Are they as terrible and fear-inducing as it appears?  Not necessarily.

Societal shocks happen and they are always followed by fear, if not outright panic.  Our history is that we have had problems with food prices and supply before, most notably during the Second World War, and we managed to survive.  What is different is that Franklin Roosevelt’s government had sufficiently advanced notice that there would be another war and had begun thinking ahead.  Today?  Well…that depends to whom you are talking.  The simple reality is that there are no easily ascertainable data points to assess developments at the retail grocery level and the lack of data feeds uncertainty and fear.

This is why I am resurrecting the original PracticalDad Price Index – which I calculated monthly from November 2010 to September 2016 – and modifying it to follow activity within the grocery stores.  The original index was created as a kitchen table project to ascertain the impact of the Federal Reserve’s novel Quantitative Easing programs upon food prices in my vicinity.  It was calculated both to satisfy my own curiosity and to serve as a small data point for a larger online community.  It’s one thing to read wonkery, but another to actually see it in concrete terms.

The modified market basket, methodology and results will be covered at length in the next article.  For now, let’s go from pondering questions of inflation at the molecular level of three local grocery stores to a more global perspective on prices and inflation.

Understand that inflation is simply the decrease in a currency’s value, as measured by it’s purchasing power for goods and services.  There are three principal reasons for this.

First, there is demand, such that people are willing to spend more for that item with the supply of that item being relatively constant.  The stunning rise of a single Bitcoin is an example of that but that glorious moment in our history when Americans believed that a house was an ever-increasing investment works as well.  Our realization that a house wasn’t so is a good example of the inverse, deflation.

Second, an inflationary or deflationary effect can arise out of a good’s supply.

The oil supply shock of the 1973 OPEC oil embargo caused prices to spike simply because there was an immediate halt to the flow of oil to the US with no corresponding decrease in demand to offset it.  In economics parlance, this was a simple shift in the supply curve to the left with demand being equal.  The shift from Intersection A to Intersection B resulted in a rise from a price of P1 to a price of P2.  Real life, unfortunately, was quite a bit messier.

Finally, the purchasing power of a currency can be affected simply by the sheer amount of money available within the system.  What most have forgotten is that inflation for food – particularly meats – was already an issue prior to the 1973 Embargo.

There were calls by housewives for “Meatless Meals” as a push-back against grocers and protests broke out across the country.  Housewives blamed grocers and the grocers pointed their fingers at farmers, who kicked the can of blame to the feed producers.  Where did the final responsibility rest amidst this idiot firing squad?  Actually, it was a result of the persistent and signficant increase in the amount of money within the economy starting in the 1960s.  The Federal Reserve itself terms that era as The Great Inflation and notes that the period began in 1965 and ended in 1982.  Why those years?  Because 1965 saw the beginning of LBJ’s Vietnam build-up as well as the inception of his Great Society programs.  And 1982?  That’s when then-Fed Chairman Paul Volcker turned off the tap and ratcheted interest rates to nosebleed levels to rein in the resultant inflation.

There is nobody – absolutely nobody – who can tell how this plays out in the grocery aisles.  There are competing articles about incipient inflation and incipient deflation, which is really where the mass of Americans have been stuck since 2009.  So take a side and argue away because each argument has merit and honestly, the act of arguing serves nothing better than to satisfy a primal intellectual urge, a form of mental masturbation.

What this C-19 Pandemic has managed to do however, is create a situation in which all three factors are now simultaneously in play amidst the real economy.  In the short term, the supply of specific food groups is curtailed and all things being equal, there should be a spike in the prices for those groups.  But all things are not equal here because while there’s the supply question, the American people have suffered a cataclysmic – and that’s an appropriate word for these circumstances – demand shock in loss of income, strained as it already was for the past decade.  Refer back to the Supply/Demand chart shown above.  The family income supported Demand curve hasn’t so much been shifted left as it’s been tossed into the bottom corner like a broken corpse.

Our present drama is playing out amidst these first two factors of compromised Food Supply and cratered Family Income, contesting one another like fighters in the late rounds of a championship bout.

But the third factor, the Money Supply, is waiting quietly outside of the ring and that is what literally awakens me on the occasional night.  In the wake of the 2008 Financial Crisis, the Fed’s three QE programs created trillions of dollars and in the process extended the Fed’s balance sheet to amounts previously unimagined.

Growth in M1 (1975 – 2015)

That it didn’t tear Joe Six-Pack a new one is a testament only to the fact that the economic, legislative and electoral policies since the repeal of Glass-Steagall literally created two stand-alone economies:  one awash in wealth for the uber-wealthy and another that diminishes the American Middle Class a little bit more each day.

My wife once asked me, if the goal was to create inflation, where is it?  The inflation is there.  It is encased in the equity markets and the prices for exclusive properties in places like New York, San Francisco, the Hamptons and Potomac, Maryland.  It is encrusted in the wealth accrued to billionaires and near-billionaires and their purchases of art, excess consumption and doom shelters in New Zealand.  It is wrapped up in projects such as Jeff Bezos’ effort to create a ten thousand year clock as a monument to long-term thinking, the vicious irony being that it’s financed by a predatory reliance on short-term quarterly results.  And the inflation is locked away in the purchases of items of alternate value such as cryptocurrencies and precious metals, which are now physically almost unobtainable despite having a stable paper price.  Go figure that one out.

As global economies pursued this race to the bottom with their respective currency values, the Fed acknowledged that it had to begin raising interest rates to something remotely approaching historic normalcy.  It’s not surprising that the stock market became cranky during this period because it’s flow of cheap credit was threatened.  There’s a reason that President Trump demanded zero and negative rates from the Fed, regardless of the damage that these rates do to real activity.  But in the immediate aftermath of the Pandemic’s onset, the response was to salvage the economy by again dropping rates and extending lifelines to a wide variety of corporate and financial entities.  The result of these lifelines from the government and the Fed?  $6 Trillion in the course of the two month period ending April 15, 2020.  That money is now coursing through the bond and equity markets, which have stabilized since the roll-out of the various programs.

Yet the average American gets a one-time check of $1200 with an additional $500 per child.

At the end of the day however, our economy is built upon the premise that Americans must spend for any recovery to happen.  That’s why the Administration pushes to get the economy re-opened and money flowing, even though the infection and fatality numbers in many areas still fail the President’s own criteria for re-opening.  That is why we hear establishment commentary conflating legitimate saving with ridiculous terms like “hoarding cash”.  Sure dude, I can’t cover a $400 car repair but yeah, I’m good for a beach week to help the economy.  Ultimately, the average American will not be able to consume unless the Federal Government renders real and meaningful assistance and the two bifurcated economies are rejoined in even the loosest fashion.  Whether it is debt relief, guaranteed income or any number of other programs that remove the noose from the neck of the 99% and/or ratchet down upon the 1%, the economies must be rejoined and a re-balancing must take place.

That’s when the trillions of dollars set loose since 2009 are liable to return.  If it happens, that money will begin flowing through the real economy and we will be set up for a replay of The Great Inflation, except that the Americans of this generation won’t have the financial health of their great-grandparents to survive.  The resultant inflation will ignite and what we witness in the next year will be child’s play in comparison.

It is possible that these fears won’t be realized.  But make no mistake that the American economy – and the political body – is seriously ill.  One of the criticisms of the repeated actions of the Fed’s QE programs is that it’s akin to treating cancer with copious amounts of painkillers.  The patient feels better but the cancer continues unabated.  At some point, the treatment must occur in all of its unpleasantness.  As a survivor of cancer and any number of other medical issues, I attest to the value of a painkiller in the moment; I also understood in the moment that my survival was predicated upon a simple submission to the treatment and all of the side-effects.

Apart from the sheer ability to draw breath for yet another day, there’s an upside to survival.  It is the understanding that despite the worst fears in the moment, they are at that time, only fears and not guaranteed realities.  You learn to acknowledge the fears and set them aside, managing your life one step at a time and taking each step as it comes.  The fears are there.  They are real.  But until they actually occur, they can be managed one step at a time.

So it will go in the grocery store.  We will manage as best that we can because that is ultimately all that we can do:  our best.  In the meantime, I will work to put a recognizable face to the abstract notion of the cost of food and the reality of the supply chain.  That will be the next article:  The new Market-basket.

 

Notes on the Supply Chain

As the country leans into a lockdown and fear intensifies, there is another side-bar conversation about the strength and/or fragility of our supply chains.  Our out-sourcing of pharma and manufacturing has bitten us in profound ways but apart from ventilators and PPE, that is a step removed for many.  The immediate concern for most pertains to the food supply chain, which adds yet another layer of tension to an already fearful scenario.  Large numbers of people now enter the grocery store intent on finding what they can before they are potentially gob-smacked with someone’s aerosolized germs.  But what is notable about this grocery scenario and can we draw inferences for moving forward?

Yes, there are.

Let me start by explaining my background.  First, I have not only been the stay-at-home parent who has done the bulk of the shopping and cooking for that time, but I am also a data-driven economics geek.  My wife, BH, now takes a greater role than earlier and much on the generated lists now emanate from her facile mind.  But in the early years of toddlers and small children, this was predominantly my responsibility.

Where this merged with economics was in 2009-2010, in the aftermath of the Financial Crisis and deep recession.  At that moment, the Federal Reserve Chairman was Benjamin Bernanke and it was clear that The Powers That Be had advance notice of problems at his 2006 start; his area of academic expertise was in the errors of the 1929 Fed in responding to that year’s Depression.  Bernanke had argued that the Fed exacerbated the stock market collapse by failing to provide liquidity for the market as it  struggled.  His response, unproven and academic, was that the Fed should have provided as much liquidity as possible and the collapse in late 2008 provided the opportunity to test his theory by supplying liquidity via the first of multiple rounds of Quantitative Easing.  The debate, loud and rancorous on Economics and Finance blogs, was whether this untested theory would work or instead spark rounds of runaway inflation.  My own questions went to how this would affect my own family.  Because I was involved in the establishment of a local cost-of-living survey in my distant past and had spent time conferring with its national creators, I decided to lever this experience into the creation of a kitchen table project, the PracticalDad Price Index.

The Index kicked off in November 2010 and focused upon a market basket of 47 common grocery items.  My intent was to see what happened to the prices of this local basket as the QE program – and its successors – rolled through the economy.  It continued monthly until  personal circumstances dictated it’s ending in September 2016.  An offshoot of this focus upon pricing for almost six years was a new appreciation for the food supply chain.  It’s not typically notable unless something is wildly amiss, such as a run on toilet paper in a pandemic but over that span, there were distinct changes in the grocery supply chain as grocers and suppliers adjusted to the ongoing decline in purchasing power by a weakening American consumer.  What is notable about the supply chain?

First, the name itself is misleading.  We talk about the supply chain as though it was a singular monolithic entity with a single controller, but it isn’t.  The supply chain is a dynamic – almost organic – entity, involving the input of hundreds and thousands of retailers and suppliers in a geographically and economically diverse nation.  It evolves over time to respond to the data fed to it via market and economic research and the sheer volume of literally billions of transactions involving an untold number of products at different price points.  It is continually changing as grocers and producers meet consumer changes in spending power, habits and trends.  Some entities fail in bankruptcy or are taken over by competitors.  Others offer cheaper alternatives for sale to the consumer.  The point is that it can and does change in real time.  Personally, I don’t envision so much of a chain as the visible sinews and tendons of the economic body working both individually and collectively at the same time.  One sinew would be dairy and another produce, yet others involving meat sources and consumer non-durables such as health and beauty products.  Each sinew answers to distinct inputs and trends with the collective result of an economy reliably providing needed goods to the consumers.

Second, the supply chain is built to respond reliably within a certain timeframe BUT the pandemic has shortened that ruinously.  The inputs that drive the process are now wildly disordered and the processes are momentarily overwhelmed.  The consumer, already declining, has had a catastrophic loss of income.  Entire sectors of the economy are suddenly and completely closed.  There is an immense and out-sized need for certain items, particularly related to disinfectants and cleaners, that utterly outstrips the ability of those sinews to meet those needs.  There is concern that the food sinews will be compromised for fear of viral infection among those workers.  This doesn’t even touch upon toilet paper, the disappearance of which suggests that most Americans believe Covid-19 will completely deforest the continental United States.

It was reported this week that dairy farmers in some regions were forced to dump raw milk, an infuriating development when millions are suddenly unemployed and food banks increasingly stressed.  My original take was that it reflected a collapse in dairy pricing, as occurred during the 1929 Depression; in that period, farmers and herdsmen destroyed crops and dumped milk because it was the only way to bring supply into equilibrium with a break-even point that supported even minimal prices.  Another article explained the rationale behind the decision to dump and while immensely frustrating, it makes sense.  In the Great Depression, episodes of dumping only occurred after years of being mired amidst years of poverty that wouldn’t support even minimal prices.  This episode is founded again upon the concept of time; the inability of dairy producers to find the packaging that would allow the product to come to market to meet the suddenly soaring demand.  The supply chain is not built for and cannot adapt to a shortened time frame.

Third, the factor of time now also drives many of our shopping habits.  American workers and families have felt the pinch of demands upon time and this has carried over to the grocery shelves.  Many products are now processed in some way or pre-packaged with the intent of minimizing the time required to cook and serve.  The cost for such products however, is driven upwards because the much of the labor for preparation has been taken up into the manufacturing process.  In essence, time truly is money and it’s a trade-off that many Americans have made for decades.

Fourth, observations from recent grocery trips indicate several things.

  • The scarcest items are those that either require the least amount of household labor to prepare or require a higher amount of pre-market processing or travel in order to bring to the shelves.
  • The produce sections at the entrances of multiple groceries have been consistently well-stocked, except for lettuce (which is hilarious since my wife routinely reminds me that lettuce is mostly water and the least-vegetable vegetable on the planet).  I have been surprised to find that bananas and citrus are still plentiful although that might change as the travel network degrades.
  • Canned goods have been in persistently high demand for their long shelf life but they have remained available; this is liable to change if the virus depletes the workforce in the plants.  Likewise for canned soups, pastas and sauces, peanut butters and orange juice.  There are instances in which there are less popular types of canned vegetables or beans in greater quantity as people ignore them for the more commonly preferred types.
  • Non-dairy and specialty milks (Lactaid/soy and almond) have been depleted but there has been a reasonable supply of locally sourced standard milk (whole, 2%, non-fat).  Likewise for yogurts and cheeses.  Specialty yogurts requiring greater processing are depleted while simpler yogurts have been there in sufficient amounts.  Locally sourced and block cheeses are available while the shredded variety is more depleted.
  • The meat cases were sporadic.  I’d noted lesser amounts of ground beef and boneless/skinless chicken while there were still sufficient amounts of other meats.  This observation was confirmed on the grocer’s website with the note that price was higher and availability more limited but that this should remedy itself within the near future.  Eggs were in sufficient amounts but the price per dozen had almost doubled and the grocer noted that that should revert back to norm shortly.
  • Breads were completely out of whack as those products requiring further processing are in short supply:  Schmidt’s 647 loaves are a prime example.  Other popular standards were sold out and one local grocer was replacing them with simply store baked white bread loaves.  My experience with one of those was that it grew mold far more quickly than its commercial bakery opposite, indicating a lack of preservatives.
  • While there’s been consistently brisk movement in canned vegetables, I noted on occasions that the 19th century predecessor, glass jars of pickled vegetables, were almost untouched.

What are the takeaways moving forward?  I’m operating under the premise that this pandemic will come in waves, like it’s 1918 Spanish Flu epidemic, and is likely to last into 2021 before ending.

First, the supply chain will reassert itself and adapt to the new conditions of problematic supply/processing and fewer consumer dollars.  The gist will be to save dollars by shifting the labor cost back out of the factory and into the household.  For example, instead of spending money on highly specialized yogurts, consumers will opt instead to purchase the simpler variety and add their own fruit or flavoring.  Instead of spending on canned beans, consumers will opt to reassert their time in the kitchen by remembering to put dry beans in a pot of water the evening before cooking.  Food preparation will become a more deliberative and time intensive activity as it was for our great-grandparents and forebears.

Second, consciously or not, people will begin to expand their own food supply chains so that they aren’t reliant on a grocery store.  I expect a return to gardening with the rise of the Corona Garden, much as the Second World War saw the rise of the Victory Garden.  As stay-at-home orders have rolled out across the country, there has been a significant increase in seed sales as well as a near sell-out of chicks.  Communities are likely to follow their 1970s  predecessors and set aside lands for more community gardens for those who do not have sufficient personal space to support a garden.  Another example of this would be our joining a CSA last year for produce, cheese and eggs.

Expand your supply chain within the store itself.  Seek out alternative foods that are more plentiful than the standards and try them.  Middle is presently back in the household for the duration.  When he joined me the other week at the store, we were discussing his new appreciation for Indian and Halal and when we went to the small foreign food section, it was almost fully stocked with rices, sauces, spices and chickpeas.  And yeah, the guy did a creditable job on an Indian meal.  Think of it as an adventure if you’re an optimist and a you’ll eat it and you’ll like it experience if you’re a pessimist.

Third, take time to do more planning.  Consider your menu choices as you walk through the next one to two weeks and buy accordingly.  As a society, we will no longer have the money nor the inclination to meander through a grocery store browsing for the next great impulse buy.  I suspect that lingering will be a thing of the past in stores.

Finally, be mindful of others when you are shopping.  Our community’s church sponsored food bank noted a 360% increase in the number of families requesting food assistance over the course of a single week.  During non-growing periods, the food banks are going to be more dependent on canned and processed foods and those able to still get to the store will be in a better position to purchase fresher foods and cook for themselves.  Also consider essential workers and their families and leave the more easily prepared foods for them, because cooking isn’t likely to be on their agenda after a busy shift.

 

 

So the Millennials Like Socialism…

It started as an online survey by victimsofcommunism.org and has wound its way through the media, news and social.  “It” is a survey result finding that approximately 70% of American Millennials (born between 1981 and 1996) would vote for a socialist candidate instead of a non-socialist.  It’s fed a breadth of spin-off articles breathlessly reporting the results as well as a slew of memes – many troll-created – mocking millennials.  This particular little meme crossed my Facebook feed several weeks ago.

My response?

Why so surprised?

If folks are surprised that upwards of 70% of millennials would support a socialist, then consider this percentage:  80% of millennials don’t expect to receive Social Security when they reach what we consider as retirement age.  I’m surprised that so many of the X, Boomer and Silent generations are so obtuse as to consider this news.  What Millennials have witnessed from their earliest youth is the Great Reversion, a thorough dismantling of benefits and privileges that were earned by and afforded to their elder generations:  income, education, health insurance, job opportunities…all of it.  Millennials are the first generation to be raised and come of age in this period, while their generational elders had at least some benefits of the preceding society and economy.

Maybe we need to first determine if Millennials are talking about the same Socialism as their elders.  Just remember this at the outset:  most individuals don’t reach a meaningful state of political awareness until at least their teens and what they witness during that period will largely shape their long-term political outlook.  So…what is Socialism?

There’s a distinct difference between what is meant by the two generational groups.  That the original survey came from victimsofcommunism.org is telling;  it is a non-profit organization created as an “educational and human rights foundation” (per their website) by unanimous Congressional action after the collapse of the Soviet Union.  The two principal nations – the Soviet Union and Communist China – billed themselves as Socialist and those most affected by their atrocities – including the generations of Americans who engaged in a sometimes deadly Cold War against them – will identify Socialism with the death and damage wrought by them.

Millennials view Socialism as something different however.  In the earliest years of the Millennial period, the Soviet Union was in decline and a distinct political resistance had formed in Poland.  When Millennials reached elementary school, the Berlin Wall fell and was followed within two years by the collapse of the Soviet Union itself.  The existential threat of totalitarian Socialism ceased and Millennials came of age without noting it as a meaningful factor in their lives.  As the earliest Millennials aged and were joined by their younger peers, they found a new brand of Socialism in the countries of Europe, later the European Union.  In many countries, there was free or minimally priced healthcare for the citizenry.  There was also heavily subsidized and reasonably priced higher education as well a network of state supported social programs that assisted citizens.  That these nations had free and democratic elections put a stake in the notion that Socialism, as experienced by their elders, was evil and deadly.

What Millennials hadn’t experienced, which their elders had, was determining how these programs were funded.  There’s an aphorism of uncertain origin:  if a person isn’t a socialist at 25, then he has no heart.  If he isn’t a conservative at 50, then he has no head.  Generations disagree with one another.  I once argued with my parents about taxes and drove my mother to a near-stroke by arguing that we should be willing and ready to pay our fair share of taxes; my father reminded her that I would soon be paying my own taxes as an adult and my attitude would probably change.  He was right and my willingness to yield my earnings to the government declined  dramatically when I was responsible for putting a roof over my own head.  But that dinner conversation was decades ago and despite graduating from college in the midst of a serious recession, my wife and I were privileged to enjoy the benefits of that period before Things Economic went seriously south.

How far south?

Far enough south that the youngest Millennials are through college and recognizing that the economic odds are stacked against them.  Think about it:  your hope for a middle-class life is dependent upon having some form of higher education yet obtaining that degree will leave you with an average student loan balance of $35,359.  If you land a job with health insurance, it’s increasingly likely to have a high deductible plan since more employers are shifting in that direction to offset the rising cost of having insurance whatsoever.  Fully 66% of all personal bankruptcy filings are attributable to the impact of medical bills, even with the presence of health insurance.  Housing is going to be costlier as the student debt load impacts your ability to save for a down payment to buy a house, yet the median rental cost has increased by almost 50% since 2001 (through 2015) while median household income has been static over that same period.  You will be responsible for your own eventual retirement via personal savings and expect that the Social Security net will be exhausted and closed.  And honestly, if persistent mass shootings in public venues and schools elicit nothing more than thoughts and prayers from those in power, can you actually believe that any meaningful assistance will be forthcoming that same group?

Millennials are learning how deceptive the American economic system has become.  It has been based for decades upon the notion that we are consumers with a crucial role as an economic driver first for the domestic, and later, for the global economy.  What we are experiencing is that we have instead become the consumed, fodder for the corporate predators who have gained a disproportionate level of control in society.

Yeah, it’s daunting.  If I were a Millennial, I would find it daunting.  So they will  band together as a voting bloc to push for a public response that helps them, much as their great-great-grandparents did when they elected FDR in a landslide over Herbert Hoover.  As the American Middle Class continues to erode, the Millennials are living the deterioration and are willing to forego a larger percentage of their present earnings in the expectation that their futures aren’t those of poverty and hopelessness.

One final comment.  I like Sam Elliott and if there is such a thing as reincarnation, I want to come back as his gloriously badass mustache.  But let’s do it homage by not taking it in vain on what is a meme likely created by trolls to sow further discord.  Take a moment to try to walk in the shoes of a Millennial and you’re liable to find that they can’t afford the kind with good arch support.

 

 

 

The American Family Changes…

…these are serious structural changes to the economy that will necessarily flow into so many other facets of our lives – food and cooking, housing, education, medicine, child-rearing.

      –  PracticalDad, The Great Reversion  June 27, 2013 

The Great Reversion, which kicked into overdrive with the Financial Crisis of 2007, has now run headfirst into the social institution that the Conservative movement exalts:  the American Family.  Change is constant although most is ebb and flow.  But now, multiple separate data-points about the American family support the concept that its structure is changing in response to its long-term financial circumstances.

Let’s be clear.  There is no longer a true monolithic model for the contemporary American family and no one can lay claim to it, despite what the Religious Right likes to think.  But the separate data-points indicate that the great mass of families – religious or not – is looking at their respective long-term circumstances and making rational family-unit level decisions to best situate themselves for what they perceive to be their future.  We all know the mass of economic data-points showing what’s amiss:

These kinds of circumstances have an impact however, and that impact is now reflected in the long-term decisions of the family adults.  How so?

First, America’s Total Fertility Rate – known informally as our replacement fertility rate – declined in 2018 to 1.73, the lowest point since the Oil Crisis/Inflation period of the mid 1970s.  That was a bleak period two generations ago and I recall a conversation with a gentleman who commented that he and his wife were nervous about bringing new life into a world that was, in the moment, intimidating.  Circumstances improved however:  The Berlin Wall had fallen and the Soviet Union collapsed; even with 9/11, we entered a period in which homes were larger than ever and housing prices would only ever go up.  Money was cheap and anybody who could fog a mirror was able to borrow large amounts for increasingly unpopular McMansions.  And with that increase went the Total Fertility Rate.

Until approximately 2007 however, when the wheels came off.  Since then, the TFR has dropped and it’s low point is confirmed by a second fertility statistic, the General Fertility Rate, which measures the rate at which women are currently having children.

US Fertility Rates/courtesy Pew Research

 

 

 

 

 

 

The typical family is looking at it’s prospect and saying Nah Bruh, I think that I’m good for now…  And this is playing out in the second data-point.

Next, more younger couples are only having one child.  This is now the fastest growing cohort of families and has doubled in the four decades from 1976 to 2015, from 11% to 22% and if the article is correct, then it’s not going to slow appreciably in the near future.  It hasn’t necessarily been a financial decision since part of the interplay is the aspect of delayed motherhood from a greater participation in the workforce and the opening of previously closed career pathways for women.  But my suspicion, gut at best, is that people are looking at the cost, excluding higher education, and holding put at one child.

Third, the American family itself is quietly morphing from its historic nuclear family structure to a multi-generational model.  What we consider the traditional nuclear has been rooted for generations in the two-generation unit, parents and biological children together.  It has shifted itself as the racial, cultural and gender lines have blurred so that a modern nuclear family can be parents of two separate races or the same sex, and the children can be adopted instead of related via birth.  Studies have shown that this particular unit structure can be found in records back as far as the 13th century in England but the sociologists’ research of the 20th century has linked the economic development since the Industrial Revolution of the 19th Century, as well as our own domestic economic growth, to the widespread availability of the nuclear family; it was this foundational unit that was able to move to where the opportunities for economic advancement were then available.

One particular economic issue today pertains to this very concept of labor mobility.  Economists have noted in the past several years that the percentage of Americans willing to move for employment has dropped by half, from the 1980s to today, from 20% to about 10%.  It’s notable that from 2012 to 2017, this number declined from one in eight Americans in 2012 to one in ten in 2017.  Labor mobility matters because it allows for the best match of labor demand and supply so that productivity is maximized at the greatest benefit to labor.  Consider Detroit’s auto industry in the early to mid 20th century.  American automakers were able to turn out autos at such a rate because they were able to obtain a healthy supply of labor, much of it from the Black communities of the American South.  For all of the social issues that were engendered, the pay for black workers in Detroit was still higher than what they were able to earn in the Jim Crow South and significant numbers of Blacks moved northwards to take advantage of it.  But when labor mobility declines, as it has, then there is a mismatch between the demand and supply of labor and each aspect suffers.  Middle had a college classmate who graduated with a degree in video sound editing and his goal was to move to Silicon Valley; but with the cost of housing so wildly out of reach for the average person, this youngster would have joined others living in vehicles as they worked in their chosen field.  The result?  He stayed on the east coast.

If the nuclear family is a two-generation unit, parent(s) and children, what then is the multi-generational model?  The first thought of many Americans is that of The Waltons, the Depression-era family portrayed on the iconic television show of the 1970s.  They were a nuclear family that became a multi-generational family by dint of having the grandparents live under their roof.  But multi-generational is more than that.  According to sociologists at Pew Research, the multi-generational family model is composed of parents and adult children past the age of 25 or grandparents and grandchildren or any other combination of greater than two generations.  Right away, we recognize two circumstances that have come into focus from this definition.  First, the number of young adults that are now living at home because of their student debt load.  As of 2016, approximately one third of adults between 25 and 29 lived with their parents, triple the percentage who did so in the 1970s.  The second situation is the rise in the number of grandparents caring for grandchildren because of the parents’ instability due to economic factors or more tragically, drug addiction.  The raw numbers aren’t nominally huge, but the percentage of grandparent-headed households has increased in less than a decade.

Percent/Nominal Rise in Multigenerational /courtesty Pew Research

When you note the rise in the percentage of multi-generation families since 1970, also consider the arrival of the immigrant family; both Asian and Hispanic families tend to have more than two generations under the same roof, often because of financial reasons.  Despite this, the percentage of multi-generational families has risen across all racial demographics.

But these factors account for what has happened thus far and don’t necessarily reflect the impact of what will come; expect the multi-generational  model to make far greater inroads as we move ahead.  Simply put, there are going to be far more elderly Americans with far less savings to support themselves through their remaining years and the existing social infrastructure for their care is seriously insufficient.

The first thing to understand is that there is no longer a single demographic cohort for the elderly and these cohorts aren’t growing at the same rate; there are seniors, the elderly and the Old AF. The demographic models are such that the number of elderly Americans, 65 and above, will outnumber young Americans by 2035.  However, the number of those between 75 and 84 will increase by 100% while those above 85 will increase by 200% by 2050.  The raw number of the elderly population is going to outpace the number of workers available to support them via government financed social programs.

The second factor to consider is the state of the seniors’ finances.  According to the Transamerica Center for Retirement Studies,  the median savings for people in their 50s and 60s is $117000 and $172000 respectively.  Many in those age cohorts recognize that it isn’t enough and fully expect to continue working past the traditional retirement age of 65 and the percentage that do is now at now at 20%, double the level of 10% in 1985.    Coincidentally, the percentage of older Americans still working was 29% in 1949, about the same time at which the percentage of multi-generational families was as high as it is now.

The paucity of savings is further complicated by the global experiment with artificially low interest rates. Our national monetary policy for longer than a decade has been to push interest rates to the lower ranges to both encourage consumption – I have to hold back a laugh when I consider the prevailing credit card rates – and assist in managing the interest costs of our national debt.  This is good for the federal government and companies, who have persistently taken on large amounts of debt for the purpose of buying back their stock.  But it is horrendous for middle-aged and elderly savers who, at one time, depended upon the interest income from their lifetime of accumulated savings to fund their nest egg.  As rates have been consistently low for more than a decade, those in or approaching their senior years have been forced to shift their investment focus to riskier investments in the hope of obtaining a higher return.  This is a sea-change from the traditional approach of shifting to safer and more stable portfolios as retirement is reached.  If you are 56 and have $172000 in savings, you are going to run a greater risk of losing it before you hit your final years.

The last factor is that the infrastructure for elder care is simply inadequate for the numbers of older Americans coming down the pike.  Elderly Americans are covered for many, if not all, medical expenses via the Medicare program; most importantly for very elderly Americans, this can include some, but not all, aspects of nursing home residency.  Corollary expense, such as hands-on care for assistance in Activities of Daily Living (ADL) is not covered and is left to the individual or family to pay.  In addition, there is a cap for the per diem fee that Medicare will pay nursing homes for Medicare patients so there is limited profitability for nursing homes in the Medicare program.  The upshot?  There might be a specific number of nursing beds available in a locality but there is only a subset of those that are available to elderly citizens whose primary coverage is via Medicare simply because of insufficient revenue; this isn’t referring to the rates of return on the business but actually even maintaining any profitability at all.

The other aspect to the infrastructure question is simply one of labor.  The dispersion of the American elderly demographic isn’t uniform and some areas are more hard hit than others.  Maine is now what the World Bank classifies as a super-aged entity, noted by a fifth of the population being older than 65 years of age; this is the first state to reach this milestone and by 2030 – 11 years from now – more than half of the states in the country will cross that threshold.  If there are an insufficient number of nursing home beds available for the most infirm, then the next best step is to do everything possible to keep them in their own homes.  It is less expensive and theoretically possible to make this work – except that there are vicinities in which there isn’t enough available trained labor to support that goal of in-home services.  Maine is the first state to face the situation and service providers are simply declining to take on new clients because they just do not have the people to provide the services; the families who are in the area are then forced into situations, often intense, for which they have no minimal resources and training.

Let’s connect the dots.

  • The elder generations will grow significantly and disproportionately, relative to younger generations.
  • These generations lack the assets to support the care that is likely to be required in their much later years as debility and deteriorating medical conditions require greater spending.
  • The infrastructure, both physical and labor, for elder care is insufficient at present in many locations for this growth.
  • The present political conservative political sentiment will preclude significantly increased spending on elder care programs and much of the burden will continue to be shifted back to the family unit as has already happened with retirement, cost of higher education and health care costs. Even if there is a massive shift towards greater social spending, the conversation among the Democratic candidates relates to healthcare and higher education benefits with little mention of Eldercare.

There are simply too many soon-to-be elders and they don’t have the money.  Any correction of the hollowing out of the American Middle Class will likely take decades which means that even the younger generations aren’t going to have the resources and they in turn will have to rely in some measure on their own adult children when they reach their own elder years.  This is the upshot of living through the Great Reversion since our forebears often had to stand for some measure of responsibility for their own parents and grandparents and this is how it’s going to be moving forward.

Raising children can be difficult and teens even more so.  But our grandparents could get through those years with a sigh of relief at the lifting of responsibility because their own parents had the assets to largely support themselves in their dotage.  Many of us are only going to have a few years of respite before we are forced to re-assume that responsibility for our own elders as they navigate their final years.

Understand that your own children are watching you and you’ll want to set a decent example for them when they are, in turn, responsible for you.