Author

ProfitRuleDefinition

Low-paid workers face the highest risk of losing their jobs when the UK government’s furlough scheme ends in September, according to analysis by a leading thinktank.

The government’s coronavirus job retention scheme, which pays 80% of furloughed workers’ wages, is widely believed to have prevented millions of job losses when the economy collapsed but economists are concerned that some jobs will still not be viable when it ends on 30 September.

The expected increase in unemployment is likely to hit lower-paid workers hardest, in line with the experience of the financial crisis more than a decade ago, according to an annual assessment by the Resolution Foundation on the prospects for workers on low incomes.

It said that by March the lowest-paid fifth of workers were three times more likely to have lost their job, been furloughed or lost income when compared with the top fifth. Twenty-one per cent of lower-paid workers had lost out, compared with only 7% of the highest paid.

There were still 3.4m furloughed jobs at 30 April, according to the latest available government data. That was lower than the most recent peak of 5.1m jobs in January, and many more workers are likely to have restarted work since the lockdown easing began. Retail, hospitality and leisure are the biggest low-pay sectors, and they accounted for the majority of people returning to work in April.

The foundation noted that prospects for those on low incomes had improved markedly since last year when the pandemic caused economic chaos, particularly in low-pay sectors. Unemployment has fallen back to 4.9%, and some sectors are reporting difficulties hiring workers. Private sector economists have scaled back their expectations for unemployment at the end of the year, from 6.3% to 5.9%, according to forecasts collated by the Treasury.

However, the foundation said the government must not assume that rising minimum wages and the economy reopening will be sufficient to improve the lot of the lowest paid. It called for faster minimum wage increases and new rights to a regular contract, and more notice ahead of shift changes for those on zero-hours contracts.

Nye Cominetti, a senior economist at the Resolution Foundation, said: “Low-paid workers have been at the heart of the economic crisis. Fortunately, low-paid workers also look set to be at the heart of the recovery by coming off furlough in huge numbers and returning to their previous jobs.

“However, big risks still lie ahead. Low-paid workers are most at risk from the expected rise in unemployment later this year, which also risks causing greater job insecurity.”

Read more:
Low-paid UK workers ‘most at risk of losing jobs when furlough ends’

0 comment
0 FacebookTwitterPinterestEmail

Enlarge (credit: Frank Lindecke / Flickr)

Cybersecurity truisms have long been described in simple terms of trust: Beware email attachments from unfamiliar sources and don’t hand over credentials to a fraudulent website. But increasingly, sophisticated hackers are undermining that basic sense of trust and raising a paranoia-inducing question: what if the legitimate hardware and software that makes up your network has been compromised at the source?

That insidious and increasingly common form of hacking is known as a “supply chain attack,” a technique in which an adversary slips malicious code or even a malicious component into a trusted piece of software or hardware. By compromising a single supplier, spies or saboteurs can hijack its distribution systems to turn any application they sell, any software update they push out, even the physical equipment they ship to customers, into Trojan horses. With one well-placed intrusion, they can create a springboard to the networks of a supplier’s customers—sometimes numbering hundreds or even thousands of victims.

“Supply chain attacks are scary because they’re really hard to deal with, and because they make it clear you’re trusting a whole ecology,” says Nick Weaver, a security researcher at UC Berkeley’s International Computer Science Institute. “You’re trusting every vendor whose code is on your machine, and you’re trusting every vendor’s vendor.”

Read 10 remaining paragraphs | Comments

0 comment
0 FacebookTwitterPinterestEmail

Andrew Lloyd Webber last night said he could take ministers to court if they do not allow theatres to operate at full capacity from June 21.

The impresario said it would be ‘the final death blow’ if the relaxation of restrictions does not go ahead as planned later this month.

Indoor entertainment venues were able to reopen on May 17 at half capacity, but many theatres have remained shut because it is not profitable to play to smaller audiences.

Lord Lloyd-Webber said if theatres cannot reopen ‘100 per cent’ after June 21, the issue becomes ‘what is the legality of the whole thing?’.

‘If the Government’s own science has told them that buildings are safe… I’m advised that at that point things could get quite difficult,’ he said.

‘This is the very last thing that anybody wants to do, but there would become a legal case at that point because it’s their science – not ours. I would passionately hope that we don’t have to, but I think we would have to consider it.’

Lord Lloyd-Webber, who hopes to open his new musical Cinderella starring Carrie Hope Fletcher at the Gillian Lynne theatre in the West End next month, said he would be happy to ask theatre-goers to wear face coverings.

He added: ‘We would conform with anything the Government asks us to do to get 100 per cent open – but we have to be 100 per cent.’

He also pointed to the success of the Government’s indoor events trials, such as at the Brit Awards at London’s O2 arena and the World Snooker Championships at the Crucible in Sheffield.

Lord Lloyd-Webber said the snooker had ‘shown there is no increased risk of transmission of Covid in a theatre’.

‘If scientists really are so worried about everything, then they should be saying there should be a total circuit breaker surely and lock everything down again for two weeks.’

But, he said, to keep the arts sector effectively closed suggests the Government does not care. Though he praised Culture Secretary Oliver Dowden for ‘fighting really hard for us’.

Read more:
Andrew Lloyd Webber to sue the Government if theatres are not fully reopened from June 21

0 comment
0 FacebookTwitterPinterestEmail

The second wave hit almost everyone by surprise when the government had almost declared a victory over Covid in late February/March and people lowered their guards in following Covid appropriate behavior.

The contagion started surging around late March in Mumbai, spreading like a wildfire in Maharashtra, Delhi, Karnataka and many other states by April. There were photographs of dead bodies floating in Ganga and unprecedented panic as people were dying in hospital and many people faced oxygen shortage.

The big cities and states which faced the second wave in the beginning have shown early signs of recovery but there are signs of rural penetrations of the virus, which can be a sign of worry as the rural healthcare infrastructure isn’t sufficient to handle the upsurge in cases. On the other hand there is less literacy among the rural people about the virus and healthcare.

In the starting phases of the disease it’s indistinguishable from the common flu, and as a result, people in the villages treat themselves with herbal/home remedies for a few days before even thinking of going to a public health center for a checkup.

This means that by the time they go and get tested and get the reports of the test which nearly takes one more day to finalize that a person is positive or not, as many as a minimum of 5 days are gone between the detection of the first symptoms and the commencement of Covid treatment. This difference between detection of the first symptom and commencement of the Covid treatment has proved to increase the severity of infection in the body which can be damaging.      

During the first wave, the transit of infection caused by the virus didn’t have an easy passage to the rural areas but now there is a growing concern about smaller towns and rural areas where the virus is heading inside.

Incorrectly assumed countrywide herd immunity and lack of anticipation of the second wave became the Trojan horses by which the virus evaded rural defenses. It is now time to think and take quick actions to contain the spread of the virus in the villages.

Source – https://www.gettyimages.in

There are states such as Kerala and Tamil Nadu which have strong healthcare systems that reach rural areas but many states in the central and northern India have a weak healthcare system, especially lacking physical strength in rural areas. Infrastructure, healthcare workforce, timely availability of medical supplies, population health literacy, and connectivity to higher levels of care are deficient in many districts.

Source – thequint.com

What is needed now is proper awareness about the vaccination among the rural people and simultaneous upgrades in the rural health infrastructure, especially Primary Healthcare Centers (PHCs).

Since the vaccines are not readily available and are in short supply, primary healthcare teams should conduct household visits for symptom surveillance and case detection, timely testing, triage, and referral as needed, home care support and monitoring.

Engagement of the local community and ASHA healthcare workers is vital, hence, the community based organizations having grassroots presence can assist in the delivery of health and social services. Emergency transport systems should be organized with assurance of ability, affordability, and equity, to transport seriously ill patients to pre-determined advanced care clinics and hospitals.

 

Written By – Lakshay Khichar

Edited By – Sohini Roy

The post Covid Inside Rural India appeared first on The Economic Transcript.

0 comment
0 FacebookTwitterPinterestEmail

[This article is part of the Understanding Money Mechanics series, by Robert P. Murphy. The series will be published as a book in 2021.]

In chapter 7 we summarized some of the major changes in how central banks have operated since the 2008 financial crisis. In the present chapter, we detail some of the even more recent changes in Federal Reserve operations since the onset of the coronavirus panic in March 2020.

Size of the Fed’s Balance Sheet

The most obvious change in Fed policy has been the dramatic expansion of its balance sheet since March 2020.

Figure 1: Total Assets Held by the Federal Reserve

As figure 1 indicates, the explosion in Fed asset purchases since March 2020 dwarfs even the three rounds of QE (quantitative easing) following the 2008 financial crisis. Indeed, from March 4, 2020, through March 3, 2021, the Fed increased its assets from $4.2 trillion to $7.6 trillion, an incredible one-year jump of $3.3 trillion (or 78 percent). Furthermore, as the graph reveals, the upward trajectory continues as of this writing.

Composition of the Fed’s Balance Sheet

Besides the quantitative change in the Fed’s asset purchases, there has been a qualitative change in the type of asset. In particular, the Fed is now buying large amounts of private sector corporate bonds (both individually and exchange-traded funds); as of the mid-May 2021 balance sheet report, the Fed’s “Corporate Credit Facilities LLC” held almost $26 billion in assets.1 This change in policy would have been extremely controversial (if only for the potential corruption) prior to the financial crisis, but it is now a seemingly natural outgrowth of the expansion of Fed discretionary power that began in the fall of 2008.

The Fed announced the creation of the Primary and Secondary Corporate Credit Facilities LLC in March 2020 (though it did not begin aggressively buying corporate debt—which had to have been rated “investment grade” before the pandemic hit—until June 20202). At the same time, the Fed announced expansions of preexisting asset purchase programs, as well as the creation of a “Term Asset-Backed Securities Loan Facility (TALF), to support the flow of credit to consumers and businesses,” which would “enable the issuance of asset-backed securities (ABS) backed by student loans, auto loans, credit card loans, loans guaranteed by the Small Business Administration (SBA), and certain other assets.”3

At this point, the Federal Reserve now has the capability of influencing the credit markets not just for commercial banks, but for commercial and residential real estate, corporate bonds, commercial paper, cars, student loans, and even personal credit cards.

Abolition of Reserve Requirements for US Banks

In an emergency statement issued in the evening on Sunday, March 15, 2020, the Fed announced a host of new policies in light of the then emerging alarm over the coronavirus.4 In addition to cutting the target for the federal funds rate back down to 0 percent (with a range of up to 0.25 percent) and pledging to increase the scale of its asset purchases, the Federal Open Market Committee (FOMC) statement concluded with this tantalizing paragraph:

In a related set of actions to support the credit needs of households and businesses, the Federal Reserve announced measures related to the discount window, intraday credit, bank capital and liquidity buffers, reserve requirements, and—in coordination with other central banks—the U.S. dollar liquidity swap line arrangements. More information can be found on the Federal Reserve Board’s website. (bold added)

The final word, “website,” contained a hyperlink to the Fed’s main website. Yet if one looked at the compilation of press releases, there was an additional item posted on March 15, 2020, titled “Federal Reserve Actions to Support the Flow of Credit to Households and Businesses,” which was alluded to in the official FOMC statement.5 For our purposes, we will highlight the last measure listed in this supplemental statement:

Reserve Requirements

For many years, reserve requirements played a central role in the implementation of monetary policy by creating a stable demand for reserves. In January 2019, the FOMC announced its intention to implement monetary policy in an ample reserves regime. Reserve requirements do not play a significant role in this operating framework.

In light of the shift to an ample reserves regime, the Board has reduced reserve requirement ratios to zero percent effective on March 26, [2020,] the beginning of the next reserve maintenance period. This action eliminates reserve requirements for thousands of depository institutions and will help to support lending to households and businesses. (bold added)

Since the Fed’s actions following the financial crisis of 2008, the US banking system as a whole has been awash with excess reserves (see the second chart in chapter 14). This is because following the Fed’s injections of new reserves under the various rounds of quantitative easing, the commercial banks did not create new loans for their own customers to the maximum amount legally allowed. Therefore, the immediate impact of the Fed’s 2020 decision to abolish reserve requirements should be minimal, since the original reserve requirements were not binding at the time of the change.

However, even though the US banking system had more than enough reserves to cover its requirements, it is still the case that the level of required reserves rose dramatically—quintupling from about $40 billion to more than $200 billion—since the financial crisis, as the following chart reveals:

Figure 2: Required Reserves of US Depository Institutions

(In the chart, the Required Reserves line falls vertically to zero at the end, because the Fed’s policy change abolished reserve requirements.)

To avoid confusion, the reader should remember that in addition to the Fed’s direct actions that caused the monetary base to soar, money “held by the public” (which we can summarize by the monetary aggregate M1) also dramatically increased following the 2008 crisis. Later in this chapter we will explain the redefinition of M1 in 2020, but the graph of M1 we present in chapter 14 shows the measure in its old definition; the reader can see that it rose steadily after 2008, and jumped sharply in 2020. To the extent that much of this increase in money held by the public took the form not of actual physical currency, but of checking account balances at commercial banks, the statutorily required reserves rose correspondingly—as reflected in the chart above.

Some analysts argue that the Fed’s abolition of reserve requirements merely reflects the new realities of modern banking. With the 1994 introduction of retail “sweep accounts”6 and especially in the post-2008 era of large central bank balance sheets, some have argued that reserve requirements are anachronistic and no longer influence commercial bank lending decisions, except to necessitate cumbersome maneuvers.7

Although the situation is no doubt nuanced, some of the more glib defenses of the new Fed policy prove too much. For example, the Fed’s own explanation (quoted above) says, “This action eliminates reserve requirements for thousands of depository institutions and will help to support lending to households and businesses.” If it were indeed the case that the reserve requirements did not constrain bank lending—as claimed by some of those dismissing the announcement as a bit of trivia—then abolishing the requirements wouldn’t support lending to households and businesses.

To put it simply, if the abolition of reserve requirements really have no effect, then one wonders why the Fed decided to implement the move along with the other emergency measures activated at the onset of the coronavirus crisis. At the very least, abolishing the requirements will give the commercial banks freer rein to make loans down the road, if conditions return to a scenario where the original rules would have provided a check on additional bank credit inflation.

Redefinition of M1

On February 23, 2021, the Fed announced:

As announced on December 17, 2020, the Board’s Statistical Release H.6, “Money Stock Measures,” will recognize savings deposits as a type of transaction account, starting with the publication today. This recognition reflects the Board’s action on April 24, 2020, to remove the regulatory distinction between transaction accounts and savings deposits by deleting the six-per-month transfer limit on savings deposits in Regulation D. This change means that savings deposits have had a similar regulatory definition and the same liquidity characteristics as the transaction accounts reported as “Other checkable deposits” on the H.6 statistical release since the change to Regulation D. Consequently, today’s H.6 statistical release combines release items “Savings deposits” and “Other checkable deposits” retroactively back to May 2020 and includes the resulting sum, reported as “Other liquid deposits,” in the M1 monetary aggregate. This action increases the M1 monetary aggregate significantly while leaving the M2 monetary aggregate unchanged.8

In other words, in late April of 2020, the Fed removed some of the limits on savings deposits in a way that made them equivalent to checking account deposits. As such, savings deposits from May 2020 forward are now included in M1, whereas before they had been excluded from it. Yet either way, savings deposits were always included in M2. Consequently, we can look at the Fed’s graphs of both M1 and M2 to isolate the impact of the reclassification:

Figure 3: M1 and M2 Money Stock, Showing Effect of May 2020 Redefinition

As the figure indicates, there was a massive spike in the official M1 measure in May 2020, largely (though not entirely) reflecting the reclassification of savings deposits as part of M1. However, note that M2 also rose sharply at exactly this time, reflecting a genuine increase in money held by the public because of the coronavirus panic and Fed policy. (Also remember that the M1 chart shown in chapter 14 was made based on the original M1 numbers, before the retroactive reclassification occurred. The chart in chapter 14 shows that M1, even according to the old definition, truly did spike in the spring of 2020.)

Given the change in Regulation D, the reclassification of M1 made perfect sense. Some economists have speculated that the motivation for the Fed’s decision to discontinue publication of certain monetary measures—which occurred at the same time as the retroactive M1 reclassification—may have been to obscure the large increase in US Treasury and foreign bank deposits with the Fed, as such data might fuel concerns that the Fed is acting to monetize US government spending.9

Switch to Average Inflation Targeting

On August 27, 2020, the Fed posted its “2020 Statement on Longer-Run Goals and Monetary Policy Strategy,” which amended the original statement adopted back in 2012. The following excerpt highlights the major change in the 2020 amendment:

The inflation rate over the longer run is primarily determined by monetary policy, and hence the Committee has the ability to specify a longer-run goal for inflation. The Committee reaffirms its judgment that inflation at the rate of 2 percent, as measured by the annual change in the price index for personal consumption expenditures, is most consistent over the longer run with the Federal Reserve’s statutory mandate. The Committee judges that longer-term inflation expectations that are well anchored at 2 percent foster price stability and moderate long-term interest rates and enhance the Committee’s ability to promote maximum employment in the face of significant economic disturbances. In order to anchor longer-term inflation expectations at this level, the Committee seeks to achieve inflation that averages 2 percent over time, and therefore judges that, following periods when inflation has been running persistently below 2 percent, appropriate monetary policy will likely aim to achieve inflation moderately above 2 percent for some time. (bold added)10

Before the August 2020 change, the Fed had adopted a constant (price) inflation target, which reset anew each period. For example, if the Fed wanted inflation (in the Personal Consumption Expenditure index) to average 2 percent in 2020, but in actuality the desired inflation measure came in at only 1 percent, then under the old system, the Fed in 2021 would try again to hit 2 percent. But under the new system, the Fed might shoot for inflation of 2.5 percent for both 2021 and 2022 to make up for the initial undershooting of the target back in 2020. (We are ignoring the complications of exponential growth to keep the arithmetic simple.) This is what the Fed authors mean by saying they are switching to an average inflation target: in our example, if the Fed undershoots the target in 2020, the average over the three-year period can only hit the target if the Fed overshoots in 2021 and 2022.

At the Jackson Hole monetary conference held in late August 2020, Fed chair Jay Powell gave the opening remarks. He first summarized some of the major changes in the global economy and central bank practice since 2012, and then explained the new Fed policy by saying:

The key innovations in our new consensus statement reflect the changes in the economy I described. Our new statement explicitly acknowledges the challenges posed by the proximity of interest rates to the effective lower bound. By reducing our scope to support the economy by cutting interest rates, the lower bound increases downward risks to employment and inflation. To counter these risks, we are prepared to use our full range of tools to support the economy.11

Specifically, Powell argued that the fall in real interest rates, as well as muted (price) inflationary expectations, made the “zero lower bound” a much more potent threat in 2020 than it had been a decade earlier. When short-term nominal interest rates hit 0 percent, it is difficult for the central bank to cut further; why would people lend out their money at a negative interest rate when they could just hold cash and earn 0 percent? According to some economists, at the zero lower bound conventional monetary policy loses traction and other measures are needed.

In theory, the switch to average inflation targeting can help alleviate the problem posed by the zero lower bound. Investors know that if the Fed runs into trouble during a sluggish year and inflation falls short of the target, then the Fed is required to let the economy “run hot” for a while in order to make up for the lost ground. Even if nominal interest rates stay at 0 percent, the increase in expectations of future inflation lower real interest rates and have the same impact as if the Fed had more room to cut nominal interest rates in the present.

In contrast to this optimistic interpretation of the Fed’s new regime, a more cynical take is that Federal Reserve officials knew that their massive monetary expansion in 2020 would lead to higher price inflation, and they wanted to provide themselves with a framework to justify their failure to stay within their own guidelines.

1. See the discussion and citation in Joseph T. Salerno, “The Fed’s Money Supply Measures: The Good News—and the Really, Really Bad News,” Mises Wire, Mar. 6, 2021, https://mises.org/wire/feds-money-supply-measures-good-news-and-really-really-bad-news.
2. See Board of Governors of the Federal Reserve System, “2020 Statement on Longer-Run Goals and Monetary Policy Strategy,” Aug. 27, 2020, last modified Jan. 14, 2021, https://www.federalreserve.gov/monetarypolicy/review-of-monetary-policy-strategy-tools-and-communications-statement-on-longer-run-goals-monetary-policy-strategy.htm.
3. Jerome H. Powell, “Opening Remarks: New Economic Challenges and the Fed’s Monetary Policy Review” (speech given at the Navigating the Decade Ahead: Implications for Monetary Policy economic policy symposium, Jackson Hole, WY, August 2020), https://www.kansascityfed.org/documents/7832/JH2020-Powell.pdf.
4. For the connection between sweep accounts and reserve requirements, see Richard G. Anderson and Robert H. Rasche, “Retail Sweep Programs and Bank Reserves, 1994–1999,” Review 83, no. 1 (January/February 2001): 51–72, https://files.stlouisfed.org/files/htdocs/publications/review/01/0101ra.pdf.
5. For example, in his February 10, 2010, testimony before the House Committee on Financial Services, then Fed chair Ben Bernanke said that the “Federal Reserve believes it is possible that, ultimately, its operating framework will allow the elimination of minimum reserve requirements, which impose costs and distortions on the banking system.” Benjamin Bernanke, “Federal Reserve’s Exit Strategy” (statement before the Committee on Financial Services, US House of Representatives, Washington, DC, Feb. 10, 2010), quoted in Vijay Boyapati, “Why Credit Deflation Is More Likely Than Mass Inflation,” Libertarian Papers 2, art. no. 43, (2010): 1–28, https://cdn.mises.org/-2-43_2.pdf.
6. The block quotation is taken from the February 23, 2021, announcement available at this feed: Board of Governors of the Federal Reserve System, “Money Stock Revisions,” H.6 (Money Stock Measures) statistical release, Mar. 23, 2021, https://www.federalreserve.gov/feeds/h6.html.
7. For the current summary of the Fed’s balance sheet, see Board of Governors of the Federal Reserve System, “Factors Affecting Reserve Balances of Depository Institutions and Condition Statement of Federal Reserve Banks,” H.4.1 (Factors Affecting Reserve Balances) statistical release, May 27, 2021, https://www.federalreserve.gov/releases/h41/current/h41.pdf.
8. See Nancy Marshall-Genzer, “The Fed Starts Buying Corporate Bonds,” Marketplace, June 16, 2020, https://www.marketplace.org/2020/06/16/the-fed-starts-buying-corporate-bonds/.
9. See the Fed’s announcement of its new facilities in its March 23, 2020, press release: Board of Governors of the Federal Reserve System, “Federal Reserve Announces Extensive New Measures to Support the Economy,” press release, Mar. 23, 2020, last modified July 28, 2020, https://www.federalreserve.gov/newsevents/pressreleases/monetary20200323b.htm.
10. See the FOMC statement of March 15, 2020: Board of Governors of the Federal Reserve System, “Federal Reserve Issues FOMC Statement,” press release, Mar. 15, 2020, https://www.federalreserve.gov/newsevents/pressreleases/monetary20200315a.htm.
11. The supplemental Fed posting from March 15, 2020, is Board of Governors of the Federal Reserve System, “Federal Reserve Actions to Support the Flow of Credit to Households and Businesses,” press release, Mar. 15, 2020, https://www.federalreserve.gov/newsevents/pressreleases/monetary20200315b.htm.

0 comment
0 FacebookTwitterPinterestEmail

Enlarge (credit: Matthew Stockman / Getty Images)

A ransomware attack has struck the world’s biggest meat producer, causing it to halt some operations in the US, Canada, and Australia while threatening shortages throughout the world, including up to a fifth of the American supply.

Brazil-based JBS SA said on Monday that it was the target of an organized cyberattack that had affected servers supporting North American and Australian IT operations. A White House spokeswoman later said the meat producer had been hit by a ransomware attack “from a criminal organization likely based in Russia” and that the FBI was investigating.

Existential threat

The weekend attack came three weeks after a separate ransomware attack on Colonial Pipeline disrupted the availability of gasoline and jet fuel up and down the US East Coast. Late last year, ransomware attacks on hospitals hamstrung their ability to provide emergency services just as the coronavirus was already straining their capacity.

Read 6 remaining paragraphs | Comments

0 comment
0 FacebookTwitterPinterestEmail

A couple months ago, in arguing that “The Fed should give everyone a bank account,” journalist Matt Yglesias cited what he took to be an instructive precedent: “Once upon a time, governments didn’t issue paper currency, and instead banknotes were printed privately by banks. But over time, we came to see this as a worthwhile public service.” His first sentence is certainly correct. Banknotes were redeemable paper claims on the issuing banks, which circulated as currency. Systems of freely competitive note-issue worked quite well, as in Canada and Scotland, with the notes of different banks trading at par against one another.

But what to make of his second sentence? I take it to mean that governments now issue paper currency because “we” learned that it is beneficial for governments to do so, rather than (or along with) private banks. Both the origin story and the evaluation are dubious. Yglesias seems to have consulted his imagination rather than the historical record about how governments came to provide paper currency. Getting that history straight was not the point of his piece, so his casual narrative may be forgiven. But for those who do want to get it straight, the story of how the Federal Reserve System of the United States was authorized to issue banknotes, and how private U.S. banks were de-authorized, may be of interest.

On the evaluative question, it should be noted that private banknotes still predominate today in the few places where their issue remains legal: Scotland, Northern Ireland, Hong Kong, and Macau. There is no compelling efficiency rationale for government provision (much less exclusive provision) of paper currency.

Before the Federal Reserve Act

Private commercial banks issued banknotes in the United States from 1781 up to 1935, with only occasional governmental and semi-governmental issues. Before the Civil War there were two federally chartered note-issuing banks, namely the first and second Banks of the United States (1791-1811 and 1816-36). Congress owned one-fifth of their initial share capital, but their notes were not obligations of the federal government. The governments of Kentucky and Vermont owned banks. Otherwise, all paper currency notes were the obligations of private institutions, even when state governments held minority shares.[1] Finally, during the Civil War, the federal government issued “greenbacks,” or United States Notes. These were not banknotes but legal-tender obligations of the U.S. Treasury, and they were not payable in gold (the nation’s metallic standard before and after the war) until 1879.

A long-lasting change in the legal rules for private note-issue came in the National Banking Acts of 1863 and 1864, and a further Act of 1866. The National Banking Acts authorized federal charters for note-issuing banks (called “National Banks” although they could not branch interstate). The charters linked the right of note issue to a National Bank’s purchase of eligible federal bonds. This requirement was a device to sell federal war bonds to the federally chartered banks, inspired by similar requirements that many state governments had instituted to sell their own bonds to state-chartered banks. These were called “bond-collateral” requirements because if a bank failed, the bonds would be sold to reimburse customers holding its banknotes. Initially a National Bank could issue notes up to 90% of the par value of the eligible bonds it held; after 1900 it could issue up to 100%.

The Act of 1866 imposed a deliberately prohibitive tax on banknotes issued by state-chartered banks—a tax high enough to drive them out of the note-issuing business. The tax was upheld by the Supreme Court in Veazie Bank v. Fenno (1869). Only National Banks thereafter issued banknotes, and only on terms dictated by the federal government.

The bond-collateral requirement on National Banks had an unintended consequence: it made the quantity of banknotes in circulation inflexible or “inelastic,” unable to vary to meet seasonal or cyclical variation in the public’s desired mix of banknotes and deposits. Inelasticity was a major factor in causing the five U.S. banking panics of the National Banking era. The Canadian banking system during the same era, by contrast, had no bond-collateral requirement on banknotes, no seasonal spike in interest rates at crop-moving time, and no financial panics.

The U.S. system could have avoided panics by adopting Canadian-style reforms: removing bond collateral requirements and allowing nationwide branch banking. Note-issue would have stayed a private business. But this solution was a political non-starter: The thousands of small state-chartered banks wouldn’t stand for the competition that liberalization of branching would bring.[2]

Enter the Fed, and Exit National Bank Notes

Instead, the Federal Reserve Act of 1913 was passed to, as its preamble says, “provide a more elastic currency.” National Bank Notes would remain in circulation, and their volume would remain tied to the volume of available federal bonds eligible to serve as collateral, but Federal Reserve Notes would provide elasticity to the total stock of currency by varying as necessary to meet variations in demand for banknotes. Note-issue by a government agency was a “worthwhile public service” only in a second-best sense, private note-issue having been hobbled by legal restrictions that rendered its supply inelastic. In Canada, with no inelasticity problem and no panics, there was no case for a central bank in 1913.

The figure below shows the volume of National Bank Notes in circulation between 1914 and 1935, together with the volume of Federal Reserve Notes. The Federal Reserve Act authorized the Fed to replace National Bank Notes in circulation with Federal Reserve Notes, by purchasing the eligible bonds from any National Banks that decided to retire from note-issue. Only a few did. The public did not show any preference for Federal Reserve Notes. The volume of National Bank Notes dropped about 30 percent in 1914-16. Between 1916 and 1932 the volume of National Bank Notes was rather steady. In 1932 the volume of paper currency in circulation was about 20 percent National Bank Notes (about $650 million), and about 80 percent Federal Reserve Notes (about $2600 million). National Bank Notes bulged in 1932-34 with the passage of legislation that expanded the range of eligible collateral to include higher-yielding bonds.

The coexistence of Federal Reserve Notes and National Bank Notes ended after 1935. What ended private note-issue was a further tightening of the noose of legal restrictions. As of 1930, the Treasury bonds that bore the “circulation privilege” were callable. On August 1, 1935, the U.S. Treasury, following an executive order given  by President Franklin Roosevelt that March, called in and retired all of the bonds that bore the circulation privilege. National Banks then held $658 million of such bonds as collateral against their $658 million of notes in circulation. With the required bonds unavailable, National Banks lost the right to issue. The Federal Reserve paid par value for the bonds in its own liabilities, allowing the banks to recall and redeem their banknotes by paying their customers with Federal Reserve Notes.

An article in the Indianapolis Times on the 11th of March 1935 about the plan to retire National Bank Notes was appropriately headlined: “U. S. to Take Control over All Currency.” Treasury officials rationalized the measure as giving the federal government greater power over the monetary system, as though a more centralized system were ipso facto better: “Government officials said the move was another step in the simplification of the monetary system and a vesting of complete power in the hands of the Federal government. Previously, national banks have been permit[ted] to issue their money independent of whether or not it was needed in circulation and sometimes in conflict with other monetary steps of the government.” The supposed conflict was not specified. No conflict is evident in the figure, showing hardly any variation in the volume of National Bank Notes between 1916 and 1932.

The imagined benefits of centralized control rested on wishful thinking. Few economic historians today would give a passing grade to the Federal Reserve’s conduct of monetary policy in the decade before or in the decade after 1935. The Fed’s post-Depression performance has also left a lot to be desired, but not for lack of control over the currency.

 

______________

[1] See Susan Hoffman, Politics and Banking: Ideas, Public Policy, and the Creation of Financial Institutions (Johns Hopkins University Press, 2001), pp. 75-76.

[2] George A. Selgin and Lawrence H. White, “Monetary Reform and the Redemption of National Bank Notes, 1863-1913,” Business History Review 68 (Summer 1994), pp. 205-43.

The post How U.S. Government Paper Currency Began, and How Private Banknotes Ended appeared first on Alt-M.

0 comment
0 FacebookTwitterPinterestEmail

Yoram Bauman: 

This is the fifth such review I’ve been involved in and it is almost certainly the last review I’ll be doing, for the simple reason that the vast majority of textbooks now have excellent content on climate change! (If desired you can skip directly to the report card, or read on for some context and big-picture thoughts.)

The state of affairs today is very different from that of 10 years ago—my previous reviews were in 201020122014, and 2017—much less 20 years ago, when I had an astonishing and hilarious email exchange with University of Houston professors Roy Ruffin and Paul Gregory about the wacky climate-skeptic claims (“no matter how much contrary evidence is presented, it just doesn’t matter”) in their now-defunct textbook.

In past years I have given out a Ruffin and Gregory Award for the Worst Treatment of Climate Change in an Economics Textbook, and I am pleased to say that no book merits that award this year

This is good news … the economics profession won’t be participating (as much) in the training of undergraduates in climate skepticism.

0 comment
0 FacebookTwitterPinterestEmail

Over the course of 2020, FiveThirtyEight’s visual journalists covered a historic election, an unprecedented year in sports, a raging pandemic and an economy in free fall. So to cap off this long, strange, difficult year, we’re continuing our tradition of celebrating the best — and weirdest — charts we’ve published in the last 12 months. Charts are grouped by topic, but they’re not listed in any particular order beyond that. Click any of them to read the story where they originally ran. Enjoy!

Politics

Election 2020

Sports

COVID-19 and its economic fallout

Did you enjoy this long list of weird charts? Then boy do we have content in the archives for you! Check out our lists from 2019, 2018, 2016, 2015 and 2014.

Watch: https://abcnews.go.com/fivethirtyeight/video/covid-19-vaccine-means-political-battles-74728009

Watch: https://abcnews.go.com/Health/video/people-hesitant-trust-covid-19-vaccine-74659883

Watch: https://abcnews.go.com/fivethirtyeight/video/1200-books-trump-us-fivethirtyeight-politics-podcast-74787247

Watch: https://abcnews.go.com/fivethirtyeight/video/emily-oster-made-covid-19-database-fivethirtyeight-74677267

0 comment
0 FacebookTwitterPinterestEmail

DMT (2020) draw attention to my treatment of the weighted WTP estimates. The regression model for the second scenario has a negative sign for the constant and a positive sign for the slope. When I “mechanically” calculate WTP for the second scenario it is a positive number which adds weight to the sum of the WTP parts. This is in contrast to the unweighted data for which WTP is negative. Inclusion of the data from this scenario biases the adding-up tests in favor of the conclusion that the WTP data does not pass the adding-up test. 

The motivation for my consideration of the weighted data was DMT’s (2015) claim that they found similar results with the weighted data. My analysis uncovered validity problems with two of the five scenarios which, when included in a adding-up test, led to a failure to reject adding-up. At this point in the conversation it will be instructive to visually examine the weighted data to see if it even passes the “laugh” test. In my opinion, it doesn’t. 

Below are the weighted votes and theTurnbull for the whole scenario (note that the weights are scaled to equal to sub-sample sizes). The dots and dotted lines represent the raw data. Instead of a downward slope, these data are “roller-coaster” shaped (two scary hills with a smooth ride home). The linear probability model (with weighted data) has a constant equal to 0.54 (t=9.73) and a slope equal to -0.00017 (t=-0.69). This suggests to me that the whole scenario data, once weighted, lacks validity. While lacking validity, the solid line Turnbull illustrates how a researcher can obtain a WTP estimate with data that does not conform to rational choice theory. The Turnbull smooths the data over the invalid stretches of the bid curve (the “non-monoticities” using the CVM jargon) and the WTP estimate is the area of the rectangles. In this case WTP = $191 which is very close to the unweighted Turnbull estimate. But, a researcher should consider this estimate questionable since the underlying data does not conform to theory. As a reminder, the WTP for the whole scenario is key to the adding up test as it is compared to the sum of the parts. The WTP estimate from linear logit model is $239 with the Delta Method [-252, 731] and Krinsky-Robb [-8938, 9615] confidence intervals. Given the statistical uncertainty of the WTP estimate, it is impossible to conduct any sort of hypothesis test with these data. 

Below are the weighted votes and the (pooled) Turnbull for the second scenario. The dots and dotted lines represent the raw data. Instead of a downward slope, these data are “Nike swoosh” shaped. The linear probability model (with weighted data) has a constant equal to 0.13 (t=2.46) and a slope equal to 0.00107 (t=4.19). This suggests to me that the second scenario data, once weighted, lacks validity. Again, the Turnbull estimator masks the weakness of the underlying data. In this case, the Turnbull is essentially a single rectangle. With pooling the probability of a vote in favor is equal to 28.06% for the lower bid amounts. With pooling the probability is 27.56% for the higher bids. The Turnbull WTP estimate is $112 which appears to be a reasonable number, hiding the problems with the underlying data. 

DMT reestimated the full data model with the cost coefficients constrained to be equal. In a utility difference model the cost coefficient is the estimate for the marginal utility of income. There is no reason for marginal utility of income to vary across treatments unless the clean-up scenarios and income are substitutes or complements. This theoretical understanding does not explain why the weighted models for the whole and second scenarios are not internally valid (i.e., the cost coefficient is not negative and statistically different from zero). The model that DMT refer to passes a statistical test, i.e., the model that constains the cost coefficient to be equal is not worse statistically than an unconstrained model, but it should be considered inappropriate due to the lack of validity in the weighted whole and second scenario data sets. Use of the model with a constrained cost coefficient amounts to hiding a poor result. The reason that the weighted model with the full data set takes the correct sign is because the scenarios with correct signs outweigh the scenarios with incorrect or statistically insignificant signs. The reader should attach little import to DMT’s (2015) claim that their result is robust to the use of sample weights. 

0 comment
0 FacebookTwitterPinterestEmail