Category:

Stock

Beltway Bandits

by

Chris Edwards

Next year, the new president will face budget deficits and interest costs spiraling upwards. He or she will need to find spending to cut. How about all the waste and overcharging in federal contracting?

The federal government procures more than $750 billion of goods and services a year, with the Pentagon accounting for 60 percent of the total. Federal contractors are known for cost overruns, inflated profits, and sometimes corruption.

Raytheon is currently in the spotlight for a major Department of Defense (DOD) rip off. From the US Department of Justice last week:

[F]rom 2012 through 2013 and again from 2017 through 2018, Raytheon employees provided false and fraudulent information to the DOD during contract negotiations concerning two contracts with the United States for the benefit of a foreign partner — one to purchase PATRIOT missile systems and the other to operate and maintain a radar system. In both instances, Raytheon employees provided false and fraudulent information to DOD in order to mislead DOD into awarding the two contracts at inflated prices. These schemes to defraud caused the DOD to pay Raytheon over $111 million more than Raytheon should have been paid on the contracts.

… Raytheon also entered into a civil False Claims Act settlement to resolve allegations that it provided untruthful certified cost or pricing data when negotiating prices with the DOD for numerous government contracts and double billed on a weapons maintenance contract. Under the False Claims Act settlement, which is the second largest government procurement fraud recovery under the Act, Raytheon will pay $428 million for knowingly failing to provide truthful certified cost and pricing data during negotiations on numerous government contracts between 2009 and 2020, in violation of the Truth in Negotiations Act (TINA).

… Raytheon also admitted that by misrepresenting its costs during contract negotiations it overcharged the United States on these contracts and received profits in excess of the negotiated profit rates. Further, Raytheon admitted that it failed to disclose truthful cost or pricing data on a contract to staff a radar station. Raytheon also admitted that it billed the same costs twice on a DOD contract.

Raytheon was also hit for violations of foreign corruption and arms control laws for a total of $950 million in fines. How often does such Raytheon-style cheating take place? After a six-month investigation last year, CBS News reported, “Military contractors overcharge the Pentagon on almost everything the Department of Defense buys each year, experts told 60 Minutes.”

If true, procurement reform has a lot of potential. This Government Accountability Office chart suggests which agencies the new administration should target for savings.

For more information, R Street held a recent forum on waste in federal contracting. 

0 comment
0 FacebookTwitterPinterestEmail

Jai Kedia

In September, the Fed cut the target range for its policy rate by 50 basis points to much publicity, with several media outlets claiming that this was the first step toward reducing borrowing costs and easing the debt burdens faced by many Americans. Of course, the implicit assumption is that economy-wide borrowing rates such as auto loans, credit cards, or mortgages are intrinsically linked to the Fed’s policy instrument—the federal funds rate (FFR). It may come as a surprise, then, that several borrowing rates, ranging from treasury securities to mortgage rates, have increased since the Fed’s rate cut.

In fact, the discrepancy between the FFR and other borrowing rates has been prevalent for some time. The rates had started to increase before the series of rate hikes executed by the Fed in response to post-pandemic inflation. On the surface, it is fair to assume that changes to the FFR should be transmitted through to other borrowing rates. After all, the FFR is believed to represent the cost borne by financial institutions to acquire liquid reserves for themselves in the interbank lending market. If the cost to acquire funds is low, then the cost of lending these funds to consumers must be equivalently low. However, since the overhaul of the Fed’s operating procedures in the aftermath of the financial crisis, the FFR no longer holds sway as it once did.

Following the financial crisis of 2008-09 and its associated recession, the Fed shifted to a “floor system” that would ensure abundant reserves in the financial sector. That is, it instituted policies such as interest on excess reserves and quantitative easing that ensured banks would maintain a large surplus of reserves, reducing their need to engage in interbank lending and thereby reducing the importance of the FFR. Of course, the FFR still matters as a numeric stance of monetary policy, but it is hard to see how it can hold any importance beyond this signaling effect.

Data on borrowing costs confirm this hypothesis. Figures 1–3 show the correlation between monthly FFR and three other key borrowing rates. These include the 30-year mortgage rate (Figure 1), Moody’s Baa corporate bond yield (Figure 2), and the 10-year Treasury yield (Figure 3)—which represent benchmarks for consumer, corporate, and government debt, respectively. Additionally, correlations were computed for various lag structures so that any leading or lagging effects between the FFR and other borrowing rates may be observed. This analysis was conducted over two periods—the Great Moderation (1984 through 2006) and the post–financial crisis (2009 to present)—to see how the post-2008 change to the Fed’s operating system affected these correlations.

As all figures show, before the financial crisis, the FFR was linked almost perfectly (i.e., a near 100 percent correlation) with other borrowing costs. Within the same month, changes to the FFR were matched by near identical changes to other borrowing costs. This is no longer true following the financial crisis. Under the Fed’s revised framework, the same-month correlations fall drastically. FFR correlation with the 30-year mortgage rate falls to 70 percent, with Baa bond yield falling to 58 percent, and with the 10-year Treasury yield falling to 71 percent.

Additionally, the skew of the cross-correlation distribution has changed since the financial crisis. Before 2007, the correlation distribution was symmetrical around a zero lag. That is, changes to the FFR and other borrowing rates occurred simultaneously with neither being a leading nor lagging indicator of the other. However, under the post-crisis regime, the distribution shows a significant left skew. This implies that the FFR is much more likely to be correlated with other borrowing rates at a lag and much less likely at a lead. Put differently, the results suggest that other borrowing costs move first (probably in response to economic conditions) and are then followed by movements to the FFR.

Naturally, the FFR is still correlated with other rates even though this correlation has fallen significantly over time. This is because all borrowing rates respond to general economic conditions that may warrant tighter or looser lending conditions. However, the results presented here indicate that the mechanism by which this transmission occurs from the broader economy to relevant borrowing costs seems to bypass the Federal Reserve. If anything, the Fed seems to take its cue from the markets, not the other way around.

This analysis only adds to the litany of evidence we have presented that the Fed matters much less than people think. Of course, some may claim that the Fed is still a primary determinant of lending conditions but that how it controls such markets has changed. If that is true, then the onus is on the Fed to provide a detailed explanation of which policy instrument now matters instead of the FFR and the mechanism by which this instrument affects lending standards and, thereby, its dual mandate objectives of low inflation and stable employment. So far, it has not provided any such explanations.

The author thanks Jerome Famularo for his excellent research assistance in the preparation of this blog.

0 comment
0 FacebookTwitterPinterestEmail

Travis Fisher and Joshua Loucks

Last week, the Supreme Court issued an order that left many in disbelief. The Court denied several motions for stay (a legal pause) regarding the Environmental Protection Agency’s (EPA’s) Clean Power Plan (CPP) 2.0 rule after granting a stay of the original CPP in 2016 and elaborating on the Major Questions Doctrine in overturning the CPP on its merits in West Virginia v. EPA in 2022.

The EPA rule, which we call CPP 2.0 because it’s the second attempt at a CPP under section 111 of the Clean Air Act, hurts the reliability and affordability of electricity when both are already at risk. The EPA now requires existing coal and new natural gas power plants to significantly change their operations or shut them down entirely. CPP 2.0 is a costly and unlawful mandate for the unproven technology of carbon capture and sequestration/​storage (CCS).

The fate of CPP 2.0 will be the same when the Supreme Court reviews it on the merits—it will be overturned because it plainly violates the statute it cites as authority from Congress. But it will hurt many more Americans than it needs to because the Supreme Court took a narrow view of “irreparable harm.”

The irreparable harm ignored by the Supreme Court is that demand for electricity in the United States is growing again in exceptional ways (in part due to unforeseen growth in computing load), and CPP 2.0 is preventing economic growth—and possibly causing electricity shortages—by mandating impossible standards for existing and new electricity supplies.

The CPP 2.0 Mandate

In its most basic form, CPP 2.0 requires existing coal and new natural gas power plants to implement CCS on an unprecedented scale. For existing coal-fired power plants, CPP 2.0 requires the owners of any plant that might remain operational after 2039 to capture and store 90 percent of its carbon dioxide emissions by 2032. (Some less stringent options are available for coal plants that will be closed by 2032 or 2040.)

Likewise, the rule requires any new combined-cycle natural gas–fired power plants operating above “baseload” levels (at an annual capacity factor above 40 percent) to reduce their carbon emissions by 90 percent by 2032 by implementing CCS.

How did the EPA come up with this rule? Dating back to the 1970 Clean Air Act amendments, Congress authorized the EPA to issue nationally binding emissions standards for stationary sources like power plants through section 111 using proven technology as a baseline.

Specifically, EPA’s standards of performance under section 111 must be based on “the best system of emission reduction which (taking into account the cost of achieving such reduction and any nonair quality health and environmental impact and energy requirements) the Administrator determines has been adequately demonstrated.” (emphasis added)

In CPP 2.0, the EPA claimed CCS at a 90 percent capture rate had been adequately demonstrated, which is a patently false account of the facts on the ground. This is the key issue of statutory interpretation relevant to CPP 2.0, and we believe the rule will fail on EPA’s fundamental misreading of the statute and/​or its misreading of the facts in the record.

The short version of the merits argument is this: CCS depends on a mind-bogglingly large set of new infrastructure (rivaling the existing network of fossil fuel infrastructure itself), including CO2 pipelines to carry enormous amounts of CO2 from power plants to injection sites. Such infrastructure may be impossible to build and certainly has not been “adequately demonstrated.” For more detailed legal and technical arguments, see comments on the proposed rule.

The only power plant in the United States that captures anywhere near 90 percent of CO2 emissions is perhaps the Petra Nova plant in Texas, which has not operated continuously and does not technically sequester CO2 at all—it injects CO2 into oil wells in a process known as enhanced oil recovery. Another power plant often cited by proponents of CCS (and explicitly cited by the EPA) is the Boundary Dam project in Canada, which has consistently underperformed on its CCS goal of—you guessed it—90 percent.

Thus, EPA’s emissions standard is far too stringent because it is based on plants that are located near profitable CO2 injection and storage sites or are falling woefully short of the EPA’s goal. A rule mandating 90 percent CCS nationwide is therefore at odds with the part of the statute that says the “best system of emission reduction” must be “adequately demonstrated.”

CPP 2.0’s Irreparable Harm

The mandate to close existing coal plants and prevent the building of new “baseload” natural gas plants is a recipe for electricity shortages, skyrocketing electricity prices, or a mix of both. The lack of certainty regarding which set of rules a power plant owner is likely to face in the coming years is itself a deterrent to building or retaining needed supplies.

In practice, the much-needed new electricity supplies are likely to come from less efficient simple-cycle natural gas power plants—essentially methane-fueled jet engines—which will increase costs and preclude more efficient investments until CPP 2.0 is finally overturned.

At the wholesale level, prices will be set more often by these less efficient units with higher marginal costs, meaning wholesale electricity prices will be higher than necessary. Further, the US Energy Information Administration (EIA) estimates only 2.6 gigawatts of new natural gas–fired power plants will come online in 2024, while 3.8 GW will be retired.

Coal plant closures will also reduce power supplies. PJM Interconnection Inc., the largest electricity market in North America by revenue and volume, has issued stark warnings about the collision course we are on between growing demand and falling supply, stating that there is a “timing mismatch between resource retirements, load growth and the pace of new generation entry.” CPP 2.0 exacerbates such a mismatch because it would force the retirement of coal units, which produced 16 percent of the electricity in the United States last year.

To be clear, we don’t expect anyone at the EPA, the Supreme Court, or any government agency to accurately predict the timing and scale of Americans’ future electricity needs. Efforts to centrally plan electricity markets are likely to lead to supply shortages, increased costs, top-down rationing, and rolling blackouts. But that is precisely why CPP 2.0 is so harmful—it allows the EPA to be the national gatekeeper for new electricity supplies, which will have disastrous consequences.

The Court’s Mistake

In his statement about the denial of applications for stay, Justice Brett Kavanaugh argued that applicants “are unlikely to suffer irreparable harm before the Court of Appeals for the DC Circuit decides the merits” because “applicants need not start compliance work until June 2025.” Unfortunately, that is untrue. For would-be builders of new natural gas power plants, the irreparable harm likely began in May 2023 (the date of the proposed rule) and was cemented in the final version of CPP 2.0, which featured an effective date of July 8, 2024.

While Justice Kavanaugh’s approach may make sense in the legal compliance world, it ignores economic decisions that predate compliance. As Frederic Bastiat might say, the court focused on what is seen—the compliance measures undertaken by plants that are already built—and failed to recognize the unseen harms. We cannot see, for example, the business activities or consumer savings that might have occurred if the Supreme Court had granted the motions for stay. In other words, CPP 2.0 is already causing irreparable harm because it’s preventing much-needed electricity supplies that would be built in its absence. (We note that Justice Clarence Thomas would have granted the applications for stay, and Justice Samuel Alito did not participate.)

PJM and other grid operators articulated this harm in their amicus brief, stating that the EPA has “failed to adequately consider the impact of premature retirements driven by the Rule’s compliance timelines.” They also highlighted how investments in the grid, particularly large power plants, are based on “the expected revenues associated with continuing operation of the unit. Unit owners may decide to retire units early rather than incur additional expense and risk.” Premature power plant closures—and the stalling of new supplies in an era of demand growth—are the irreparable harm the Court failed to see.

Conclusion

Supreme Court justices clearly understand the law. However, the order in this case demonstrated that many of them do not understand market processes and fall into the knowledge problem trap of attempting to assume the unknowable. By denying the motions to stay CPP 2.0, the Supreme Court squandered a perfect opportunity to limit executive branch overreach in the new post-Chevron era and protect millions of Americans from government-induced harm.

0 comment
0 FacebookTwitterPinterestEmail

Social Security Is a Legal Ponzi Scheme

by

Romina Boccia

Ida May Fuller, the first person to receive a Social Security check, worked for just three years before receiving her first benefit (in 1940). Over that time, the total taxes deducted from her salary amounted to a mere $24.75. Yet her first monthly check came in at $22.54, almost matching her entire contribution. Over the course of her lifetime, Fuller collected $22,888.92 in Social Security benefits. That’s equivalent to nearly half a million in today’s dollars and about 1,000 times what she had paid in taxes.

Some people continue to believe that Social Security functions as a mandatory savings system. But that’s not how this program works. Unlike a genuine savings account, where individual contributions accumulate and grow over time, Social Security’s structure is far more akin to a classic Ponzi scheme. Early recipients received far more in benefits than they ever paid in, while future generations face escalating costs to sustain the system.

Fuller’s case is a glaring example of how Social Security was never designed as a true savings system. Early beneficiaries, like her, reaped enormous gains because the program relied on payroll taxes from younger, growing workforces to cover payouts. Today’s workers, however, face an increasingly unbalanced equation. They are being asked to pay ever-higher taxes to support a system that offers far less in return than what earlier generations enjoyed.

The math has fundamentally changed. In 1950, there were about 16 workers paying into Social Security for every retiree. Today, that number has dwindled to just 2.7 workers per retiree, and it’s projected to fall further to 2.4 workers per retiree by 2035. As the worker-to-retiree ratio shrinks, the system faces increasing strain. This demographic shift is one of the primary drivers of Social Security’s financing issues. Fewer workers supporting more retirees means higher taxes or reduced benefits—or both—to keep the program afloat.

The required changes are substantial. The Congressional Budget Office (CBO) estimates that the payroll tax would need to immediately increase from 12.4 percent to 16.7 percent to cover the program’s long-term actuarial deficit of more than $25 trillion. In numbers, a median worker earning $61,000 would need to pay an extra $2,600 in payroll taxes, bringing their total payroll tax burden above $10,000. On the flip side, the CBO projects that benefits would need to be cut by 23 percent in 2035 to meet incoming payroll tax revenues.

Another major reason that Social Security is financially unsustainable is because Congress repeatedly expanded benefits. From including spouses and survivors, to indexing initial benefits to wage growth, to adopting automatic cost-of-living adjustments, Congress has a long history of expanding Social Security, especially come election time.

Even Ida May Fuller herself grew concerned about the program’s unsustainable expansion. When Congress proposed yet another benefit increase in 1970, Fuller voiced her opposition. “It’s been raised as far as it ought to go,” she said. “Every time they raise it, they raise the amount taken away from the working people who pay into it and it’s just getting to be too much of a burden.”

Fuller’s warning rings true today. Social Security extracts significant resources from workers while offering them less in return. Most workers would be better off if Social Security didn’t exist and they had saved the money they paid in payroll taxes in accounts that they owned and controlled instead.

Alas, Congress has made millions of Americans largely dependent on Social Security for their retirement income by incentivizing them to work and save less than they otherwise would have. Transitioning away from this unsustainable system will be costly and politically painful. We can begin by reducing the growth of future benefits, increasing the age at which new beneficiaries can claim Social Security, and discontinuing cost-of-living adjustments for wealthier Americans to reduce their benefits slowly over time. Bolder changes would transition the program away from an earnings-related benefit to a poverty-targeted, predictable benefit. Significant benefit changes are necessary to reduce payroll taxes and enable workers to save more in private accounts that they personally own.

It’s time to confront the painful but necessary truth that no matter what story politicians told, Social Security has always been an income transfer program, not a savings system. Today’s Social Security is increasingly burdening future generations with the threats of higher taxes and inflation. A more effective approach would design Social Security as a safety net to prevent senior poverty while empowering workers to have greater control over their retirement security.

Next month I’ll publish a new Cato Policy Analysis that will dive deeper into exposing the myths surrounding the Social Security trust fund, revealing its true nature as a legal Ponzi scheme. This report will uncover the fiscal realities of Social Security’s financing challenge and highlight structural reforms that will reduce the rising burden on younger workers while protecting vulnerable seniors. 

0 comment
0 FacebookTwitterPinterestEmail

Alex Nowrasteh

Steven Malanga, a senior fellow at the Manhattan Institute and a senior editor of City Journal, argues that there is a crime wave caused by immigrants in his recent piece, “No, You’re Not Imagining a Migrant Crime Spree.” This is not accurate and the article overall is not a useful source for those interested in understanding illegal immigrant criminality. 

Malanga’s piece leans heavily on rhetoric and individual cases of migrant crime and doesn’t give nearly as much attention to migrant criminal conviction, arrest, or incarceration rates as he should have. He begins his piece by recounting the tragic murder of Laken Riley in Georgia and a political debate that ensued when President Biden mentioned the murder in his 2024 State of the Union address. Malanga then brushed aside the best empirical evidence on illegal immigrant criminality when he wrote:

The elite press rode to Biden’s defense. The idea of a migrant crime wave was a myth, media outlets proclaimed, noting studies of Texas incarceration data from years ago, which seemed to suggest that illegals commit crimes at low rates.

What are those “studies of Texas incarceration data from years ago”? They don’t exist because those studies are of Texas criminal conviction and arrest rates by immigration status, not incarceration rates. 

Researchers and the media focus on Texas state crime data because it’s the only state that records criminal convictions and arrests by immigration status. We’re fortunate that Texas does because it’s a border state with the second highest population of illegal immigrants after California, and it’s governed by Republicans, which short-circuits the common conservative argument that you can’t trust crime data reported from Democratic jurisdictions. The data are also more recent than Malanga makes it seem—although they weren’t available when Riley was murdered.

After her murder, I wrote a study of criminal convictions in Texas by immigration status focusing on homicides that includes 2022 data. My study shows that illegal immigrants have lower homicide conviction and arrest rates than native-born Americans but higher rates than legal immigrants. 

The Texas data are pretty good but they contain some hidden landmines, as I have explained here. The most recent Texas data from 2022 were not recent enough for him, but a piece on the supposedly high rate of illegal immigrant gang membership published 20 years ago made the cut.

In 2022, native-born Americans had a homicide conviction rate of 4.9 per 100,000 in Texas (Figure 1). In other words, 4.9 native-born Americans were convicted of homicide in that year for every 100,000 native-born Americans living in Texas. In the same year, the illegal immigrant homicide conviction rate was 3.1 per 100,000 illegal immigrants living in Texas. The legal immigrant homicide conviction rate was the lowest of all at 1.8 per 100,000. 

There were 1,336 people convicted of homicide in Texas in 2022. Native-born Americans were convicted of 1,209 of those homicides, illegal immigrants were convicted of 67, and legal immigrants were convicted of 60. Homicide conviction rates for illegal immigrants and legal immigrants were 36 percent and 62 percent, respectively, below those of native-born Americans in Texas in 2022.

The results of the above analysis of Texas crime data were criticized by the Center for Immigration Studies (CIS). Malanga cites CIS’ criticisms in his piece but he does not cite my responses. The major disagreement is that I maintain that CIS double-counted some illegal immigrant convictions based on how they requested the data. To be clear, this was not intentional on their part. CIS disagrees. 

Furthermore, these data are presented without any controls for age, sex, race and ethnicity, and socioeconomic status (SES). Even if CIS is correct and my data analysis is wrong, the difference between illegal immigrant and native-born criminal conviction rates for homicide is small. Introducing normal controls for age, sex, race and ethnicity, SES, or any combination would surely produce a far lower illegal immigrant criminal conviction rate, as Richard Hanania points out. Although introducing controls would be best practice, it’s not necessary to have such controls because I’m happy fighting in the methodological landscape crafted by the nativists at CIS.

Malanga faults pro-immigration analysts and advocates for countering “that significant immigrant wrongdoing can’t be going on, as crime rates are now falling.” He’s correct about that. The foreign-born share of the US population is just shy of 15 percent, so fluctuations in immigrant criminality would have to be extreme to drive the nationwide crime rate. That’s why it’s important to look at illegal immigrant and immigrant crime rate data where they are available, which brings us back to his erroneous dismissal of the Texas criminal conviction data.

On this point, Cato also published research estimating legal and illegal immigrant incarceration rates. Our latest estimate of these is for the year 2018 (we’re updating it now), and there are methodological challenges that expose it to criticism. But the findings are in line with other research on immigrant criminality

Relative to native-born Americans, we found that illegal immigrants were 41 percent less likely to be incarcerated and legal immigrants were 74 percent less likely. But Malanga didn’t have to rely on Cato’s research (although he definitely should have); he could have looked at a wonderful working paper by Ran Abramitzky, Leah Platt Boustan, Elisa Jácome, Santiago Pérez, and Juan David Torres that took a long-term look at immigrant criminality in American history. They find:

We provide the first nationally representative long-run series (1870–2020) of incarceration rates for immigrants and the US-born. As a group, immigrants have had lower incarceration rates than the US-born for 150 years. Moreover, relative to the US-born, immigrants’ incarceration rates have declined since 1960: immigrants today are 60% less likely to be incarcerated (30% relative to US-born whites). This relative decline occurred among immigrants from all regions and cannot be explained by changes in immigrants’ observable characteristics or immigration policy. Instead, the decline is part of a broader divergence of outcomes between less-educated immigrants and their US-born counterparts.

Abramitzky and his coauthors don’t estimate illegal immigrant incarceration rates and their data only go through 2020, but the figure from their paper is striking.

Malanga’s more fundamental point is that there’s an immigrant crime wave in the United States. He recounts individual high-profile crimes like the murder of Laken Riley and others, but those don’t tell us whether there’s a crime wave—regardless of how tragic those examples of individual crimes are. Only crime rates, incarceration rates, criminal conviction or arrest rates, and similar measures can tell us whether there is a crime wave. 

Furthermore, US-wide crime rates can’t tell us the source of the crime wave as Malanga writes. That’s why we need the detailed evidence above on immigrant criminality relative to native criminality to figure it out.

Crime rates are more important than relying on individual cases of crimes committed by immigrants because they allow us to see whether there is more crime controlling for the number of people, which is important because more people generally lead to more crime. Crime rates tell us whether Americans are more at risk of being a victim of crime; they tell us whether certain subpopulations are more or less likely to commit crime than others; and they tell us whether some jurisdictions are more dangerous than others. The criminality of different subpopulations can indicate where higher or lower crime rates are coming from. 

As I wrote earlier this year, “the focus on crime rates matters when discussing the relative criminality of different groups and evaluating whether immigrants bring more crime than they add people to the United States.”

Malanga doesn’t spend much time discussing crime rates, criminal conviction rates, incarceration rates, or other similar measures over time, which makes it impossible to judge whether we’re living in the middle of a crime wave largely or partially caused by illegal immigrant criminals. The recent crime surge began in May 2020 after the murder of George Floyd, which was also the month with the second-lowest number of illegal immigrant border apprehensions in a very long time. We’re constantly told that Trump controlled the border in the preceding years, which is exaggerated for political purposes, but there were still many fewer crossings and apprehensions (“encounters” are what they’re called now) than after Biden was sworn in. Odd timing for an illegal immigrant crime wave, to say nothing of subsequently escalating border encounters and falling crime rates. So, where were these criminals born? Overwhelmingly in the United States.

Malanga writes that ICE removals of illegal immigrant criminals are down compared to 2019, although they are up since the last month of 2020 during the pandemic when Trump was president. Another wrinkle is that ICE has released fewer criminals under the Biden administration than during the Trump administration. There were 182,870 book-ins of criminal migrants into ICE detention in 2020, which rose to 211,450 in 2021, rose by another 100,000 in 2022, reached 273,220 in 2023, and was on track for more in 2024. The above administrative actions are complex and paint a confusing picture. 

Regardless, migrant crime cannot be a major driver of nationwide crime rates and small changes in ICE book-in or removals policy can’t explain much of the variation in crime rates because the numbers are so small. Again, this is why focusing on illegal immigrant criminality measured by criminal conviction, arrest, and incarceration rates is so valuable.

Malanga then mentions data from the State Criminal Alien Assistance Program (SCAAP), which allows state governments to apply to be partly reimbursed by the federal government for incarcerating illegal immigrants. Before he mentions SCAAP data, Malanga mentions other surveys of federal multistate data that supposedly “show a far more troubling reality.” He provides no link to these surveys, but I assume he means the SCAAP data. 

Malanga leans heavily on a report from the Federation for American Immigration Reform (FAIR) that analyzes the SCAAP data and finds that illegal immigrants have a much higher incarceration rate than other populations. I criticized FAIR’s interpretation of the data here and here. In the years analyzed, illegal immigrants were about 6.5 percent of Texas’s population, 4.7 percent of arrests, 3.9 percent of criminal convictions, and accounted for 3.6 percent of all days incarcerated. The patterns are similar for other states.

The recent letter from ICE to Rep. Tony Gonzalez (R‑TX) on the 662,556 noncitizens convicted of crimes or with ending charges also makes an appearance in Malanga’s piece. He says that these criminals aren’t being deported, but that’s because they are on ICE’s non-detained docket. That mostly means the migrants are currently incarcerated, deceased, from countries where they can’t be deported because of the lack of a treaty, or for other reasons. It’s not as if the Biden administration just decided not to deport them to create a crisis that undermined the Democratic Party’s electoral chances. 

Malanga links to a piece by Peter Kirsanow about a GAO report on SCAAP criminal aliens. Kirsanow makes the same error that FAIR did when he assumed that the number of criminal aliens incarcerated was a stock figure. In fact, it was the total number of criminal alien incarcerations—a sum of stocks and flows. In response to Kirsanow’s piece, I wrote that this means “if a criminal alien was incarcerated for 10 short sentences, released after each one, and then re-incarcerated, then that single alien would account for 10 incarcerations under the SCAAP figure for that year. But Kirsnaow counts that as 10 individuals [at a single point in time]. 

However, when it comes to estimating the incarceration rate of natives, Kirsanow compares the number of individuals incarcerated with their total population [at a single point in time].” In other words, he’s comparing the GAO’s measure of flows plus stocks of criminal alien incarcerations over a long period of time to the native-born stock of incarcerations in a single period. Kirsanow’s analysis is an apples-to-oranges comparison that can only yield a relatively higher criminal alien incarceration rate.

Much of Malanga’s piece recounts individual instances of migrants committing crimes. Almost all the cases he mentions are examples of individuals who should never have been allowed into the United States or who should be removed. But his mention of eight nationals from Tajikistan with supposed terrorist ties is really too much. Eight Tajik men who crossed the US-Mexico border in 2023 and 2024 were arrested in early June 2024 on immigration charges after the government learned they may have had contacts with ISIS or contacts with people who had potential ties to ISIS. There was no evidence to suggest that a specific terrorist attack was planned, no evidence of an imminent threat to the homeland, no terrorism charges have been filed against them, and “ties” is an ambiguous term with little consistent meaning. Zero people have been murdered in domestic terrorist attacks committed by migrants who entered illegally.

Zero Americans have been murdered by foreign-born terrorists in attacks on US soil since President Biden took office, and there has only been one injury in an attack, which was committed by Canadian-born David DePape, who was inspired by his right-wing ideology to attack Nancy Pelosi and instead injured her husband. 

In contrast, 12 people were murdered and 38 were injured in attacks committed by foreign-born terrorists on US soil during the Trump administration. It’s certainly unfair to blame Trump or credit Biden for such different records, just as it’s unfair to blame presidents for the number of people with “terrorism ties” who cross the border even when none of them are terrorists.

Malanga’s piece begins and ends by criticizing the “elite press” and “media” for mentioning low illegal immigrant criminal conviction rates or describing Donald Trump’s immigration platform as “extreme.” It wouldn’t be a conservative piece without criticizing the media. But the only reason we know about these cases of illegal immigrants committing crimes is because of reporting by the media and elite press, so let’s give credit where it’s due. 

More importantly, press reporting on illegal immigrant criminal conviction rates alongside cases of individual migrants committing crimes is a welcome development. Much of the press breathlessly reports tragedies without informing their readers of the baseline hazard, so I can’t be too upset when they provide that information. My only complaint is that the media should do this when they report every rare event, like the chance of being killed or injured in domestic terrorist attacks, the chance of being killed in a mass shooting, the chance of being unlawfully killed by a police officer, and other hazards.

Malanga and I agree on a few points. First, immigrant noncitizens who commit violent or property offenses should be punished and removed from the United States. Our immigration laws make this more difficult because immigration enforcement authorities waste most of their time trying to block peaceful migrants from selling their labor to Americans. Liberalizing immigration laws would reduce the black market and allow the government to focus on blocking, identifying, and removing migrant criminals. It’s unclear whether Malanga would agree with the last part, but he certainly would agree with the first. 

The second area of agreement is on data availability. Every state should keep data on arrests and convictions by crime and immigration status. After all, it’s hard to adjust policy if we don’t know what’s going on. There’s no good reason for American law enforcement to not report these data.

Malanga’s piece ignores significant evidence, pays scant attention to actual crime rates, sometimes misdescribes research, is occasionally unspecific, and focuses overwhelmingly on individual instances of crimes committed by migrants. His piece is not a useful source for those interested in understanding illegal immigrant criminality and what to do about it.

0 comment
0 FacebookTwitterPinterestEmail

Jeffrey Miron

The standard argument for scope-of-practice (SOP) laws—and more generally, for occupational licensing—is that such regulation improves the quality of services by keeping out lower-skill providers. If so, then these government-created barriers to entry are potentially beneficial, even if they reduce supply and raise prices.

One problem with this perspective is that existing evidence does not find improvements in quality from SOP or licensing laws, and a new study confirms:

The United States is experiencing a shortage of physicians—exacerbated by the COVID-19 pandemic—and the shortage is expected to worsen primarily because of population growth and aging. Notably, the availability of ophthalmologists is trending downward despite growing demand for eye care. … Given the limited availability of ophthalmologists, some have suggested leveraging optometrists, who also have skills in eye care. …

And, lo and behold,

Beginning in the 1970s, optometrists gradually obtained the authority to prescribe medications. This scope-of-practice expansion has allowed optometrists to diagnose and treat patients with eye diseases or disorders without referrals to ophthalmologists. …

States have introduced and expanded optometrists’ prescription authority in multiple phases. Initially, optometrists were permitted to administer medications only for diagnostic purposes. Then, states allowed optometrists to prescribe medications for treatment purposes—known as therapeutic pharmaceutical agent (TPA) prescription authority. …

Our research estimates the effects of optometrists’ TPA prescription authority…. Specifically, it examines the effect of laws that allow optometrists to prescribe drugs for glaucoma treatment on public eye health. …

Our estimates provide evidence that granting TPA prescription authority to optometrists improved public eye health and increased optometrists’ earnings. Vision impairment declined by 12 percent on average over a 15‐​year period after the policy change.

Thus, rather than improving customer outcomes, SOP laws made them worse. So why do such laws exist? Presumably as protectionism for suppliers who want to avoid competition (but claim they are helping consumers). Bootleggers and Baptists lives!

This article appeared on Substack on October 22, 2024.

0 comment
0 FacebookTwitterPinterestEmail

Brandan P. Buck

Fall may finally have arrived in DC, but one discussion that never seems out of season is pontificating on a need for a return of military conscription. While the leaves may have turned, the substance of the case from those who advocate for conscription remains the same: blending arguments of geopolitical necessity and socio-cultural improvement, arguing that such an institution “would contribute to maintaining the integrity of our domestic political life.”

As radical as such an idea may sound, it has gained traction in recent months. And even when dressed with a veneer of moderation, it still holds alarming potential for undermining American liberty. Such was the case for the Center for a New American Security’s (CNAS) June report, “Back to the Drafting Board.” Part of what author and activist Edward Hasbrouck quipped was the “summer of the draft,” CNAS’s report advocated for an overhauled Selective Service System (SSS) to serve as a backstop for enabling total military mobilization and as a deterrent against the near-peer competitors.

While the paper noticeably and laudably eschewed two traditional talking points, namely the use of the draft as a domestic political tool and as a means of coercive recruitment for peacetime, the proposal by authors Katherine L. Kuzminski and Taren Sylvester should cause alarm for those who value liberty and restrained government.

Chief among their ideas was an overhauled registration system that would collect personal data beyond the current limits of one’s name, Social Security number, and address. Kuzminski and Sylvester claim that the changing nature of war requires a 21st-century military with diverse skill sets, arguing that the All-Volunteer Force (AVF) “could further benefit from the use of conscripts who bring technical skill sets not readily available in the professionalized force.” They propose, as other draft proponents have argued, that a future American military would need its share of keyboard warriors too, and only a draft could adequately field such a force. 

To prepare for such a requirement, the authors assert that “the federal government would benefit from additional information, including educational attainment, chronic medical conditions precluding military service, skill sets, and preferences regarding assignment to the military services and career fields or military occupational specialties.” In short, Uncle Sam would need to build dossiers on every American man (and possibly every American woman) from 18–26 and maintain them via a virtual panopticon, the like of which the US has never constructed before. 

Putting the moral arguments aside for a second, such data collection requirements would pose a logistical nightmare for a federal government that has continually proven itself inept at protecting the private information of its citizens. Such collection efforts look nearly impossible, for as author Edward Hasbrouck has noted, current SSS databases are woefully neglected, with dead addresses and other shortcomings. This raises the question of enforcement: how do Kuzminski and Sylvester propose to enforce such requirements? We do not know because they do not tell us. 

Enforcement is a glaring oversight on their part despite acknowledging that “a higher percentage of Americans would object to compulsory service in a future scenario than in the past—even higher than the rate of protests seen during the Vietnam War,” a commonly held view for a conscription exercise, the results of which lay at the heart of their article.

Back to the drafting board, indeed.

Additionally, because most initial selective service enrollees would conceivably have very similar educational training and skills, they would have to regularly update their information with the SSS for such a database to be effective. This requirement would insert the federal government into the lives of every American citizen, placing them as a target of constant surveillance. 

Finally, in their report, they note that social media “has the potential to play a role in American compliance with future draft mobilization,” using Washington’s latest buzzword, “disinformation.” The implications here, of course, are disturbing, yet another casus belli for undermining the First Amendment through social media regulation. 

Past opponents of conscription argued that the practice would turn America into a garrison state. Whether in war or peace, Kuzminski and Sylvester’s proposals would do just that.

0 comment
0 FacebookTwitterPinterestEmail

Andy Craig

As we get closer to the election, a number of articles have been published offering legal explainers for how the process under the new Electoral Count Reform Act of 2022 would work. These have been broadly correct in most details, but one common claim reflects a confusion worth clarifying. Specifically, the conditions under which a “contingent election” could be triggered, sending the race to the House under the procedures of the Twelfth Amendment

A contingent election has been a particularly acute fear among Democrats, because in the House it is not each representative who would have one vote, but each state’s delegation. An absolute majority of states, twenty-six, is required to win. It is likely Republicans will control more state delegations than Democrats even if Democrats have an overall majority of the House. It is also possible, because of states being evenly split, that no party would control the needed twenty-six votes. This was the case last time, even if a contingent election had been triggered, due to Republican members opposed to the effort in Wyoming and Michigan. 

The most obvious way a contingent election could happen, though it’s unlikely, would be an exact 269–269 tie in the Electoral College. Even less likely would be a third candidate winning enough states, a strategy attempted several times in American history but never successful. There are no prominent third party candidates with such a chance in this election. Under these scenarios, no candidate would then have “a majority of the whole number of Electors appointed,” which the Twelfth Amendment requires to avoid a contingent election. There are 538 electoral votes, so normally a majority would be 270. 

The other possibility frequently discussed is that Congress, during the electoral count on January 6, 2025, would reject enough electors so that no candidate reaches 270 votes. However, the form of the objection matters here, and the kind of objection most often floated by Trump supporters would not result in a contingent election. 

This distinction arises from the implications of the Constitution itself, but is codified by ECRA with two different provisions: 3 USC §15(d)(2)(B)(ii)(I) and 3 USC §15(d)(2)(B)(ii)(II). Try reading those citations out loud five times fast. But more simply, the first kind of objection is that the electors were not “lawfully certified.” The second kind of objection is that the votes of valid electors were not “regularly given.” 

An objection that electors were not lawfully certified would be the implication of claiming a candidate did not really win a state’s popular vote, or did not do so lawfully, and so that party’s electors were not lawfully certified. This kind of objection is intended to be foreclosed by ECRA altogether, by requiring only a single certified slate of electors as determined under state law and confirmed by possible litigation, as provided under other sections of the law. The whole point is that sitting in judgement of a state’s popular vote result is not within the proper purview of Congress. The provision remains only for the slim possibility that those mechanisms have failed and Congress is presented with a slate of electors not in compliance with the law’s procedural definition of “lawfully certified.” 

An objection that votes were not “regularly given” would mean the electors themselves were properly appointed but then cast but they somehow then cast invalid votes. This would cover, for example, if the electors voted for an ineligible candidate, not entirely hypothetical in this race, but lacking serious support in Congress. It would also apply if the electors voted for two candidates for president and vice president who are both residents of the same state as the electors, which is not allowed. It would in theory also include failure of the electors to meet and vote on the required day, which the Constitution requires to be the same throughout the United States, or if the electors failed to certify their votes in the form required. 

The first kind of objection, lawfully certified, would have the effect of reducing the “whole number of electors appointed,” because its contention is that the rejected electors were not really electors at all. This would mean the denominator for determining a winning majority is likewise reduced. It would be the same as saying the state simply failed to appoint any electors, as southern states did during the Civil War. In such a case, whoever has a majority of the votes cast by the remaining electors, unless there is a tie, would be directly elected. There would be no contingent election in the House. 

Only the second kind of objection could result in the total number of electors still being 538, thus leaving the winning majority threshold at 270. But this kind of objection has little relevance to the common theories advanced by Trump supporters in the last election and being readied for a possible repeat attempt. To make an objection of this nature would be conceding that Harris really did win the relevant state, the Democratic electors were properly certified, and yet those electors somehow cast invalid votes. 

The law makes this distinction clear, carefully tracking the Constitution’s text. 3 USC §15(e)(2) provides that the whole number of electors is reduced only if a state has failed to appoint the number of electors to which it is entitled, or else an objection is sustained under “subsection (d)(2)(B)(ii)(I)” (emphasis added). That is, a “lawfully certified” objection, but not a “regularly given” objection.

In other words, the kind of objection which could trigger a contingent election is also the kind of objection made effectively impossible by ECRA, because “lawfully certified” objections can only be raised if Congress is somehow presented with a slate of electors in defiance of a federal court ruling. ECRA requires such a question to be decided in the courts prior to January 6, with that outcome being binding on Congress. The concept of Congress choosing between dueling slates of claimant electors, as envisioned under the previous Electoral Count Act of 1887, is no longer applicable. 

As a practical matter, even with the possibility of narrow Republican majorities in both the House and Senate, it is plain not enough would vote in favor of an attempt to overturn the election. ECRA was co-sponsored by ten Republican senators, and passed out of committee with only one Republican vote against it. In 2021, about two-thirds of House Republicans voted for the objections, and in the Senate, only eight voted for one or both of the two objections considered. Considering that any plausible Republican majorities would have only a few votes to spare, possibly as few as just one, majorities in both chambers will be out of reach. Even if higher shares of the GOP members vote that way, it would require near-unanimity for an idea already firmly rejected by many Republicans.

Attempts to subvert the outcome of the 2024 election are serious and ongoing, and merit serious opposition. But a contingent election in the House is not how such an effort would play out, and popular explainers of the law should avoid erroneously suggesting otherwise. 

0 comment
0 FacebookTwitterPinterestEmail

Clark Packard

During a recent stop in Pennsylvania, US Trade Representative Katherine Tai was asked by Eric Martin of Bloomberg about the Biden administration’s decision to continue and expand tariffs on Chinese goods imposed under President Trump. She replied that the tariffs remain in place because “… we really haven’t seen the PRC (People’s Republic of China) make any changes to fundamental systemic structural policies that would make sense for us to provide any relaxation” and that the tariffs could provide leverage in some future, unspecified negotiation with the Chinese.

In 2018, the Trump administration issued its Section 301 report on Chinese trade and investment practices. Though flawed in certain respects, the lengthy report documented a litany of legitimately concerning policies employed by the Chinese government in pursuit of its 21st-century high-tech mercantilist agenda, which burden American commerce and hurt American workers. Based on the report’s findings, the Trump administration imposed a series of heavy tariffs that they claimed would force Beijing to make structural changes to its international trade and investment practices.

Over the last six-plus years, those tariffs have imposed significant costs on Americans as Cato scholars have repeatedly highlighted. Perhaps those costs could be justified if the tariffs had forced a wholesale reorientation of Chinese economic policies.

Instead, Americans have the worst of both worlds: tariffs continue to harm American firms and families while Beijing’s abusive practices continue largely unabated as Ambassador Tai and others—like the US-China Economic and Security Review Commission—acknowledge.

After more than six years of costly failure, the United States needs a different approach. The World Trade Organization’s (WTO) dispute settlement system needs to be reinvigorated, but when operational offers a forum to hold Beijing accountable for violating its WTO commitments as Cato analysis has shown. Likewise, rejoining the Trans-Pacific Partnership can raise commercial standards—and help offset China’s gravitational pull in the Pacific region—by establishing a large trading bloc of countries committed to high-quality, rules-based trade and investment practices.

Ultimately, Washington’s ability to force wholesale changes onto the Chinese economy is limited. Instead, growth should be the north star of US economic policy vis-à-vis China and policymakers should trust the United States’ traditional strengths: a commitment to the rule of law, openness to international trade and immigration, and dedication to dynamic, market-based innovation. 

0 comment
0 FacebookTwitterPinterestEmail

Marc Joffe

With tens of thousands of California residents living on the streets and widespread concerns over a housing affordability crisis, we might expect political leaders to build a large volume of new housing as quickly and inexpensively as possible. But, of course, that has not been the case. Thanks to a combination of special interest influence and the phenomenon of “everything bagel liberalism”, under which progressives try to solve a multitude of often-conflicting problems with one policy, California governments and their nonprofit partners have been creating housing very slowly and at a high cost. A better alternative is for governments to get out of the way and allow private entities to build low-cost housing quickly without the overhead of other political objectives.

How Not to Do It

In 2016, a fire originating from a neighboring building seriously damaged 3300 Mission Street, a building that housed 28 single-room occupancy (SRO) hotel units, two stores, and a bar. Although the owner originally planned to fix the building, he sold it to Oak Funds, a local real estate firm the following year. This firm also declared an intention to restore the SRO but never did so. These two owners may have been deterred from fixing the building because they would have been required to offer most of the units to their previous tenants who were covered by the city’s rent control ordinance. The fact that San Francisco obliged the owners to rent out much of the property at below-market prices may have made repairing the building uneconomic, especially given San Francisco’s high construction costs.

Last year Oak Funds sold the still vacant building to affordable housing developer Bernal Heights Housing Corporation (BHHC) for $1 million more than it paid for the property in 2017. Since then, BHHC has been assembling financing to rebuild and expand 3300 Mission, albeit at a high cost to taxpayers.

Under a plan recently approved by San Francisco supervisors, BHHC will spend $41 million (including the already incurred acquisition cost) to produce 35 units: residential units, a community space, and a small retail space. The cost per residential unit works out to be about $1.1 million each, even though all the units will be small studios ranging in size from 267 to 406 square feet.

One reason that the project is so expensive is that San Francisco requires construction workers building the affordable housing projects it funds to be paid so-called prevailing wages determined by the California Department of Labor Relations. This means that construction laborers must receive a salary and benefits package worth $69.41 per hour on weekdays, with large premiums for Saturday and Sunday work.

Although construction costs themselves are high, there are other drivers for the high price of these units. The project budget includes $3.3 million in construction financing costs, $2.6 million in developer fees, $2.2 million in architect fees, and $590,000 in legal fees, so a range of professionals are being well compensated.

BHHC will receive $16.6 million of city funds covering 40 percent of overall project costs. The rest will be privately funded by investors taking advantage of a 9 percent Low Income Housing Tax Credit (LIHTC). This credit allows the investor to exclude 9 percent of project cost annually for each of the ten years after which the building is completed. The credits can also be bought and sold on the secondary market. The Tax Foundation has characterized LIHTC as “an exceptionally complex tax expenditure.”

Occupancy is now expected in late 2026, ten years after the original building became uninhabitable. And the City Attorney has determined that rent-controlled tenants living in the building before the fire will no longer have a right to return.

A More Efficient Option

While government-funded projects typically require on-site construction with laborers receiving prevailing wages, private affordable housing projects can rely on lower-cost prefabricated units. Two miles north of 3300 Mission at 33 Gough Street, nonprofit Dignity Moves created a 70-unit tiny home village. Construction began in January 2022 with occupancy occurring six months later. The cost per unit, including shared amenities, worked out to $32,000.

The homes do not include separate kitchens or bathrooms, but, as Dignity Moves CEO Elizabeth Funk recently told me, sharing dining and washing facilities is not a major issue for most individuals experiencing homelessness. More important to them is having a roof over their heads in a unit that can be locked.

The 33 Gough tiny home village is one of several “interim supportive housing” projects Dignity Moves has completed or is developing in California. These projects provide services such as life coaching and addiction treatment, in addition to the homes themselves. While the lifespan of the tiny homes is likely shorter than permanent supportive housing units, they are less expensive to maintain, insure, and replace.

Some government officials are embracing interim housing, including San Jose Mayor Matt Mahan. His city plans to have about 1,300 interim units available by the end of 2024. The California state legislature has also gotten into the act, recently passing a bill exempting tiny home villages from the California Environmental Quality Act (CEQA) process.

It remains to be seen whether more government involvement in the creation of tiny home villages will slow their completion and increase their costs. Hopefully, tiny homes will not become the schmear on the everything bagels so often baked by California’s public sector.

0 comment
0 FacebookTwitterPinterestEmail