The Cost of Obamacare’s “Slacker Mandate”

What with the rapid unraveling of the Obamacare health insurance exchanges, Americans might be excused for having forgotten one of Obamacare’s first intrusions: The “slacker mandate.” This was the provision that requires employer-based health plans to cover “children” on their parents’ plans until they are 26.

It took effect in 2010. No other law requires parents to take care of their kids until they are 26. But health care is different, as we all know. The slacker mandate increased premiums by one to three percent, and it’s associated with the introduction of more nuanced employer-based benefits, according to research discussed by Bruce Japsen.

A couple of things to note: The costs of the slacker mandate were not socialized. The entire cost of the mandate is borne by the parents, through increased premiums. Pre-Obamacare, employer-based benefits tended to have three tiers of premium: Single, couple, or family. It did not matter how many kids you had (because covering kids is so inexpensive that it’s not worth the administrative hassle of adding a surcharge for each kid). The slacker mandate changed that, and more employer-based plans now charge a premium for each dependent.

Anyway, the results are in, according to a new study published by the National Bureau of Economic Research:

If, as suggested by prior work, the provision reduced the amount of time young adults work, the question arises, what have these adults done with the extra time?

The extra time has gone into socializing, and to a lesser extent, into education and job search. Availability of insurance and change in work time appear to have increased young adults’ subjective well-being, enabling them to spend time on activities they view as more meaningful than those they did before insurance became available.

(Gregory Dolman & Dhaval Dave, “It’s About Time: Effects of the Affordable Care Act Coverage Mandate on Time Use,” NBER Working Paper No. 21725, November 2015.)

The Washington Post has reproduced some of the charts. Here are some examples:

  • Socializing has increased about 30 minutes per day, among 23 to 25-year olds.
  • Sleeping has gone up about 10 minutes a day, among 19-25-year olds.
  • Work has gone down about 20 minutes a day, among 23-25-year olds.
  • Exercise has gone up about 10 minutes a day, among 23-25-year olds.

Well, I am glad they are staying fit.

Look, there was never a law preventing parents from paying for their kids’ health insurance after they reach majority. Mandating that parents pay so these young adults can spend more “meaningful” time socializing instead of working will likely have detrimental long-term consequences for their earnings and fulfillment that exceed any benefits they might get from Obamacare.

* * *

For the pivotal alternative to Obamacare, please see Independent Institute’s book, A Better Choice: Healthcare Solutions for America, by John C. Goodman.

A Minor Victory for Privacy: NSA’s Bulk Phone Collection Ends

IAO-logoThe expiration of the National Security Agencies’ power to collect and indefinitely store all phone records is neither cause for raucous celebration among privacy advocates—nor cause for predictions of doom among hawks for national security powers, such as this by Fox News:

The National Security Agency’s sweeping authority to collect phone-record data expired Sunday, despite evidence that such programs helped European officials track down the perpetrators of the recent Paris suicide bombing attacks and prevented other attacks.

American security agencies retain every tool the French drew on in using cell phone records to track down some of the perpetrators—after, we may add, the fact.

The more salient fact is that French security agencies, as American agencies before 9/11, had plenty of data in-hand—they were simply inept at utilizing it to prevent either the Charlie Hebdo or more recent attacks.


Hospital Ownership of Physicians Drives Up Costs

HospitalNew research published in the JAMA Internal Medicine journal supports, with rigorous data analysis, the notion that hospital ownership of medical practices drives up costs:

Among the 240 Metropolitan Statistical Areas, physician-hospital integration increased from 2008 to 2012 by a mean of 3.3 percentage points, with considerable variation in increases across MSAs. For our study sample of 7,391,335 nonelderly enrollees, an increase in physician-hospital integration equivalent to the 75th percentile of changes experienced by MSAs was associated with a mean increase of $75 per enrollee in annual outpatient spending from 2008 to 2012, a 3.1% increase relative to mean outpatient spending in 2012). This increase in outpatient spending was driven almost entirely by price increases because associated changes in utilization were minimal (corresponding change in price-standardized spending, $14). Changes in physician-hospital integration were not associated with significant changes in inpatient spending ($22 per enrollee) or utilization ($10 per enrollee).

(Note: I have edited out the measures of statistical significance from the abstract, for ease of reading.)

$75 per enrollee is not a huge increase, but it certainly supports the thesis that hospital acquisition of physician practices does not lead to reduced total cost of care, even if the hospital does not influence the physicians to fill the hospital’s beds. This is an important topic, because providers are re-organizing themselves to take advantage of new forms of reimbursement, in which they bear financial risk.


The Pilgrims’ Real Thanksgiving Lesson

ThanksgivingWith Thanksgiving upon us once again, we offer a reminder of the economic lesson that made our first Thanksgiving possible:

The Pilgrims’ Real Thanksgiving Lesson
by Benjamin Powell

Feast and football. That’s what many of us think about at Thanksgiving. Most people identify the origin of the holiday with the Pilgrims’ first bountiful harvest. But few understand how the Pilgrims actually solved their chronic food shortages.

Many people believe that after suffering through a severe winter, the Pilgrims’ food shortages were resolved the following spring when the Native Americans taught them to plant corn and a Thanksgiving celebration resulted. In fact, the pilgrims continued to face chronic food shortages for three years until the harvest of 1623. Bad weather or lack of farming knowledge did not cause the pilgrims’ shortages. Bad economic incentives did.

In 1620 Plymouth Plantation was founded with a system of communal property rights. Food and supplies were held in common and then distributed based on “equality” and “need” as determined by Plantation officials. People received the same rations whether or not they contributed to producing the food, and residents were forbidden from producing their own food. Governor William Bradford, in his 1647 history, Of Plymouth Plantation, wrote that this system “was found to breed much confusion and discontent and retard much employment that would have been to their benefit and comfort.” The problem was that “young men, that were most able and fit for labour, did repine that they should spend their time and strength to work for other men’s wives and children without any recompense.” Because of the poor incentives, little food was produced.

Faced with potential starvation in the spring of 1623, the colony decided to implement a new economic system. Every family was assigned a private parcel of land. They could then keep all they grew for themselves, but now they alone were responsible for feeding themselves. While not a complete private property system, the move away from communal ownership had dramatic results.

This change, Bradford wrote, “had very good success, for it made all hands very industrious, so as much more corn was planted than otherwise would have been.” Giving people economic incentives changed their behavior. Once the new system of property rights was in place, “the women now went willingly into the field, and took their little ones with them to set corn; which before would allege weakness and inability.”

Once the Pilgrims in the Plymouth Plantation abandoned their communal economic system and adopted one with greater individual property rights, they never again faced the starvation and food shortages of the first three years. It was only after allowing greater property rights that they could feast without worrying that famine was just around the corner.

We are direct beneficiaries of the economics lesson the Pilgrims learned in 1623. Today we have a much better developed and well-defined set of property rights. Our economic system offers incentives for us—in the form of prices and profits—to coordinate our individual behavior for the mutual benefit of all; even those we may not personally know.

It is customary in many families to “give thanks to the hands that prepared this feast” during the Thanksgiving dinner blessing. Perhaps we should also be thankful for the millions of other hands that helped get the dinner to the table: the grocer who sold us the turkey, the truck driver who delivered it to the store, and the farmer who raised it all contributed to our Thanksgiving dinner because our economic system rewards them. That’s the real lesson of Thanksgiving. The economic incentives provided by private competitive markets where people are left free to make their own choices make bountiful feasts possible.

Giving Thanks for Stores that Open on Thanksgiving

thanksgivingIt’s that time of year again. Thanksgiving is upon us. Tomorrow, millions of us will join together with friends and family to celebrate the Holiday. The day after, “Black Friday,” kicks off the holiday shopping season with a variety of sales.

Once again, these sales are bringing controversy with them. It’s not the fact that these sales are taking place, but when.

Every year we see the same uproar surrounding Thanksgiving and Black Friday sales. That is, many stores are opening on Thanksgiving Day as part of their holiday promotions. Many individuals take issue with these policies, arguing that opening stores on Thanksgiving corrupts a holiday that’s all about spending time with family and friends.


Police Take More Property from People than Burglars

PoliceMost readers of The Beacon are probably familiar with the rise in civil asset forfeiture, which gives police the power to seize property they claim was used in criminal activity, often without accusing the property owner of a crime. They don’t have to. It’s up to property owners to prove they are innocent to get their property back.

Martin Armstrong posts on his blog that in 2014 property taken through civil asset forfeiture exceeded the value of property taken by burglars. This article analyzes that claim in more detail, and it appears that the statistics Armstrong uses actually undercount the losses from civil asset forfeiture. For one thing, he only looks at civil asset forfeitures by the federal government.

It is unsettling to think that the property of Americans is more at risk from being confiscated by police than being stolen by burglars.

How Lord Acton Trumps George Orwell in The Hunger Games

Katniss Everdeen makes a choice in a pivotal scene in Mockingjay, the third book in The Hunger Games trilogy by Suzanne Collins, that had the potential to elevate her into the pantheon of pro-freedom heroines in contemporary fiction. Unfortunately, neither the book nor the movies leveraged this act to let Everdeen step onto that podium, much to the chagrin of libertarians (including myself) who had hoped for more. Indeed, in previous blog posts, I suggested that The Hunger Games might be the Millennial Generation’s version of George Orwell’s classic 1984 (see here, here, and here). After reading the trilogy and watching all four movies, I no longer think the series holds that promise.

In fact, the fourth installment of the movie series, The Hunger Games: Mockingjay, Part 2, does even more to dispel any notion that Katniss Everdeen is anything more than a survivor (albeit a heroic one) trapped in a world controlled by the State. She eschews any leadership role or place in the revolution for freedom, making her quest a personal one rather than a blow for a higher principle or value. She is fighting against tyranny, but she doesn’t have much of an alternative to offer. This takes her out of the running as a true leader able to galvanize others around a common idea (and reduces her value as a strong female character as well).

Nevertheless, the pivotal scene in Mockingjay (and the movie Part 2) is worth discussing because it reflects a critical plot point in Katniss Everdeen’s character arc and the decision she makes would likely have pleased Lord John Acton (1834-1902). On the surface, Katniss accepts a Faustian bargain for the privilege of executing the secular tyrant President Snow. Indeed, this is her goal since she recognizes that as long as Snow is alive the Capitol and its ideas never will be truly dead. She accepts, it seems, a deal with rebel president Alma Coin to hold one last Hunger Games featuring the children of the Capitol District. It’s a cynical effort by Coin to channel the bloodlust of the rebellion as a way to pave a road to peace, or at least that’s what Katniss and her peers are led to believe. 


U.S. Department Flunks Data Security...Again

computerEarlier this week the full House Committee on Oversight and Government Reform blasted the U.S. Department of Education for its lax security surrounding student data. But this isn’t the first time ED’s been taken to the woodshed.

The Government Accountability Office (GAO) reported in 2011 that ED still hadn’t implemented security controls recommend in 2009 by its own Education Office of the Inspector General (IG). And, just this week the GAO again documented ED’s numerous information security weaknesses and deficiencies.

As. Rep. Mark Meadows (R-NC) summed up, “You know, the headline should read: ‘Department of Education Gets an F’.” (Starting at 46:31, first video)


Rosa Parks Day: The Triumph of Colorblindness and Capitalism

jb_modern_parks_1_eSixty years ago, Rosa Parks refused to give up her bus seat to a white man and was arrested for disobeying Montgomery, Alabama’s segregation ordinance. The story is well-known, even today, as we celebrate “Rosa Parks Day” (December 1). Following her arrest, African Americans organized a boycott of the city’s privately-owned bus company. Martin Luther King, Jr. became spokesman for street protests and, ever since, the civil rights movement is remembered as a militant expression of civil disobedience and “taking it to the streets.” Within a year, the city ended desegregation, but not for the reasons you might think. The real heroes behind Rosa Parks were the NAACP lawyers who battered down the walls of institutional racism with the force of the constitution, color-blind law, and capitalist forces that worked against racism—hallmarks of the classical liberal tradition of civil rights.

Laws segregating the races created separate spaces for each: separate bathrooms, water fountains, schools, theaters, and even beaches. Trolley cars and buses posed a challenge because it was economically impracticable to run separate bus lines; one for whites, the other for blacks. Therefore, as virulent white racism swept the South in the 1890s, cities passed ordinances requiring private bus companies to create separate sections for blacks and whites. As I demonstrate in Race and Liberty in America: The Essential Reader, bus companies strongly opposed this interference with their business. Company drivers would have to identify who was black or white. Furthermore, the creation of white and “colored” sections forced the bus companies to eliminate a benign form of segregation that they had created in response to consumer demand: a section for nonsmokers (smokers, not blacks, were relegated to the back of the bus). First-class “ladies” cars also had to go. Eliminating the “ladies” section placed respectable women next to prostitutes and male patrons who were not always gentlemen.


The Hobgoblin of Separation of Ownership and Control

CorporateThe Great Recession heightened people’s search for scapegoats. One common target was corporate management, accused of harming shareholders and consumers, rather than advancing their interests, with more government regulation put forward as the necessary solution. We saw it when the self-styled 99% blamed the 1% for their frustrations, when Hillary Clinton blamed weak economic growth on “quarterly capitalism,” and in other manifestations.

However, this line of argument is far from new. Its pedigree traces at least as far back as the doctrine of “the separation of ownership and control,” in Adolf Berle and Gardiner Means’ 1933 The Modern Corporation and Private Property.

The “separation of ownership and control” story focused on the widely dispersed ownership of corporations. In a nutshell, it argued that because corporate shares were spread among many small holders, no one had enough at stake to keep close tabs on corporate managers. Since managers knew that was the case, they could take advantage of shareholders, rather than advancing their interests. Therefore they did.